Thursday, February 28, 2019

Chemical potential raised states of the quantum vacuum state

We can construct an analog of a thermal state using a number operator instead of the Hamiltonian, which is different from thermal states because the number operator is Lorentz invariantly defined.
This derivation has always rather delighted me. If μ is very large, then this state is arbitrarily close to the vacuum state, then as μ becomes smaller, more and more Lorentz invariant Gaussian noise is added into the system. This "extra quantum noise" state is an infinite energy mixed state that is unitarily inequivalent to the vacuum state and to thermal states.

This post was prompted by my leaving a comment on Azimuth, which included this: "This state could be said to be in a mixed state because it’s interacting with another Lorentz invariant field (which has been traced out) as a zero temperature bath with which it exchanges particles in a Lorentz invariant way. For any algebra of observables that doesn’t include absolutely every field (including dark matter, dark energy, or anything else we haven’t found yet, …), the state over that subalgebra must be a mixed state if there’s any interaction. For the EM field in interaction with a Dirac spinor field, for example, the vacuum state for the EM field with the Dirac spinor field traced out should be a mixed deformation of the free field vacuum state, which is pure. Haag’s theorem, after all, insists that the interacting vacuum sector must not be unitarily equivalent to the free vacuum sector."

I first noticed this kind of structure about 15 years ago: with a much less pretty derivation it can be found discussed in my Phys.Lett. A338 (2005) 8, https://arxiv.org/abs/quant-ph/0411156, with the succinct title, "A succinct presentation of the quantized Klein–Gordon field, and a similar quantum presentation of the classical Klein–Gordon random field". That succinct presentation has been my slowly evolving companion ever since, with the derivation given above, which I can imagine being many, many impenetrable pages in a textbook QFT formalism, part of that evolution.

Tuesday, February 05, 2019

"Lost in Math" - a review in perspective

Sabine Hossenfelder's "Lost in Math" is an enjoyable and worthwhile read. Supersymmetry (SUSY) and String Theory and various aspects of Quantum Gravity and Cosmology get healthy doses of criticism, but from my perspective the book is interesting more for revealing how little attention even such a critique of particle physics can give to the two aspects of Quantum Mechanics and Quantum Field Theory that I think are most important, and for the springboard it provides to discuss them:
  1. our understanding of the relationship between CM and QM, and more specifically and necessarily between classical fields and quantum fields; and
  2. our understanding of quantum fields in the face of the mathematical ill-definedness of the theory, which requires that we regularize and renormalize integrals to give finite numbers.
My review, therefore, is a review of the few times Sabine comments on these two issues. Quite a number of reviews of everything else can be found elsewhere.

First up, here's a summary, on page 209, that mentions both aspects above, the measurement problem and renormalization, and Sabine also admits that she actually does want a helping of beauty in her math when she's doing some physics:
Sabine's response to Xiao-Gang Wen's Qubit Lattice Theory is initially very negative, which I suppose is because there's only a bare minimum of mathematical beauty in it, but she gets over it a little when she understands that Xiao-Gang is trying to avoid renormalization. This is one of only two mentions of that dirty math question, on pages 192-3 (for the other, see below):

George Ellis addresses the measurement problem, which I read as being about fields, which maybe it isn't, but it makes me happy to think so, on page 217:
All of Chapter 6 is given over to the interpretation of QM (with, I think, essentially no mention, as George Ellis promised, of the measurement problem for QFT, despite Lost in Math being decidedly a QFT book), which explicitly considers this selection:
  • Copenhagen
  • Qubism
  • de Broglie-Bohm
  • Many-worlds
  • Collapse
There's also what I think is an eminently reasonable discussion of the shut-up-and-calculate approach, which she does partly by using a conversation with Chad Orzel. Sabine ends with what seems a rather pessimistic summary:
  • "Quantum mechanics works great, but many physicists complain that it's unintuitive and ugly.
  • Intuition can be built by experience, and quantum mechanics is a fairly young theory. Future generations may find it more intuitive.
  • In the foundations of quantum mechanics too, it is unclear what is the actual problem that requires a solution.
  • Maybe understanding quantum mechanics is just harder than we thought."
This doesn't say anything more than many have been saying for decades, that a new idea is required, which in a moment I will suggest is best provided by understanding an old idea ...

On page 29, Sabine notes that the unification of  Heisenberg and Schrödinger was a big deal. That was done by Dirac and others pointing out that both are Hilbert space formalisms in which measurements are represented by operators:

So if one wanted to unify classical and quantum mechanics, a natural suggestion would be to present Classical Mechanics in a Hilbert space formalism, which was done by Koopman in 1931, and see just how close or how different CM and QM are, in detail. It turns out that one can construct isomorphisms both between the state spaces and between the operator spaces that represent measurements, so that if one has a measurement theory that one can live with for CM, the same measurement theory will work for QM (conversely, if you don't have anything you like for QM, the situation for CM is arguably just as bad.)
Now I'm gonna bang a drum. For these, see arXiv:1901.00526 (which is more elementary) and arXiv:1709.06711 (which discusses random fields and quantum fields, so it's harder math: also, assuming I can accommodate a referee's comments in the next two weeks, a version of this will appear in Physica Scripta in due course.) Sorry, but such a good story is barely to be found anywhere else, however you'll find in those two papers some references to some of the Koopman-von Neumann work by others, which has been slowly taking off since about 2000. Perhaps one of those references will excite you more than my own attempt to make the math as clear as I can.

On pages 156-7, we have this beauty, which is never expanded upon, but we will use it here to jump into a discussion of the relationship between random fields and quantum fields:
Yes, that's warm! Chaos in this kind of loose discussion is as much about whether the initial conditions are hot or cold, or if we think about the question in a more detailed way, it could be about the spectrum of measured values of the system at a given time or of statistics of measurements. Chaos used loosely is not different from noise used loosely. In any case, chaos/noise is not formless, there's mathematics we can do, including but not only statistics, to describe different kinds of chaos/noise. If we can modulate the chaos/noise, we can use it to send signals, to do other useful stuff, and generally we can think of everything we can do with it as signal analysis and signal processing.
The simplest mathematics we can use to construct powerful models of noise is a random variable and a probability distribution. More than that, we will want to model the chaos in different places, for which we use an indexed set of random variables, Fa, Fb, Fc, ..., which is usually called a random field. The a, b, c, ... can be anything that indicates what it is that is measured, including as many details about the different components of measurement apparatuses as are necessary for a full description.
Suppose we measure Fa, Fb, Fc, ... in intergalactic space, at the same place at one hour intervals. There's noise, so we won't measure the same value every time: assuming the results are all numbers, we can present the measurement results, when we have enough of them, in a bar graph and calculate mean values, the standard deviation, and other statistics. In an operator formalism, as in arXiv:1901.00526, we can write the mean values as ρ(Fa), ρ(Fb), ρ(Fc), ..., correlation functions as the mean values of products such as ρ(FaFb), and we can measure the mean values of any sums of products of the Fa, Fb, Fc, ..., up to any degree, ρ(FaFbFcFdFeFf⋅⋅⋅). All this is either classical or quantum, so far, depending on structural details.
There is a special idealized state, which is what we call ρ(⋅⋅⋅), the Gaussian state, for which ρ(Fa)=0, ρ(FaFb)=M(a,b), and all the higher functions can be written as a function of the matrix entries Mij=M(ai,aj), where we've given numbers to the a=a1,b=a2,c=a3, ..., and where the whole matrix M must be a positive semi-definite matrix. With this, we can construct a Hilbert space (using what is called the GNS-construction, after Gelfand-Naimark-Segal), which, if we make the a, b, c, ... and the M(⋅⋅⋅,⋅⋅⋅) just right, as in arXiv:1709.06711, can be either a random electromagnetic field or a quantized electromagnetic field, and we can construct isomorphisms between both the states and the measurements.
Warming up to it yet?

A Gaussian state is called a free quantum field in QFT. Interactions modify a Gaussian state, which is as idealized as a spherical cow, to be something different, closer to whatever the measured values are in that intergalactic place, closer to the real cow. Those measured values are whatever they are, and whatever theoretical ideal state we introduce has to come usefully close to matching them. Feynman integrals, and the regularization and renormalization scheme that makes them give finite values, give us one way to generate values for an idealized state ρ(FaFbFcFdFeFf⋅⋅⋅), with lattice QFT giving us another way (slightly less ugly than Xiao-Gang's Qubit Lattice Theory, but not by much, IMO), but since 1950 we have barely looked for an alternative to using an ill-defined Lagrangian density to describe how states should be different from the Gaussian state: time not exactly wasted, but we could have done more. It's comforting to think that virtual particles cause interactions between particles, but we could learn to love and intuitively use a different generating system for nontrivial states in a decade or two.
Sabine is very far indeed from being the only modern physicist not to care much about renormalization, but if she had interviewed Paul Dirac 40 years ago, she would have had more material than the few quotes on page 32 and have been more reluctant to say after two paragraphs, almost casually, "despite Dirac's disapproval, quantum electrodynamics is still part of the foundations of physics". She would have had an extra chapter about renormalization, and for me it's sad that she doesn't.

How, then, are we to avoid those infinities that have to be regularized and renormalized? We want a way to construct generating functions for an idealized state that gives us values for ρ(FaFbFcFdFeFf⋅⋅⋅) that are worth having for practical engineering. Since the 1950s, the starting point for this kind of thinking has been the Wightman axioms, which we can present, adapted from Rudolph Haag's book Local Quantum Physics, as:
  • A Hilbert space H supports a unitary representation of the Poincar é group; there is a unique Poincar é invariant vacuum state, of lowest energy.
  • Quantum fields are operator-valued distributions, linear maps from a measurement description space (the a, b, c, ...) into operators Fa, Fb, Fc, ... in a 🟉-algebra A.
  • Quantum fields support a nontrivial representation of the Poincaré group.
  • Microcausality: commutativity at space-like separation (no faster-than-light signalling).
  • Completeness: the action of the quantum field is irreducible (that is, states must be pure).
  • Time-slice axiom (the state now determines the future state).
The mentions of the Poincaré group and of microcausality are empirically quite well justified, but at least three of these constraints are blatantly a priori, introduced more to make the math work nicely than to make the math physically useful: (1) that the vacuum state should be of lowest energy (thermal equilibrium does not satisfy this axiom); (2) that the Fa, Fb, Fc, ... must be linear functionals of the a, b, c, ... (classically, ρ(FaFb) can be understood to be an Fa response to an Fb modulation, which would not be expected to be linear in both a and b); and (3) completeness (again, a thermal equilibrium does not satisfy this axiom, but also, if any degrees of freedom are traced out, which dark matter and dark energy are, the resulting state is a mixed state: we can, for example, usefully measure just the electromagnetic field, only inferring some aspects of, but not explicitly measuring, electric currents). One consequence of removing the second, linearity, can be found in arXiv:1507.08299, though I think I might construct this paper in a somewhat different way now than I did four years ago. Others of these axioms could also be weakened or changed, perhaps even one or more might have to be strengthened to make the resulting system a better engineering tool: whatever must be done to allow us to match the experimental values must be done. The three changes above already give us a plethora of models to characterize and to check off against nature.

I should be clear that I'm not a good enough mathematician to do what I want to do (or, at least, it takes me a very long time to do it). John Baez is, however, so you can go for a look at how he's been constructing functors, in a series of blog posts, between a restricted category of classical mechanical systems and a restricted category of quantum mechanical systems:

  • Part 1: the mystery of geometric quantization: how a quantum state space is a special sort of classical state space.
  • Part 2: the structures besides a mere symplectic manifold that are used in geometric quantization.
  • Part 3: geometric quantization as a functor with a right adjoint, ‘projectivization’, making quantum state spaces into a reflective subcategory of classical ones.
  • Part 4: making geometric quantization into a monoidal functor.
  • Part 5: the simplest example of geometric quantization: the spin-1/2 particle.
  • Part 6: quantizing the spin-3/2 particle using the twisted cubic; coherent states via the adjunction between quantization and projectivization.
  • Part 7: the Veronese embedding as a method of ‘cloning’ a classical system, and taking the symmetric tensor powers of a Hilbert space as the corresponding method of cloning a quantum system.
  • Part 8: cloning a system as changing the value of Planck’s constant.

  • Lovely! And completely out of my league: you can see me try to make a connection with his work in the comments, and fail and be batted down. I was sad😧! But I also think his approach is over-complicated: in all that formal structure you might not notice that he's constructing a new type of solution to the measurement problem. I also really hope he does field theories eventually, but he hasn't yet gotten to them.