Sunday, March 07, 2010

Algebra in Wonderland


Algebra in Wonderland

Since I was an mathematics undergraduate at Christ Church, Oxford, from 1975-78, Charles Dodgson has inevitably had a certain fascination, albeit one I haven't pursued. This New York Times article tells me things I probably ought to have known already. Charles Dodgson seems to have been rather the curmudgeon, but it's not clear from this article whether he had a spark as a mathematics tutor or whether he escaped from his students as much as he could. Teaching a thousand 19th Century mathematics undergraduates brilliantly would probably not make a hundredth of the cultural impact that Alice in Wonderland has made, however.

Friday, March 05, 2010

Modulation of a random signal

Partly thanks to Built on Facts, where can be found a post about "Hearing The Uncertainty Principle", and partly because analyzing the datasets of Gregor Weihs' experiment (arXiv:quant-ph/9810080, but it's good to look at his thesis as well), I suggested there that we can say that "QFT is about modulation of a random signal", in contrast to a common signal processing approach, in which we talk about modulation of a periodic signal.
Comment #9 more-or-less repeats what I said in my #3 (the part where I say "There is no quantum noise/fluctuations in your post, and there's none in the paper I cite above, so there's no Planck constant, which is, needless to say, a big difference."), but then goes on to something conventional, but unsupportable, "when you look for the QM particle, you will only find it in one (random) location". No to that. When you insert a high gain avalanche photodiode somewhere in an experiment, (1) changing the configuration of the experiment will cause interference effects in other signals; (2) the avalanche photodiode signal will from time to time (by which I mean not periodically) be in the avalanche state (for the length of time known as the dead time). The times at which avalanche events occur will in some cases be correlated with eerie precision with the times at which avalanche events occur at remote places in the apparatus. Although it's entirely conventional to say that a "particle" causes an avalanche event in the avalanche photodiode, that straightjackets your understanding of QFT, and is, besides, only remotely correct if you back far away from any lingering classical ideas of what a "particle" is that aren't explicitly contained in the mathematics of Hilbert space operators and states.
Try saying, instead, "QFT is about modulation of a random signal". The post more-or-less talks about modulation of a periodic signal, but we can also talk about modulation of a Lorentz invariant vacuum state. If we use probability theory to model the vacuum state (we could also use stochastic processes, but that's a different ballgame), the mathematics is raised a level above ordinary signals, in the sense that we have introduced probability measures over the linear space of ordinary signals, as a result of which the tensor product emerges quite naturally.
For me, Matt Springer's posts are somewhat variable, perhaps because he's attempting to keep it simple, which as we know is one of the hardest things to attempt, but he hits the spot often enough to remain interesting. For my comment #3, see his post.

The eeriness of the correlations of the times at which avalanche events happen in avalanche photodiodes that I mention above is pretty extreme in Gregor Weihs experiment and others like it. There's a central parametric down conversion apparatus that feeds two fiber optic cables that are 500 meters long, which at the speed of light is equivalent to about 1600 nanoseconds.When an avalanche photodiode is set up at the two remote ends of the two 500 meter fiber optic cables, about 1/20th of the time avalanche events happen within 1 nanosecond of each other. Compared to 1600 nanoseconds. The other 19/20ths of the time, there's not much of a match. We can plot the avalanche events that match within 200 nanoseconds from 2 seconds of Gregor Weihs data:

In this plot, there's some additional information, which shows, at "Alice"'s end of the experiment, which direction an electromagnetically controlled polarization device was set at (0 or 90 degrees is one setting, 45 or 135 degrees is the other setting, which is switched at random, but on average every few hundred nanoseconds), and in which of two avalanche photodiodes there was an avalanche event (in effect choosing between 0 or 90 degree polarization or choosing between 45 or 135 degree polarization).

There are lots of events within about 1 nanosecond, there is a small excess of events that have a match within about 20 nanoseconds, then the rest are distributed evenly. Beyond the 200 nanosecond extent of this plot, the time difference between events in "Alice"'s and "Bob"'s avalanche photodiodes that match most closely are just as uniformly distributed as here, out to a difference of about 20000 nanoseconds, then there's a slightly decreasing density, until all 9711 [edit: this number of events happened in the first 1/4 second, multiply by 8, more-or-less, for the number of events in 2 seconds] of the times of avalanche events in Alice's data are within 120,000 nanoseconds of some avalanche event in Bob's data. The graph below shows how close each of Alice's events is to the closest event in Bob's events in the same 2 second fragment of the dataset [edit: the graph is in fact for the first quarter second of data from the same run. The general features are unchanged]:
There are about 500 hundred of Alice's events that are so close to events in Bob's events that they don't show on the graph, then most of Alice's events are, fairly uniformly, within a time difference of about 2e-5 (=20,000 nanoseconds). The graph is steep at the far right because there are very few of Alice's events that are so separated in time from any of Bob's events.

Trying to make some sense of large amounts of data, with good dollops of muddy randomness thrown in! Modulations of a random signal.