Archive for the ‘Quantum’ category

Refuting nonlocal realism?

2nd May 2007

Posting has been light of late. I would like to say this is due to the same sort of absorbtion that JoAnne has described over at Cosmic Variance, but in fact my attention span is currently too short for that and it has more to do with my attempts to work on three projects simultaneously. In any case, a report of an experiment on quantum foundations in Nature cannot possibly go ignored for too long on this blog. See here for the arXiv eprint.

What Gröblacher et. al. report on is an experiment showing violations of an inequality proposed by Leggett, aimed at ruling out a class of nonlocal hidden-variable theories, whilst simultaneously violating the CHSH inequality, so that local hidden-variable theories are also ruled out in the same experiment. This is of course subject to the usual caveats that apply to Bell experiments, but let’s grant the correctness of the analysis for now and take a look at the class of nonlocal hidden-variable theories that are ruled out.

It is well-known that Bell’s assumption of locality can be factored out into two conditions.

  • Outcome independence: the outcome of the experiment at site A does not depend on the outcome of the experiment at site B.
  • Parameter independence: the outcome of the experiment at site A does not depend on the choice of detector setting at site B.

Leggett has proposed to consider theories that maintain the assumption of outcome independence, but drop the assumption of parameter independence.  It is worth remarking at this point that the attribution of fundamental importance to this factorization of the locality assumption can easily be criticized.  Whilst it is usual to describe the outcome at each site by  ±1 this is an oversimplification.  For example, if we are doing Stern-Gerlach measurements on electron spins then the actual outcome is a deflection of the path of the electron either up or down with respect to the orientation of the magnet.  Thus, the outcome cannot be so easily separated from the orientation of the detector, as its full description depends on the orientation.

Nevertheless, whatever one makes of the factorization, it is the case that one can construct toy models that reproduce the quantum predictions in Bell experiments by dropping parameter independence.  Therefore, it is worth considering what other reasonable constraints we can impose on theories when this assumption is dropped.  Leggett’s assumption amounts to assuming that the hidden variable states in the theory can be divided into subensembles, in each of which the two photons have a definite polarization (which may however depend on the distant detector setting).  The total ensemble corresponding to a quantum state is then a statistical average over such states.  This is the class of theories that has been ruled out by the experiment.

This is all well and good, and I am certainly in favor of any experiment that places constraints on the space of possible interpretations of quantum theory.  However, the experiment has been sold in some quarters as a “refutation of nonlocal realism”, so we should consider the extent to which this is true.  The first point to make is that there are perfectly good nonlocal realistic models, in the sense of reproducing the predictions of quantum theory, that do not satisfy Leggett’s assumptions – the prime example being Bohmian mechanics.  In the Bohm theory photons do not have a well-defined value of polarization, but instead it is determined nonlocally via the quantum potential.   Therefore, if we regard this as a reasonable theory then no experiment that simply confirms the predictions of quantum theory can be said to rule out nonlocal realism.

Why is many-worlds winning the foundations debate?

11th April 2007

Almost every time the foundations of quantum theory are mentioned in another science blog, the comments contain a lot of debate about many-worlds. I find it kind of depressing the extent to which many people are happy to jump on board with this interpretation without asking too many questions. In fact, it is almost as depressing as the fact that Copenhagen has been the dominant interpretation for so long, despite the fact that most of Bohr’s writings on the subject are pretty much incoherent.

Well, this year is the 50th anniversary of Everett’s paper, so perhaps it is appropriate to lay out exactly why I find the claims of many-worlds so unbelievable.

WARNING: The following rant contains Philosophy!

Traditionally, philosophers have made a distinction between analytic and synthetic truths. Analytic truths are those things that you can prove to be true by deduction alone. They are necessary truths and essentially they are just the tautologies of classical logic, e.g. either this is a blog post or this is not a blog post. On the other hand, synthetic truths are things we could imagine to have been another way, or things that we need to make some observation of the world in order to confirm or deny, e.g. Matt Leifer has never written a blog post about his pet cat.

Perhaps the central problem of the philosophy of science is whether the correctness of the scientific method is an analytic or a synthetic truth. Of course this depends a lot on how exactly you decide to define the scientific method, which is a topic of considerable controversy in itself. However, it’s pretty clear that the principle of induction is not an analytic truth, and even if you are a falsificationist you have to admit that it has some role in science, i.e. if a single experiment contradicts the predictions of a dominant theory then you call it an anomaly rather than a falsification. Of the other alternatives, if you’re a radical Kuhnian then you’re probably not reading this blog, since you are busy writing incoherent postmodern junk to write for a sociology journal. If you are a follower of Feyerabend then you are a conflicted soul and I sympathize. Anyway, back to the plot for people who do believe that induction has some role to play in science.

Kant’s resolution to this dilemma was to divide the synthetic truths into two categories, the a priori truths and the rest (I don’t know a good name for non-a priori synthetic truths). The a priori synthetic truths are things that cannot be directly deduced, but are nevertheless so basic to our functioning as beings living in this world that we must assume them to be true, i.e. it would be impossible to make any sense of the world without them. For example, we might decide that the fact that the universe is regular enough to perform approximately repeatable scientific experiments and to draw reliable inferences from them should be in the category of a priori truths. This seems reasonable because it is pretty hard to imagine that any kind of intelligent life could exist in a universe where the laws of physics were in continual flux.

One problem with this notion is that we can’t know a priori exactly what the a priori truths are. We can write down a list of what we currently believe to be a priori truths – our assumed a priori truths – but this is open to revision if we find that we can in fact still make sense of the world when we discard some of these assumed truths. The most famous example of this comes from Kant himself, who assumed that the way our senses are hooked up meant that we must describe the world in terms of events happening in space and time, implicitly assuming a Euclidean geometry. As we now know, the world still makes sense if we drop the Euclidean assumption, unifying space and time and working with much more general geometries. Still, even in relativity we have the concept of events occurring at spacetime locations as a fundamental primitive. If you like, you can modify Kant’s position to take this as the fundamental a priori truth, and explain that he was simply misled by the synthetic fact that our spacetime is approximately flat on ordinary length scales.

At this point, it is useful to introduce Quine’s pudding-bowl analogy for the structure of knowledge (I can’t remember what kind of bowl Quine actually used, but he’s making pudding as far as we are concerned). If you make a small chip at the top of a pudding bowl, then you won’t have any problem making pudding with it and the chip can easily be fixed up. On the other hand, if you make a hole near the bottom then you will have a sticky mess in the oven. It will take some considerable surgery to fix up the bowl and you are likely to consider just throwing out the bowl and sitting down at the pottery wheel to build a new one. The moral of the story is that we should be more skeptical of changes in the structure of our knowledge that seem to undermine assumptions that we think are fundamental. We need to have very good reasons to make such changes, because it is clear that there is a lot of work to be done in order to reconstruct all the dependent knowledge further up the bowl that we rely on every day. The point is not that we should never make such changes – just that we should be careful to ensure that there isn’t an equally good explanation that doesn’t require making such as drastic change.

Aside: Although Quine has in mind a hierarchical structure for knowledge – the parts of the pudding bowl near the bottom are the foundation that supports the rest of the bowl – I don’t think this is strictly necessary. We just need to believe that some areas of knowledge have higher connectivity than others, i.e. more other important things that depend on them. It would work equally well if you think knowledge is stuctured like a power-law graph for example.

The Quinian argument is often levelled against proposed interpretations of quantum theory, e.g. the idea that quantum theory should be understood as requiring a fundamental revision of logic or probability theory rather than these being convenient mathematical formalisms that can coexist happily with their classical counterparts. The point here is that it is bordering on the paradoxical for a scientific theory to entail changes to things on which the scientific method itself seems to depend, and we did use logical and statistical arguments to confirm quantum theory in the first place. Thus, if we revise logic or probability then the theory seems to be “eating its own tail”. This is not to say that this is an actual paradox, because it could be the case that when we reconstruct the entire edifice of knowledge according to the new logic or probability theory we will still find that we were right to believe quantum theory, but just mistaken about the reasons why we should believe it. However, the whole exercise is question begging because if we allow changes to such basic things then why not make a more radical change and consider the whole space of possible logics or probability theories. There are clearly some possible alternatives under which all hell breaks loose and we are seriously deluded about the validity of all our knowledge. In other words, we’ve taken a sledgehammer to our pudding bowl and we can’t even make jelly (jello for North Ameican readers) any more.

At this point, you might be wondering whether a Quinian argument can be levelled against the revision of geometry implied by general relativity as well. The difference is that we do have a good handle of what the space of possible alternative geometries looks like. We can imagine writing down countless alternative theories in the language of differential geometry and figuring out what the world looks like according to them. We can adopt the assumed a priori truth that the world is describable in terms of events in asome spacetime geometry and then we find the synthetic fact that general relativity is in accordance with our observations, while most of the other theories are not. We did some significant damage close to the bottom of the bowl, but it turned out that we could fix it relatively easily. There are still some fancy puddings – like the theory of quantum gravity (baked Alaska) – that we haven’t figured out how to make in the repaired bowl, but we can live without them most of the time.

Now, is there a Quinian argument to be made against the many-worlds interpretation? I think so. The idea is that when we apply the scientific method we assume we can do experiments which have actual definite outcomes. These are the basic data from which we build a confirmation or refutation our theories. Many-worlds says that this assumption is wrong, there are no fundamental definite outcomes – it just appears that way to us because we are all entangled up in the wavefunction of the universe. This is a pretty dramatic assertion and it does seem to be bordering on the “theory eating its own tail” type of assertion. We need to be pretty sure that there isn’t an equally good alternative explanation in which experiments really do have definite outcomes before accepting it. Also, as with the case of revising logic or probability, we don’t have a good understanding of the space of possible theories in which experiments do not have definite outcomes. I can think of one other theory of this type, namely a bizarre interpretation of classical probability theory in which all outcomes that are assigned nonzero probability occur in different universes, but two possible theories does not amount to much in the grand scheme of things. The problem is that on dropping the assumption of definite outcomes, we have not replaced it with an adequate new assumed a priori truth. That the world is describable by vectors in Hilbert space that evolve unitarily seems much to specific to be considered as a potential candidate. Until we do come up with such an assumption, I can’t see why many-worlds is any less radical than proposing a revision of logic or probability theory. Until then, I won’t be making any custard tarts in that particular pudding bowl myself.

Teaching Quantum Theory

26th March 2007

The recent article by Chandralekha Singh, Mario Belloni and Wolfgang Christian on Students’ understanding of Quantum Mechanics in Physics Today provoked an interesting series of letters in response. Both Robert Griffith and Travis Norsen argue that students’ understanding would be improved by replacing the usual Copenhagen/Orthodox dogma by discussion of some more recent developments in the foundations of quantum theory.

Given that I don’t actually have much experience teaching quantum theory (I have only covered a lecturer’s absence for two lectures) it is perhaps a bit presumptuous for me to contribute my thoughts on this topic. Nevertheless, I do agree wholeheartedly with the basic sentiment of both these letters. I think one can easily see that at least some of the misconceptions that Sing, Belloni and Christian have written about could be easily remedied by a bit more foundational discussion at the ground level. For example, I think the common misconception that stationary states are the only allowed states of a quantum system could be dispelled by a deeper discussion of the sense in which quantum theory is analogous to classical probability theory.

However, I think both Griffith and Norsen make a mistake in the approaches they advocate in their letters. Griffith suggests replacing the orthodoxy with his own favored approach, namely decoherent/consistent histories, and Norsen thinks we should teach students Bohmian mechanics. In fact, in his letter Griffith gives the misleading impression that his approach is universally and unproblematicallly accepted by all right-thinking physicists. Whilst the formalism certainly has quite a few adherents in quantum cosmology, it is far from true that it has received universal support from all serious thinkers on the foundations of quantum theory. Similarly, whilst I agree that Bohmian mechanics presents the clearest counterexample to many common misconceptions about quantum theory, it is far from clear that it represents the best road to future progress.

In my view, the problem is not that we are teaching the wrong orthodoxy to students, but rather that we are teaching them any orthodoxy at all, since foundations is a subject that is still mired in controversy to this day. It is hard for me to imagine any physicist who is not directly involved in foundations taking either Griffith’s or Norsen’s arguments seriously, since their letters directly contradict each other about what is the best approach to teach, and a non-specialist really has no way of deciding which one of them they should trust. The view that foundations is a murky area, with no clear reason for choosing one approach over any other is only reinforced by such arguments and it is unlikely to persuade a skeptic to change their whole teaching strategy.

On the other hand, I do believe that there are a lot of developments in foundations that have made our current understanding much clearer, and these could be usefully communicated to students. For example, we have a much clearer understanding of the “no-go” theorems, such as Bell’s theorem, and their possible loopholes, and a much clearer understanding of the space of possible realist interpretations of quantum theory. We have an improved understanding of the classical limit, via decoherence theory amongst other approaches, and quantum information theory has shown that entanglement and the understanding of quantum theory as a generalized probability theory actually have useful consequences. I believe we should teach these things as a central part of quantum mechanics courses, and not just as peripheral topics covered in the last one or two lectures, which students are instructed not to worry about because it won’t be on the final exam! We should also give students an understanding of the space of possible resolutions to foundational problems, to equip them with a BS detector for statements they are likely to hear about quantum theory. Why do I believe this? Well, simply because I think it will leave students less confused about how to understand quantum theory and because I think these areas are all increasingly fruitful avenues of research that we might want to encourage them to pursue.

The difficult question, I think, is not the why but the how. It would entail battling against the prevailing wisdom that foundations are to be de-emphasised and relegated to the end of the course. Also, good teaching materials at an appropriate level that could supplement the existing curriculum are not readily available, and that is a problem we definitely have to address if we want this to happen.

Foundations Summer School: Apply Now!

12th March 2007

Just a short note to let you know that the application form for the Perimeter Institute Quantum Foundations Summer School is now available online from here. The application deadline is 20th May.

Update: I should have mentioned that for successful applicants who are grad students all expenses will be paid by Perimeter. That should make it easier to persuade your advisor to let you go. You don’t have to be an expert on foundations and we are hoping that students studying a wide variety of areas of Physics will attend.

Update 2: Whether non-students, e.g. postdocs, will be allowed to attend is still an open question. I’m waiting to hear more about this from the organizers. Clearly, the priority for a summer school has to be grad students, so I would speculate that it will depend on the number and quality of applications that we get. I’m just guessing at the moment though and I’ll post another update once I hear the official word.

Update 3: I have just heard that there will be up to 10 places will be made available at the summer school for postdocs and junior faculty.

Foundations at APS, take 2

6th March 2007

It doesn’t seem that a year has gone by since I wrote about the first sessions on quantum foundations organized by the topical group on quantum information, concepts and computation at the APS March meeting. Nevertheless it has, and I am here in Denver after possibly the longest day of continuous sitting through talks in my life. I arrived at 8am to chair the session on Quantum Limited Measurements, which was interesting, but readers of this blog won’t want to hear about such practical matters, so instead I’ll spill the beans on the two foundations sessions that followed.

In the first foundations session, things got off to a good start with Rob Spekkens as the invited speaker explaining to us once again why quantum states are states of knowledge. OK, I’m biased because he’s a collaborator, but he did throw us a new tidbit on how to make an analog of the Elitzur Vaidman bomb experiment in his toy theory by constructing a version for field theory.

Next, there was a talk by some complete crackpot called Matt Leifer. He talked about this.

Frank Schroeck gave an overview of his formulation of quantum mechanics on phase space, which did pique my interest, but 10 minutes was really too short to do it justice. Someday I’ll read his book.

Chris Fuchs gave a talk which was surprisingly not the same as his usual quantum Bayesian propaganda speech. It contained some new results about Symmetric Informationally Complete POVMs, including the fact that the states the POVM elements are proportional to are minimum uncertainty states with respect to mutually unbiased bases. This should be hitting an arXiv near you very soon.

Caslav Brukner talked about his recent work on the emergence of classicality via coarse graining. I’ve mentioned it before on this blog, and it’s definitely a topic I’m becoming much more interested in.

Later on, Jeff Tollaksen talked about generalizing a theorem proved by Rob Spekkens and myself about pre- and post-selected quantum systems to the case of weak measurements. I’m not sure I agree with the particular spin he gives on it, especially his idea of “quantum contextuality”, but you can decide for yourself by reading this.

Jan-Ake Larrson gave a very comprehensible talk about a “loophole” (he prefers the term “experimental problem”) in Bell inequality tests to do with coincidence times of photon detection. You can deal with it by having a detection efficiency just a few percent higher than that needed to overcome the detection loophole. Read all about it here.

Most of the rest of the talks in this session were more quantum information oriented, but I suppose you can argue they were at the foundational end of quantum information. Animesh Datta talked about the role of entanglement in the Knill-Laflamme model of quantum computation with one pure qubit, Anil Shaji talked about using easily computable entanglement measures to put bounds on those that aren’t so easy to compute and finally Ian Durham made some interesting observations about the connections between entropy, information and Bell inequalities.

The second foundations session was more of a mixed bag, but let me just mention a couple of the talks that appealed to me. Marcello Sarandy Alioscia Hamma talked about generalizing the quantum adiabatic theorem to open systems, where you don’t necessarily have a Hamiltonian with well-defined eigenstates to talk about and Kicheon Kang talked about a proposal for a quantum eraser experiment with electrons.

On Tuesday, Bill Wootters won a prize for best research at an undergraduate teaching college. He gave a great talk about his discrete Wigner functions, which included some new stuff about minumum uncertainty states and analogs of coherent states.

That’s pretty much it for the foundations talks at APS this year. It’s all quantum information from here on in. That is unless you count Zeilinger, who is talking on Thursday. He’s supposed to be talking about quantum cryptography, but perhaps he will say something about the more foundationy experiments going on in his lab as well.

Tao on Many-Worlds and Tomb Raider

27th February 2007

Terence Tao has an interesting post on why many-worlds quantum theory is like Tomb Raider.  I think it’s de Broglie-Bohm theory that is more like Tomb Raider though, as you can see from the comments.

Dates for your diary

20th February 2007

Update: I am informed that the Oxford Everett meeting will be in the summer rather than in September and is invitation only.  Also, there will be a Symposium on the Foundations of Modern Physics in Vienna 7th-10th June.  Registration for that is open until the end of March.

I haven’t been contemplating too many quantum quandaries recently because I was away at a workshop on Operator Structures in Quantum Information in Banff (a very interesting meeting and a highly recommended location) and am currently visiting Caltech. My brain is mostly full of mathematics and non-foundations oriented physics. In the meantime, here are some interesting foundations events coming up this summer.

Firstly, Perimeter Institute is organising its first Summer School on Quantum Foundations August 27th-31st. There have been several summer schools in other locations in the past, which have mostly been philosophy/interpretations oriented. The PI School will have a distinctly “physics” flavor, e.g. it will include lectures on experiments amongst other things. I’ve seen the list of speakers and it looks like it’s going to be really interesting. For grad students and postdocs interested in foundations, summer schools are highly recommended because of the sparsity of experts in the subject at most institutions. It’s how I became reasonably competent in the subject at any rate. Please don’t write to me requesting further details because I can’t help you. All the information is going to be posted on your favorite quantum websites/mailing lists very soon. Alternatively, you’ll be able to get to the school website via this link once it is up and running.

Secondly, the Institute for Quantum Computing and Perimeter are jointly running a series of quantum oriented workshops this summer under the banner Taming the Quantum World. There’s lots of interesting events for quantum information folks, so check out the website, but the workshop on Operational Quantum Physics and the Quantum-Classical Contrast, June 4th-7th, organized by Paul Busch and Lucien Hardy will be of special interest to readers of this blog.

Since I’m plugging foundations meetings at my own institutions, I should also mention Many Worlds at 50, organized by Jonathan Barrett, Adrian Kent and David Wallace, taking place September 21st-24th.

Given the number of meetings in Waterloo this year, it is somewhat surprising that the foundations community has also found time to organise some events at other locations. Here’s the rundown of the rest:

– March 5th-9th: APS March Meeting, Denver – Two focus sessions on quantum foundations have been organised.

– March 29th-31st: 15th UK and European Meeting on the Foundations of Physics, Leeds.

– April 13th-15th: New Directions in the Foundations of Physics, Maryland. It’s invitation only (and full) I’m afraid.

– June 11th-16th: Quantum Theory: Reconsideration of Foundations 4, Vaxjo.

– July 2nd-13th: Operational probabilistic theories as foils to quantum theory, Cambridge. It’s invitation only (and full).

– Sometime in September: Everett at 50, Oxford.

If I’ve missed any meetings or you have any new info on any of these then please leave a comment.

Quantum Brains

7th February 2007

OK, I should be preparing a talk, but it is late and my mind is wandering, so it’s not going to happen tonight.  Instead, I’ll pose this puzzler:  If quantum computers are more efficient than classical ones then why didn’t our brains evolve to take advantage of quantum information processing?

I have a vague recollection of seeing this question on a physics blog somewhere before, and it does have a family resemblance to Scott’s infamous post, albeit a more politically correct version.

There are a number of assumptions behind this question:

  • Evolution usually does a very efficient job of coming up with information processing devices.  As evidence for this note that the best algorithms we have for some tasks simply imitiate nature, e.g. neural networks, simulated annealing, etc.
  • Some functions of the brain, such as the ability to solve math problems, are best understood by regarding the brain as a kind of computer.  Note that we don’t need to say that the brain is merely a computer, only that it can be regarded as such for understanding some of its functions, i.e. we don’t need to get into a big philosophical debate about conciousness and artificial intelligence.
  • Further, in these respects the brain is a classical computer and not a quantum one.  It certainly seems that the information processing function of neurons can be understood in classical terms, i.e. neural networks again.  There is a small minority of experts who believe that quantum mechanics plays an essential role in the information processing functions of the brain for whom my question is nonsense.

Here are all the possible explanations I can think of.

  • The set of problems in BQP, but not in P does not include anything that would have conferred a significant survival advantage for our ancestors.  Admittedly, efficient factoring could be useful for surviving high-school math class, as well as for cracking codes, but this wouldn’t have mattered so much to cave-people.  This would be disappointing, although not devastating, news for people trying to come up with new quantum algorithms.
  • There is some big problem with building a stable quantum computer of any appreciable size, and so present day experimentalists will eventually run into the same problems that nature did.
  • Dumb luck.  Evolution tends to find local minima in the landscape of all possible species.  Having a quantum brain is indeed a lower minimum than our current classical brain, but we never got a big enough hit to get over the mountain separating that solution from ourselves.

The first two explanations seem like the most interesting ones.  If the third explanation wasn’t a possibility then there would have to be a tradeoff between the amount of progress possible in developing quantum algorithms and the amount possible in actually building a quantum computer.  Given that much quantum computing funding is predicated on the idea that massive progress is possible in both areas, I’d say we should thank Darwin for dumb luck!

What can decoherence do for us?

24th January 2007

OK, so it’s time for the promised post about decoherence, but where to begin? Decoherence theory is now a vast subject with an enormous literature covering a wide variety of physical systems and scenarios. I will not deal with everything here, but just make some comments on how the theory looks from my point of view about the foundations of quantum theory. Alexei Grinbaum pointed me to a review article by Maximilian Schlosshauer on the role of decoherence in solving the measurement problem and in interpretations of quantum theory. That’s a good entry into the literature for people who want to know more.

OK, let me start by defining two problems that I take to be at the heart of understanding quantum theory:

1) The Emergence of Classicality: Our most fundamental theories of the world are quantum mechanical, but the world appears classical to us at the everyday level. Explain why we do not find ourselves making mistakes in using classical theories to make predictions about the everyday world of experience. By this I mean not only classical dynamics, but also classical probability theory, information theory, computer science, etc.

2) The ontology problem: The mathematical formalism of quantum theory provides an algorithm for computing the probabilities of outcomes of measurements made in experiments. Explain what things exist in reality and what laws they obey in such a way as to account for the correctness of the predictions of the theory.

I take these to be the fundamental challenges of understanding quantum mechanics. You will note that I did not mention the measurement problem, Schroedinger’s cat, or the other conventional ways of expressing the foundational challenges of quantum theory. This is because, as I have argued before, these problems are not interpretation neutral. Instead, one begins with something like the orthodox interpretation and shows that unitary evolution and the measurement postulates are in apparent conflict within that interpretation depending on whether we choose to view the measuring apparatus as a physical system obeying quantum theory or to leave it unanalysed. The problems with this are twofold:

i) It is not the case that we cannot solve the measurement problem. Several solutions exist, such as the account given by Bohmian mechanics, that of Everett/many-worlds, etc. The fact that there is more than one solution, and that none of them have been found to be universally compelling, indicates that it is not solving the measurement problem per se that is the issue. You could say that it is solving the measurement problem in a compelling way that is the issue, but I would say it is better to formulate the problem in such a way that it is obvious how it applies to each of the different interpretations.

ii) The standard way of describing the problems essentially assumes that the quantum state-vector corresponds more or less directly to whatever exists in reality, and that it is in fact all that exists in reality. This is an assumption of the orthodox interpretation, so we are talking about a problem with the standard interpretation and not with quantum theory itself. Assuming the reality of the state-vector simply begs the question. What if it does not correspond to an element of reality, but is just an epistemic object with a status akin to a probability distribution in classical theories? This is an idea that I favor, but now is not the time to go into detailed arguments for it. The mere fact that it is a possibility, and is taken seriously by a significant section of the foundations community, means that we should try to formulate the problems in a language that is independent of the ontological status of the state-vector.

Given this background viewpoint, we can now ask to what extent decoherence can help us with 1) and 2), i.e. the emergence and ontology problems. Let me begin with a very short description of what decoherence is in this context. The first point is that it takes seriously the idea that quantum systems, particularly the sort that we usually describe as “classical”, are open, i.e. interact strongly with a large environment. Correlations between system and environment are typically established very quickly in some particular basis, determined by the form of the system-environment interaction Hamiltonain, so that the density matrix of the system quickly becomes diagonal in that basis. Furthermore, the basis in which the correlations exist is stable over a very long period of time, which can typically be much longer than the lifetime of the universe. Finally, for many realistic Hamiltonians and a wide variety of systems, the decoherence basis corresponds very well to the kind of states we actually observe.

From my point of view, the short answer to the role of decoherence in foundations is that it provides a good framework for addressing emergence, but has almost nothing to say about ontology.  The reason for saying that should be clear:  we have a good correspondence with our observations, but at no point in my description of decoherence did I find it necessary to mention a reality underlying quantum mechanics.  Having said that, a couple of caveats are in order. Firstly, decoherence can do much more if it is placed within a framework with a well defined ontology. For example, in Everett/many-worlds, the ontology is the state-vector, which always evolves unitarily and never collapses. The trouble with this is that the ontology doesn’t correspond to our subjective experience, so we need to supplement it with some account of why we see collapses, definite measurement outcomes, etc. Decoherence theory does a pretty good job of this by providing us with rules to describe this subjective experience, i.e. we will experience the world relative to the basis that decoherence theory picks out. However, the point here is that the work is not being done by decoherence alone, as claimed by some physicists, but also by a nontrivial ontological assumption about the state-vector. As I remarked earlier, the latter is itself a point of contention, so it is clear that decoherence alone is not providing a complete solution.

The second caveat, is that some people, including Max Schlosshauer in his review, would argue for plausible denial of the need to answer the ontology question at all. So long as we can account for our subjective experience in a compelling manner then why should we demand any more of our theories? The idea is then that decoherence can solve the emergence problem, and then we are done because the ontology problem need not be solved at all. One could argue for this position, but to do so is thoroughly wrongheaded in my opinion, and this is so independently of my conviction that physics is about trying to describe a reality that goes beyond subjective experience. The simple point is that someone who takes this view seriously really has no need for decoherence theory at all. Firstly, given that we are not assigning ontological status to anything, let alone the state-vector, then you are free to collapse it, uncollapse it, evolve it, swing it around your head or do anything else you like with it. After all, if it is not supposed to represent anything existing in reality then there need not be any physical consequences for reality of any mathematical manipulation, such as a projection, that you might care to do. The second point is that if we are prepared to give a privelliged status to observers in our physical theories, by saying that physics needs to describe their experience and nothing more, then we can simply say that the collapse is a subjective property of the observer’s experience and leave it at that. We already have privelliged systems in our theory on this view, so what extra harm could that do?

Of course, I don’t subscribe to this viewpoint myself, but on both views described so far, decoherence theory either needs to be supplemented with an ontology, or is not needed at all for addressing foundational issues.

Finally, I want to make a couple of comments about how odd the decoherence solution looks from my particular point of view as a believer in the epistemic nature of wavefunctions. The first is that, from this point of view, the decoherence solution appears to have things backwards. When constructing a classical probabilistic theory, we first identify the ontological entities, e.g. particles that have definite trajectories, and describe their dynamics, e.g. Hamilton’s equations. Only then do we introduce probabilities and derive the corresponding probabilistic theory, e.g. Liouville mechanics. Decoherence theory does things in the other direction, starting from Schroedinger mechanics and then seeking to define the states of reality in terms of the probabilistic object, i.e. the state-vector. Whilst this is not obviously incorrect, since we don’t necessarily have to do things the same way in classical and quantum theories, it does seem a little perverse from my point of view. I’d rather start with an ontology and derive the fact that the state-vector is a good mathematical object for making probabilistic predictions, instead of the other way round.

The second comment concerns an analogy between the emergence of classicality in QM and the emergence of the second law of thermodynamics from statistical mechanics. For the latter, we have a multitude of conceptually different approaches, which all arrive at somewhat similar results from a practical point of view. For a state-vector epistemist like myself, the interventionist approach to statistical mechanics seems very similar to the decoherence approach to the emergence problem in QM. Both say that the respective problems cannot be solved by looking at a closed Hamiltonian system, but only by considering interaction with a somewhat uncontrollable environment. In the case of stat-mech, this is used to explain the statistical fluctuations observed in what would be an otherwise deterministic system. The introduction of correlations between system and environment is the mechanism behind both processes. Somewhat bizzarely, most physicists currently prefer closed-system approaches to the derivation of the second law, based on coarse-graining, but prefer the decoherence approach when it comes to the emergence of classicality from quantum theory. Closed system approaches have the advantage of being applicable to the universe as a whole, where there is no external environment to rely on. However, apart from special cases like this, one can broadly say that the two types of approach are complimentary for stat mech, and neither has a monopoly on explaining the second law. It is then natural to ask whether closed system approaches to emergence in QM are available making use of coarse graining, and whether they ought to be given equal weight to the decoherence explanation. Indeed, such arguments have been given – here is a recent example, which has many precursors too numerous to go through in detail. I myself am thinking about a similar kind of approach at the moment. Right now, such arguments have a disadvantage over decoherence in that the “measurement basis” has to be put in by hand, rather than emerging from the physics as in decoherence. However, further work is needed to determine whether this is an insurmountable obstacle.

In conclusion, decoherence theory has done a lot for our understanding of the emergence of classicality from quantum theory. However, it does not solve all the foundational queations about quantum theory, at least not on it’s own. Further, its importance may have been overemphasized by the physics community because other less-developed approaches to emergence could turn out to be of equal importance.

Universitas Magistrorum et Scholarium

16th January 2007

I have arrived back in Waterloo to start my new hybrid University/Perimeter Institute position.  It’s been quite a long break from posting, because – strangely enough – having two affiliations means I had to do twice the amount of paperwork to get myself set-up this time.  As much as I loved being at PI, it is nice to be back in a university and to have some small role in educating the next generation of quantum mechanics.

Over the break, Andrew Thomas has left a few comments about the role of decoherence in interpretations of quantum theory in my Professional Jealousy post.  There are some who think that understanding decoherence alone is enough to “solve” the conceptual difficulties with quantum theory.  This is quite a popular opinion in some quarters of the physics community, where one often finds people mumbling something about decoherence when asked about the measurement problem.  However, there are also many deep thinkers on foundations who have denied that decoherence completely solves the problems, and I tend to agree with them, so we’ll have a post on “What can decoherence do for us?” later on this week.

To clarify, I’m not going to argue that decoherence isn’t an important and real physical effect, nor am I going to say that it has no role at all in foundational studies, so please hold your fire until after the next post if you were thinking of commenting to that effect.