## What can decoherence do for us?

OK, so it’s time for the promised post about decoherence, but where to begin? Decoherence theory is now a vast subject with an enormous literature covering a wide variety of physical systems and scenarios. I will not deal with everything here, but just make some comments on how the theory looks from my point of view about the foundations of quantum theory. Alexei Grinbaum pointed me to a review article by Maximilian Schlosshauer on the role of decoherence in solving the measurement problem and in interpretations of quantum theory. That’s a good entry into the literature for people who want to know more.

OK, let me start by defining two problems that I take to be at the heart of understanding quantum theory:

1) The Emergence of Classicality: Our most fundamental theories of the world are quantum mechanical, but the world appears classical to us at the everyday level. Explain why we do not find ourselves making mistakes in using classical theories to make predictions about the everyday world of experience. By this I mean not only classical dynamics, but also classical probability theory, information theory, computer science, etc.

2) The ontology problem: The mathematical formalism of quantum theory provides an algorithm for computing the probabilities of outcomes of measurements made in experiments. Explain what things exist in reality and what laws they obey in such a way as to account for the correctness of the predictions of the theory.

I take these to be the fundamental challenges of understanding quantum mechanics. You will note that I did not mention the measurement problem, Schroedinger’s cat, or the other conventional ways of expressing the foundational challenges of quantum theory. This is because, as I have argued before, these problems are not *interpretation neutral*. Instead, one begins with something like the orthodox interpretation and shows that unitary evolution and the measurement postulates are in apparent conflict *within that interpretation* depending on whether we choose to view the measuring apparatus as a physical system obeying quantum theory or to leave it unanalysed. The problems with this are twofold:

i) It is not the case that we cannot solve the measurement problem. Several solutions exist, such as the account given by Bohmian mechanics, that of Everett/many-worlds, etc. The fact that there is more than one solution, and that none of them have been found to be universally compelling, indicates that it is not solving the measurement problem per se that is the issue. You could say that it is solving the measurement problem in a *compelling* way that is the issue, but I would say it is better to formulate the problem in such a way that it is obvious how it applies to each of the different interpretations.

ii) The standard way of describing the problems essentially assumes that the quantum state-vector corresponds more or less directly to whatever exists in reality, and that it is in fact all that exists in reality. This is an assumption of the orthodox interpretation, so we are talking about a problem *with the standard interpretation* and not with quantum theory itself. Assuming the reality of the state-vector simply begs the question. What if it does not correspond to an element of reality, but is just an epistemic object with a status akin to a probability distribution in classical theories? This is an idea that I favor, but now is not the time to go into detailed arguments for it. The mere fact that it is a possibility, and is taken seriously by a significant section of the foundations community, means that we should try to formulate the problems in a language that is independent of the ontological status of the state-vector.

Given this background viewpoint, we can now ask to what extent decoherence can help us with 1) and 2), i.e. the emergence and ontology problems. Let me begin with a very short description of what decoherence is in this context. The first point is that it takes seriously the idea that quantum systems, particularly the sort that we usually describe as “classical”, are open, i.e. interact strongly with a large environment. Correlations between system and environment are typically established very quickly in some particular basis, determined by the form of the system-environment interaction Hamiltonain, so that the density matrix of the system quickly becomes diagonal in that basis. Furthermore, the basis in which the correlations exist is stable over a very long period of time, which can typically be much longer than the lifetime of the universe. Finally, for many realistic Hamiltonians and a wide variety of systems, the decoherence basis corresponds very well to the kind of states we actually observe.

From my point of view, the short answer to the role of decoherence in foundations is that it provides a good framework for addressing emergence, but has almost nothing to say about ontology. The reason for saying that should be clear: we have a good correspondence with our observations, but at no point in my description of decoherence did I find it necessary to mention a reality underlying quantum mechanics. Having said that, a couple of caveats are in order. Firstly, decoherence can do much more if it is placed within a framework with a well defined ontology. For example, in Everett/many-worlds, the ontology is the state-vector, which always evolves unitarily and never collapses. The trouble with this is that the ontology doesn’t correspond to our subjective experience, so we need to supplement it with some account of why we see collapses, definite measurement outcomes, etc. Decoherence theory does a pretty good job of this by providing us with rules to describe this subjective experience, i.e. we will experience the world relative to the basis that decoherence theory picks out. However, the point here is that the work is not being done by decoherence alone, as claimed by some physicists, but also by a nontrivial ontological assumption about the state-vector. As I remarked earlier, the latter is itself a point of contention, so it is clear that decoherence alone is not providing a complete solution.

The second caveat, is that some people, including Max Schlosshauer in his review, would argue for plausible denial of the need to answer the ontology question at all. So long as we can account for our subjective experience in a compelling manner then why should we demand any more of our theories? The idea is then that decoherence can solve the emergence problem, and then we are done because the ontology problem need not be solved at all. One could argue for this position, but to do so is thoroughly wrongheaded in my opinion, and this is so independently of my conviction that physics is about trying to describe a reality that goes beyond subjective experience. The simple point is that someone who takes this view seriously really has no need for decoherence theory at all. Firstly, given that we are not assigning ontological status to anything, let alone the state-vector, then you are free to collapse it, uncollapse it, evolve it, swing it around your head or do anything else you like with it. After all, if it is not supposed to represent anything existing in reality then there need not be any physical consequences for reality of any mathematical manipulation, such as a projection, that you might care to do. The second point is that if we are prepared to give a privelliged status to observers in our physical theories, by saying that physics needs to describe their experience and nothing more, then we can simply say that the collapse is a subjective property of the observer’s experience and leave it at that. We already have privelliged systems in our theory on this view, so what extra harm could that do?

Of course, I don’t subscribe to this viewpoint myself, but on both views described so far, decoherence theory either needs to be supplemented with an ontology, or is not needed at all for addressing foundational issues.

Finally, I want to make a couple of comments about how odd the decoherence solution looks from my particular point of view as a believer in the epistemic nature of wavefunctions. The first is that, from this point of view, the decoherence solution appears to have things backwards. When constructing a classical probabilistic theory, we first identify the ontological entities, e.g. particles that have definite trajectories, and describe their dynamics, e.g. Hamilton’s equations. Only then do we introduce probabilities and derive the corresponding probabilistic theory, e.g. Liouville mechanics. Decoherence theory does things in the other direction, starting from Schroedinger mechanics and then seeking to define the states of reality in terms of the probabilistic object, i.e. the state-vector. Whilst this is not obviously incorrect, since we don’t necessarily have to do things the same way in classical and quantum theories, it does seem a little perverse from my point of view. I’d rather start with an ontology and derive the fact that the state-vector is a good mathematical object for making probabilistic predictions, instead of the other way round.

The second comment concerns an analogy between the emergence of classicality in QM and the emergence of the second law of thermodynamics from statistical mechanics. For the latter, we have a multitude of conceptually different approaches, which all arrive at somewhat similar results from a practical point of view. For a state-vector epistemist like myself, the interventionist approach to statistical mechanics seems very similar to the decoherence approach to the emergence problem in QM. Both say that the respective problems cannot be solved by looking at a closed Hamiltonian system, but only by considering interaction with a somewhat uncontrollable environment. In the case of stat-mech, this is used to explain the statistical fluctuations observed in what would be an otherwise deterministic system. The introduction of correlations between system and environment is the mechanism behind both processes. Somewhat bizzarely, most physicists currently prefer closed-system approaches to the derivation of the second law, based on coarse-graining, but prefer the decoherence approach when it comes to the emergence of classicality from quantum theory. Closed system approaches have the advantage of being applicable to the universe as a whole, where there is no external environment to rely on. However, apart from special cases like this, one can broadly say that the two types of approach are complimentary for stat mech, and neither has a monopoly on explaining the second law. It is then natural to ask whether closed system approaches to emergence in QM are available making use of coarse graining, and whether they ought to be given equal weight to the decoherence explanation. Indeed, such arguments have been given – here is a recent example, which has many precursors too numerous to go through in detail. I myself am thinking about a similar kind of approach at the moment. Right now, such arguments have a disadvantage over decoherence in that the “measurement basis” has to be put in by hand, rather than emerging from the physics as in decoherence. However, further work is needed to determine whether this is an insurmountable obstacle.

In conclusion, decoherence theory has done a lot for our understanding of the emergence of classicality from quantum theory. However, it does not solve all the foundational queations about quantum theory, at least not on it’s own. Further, its importance may have been overemphasized by the physics community because other less-developed approaches to emergence could turn out to be of equal importance.

**Explore posts in the same categories:**papers, Quantum

26th January 2007 at 11:06 am

http://www.m-w.com/cgi-bin/dictionary?va=privileged

26th January 2007 at 1:35 pm

Naive questions, exposing my ignorance…are there more or less realistic models demonstrating in detail how the environment chooses a basis via decoherence? apologies if this is in the review you refer to.

Re: ontology, I would feel more comfortable if I knew any operational way of deciding whether any specific ontology is “correct”, or even a reason to expect that a fully satisfactory solution exists…

thanks,

Moshe

26th January 2007 at 2:51 pm

Moshe said, “are there more or less realistic models demonstrating in detail how the environment chooses a basis via decoherence? apologies if this is in the review you refer to.”

Yes there are. That’s part of the technical programme and is probably the most substantial contribution of decoherence theory to foundational studies. The review article focusses mainly on conceptual/interpretational issues, but it does discuss one model by Zurek as an example, which fits into the “less realistic” category. However, you can find references to the extensive technical literature in the review.

One of the most impressive achievements is that decoherence can select different types of “basis” depending on the relative strengths of the system-environment and internal system Hamiltonians (I use quotation marks because the states selected can be overcomplete). If the internal Hamiltonian is much stronger then the energy basis tends to be selected and if it is much weaker then something approximating the position basis is often found. In intermediate regimes, things like coherent states can be selected. This corresponds well to what we see in the lab, and has significant explanatory power. The fact that other approaches to emergence cannot reproduce this yet is a significant point in favor of the decoherence explanation.

“Re: ontology, I would feel more comfortable if I knew any operational way of deciding whether any specific ontology is “correct”, or even a reason to expect that a fully satisfactory solution exists…”

I more or less agree with you. A good choice of ontology should eventually lead to an experimental prediction that it would have been incoceivable to make without it. This might not happen in current QM itself, but perhaps in future theories like quantum gravity. However, in this post, I left the question of what constitutes a “good” ontology open, because different people have different requirements. For example, one could weaken your operational requirement and just say that the ontology should have explanatory power, i.e. be a good way of thinking about the theory that aids intuition. I also don’t have a big problem with having more than one ontology available, so long as you can account for the terms of one ontology consistently in terms of the other. Arguably, we have this stiuation in classical mechanics because the different formulations, e.g. Newton’s laws, Hamiltonian mechanics, Largrangian mechanics, etc. suggest taking different quantities as the fundamental entities of the theory. On the other hand, we shouldn’t end up with competing ontologies that give radically different pictures of what is going on in reality and appear to be genuinely irreconcilable. Arguably, this is the case with current interpretations of QM.

26th January 2007 at 4:28 pm

The emergence of the second law can be (partially) tested and understood through the fluctuation theorems regarding probabilities of second law violations. This is a naive question, but is there a set of such statements regarding the role of decoherence in quantum systems as they are made larger? And have people studied the transition from quantum to classical as the system size/structure is varied?

26th January 2007 at 5:17 pm

The short answer is yes, although I’m no expert on this particular topic at the moment. I think one can view the remnants of interference effects on timescales shorter than the decoherence time as an effect somewhat analogous to fluctuations from the 2nd law. Having said that, I don’t think the analogies between emergence of classicality and emergence of the second law have received sufficient attention from theorists, so there are probably more precise statements that can be made. It’s something I’m interested in pursuing in the future.

9th February 2007 at 6:30 pm

Biologists Fixing Radios, Funny Journal Titles/Paper Graphics, and More!Commenting on the observation that he “doesn’t blog all that often”, Matt Leifer had the following to say:

Most bloggers think of their blog like a newspaper column… what I find interesting is that you don’t have to think …

9th March 2007 at 1:24 pm

[...] talked about his recent work on the emergence of classicality via coarse graining. I’ve mentioned it before on this blog, and it’s definitely a topic I’m becoming much more interested [...]

27th March 2007 at 12:59 pm

I’m an iitalian student.

Thank you for this post. Since I started studing Quantum Mechanics (2 years ago) I had thought that under the Probabilistic interpretation of this one, should be hidden something. Exactly the “ontology problem” has always been what make QM for me unsatisfactory. I’m grateful to you because finally I am not the only one to think that,(infact in my university it seemed so) and now I know the name of the physics that study that(so i can seek for reviews). I apologize for my English.

Thanks,

Dimitri

random3f

22nd August 2009 at 10:40 am

[...] talked about his recent work on the emergence of classicality via coarse graining. I’ve mentioned it before on this blog, and it’s definitely a topic I’m becoming much more interested [...]