# Measurement problem

Not to be confused with the measure problem.

The measurement problem in quantum mechanics is the problem of how (or whether) wave function collapse occurs. The inability to observe such a collapse directly has given rise to different interpretations of quantum mechanics and poses a key set of questions that each interpretation must answer.

The wave function in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states. However, actual measurements always find the physical system in a definite state. Any future evolution of the wave function is based on the state the system was discovered to be in when the measurement was made, meaning that the measurement "did something" to the system that is not obviously a consequence of Schrödinger evolution. The measurement problem is describing what that "something" is, how a superposition of many possible values becomes a single measured value.

To express matters differently (paraphrasing Steven Weinberg[1][2]), the Schrödinger wave equation determines the wave function at any later time. If observers and their measuring apparatus are themselves described by a deterministic wave function, why can we not predict precise results for measurements, but only probabilities? As a general question: How can one establish a correspondence between quantum and classical reality?[3]

## Schrödinger's cat

A thought experiment often used to illustrate the measurement problem is the "paradox" of Schrödinger's cat. A mechanism is arranged to kill a cat if a quantum event, such as the decay of a radioactive atom, occurs. Thus the fate of a large-scale object, the cat, is entangled with the fate of a quantum object, the atom. Prior to observation, according to the Schrödinger equation and numerous particle experiments, the atom is in a quantum superposition, a linear combination of decayed and undecayed states, which evolve with time. Therefore the cat should also be in a superposition, a linear combination of states that can be characterized as an "alive cat" and states that can be characterized as a "dead cat". Each of these possibilities is associated with a specific nonzero probability amplitude. However, a single, particular observation of the cat does not find a superposition: it always finds either a living cat, or a dead cat. After the measurement the cat is definitively alive or dead. The question is: How are the probabilities converted into an actual, sharply well-defined classical outcome?

## Interpretations

The Copenhagen interpretation is the oldest and probably still the most widely held interpretation of quantum mechanics.[4][5][6] [7] Most generally, it posits something in the act of observation which results in the collapse of the wave function. How this could happen is widely disputed. In general, proponents of the Copenhagen Interpretation tend to be impatient with epistemic explanations of the mechanism behind it. This attitude is summed up in the oft-quoted mantra "Shut up and calculate!"[8]

Hugh Everett's many-worlds interpretation attempts to solve the problem by suggesting that there is only one wave function, the superposition of the entire universe, and it never collapses—so there is no measurement problem. Instead, the act of measurement is simply an interaction between quantum entities, e.g. observer, measuring instrument, electron/positron etc., which entangle to form a single larger entity, for instance living cat/happy scientist. Everett also attempted to demonstrate how the probabilistic nature of quantum mechanics would appear in measurements; work later extended by Bryce DeWitt.

De Broglie–Bohm theory tries to solve the measurement problem very differently: the information describing the system contains not only the wave function, but also supplementary data (a trajectory) giving the position of the particle(s). The role of the wave function is to generate the velocity field for the particles. These velocities are such that the probability distribution for the particle remains consistent with the predictions of the orthodox quantum mechanics. According to de Broglie–Bohm theory, interaction with the environment during a measurement procedure separates the wave packets in configuration space, which is where apparent wave function collapse comes from, even though there is no actual collapse.

The Ghirardi–Rimini–Weber (GRW) theory differs from other collapse theories by proposing that wave function collapse happens spontaneously. Particles have a non-zero probability of undergoing a "hit", or spontaneous collapse of the wave function, on the order of once every hundred million years.[9] Though collapse is extremely rare, the sheer number of particles in a measurement system means that the probability of a collapse occurring somewhere in the system is high. Since the entire measurement system is entangled (by quantum entanglement), the collapse of a single particle initiates the collapse of the entire measurement apparatus.

Erich Joos and Heinz-Dieter Zeh claim that the phenomenon of quantum decoherence, which was put on firm ground in the 1980s, resolves the problem.[10] The idea is that the environment causes the classical appearance of macroscopic objects. Zeh further claims that decoherence makes it possible to identify the fuzzy boundary between the quantum microworld and the world where the classical intuition is applicable.[11][12] Quantum decoherence was proposed in the context of the many-worlds interpretation[citation needed], but it has also become an important part of some modern updates of the Copenhagen interpretation based on consistent histories.[13][14] Quantum decoherence does not describe the actual collapse of the wave function, but it explains the conversion of the quantum probabilities (that exhibit interference effects) to the ordinary classical probabilities. See, for example, Zurek,[3] Zeh[11] and Schlosshauer.[15]

The present situation is slowly clarifying, as described in a 2006 article by Schlosshauer as follows:[16]

Several decoherence-unrelated proposals have been put forward in the past to elucidate the meaning of probabilities and arrive at the Born rule ... It is fair to say that no decisive conclusion appears to have been reached as to the success of these derivations. ...
As it is well known, [many papers by Bohr insist upon] the fundamental role of classical concepts. The experimental evidence for superpositions of macroscopically distinct states on increasingly large length scales counters such a dictum. Superpositions appear to be novel and individually existing states, often without any classical counterparts. Only the physical interactions between systems then determine a particular decomposition into classical states from the view of each particular system. Thus classical concepts are to be understood as locally emergent in a relative-state sense and should no longer claim a fundamental role in the physical theory.

A fourth approach is given by objective-collapse models. In such models, the Schrödinger equation is modified and obtains nonlinear terms. These nonlinear modifications are of stochastic nature and lead to a behaviour that for microscopic quantum objects, e.g. electrons or atoms, is unmeasurably close to that given by the usual Schrödinger equation. For macroscopic objects, however, the nonlinear modification becomes important and induces the collapse of the wave function. Objective-collapse models are effective theories. The stochastic modification is thought of to stem from some external non-quantum field, but the nature of this field is unknown. One possible candidate is the gravitational interaction as in the models of Diósi and Penrose. The main difference of objective-collapse models compared to the other approaches is that they make falsifiable predictions that differ from standard quantum mechanics. Experiments are already getting close to the parameter regime where these predictions can be tested.[17]

## References and notes

1. ^ Weinberg, Steven (1998). "The Great Reduction: Physics in the Twentieth Century". In Michael Howard & William Roger Louis (eds.). The Oxford History of the Twentieth Century. Oxford University Press. p. 26. ISBN 0-19-820428-0.
2. ^ Weinberg, Steven (November 2005). "Einstein's Mistakes". Physics Today. 58 (11): 31–35. Bibcode:2005PhT....58k..31W. doi:10.1063/1.2155755.
3. ^ a b Zurek, Wojciech Hubert (22 May 2003). "Decoherence, einselection, and the quantum origins of the classical". Reviews of Modern Physics. 75 (3): 715–775. arXiv:quant-ph/0105127. Bibcode:2003RvMP...75..715Z. doi:10.1103/RevModPhys.75.715.
4. ^ Schlosshauer, Maximilian; Kofler, Johannes; Zeilinger, Anton (August 2013). "A snapshot of foundational attitudes toward quantum mechanics". Studies in History and Philosophy of Science Part B. 44 (3): 222–230. arXiv:1301.1069. Bibcode:2013SHPMP..44..222S. doi:10.1016/j.shpsb.2013.04.004.
5. ^ Sommer, Christoph (2013). "Another Survey of Foundational Attitudes Towards Quantum Mechanics". arXiv:1303.2719 [quant-ph].
6. ^ Norsen, Travis; Nelson, Sarah (2013). "Yet Another Snapshot of Foundational Attitudes Toward Quantum Mechanics". arXiv:1306.4646 [quant-ph].