How does the measurement work in quantum mechanics

Measurement processes in quantum mechanics

The resulting state obviously does not describe what is being observed: instead of a component

with probability

a superposition of all possible ›results‹ develops deterministically. This discrepancy is called the ›Problem (or paradox) of the quantum mechanical measurement process< designated.

The usual answer to the problem is to claim that the wave function describes only probabilities. Regardless of the fact that it is usually not specified further whether these should be probabilities for quantum mechanical states, classical or other (›hidden‹) quantities, the decisive objection is that a superposition turns out to be different from an ensemble of its components in all verifiable cases proves. She totally defines new Properties. In particular, such correlated (›entangled‹) states sometimes show surprising features, such as the violation of Bell's inequalities. This is why Heisenberg's original idea of ​​only indistinctly determinable classical quantities has proven to be inadequate. Likewise, the argument that a measurement means an ›uncontrollable disturbance‹ of the system cannot be upheld.

The macroscopic nature of the measuring apparatus does not help here either. The microscopic dynamics within each component

may be very complicated (e.g. ergodic), nevertheless an entangled state of the above form necessarily always arises; this is a famous example Schrödinger's cat experiment.

The description of the object system alone with the help of the density matrix (density operator), whereby the above dynamics are in the form

shows a disappearance of the off-diagonal elements, since the pointer states are approximately orthogonal. This initially seems to confirm the phenomenological theory. However, it is not a derivation of the collapse of the wave function, but merely a consistency analysis. An ensemble interpretation of this density matrix is ​​inadmissible, since the object system has no state (wave function) at all (not even an unknown one). Density matrices for subsystems are therefore also called improper mixtures (engl. improper mixtures) designated. This distinction is not only formal, but has its essential physical background in the non-locality of the quantum states (EPR paradox, Bell's inequality).

Formally analogous results apply to the measuring apparatus. If you apply the rules of probability to him, the result is as expected n with probability

. This is what is known as the 'ability to move the cut' between the object and the observer. Because of this freedom, it is empirically very difficult - if not impossible - to decide whether and at what point a collapse actually occurs. You just don't have to put the cut too close to the object. As long as interference can still be observed, contradictions would arise. On the other hand, it can be shifted towards the observer at will - in the extreme case towards the subjective observer himself, such as a. von Neumann or Wigner suggested.

Suggested solutions

The ways out of this dilemma can be roughly divided into three categories. (1) those that leave or expand quantum mechanical kinematics, (2) theories that change the dynamics (the Schrödinger equation), and (3) suggestions for an interpretation of the formally resulting entangled ›non-classical‹ states.

Group (1) includes the Bohm theory, which, in addition to the wave function, which develops according to the Schrödinger equation (and therefore also contains all problematic superpositions), also adds particles and their orbits as an essential element. In this theory it is assumed that only the particles are relevant for the observation (are perceived), while the wave function takes on the role of a guiding field. The collapse of the wave function is replaced by a statistical assumption about the distribution of the particles. As Bell emphasized, the term ›hidden parameters‹, which is generally used for quantities that are intended to supplement the quantum mechanical description with wave functions, is misleading here, since on the contrary, the wave function is ›hidden‹.

Of the theories of group (2), the theory developed by Ghirardi, Rimini and Weber is particularly the spontaneous localization been investigated. These experiments modify the Schrödinger equation (usually with a stochastic correction term) in order to obtain a collapse or equivalent effects, and can therefore in principle be experimentally differentiated. The subjective observer does not play an excellent role, since the perception can be assumed to be parallel to the states of certain objects (e.g. parts of the brain). Such a >psycho-physical parallelism‹Was originally the reason for von Neumann to introduce the collapse as a supplement to the Schrödinger dynamic. This is certainly the basis for the attractiveness of such models.

However, the deviations from quantum mechanics, especially in the macroscopic area of ​​interest, are caused by the one described below Decoherence effect superimposed.

The third group retains both the kinematics (i.e. the wave function) and its dynamics. These include, above all, the interpretations that go back to Everett, which are sometimes also referred to as multi-worlds theories. In these, the Schrödinger equation is used as the sole dynamic. As a result, the above superpositions necessarily arise macroscopically different states. So Schrödinger's cat is either dead as well as lively. The same applies to every observer, who then necessarily exists in different versions (in every component of the global wave function), which, however, cannot perceive each other.

Recent developments

In recent decades it has become increasingly clear that the von Neumann scheme of a measuring process described above is unrealistic in one decisive point: the description of the measuring apparatus as isolated A system that develops (possibly coupled to an object to be measured) according to the Schrödinger equation corresponds to a situation that we never find in the real world. Because it turns out that macroscopic bodies interact very strongly with their natural environment. This leads to an entangled (quantum correlated) State developed in such a way that interferences between different macrostates are no longer observable. This phenomenon is called Decoherence designated.

For example, the location of a macroscopic body is constantly and unavoidably ›measured‹ by the scattering of photons or molecules, i.e. the scattering states contain information about the location of the object. This is done completely analogously to the unitary measurement process dynamics described above, which is why we also use measuring process-like Interactions speaks. All macroscopic objects are therefore always strongly quantum correlated with their environment. Quantitative estimates show that this non-isolatability from the natural environment is essential right down to the area of ​​molecules. In fact, fully quantum mechanical behavior is only observed for very small molecules (such as hydrogen or ammonia).

Decoherence leads to the fact that very many systems can no longer be found in superpositions of certain states (super-selection rule). Schrödinger's cat therefore always appears either dead or lively; the non-classical superposition state is dynamically unstable and extremely short-lived. The effects of this irreversible coupling to the environment are much faster in the macroscopic range than thermal relaxation processes.

For certain (›classical‹) degrees of freedom an effective restriction of the superposition principle of quantum theory follows from the non-locality of quantum states (the latter ironically being a consequence of the superposition principle). Such super-selection rules seem to be dynamically justifiable - in contrast to theories in which they are axiomatically postulated.

These realistic and quantitative considerations show that classical behavior (in the sense of the absence of interference) has less to do with the 'size' of a system than with the dynamic openness of most objects.

The possibility of experimentally studying decoherence effects in the mesoscopic range enables important tests of quantum theory. On the other hand, the inevitability of coupling to the environment is a huge obstacle for designers of quantum computers, since they require a controlled and permanent manipulation of (at least mesoscopic) superpositions.

The strong coupling of macroscopic objects consequently leads to the development of a Quantum cosmology. This must also include a quantum theory of gravity, which has been rudimentary so far. For reasons of consistency, such considerations necessarily lead to the concept of a ›wave function of the universe‹. But by definition the universe also contains all of its observers. Here the interpretation problem of quantum theory shows itself in full sharpness.


Von Neumann, J .: Mathematical foundations of quantum mechanics, Springer, 1932, 1981.
An analysis that emerged shortly after quantum theory was formulated. Chapter VI on the measuring process is particularly worth reading today. This book also contains the (historically significant, but unfortunately useless) ›proof‹ of the non-extensibility of quantum theory.
Jammer, M .: The Philosophy of Quantum Mechanics (Wiley), 1974.
A treasure trove of the historical development of the theory, with many citations and references.
Wheeler, J. A. and Zurek, W. H .: Quantum Theory and Measurement, Princeton University Press, 1983.
An annotated reproduction of contributions that have played a role in the discussion about the interpretation of quantum theory.
d'Espagnat, B .: Veiled Reality, Addison-Wesley, 1995.
One of the most thorough analyzes of the problem of what quantum theory has to tell us.
Giulini, D., Joos, E., Kiefer, C., Kupsch, J., Stamatescu, I.-O. and toe, H.D .: Decoherence and the Appearance of a Classical World in Quantum Theory, Springer, 1996.
Discuss recent developments in the fundamentals of quantum theory, with an emphasis on questions of interpretation.