This article describes the very basics of interactive decoherence theory, outlining its bearing on alleged connections between quantum theory and philosophy of mind or cognitive science.
The ‘Connection’ Between Minds and Quantum Mechanics
Why should a cognitive scientist or philosopher of mind care about quantum mechanics?
Two principle concerns prompt some theorists to link quantum mechanics to cognitive science by way of theories of consciousness. On the one hand, it is commonly thought that consciousness plays an important role in quantum measurement and that any good account of consciousness ought to explain how it can play such a role. And on the other hand, a few people think the framework of classical physics is not, by itself, up to the task of explaining consciousness and that a good explanation must rely on special features of quantum mechanics such as state vector reduction or purported special capabilities of physical substrates while existing in hypothesised coherent states of linear superposition. Talk of Schrödinger’s cat, an example many believe shows quantum mechanics predicts bizarre superposed states for ordinary objects like cats, is often mixed in for good measure.
But comparatively large and high temperature items like cats (or neurons) just do not exist in persisting states of linear superposition capable of exhibiting interference effects, and quantum mechanics — contrary to received wisdom — offers no reason to think they should. Cats, bats, lumps of wax, and even physicists’ friends spend their time in well defined classical states; their behaviour, even after interaction with thoroughly quantum systems like decaying atoms, can be described perfectly well with ordinary probability calculus. It turns out that effective classicality extends, under almost all conditions, far below the neural level to that of medium-sized molecules. Yet this is entirely consistent with the notion that modern quantum mechanics is a universal theory applicable to everything in the cosmos and even to the cosmos itself.
Breaking the ‘Connection’
My general strategy for dispensing with the above alleged connection rests on two basic ideas from modern quantum theory:
- State vector reduction need not be considered a real physical process; calculational results empirically indistinguishable from it can be derived solely through mechanisms of interactive decoherence. Thus, quantum measurement does not rely on the presence of a conscious observer in any way whatsoever.
- The phenomenon of interactive decoherence means that most items larger than medium-sized molecules do not exist in persistent states of linear superposition. Thus, the kinds of substrates of typical concern for cognitive scientists, such as neuron-size and room-temperature biological cells, offer no opportunity for hypothesising about special properties due to quantum effects.
(Until the unlikely event of something motivating me to attempt a conversion of several equations in bra and ket notation into HTML, this discussion will have to remain abbreviated and make do without equations. Readers interested in the full story of interactive decoherence, couched in the language of consistent histories, should go straight to Roland Omnès’s excellent 1994 book The Interpretation of Quantum Mechanics. A condensed version, together with added philosophical analysis and argument about the place of quantum mechanics in cognitive theory and consciousenss research, appears in my book Mind Out of Matter.)
The upshot of decoherence is that, regardless of whether there are any conscious observers around or not, objects which we would expect to behave essentially classically do exactly that. Interaction between objects and their environments, both external and internal, does the job of ‘observation’ previously accorded to conscious observers, effecting a process which is experimentally indistinguishable from state vector reduction.
The basic principle which explains the fact of interactive decoherence is that, under ordinary circumstances for a macroscopic object coupled to an environment — either external or internal — wavefunctions for macroscopically distinct states very rapidly become orthogonal. This is due to the dense energy spectrum of the environment and the immense dimensionality of the relevant Hilbert space. It is analogous to the fact in ordinary geometry that as the dimensionality of a space approaches infinity, the proportion of its subspaces intersecting transversally, or aligned to each other in the general position, approaches unity. (‘General position’ indicates that two spaces intersect in the smallest possible dimension with a union of the greatest possible dimension; for example, two planes in general position in three dimensions intersect at a line, while two planes in four dimensions intersect at a point). Simplifying hugely, the probability of randomly selecting two wavefunctions in a high dimensional Hilbert space which are not orthogonal is vanishingly small. So long as there is even a very small coupling between an object and its environment, this tendency of wavefunctions for the environment rapidly to become orthogonal means coherent phase relationships between macroscopically distinct states are destroyed, off-diagonal terms of the reduced density matrix vanish, and interference effects become impossible. This amounts to an extremely rapid and efficient decrease in the number of an object’s possible states which can be distinguished through their effects on the environment (in other words, which are experimentally meaningful). To give an idea of the efficiency of the process, Joos and Zeh (1985) calculate a decoherence time of 10-36 seconds for a spherical grain of dust one micron in radius at standard temperature and pressure.
And once the off-diagonal terms of the reduced density matrix disappear, of course, we can treat different eigenstates of what Omnès dubs the ‘collective observables’ (those of the collective quasi-classical object) with ordinary probability calculus.
The important point for the measurement problem is that it is decoherence in the measuring apparatus which transfers the quantum property of a microscopic system into something real and distinguishable — and observationally meaningful — in the macroscopic world. After impressive proofs of a string of theorems establishing the equivalence of measurement data and microscopic properties, the equivalence of the respective probabilities, and the outcome of repeated measurements of the same system with respect to the same observable, Omnès (1994, p. 338) derives the general form for state vector reduction. Ironically, in light of the comment sometimes made that decoherence offers merely a “calculational tool” (Kiefer 1991, p. 379), this rule for state vector reduction emphatically does not describe a real physical process; it is, instead, merely a computational convenience for predicting the outcomes of measurements. State vector reduction, on the other hand, may be dispensed with altogether as an actual physical process. Instead, the wavefunctions of measuring apparatuses and the like could, in principle, be followed in minute detail, nonetheless turning up — on account of decoherence in the measuring apparatus — the very same results. With its stipulation that macroscopic objects behave classically (a stipulation, incidentally, which is explicitly inconsistent), the Copenhagen interpretation guarantees the same calculational result, but on the present view both the quasi-classical macroscopic behaviour and the mathematical rule of state vector reduction can instead be derived.
Perhaps the most significant point about all this is that the theory of interactive decoherence grows naturally out of existing quantum theory. The mathematics of decoherence are incontrovertible: it is not that someone has proposed that maybe it happens, it is a necessary consequence of the existing mathematical framework.
Whether decoherence answers all the philosophical questions we’d like answered, and in particular whether it overcomes John Bell’s (1990) criticism of interpretations of quantum mechanics which start and end by pointing out that quantum theory gives the right answers — interpretations which he derides as ‘FAPP’s (‘For All Practical Purposes’) — is addressed carefully in my book Mind Out of Matter.
- Research Archive
- About the Research Archive
- Drafts and Unfinished Papers
- International Workshop on Robot Cognition
- Mind Out of Matter
- Research Bibliography
- Supplementary Bibliography from Mind Out of Matter
- Tutorials and Introductions
- Biological Cognition & Universal Computation
- Chaos Theory
- Complexity & Information Theory
- Computability and Completeness
- Emergence and Levels of Description
- Models of Computation: Turing Machines and Finite Automata
- Philosophers’ Zombies and their Role in Cognitive Science
- Quantum Decoherence
- Recursion Theory
- Soul Searching (TV Documentary)
This article was originally published by Dr Greg Mulhauser on .on and was last reviewed or updated by