Standard Misunderstandings of Quantum Mechanics

(See also Interpretations of Quantum Mechanics and Quantum Mind Theories )
  1. QM1: Waves and Particles.
    1. QM1.1: Particles and Waves.
    2. QM1.2: Particles and Particles.
    3. QM1.3 How wavy are waves?
  2. QM2: Indeterminism
    1. QM2.1 Defining Determinism
    2. QM2.2 Necessary and sufficient Causation
    3. QM2.3 Probablistic_Determinism
    4. QM2.4: Microscopic and Macroscopic Randomness.
  3. QM3 Uncertainty
    1. QM3.1: Uncertainty and Indeterminacy.
    2. QM3.2: Uncertainty and Epistemic Limitation.
    3. QM3.3: Uncertainty and Disturbance.
    4. QM3.4: Uncertainty and Complementarity.
    5. QM3.5: Waves/Particle duality and complementarity: momentum and position measurement in the same experiment
  4. QM4: Locality and Space.
    1. QM4.1 What Locality Is
    2. QM4.2 Time and Space.
    3. QM4.3 Quantum Mechanics and Locality
    4. QM4.4 What Locality Is
    5. QM4.5 The Relationship Between Locality and Determinism
    6. QM4.6 Tests of Locality and Determinism I: The EPR paper
    7. QM4.7 Tests of Locality and Determinism II: Bells Theorem
    8. QM4.8 Tests of Locality and Determinism II: Aspects Experiment
    9. QM4.9 Tests of Locality and Determinism II: Reactions to the Aspect Experiment
    10. QM4.10 Local Realism
  5. QM5: Quantum Mechanics and Time.
    1. QM5.1: Causal Reversal.
    2. QM5.2: Recovering Time from a Timeless Multiverse.
  6. QM6: Measurement and Reality.
    1. QM6.1: Consciousness and Collapse.
    2. QM6.2: What is real in Quantum Mechanics ?
    3. QM6.3: The reality or otherwise of collapse ?
    4. QM6.4: The reality of waves
    5. QM6.5: The reality of particles
  7. QM7: Mathematical Aspects of QM
    1. QM7.1: The digital universe: is the cosmos made of integers ?
    2. QM7.2: Quantum and classical physics
  8. QM8. Myths about the Copnhagen interpretation

QM1 Waves and Particles:

QM1.1 Particles and Waves

There is a problem of waves and particles in classical physics because there is empirical evidence of both wave-like and particle-like behaviour, but the maths of classical physics cannot embrace both behaviours.

Quantum physics solves this problem -- note that it does not create a W/P problem, it solves one -- with a mathematical description (the wave function/state vector) which can embrace both behaviours. The WF/SV is not a toggle switch that goes from 100% wave to 100% particle; it is quite capable of describing in-between states.

"The miracle is the fact that these seemingly gross absurdities of experimental fact -- that waves are particles and particles -- can be accomodated within a beautiful mathematical formalism."
Roger Penrose, the Road to Reality

Particles-behaving-as-waves are just as readily observable as particles-behaving as particles. A radio antenna observes photons as waves -- it extracts frequency/momentum information, but gives you no positional information.

back to top

QM1.2 Particles and Particles.

Most people have heard that particles are things like protons, electrons and photons. Most people have also heard that a proton, electron or (especially) photon can be both a particle and a wave. How can a particle not be a particle ? The confusion is verbal. A particle-as-opposed-to-a-wave is characterised by being well-localised in space, by having a well-defined position. A particle-as-a-quantum (sometimes called a wavicle) is an indivisible unit -- you can't have half a photon, even if the photon is spread out over a large volume of space.

QM1.3 How wavy are waves?

Earlier we gave the impression that anything quantum entity spread over a relatively wide area is a wave. This isn't quite right, or at least some people dispute it. Strictly speaking, a wave must have a repeating structure. Not all viable wave functions (perhaps we should say state vectors) actually do. From this point of view, the wae and the particle are both sepcial cases.

(In fact, it can be further argued that a 100% well-defined state is never achieved, since Kronecker delta functions are not normalisable (Penrose).)

Now we know how the electrons and light behave. But what can I call it? If I say they behave like particles I give the wrong impression; also if I say they behave like waves. They behave in their own inimitable way, which technically could be called quantum-mechanical way. They behave in a way that is like nothing that you have ever seen before. Your experience with things that you have seen before is incomplete. (R.P. Feynman, The Character of Physical Law, MIT Press, 1967).

back to top

QM2. Indeterminism.

QM2.1 Defining Determinism

The classic definition of causal determinism is Simone de Laplace's:-

"An intellect which at any given moment knew all the forces that animate Nature and the mutual positions of the beings that comprise it, if this intellect were vast enough to submit its data to analysis, could condense into a single formula the movement of the greatest bodies of the universe and that of the lightest atom: for such an intellect nothing could be uncertain; and the future just like the past would be present before our eyes".

Determinism is often confused with predictability. But whether something is determined or not is an objective fact, and not dependent on the predictive abilities of any particular observer. That is why Laplace's definition includes an Infinite Intellect. It is a way of getting the limitations of observers out of the way in order to focus on the events themselves. Determinism implies that eveything is predictable in principle, but it does not imply we can predict everything, since our limitations might get in the way. Given our inability to predict everythuing, underlying strcit determinism might still be true, but then so might a form of indeterminism.

Causal determinism means that every event is strictly necessitated and inevitable. There are thus two positions opposed to it: the position that nothing is caused in any way, and the position that some things are at least partially caused. The position that some things are at least partly caused is the kind of indeterminism indicated by QM. QM is not a free-for-all. Some things are impossible under certain circumstances. Some are very likely. This uneven distributon of probabilites makes quantum mechanical laws testable despite their lack of detemrinism.

QM2.2 Necessary and Sufficient Causation

Determinists often claim 'everything has a cause' as both a self-evident principle, and as one which has significant philosophical import. However, the truth of the latter depends, as philosophical questions tend to, one what one means by 'cause'.

Sufficient cause: If C, then E. C's cannot occur without E's following on. C's are sufficient to cause E's. But something else, C* could also cause E. "There is more than one way to skin a cat".

Necessary Cause: If E, then C. If E has occured, C must have occured. C is necessary for E. E cannot occur without C. But C's occurence alone may not be sufficient for E. In order to bake bread, an ovem is necessary, but yeast is also necessary.

Quantum events always have necessary causes. The apparatus always has to be set up in a certain way in order to obtain results, even if they are only probablistic results.

QM2.3 Probablistic Causation

Let's look a bit more closely at this idea: The idea of probabilistic causation is that a cause changes the likelihood of its effect or effects. Strict deterministic causation can be see as a special case, where there is one effect that occurs with 100% probability. (Nothing-is-caused indeterminism is an even more generous case where no event has a probability of zero. But under probabilistic determinism, some events are still impossible). The idea that probabilistic causation is fundamental to nature is somewhat new, dating from the advent of quantum mechanics. The basic notion of probabilistic causation is not new, for instance in phrases like 'smoking causes cancer', which means 'smoking makes cancer more likely', not '100% of everybody who smokes will get cancer'. The innovation is the idea that there is a lack of determinism not just a lack of predictability, that even if you knew every detail of what is going on you would still not be able to predict outcomes exactly.

Strict determinism is not a pre-requisite for science. Probabilistic laws can still be confirmed and falsified, and many areas of science have always operated statistically.

Adherents of the strict version of causality, who believe that for a cause to be a cause it must necessitate its effects, often say that in the case of probabilistic causality it is only lack of fine-grained information about the details of a physical situation that causes the appearance of merely probablistic causation. This is not a claim about what probabilistic causation means, since probabilistic causation is equally well understood by people who don't believe in hidden determining factors. It is not an empirical fact either, since, by definition, hidden determining factors are not apparent. It can hardly be claimed as something that can be argued for logically either, since arguments for strict determinism need to refute non-strict, probabilistic causation, and cannot do that without appealing, in a vicious circle, to the very assumption of underlying determinism in question. If there are hidden determining factors, they exist for a reason, and the reason is the truth of strict determinism.

back to top

QMN2.4 Microscopic and Macroscopic Randomness

A common reaction to QM is that it doesn't matter since quantum randomness will never manifest itself at the macroscopic level -- that is, in the world of sticks and stones we can see with the naked eye. An appeal is usually made to the "law of large numbers", according to which random fluctuations at the atomic (or lower level) will cancel each other out in a macroscopic object, so that what is seen is an averaged-out behaviour that is fairly predictable.

Something like this must be happening in some cases, assuming QM is a correct description of the micro-world, or there would not even be an appearance of a deterministic macro-world. Since deterministic classical physics is partially correct, there must be a mechanism that makes the QM micro-world at least approximate to the classical description.

However, it it were the case that all macroscopic objects behaved in a 100% deterministic fashion, there would be no evidence for QM in the first place -- since all scientific apparatus is in the macro-world ! A geiger-counter is able to amplify the impact of a single particle into an audible click. Richard Feynman suggested that if that wasn't macroscopic enough, you could always amplify the signal further and use it to set off a stick of dynamite! It could be objected that these are artificial situations. This is rather desperate, however, because there is a well-known natural mechanism that could do the same job: classical chaos.

A classically chaotic system is by definition one that is critically sensitive to its initial conditions. "critically" sensitive means that any variation in initial conditions, no matter how slight, can bring about a change in the macroscopic behaviour of the system, no matter how large. Since there is no lower limit to critical sensitivity, it must extend all the way "down" to me microscopic world of quantum physics. Thus, hurricanes need not be started by butterfly wings, they can be started by electrons!

The term "classical" misleads some people. Chaos can be defined within the framework of classical physics, which is strictly deterministic. This is sometimes taken to mean any chaotic system encountered in nature (such as a weather system) is classical and deterministic. However, when we tall about ordinary, non-chaotic systems being classical, we mean they are *approximately* classical. Classical physics is not entirely wrong; it worked for 100's of years after all. But it is not entirely right either. "Classical" systems are quantum systems that approximate classical behaviour.

Thus any chaotic system that you can actually encounter, such as a weather system, is only approximately classical. It has no underlying determinism. At the most fundamental level it is a quantum system -- because everything is.

So we can have classical system that behave predictably (ordinary Newtonian physics), quantum systems that behave predictably on the macroscopic level (through the Law of Large Numbers), classical systems that behave unpredictably (through classical chaos) and quantum systems that unpredictably on the macroscopic as well as microscopic level (chaos and other "quantum amplifiers").

In fact, this is not just theoretical. Conventional big-bang theories generally require an input of quantum indeterminism to provide the large-scale structure of the universe. A singularity exploding according to classical laws would expand evenly in every direction, leading to a boring universe consisting of an evenly dispersed cloud of gas. So when you look at the night sky, you are seeing evidence for macroscopic randomness!

One last word: Heisenberg's uncertainty principle does include a constant "h", and it is very small. But is is not an upper limit that prevents uncertainty from leaking into the macroscopic world. In fact, the mathematical form of the Uncertainty principle:
delta_x . delta_p >= h_bar
is an inequality. It sets a lower limit on the amount of uncertainty but no upper limit.

back to top

QM3 Uncertainty

QM3.1 Uncertainty and Indeterminism

A perennial confusion about QM is the failure to realise that there are actually two sources of indeterminacy; the Heisenberg Uncertainty Principle and the collapse (or reduction) of the state vector (or wave function). The H.U.P prevents a particle simultaneously having a well defined position and momentum. It is a fuzziness, a limitation on the amount of information available about a physical situation. Indeterminacy, on the other hand, means that the actual outcome of a situation is fundamentally unknowable. The mathematical apparatus of QM assigns probabilites to as series of outcomes(for instance an atom decays after 1 second. or 2 seconds or 3 seconds). If a set of experiments is performed (either a series of single particle experiments over time, or an experiment on an "ensemble" of particles) the frequencies of outcomes can be correlated against the predicted probabilities. Thus quantum theory can be checked although it does not make precise predictions.

back to top

QM3.2 Fundamental uncertainty and epistemic limitation

The Heisenberg Uncertainty Relation is often described by non-specialists as an epistemic limitation, a inability to know what the position and momentum of a particle both are -- with the implication that they really do have simultaneous values, knowable from some God's eye point-of-view. However, the justification of the Uncertainty Principle is the a picture of what a sort of thing a (wave-) particle is -- and what sort of thing momentum is. This is easiest to see in terms of the wave-packet approach. The momentum of a wave-particle is "really" the frequency of the wave-packet. The position is the maximum amplitude of the wave-packet. A pure since wave has the best-defined frequency, but the worst-defined position, since the amplitude is the same everywhere. A delta function, or "spike" has the best-defined position, but it's frequency is indeterminable. (In terms or fourier analysis, you have to superimpose sine waves of every frequency to get a "spike" -- thus the spike is occupies all of "frequency" space).

Of course, this all assumes the "picture" provided by standard QM is correct. it might not be, but within the theory, the Uncertainty principle arises from the ontological nature of a "wavicle". It is not an epistemic limitation, a "veil", on what can be known about a wavicle. A wavicle that is spread out over space just does not have a well-defined position; A spike does not have a well-defined frequency.

Thus any claim to the effect that the Uncertainty Principle is an epistemic limitation is a claim to the effect that QM is basically wrong or incomplete in some way. Explained further here.

back to top

QM3.3 Uncertainty and measurement disturbance

"In quantum mechanics, however, indeterminacy is of a much more fundamental nature, having nothing to do with errors or disturbance."--wikipedia

"The uncertainty principle in quantum mechanics is sometimes erroneously explained by claiming that the measurement of position necessarily disturbs a particle's momentum. Heisenberg himself may have initially offered explanations which suggested this view." -- wikipedia

Uncertainty emerged from Heisenberg's highly abstract Matrix formulation. In order to give a physical justification for it, he came up with the disturbance idea, the idea that attempting to measure the position or velocity of a particle using another particle will inevitably disturb the first particle. The idea has lived on in the popular literature ever since. However, it suffers from the problem of implying that that particles really do have well-defined positions and momenta at all times, if only you could get at them without disturbing the system. The more modern approach is that the information is just not there in the first place For instance, the uncertainty principle is implicated in processes like the decay of unstable nuclei, where the "disturbance" element is absent-- the time nucleus takes to decay is related to energy levels by the time-energy UP.

Nonetheless disturbance lives on in the minds of the reading public.

http://members.tripod.com/~Glove_r/Folse4.html

Though it forms no part of complementarity, the disturbance principle was frequently defended as part of the Copenhagen Interpretation and often identified with Bohr's view in the years following Heisenberg's discovery. From the perspective of Heisenberg, it appeared that the basis of the disturbance principle lay in the fact that the instruments doing the observing "disturbed" the observed system such that its state after observation is no longer what was determined in the measurement. This interpretation compares observation of atomic systems to measuring, for example, the inner workings of a wrist watch using a yardstick.

However, the disturbance interpretation plays havoc with the facts behind the genesis of the uncertainty principle and its status within the mathematical formalism of quantum mechanics. The principle is a straight forward deductive consequence of the quantum theoretical formalism which provides a highly confirmed means of predicting the outcome of interaction between radiation and matter. There is no mention of disturbance in the derivation of the principle itself, nor of how to go about determining the relevant parameters.

The design of experiments is relevant only to interpreting the physical significance of the principle. The assumption that the classical system really exists in a classical mechanical state supposes the question of whether an experiment could be designed which will yield greater knowledge about the state of the atomic system than the uncertainty principle allows. If this could be done, the theory would be properly judged incomplete.

The disturbance interpretation mistake becomes apparent when we realize, that according to it, we could only approach classical ideals of strict determinism if our measuring instruments were the size of atoms. However, it is only an immense difference between the dimensions of ordinary human experience and those involved in atomic processes that made strict determinism a nearly obtainable goal. If our instruments were the same size as atoms, then the role of the quantum in an interaction would be ever increasing rather than decreasing, as the disturbance interpretation suggest.

In classical mechanics, the observation also "disturbs" the observed, but the disturbance is either negligible or "controllable" and so can be accounted for in defining the state of an isolated system after the observation interaction. In quantum theory, ordinarily the effect of the interaction cannot be considered negligible nor "controllable". Since the disturbance interpretation makes it appear that the uncertainty principle is a empirical generalisation, it's unable to explain why this alleged disturbance cannot be determined in the quantum framework, and allows a return to classical deterministic formalism.

QM3.4 Uncertainty and Complementarity

Complementarity can be taken as the "black and white" claim that an observation can only be of a pure wave (complete accuracy about momentum) or a pure particle (complete accuracy about position) . Alternatively, it could be the shades-of-gray claim that more accuracy about positions means less accuracy about momentum. This kind of complementarity s alrady implicity in the uncertainty principle. which allows both uncertainties to be above zero and on the same scale.

There are perhaps as many as four interpretations of the principle of complementarity:

  1. evolution -- black and white
  2. evolution -- shades of gray (the complementarity implied by the uncertainry principle)
  3. collapse -- blackand white (a further complemtarity occuing on collapse).
  4. collapse -- shades of gray
      Is Afshar's "refutation" of the Copenhagen Interpretation concerned with 3 or 4? Refuting black-and-white complementarity does not refure every assumoption of the C.I, nor does it dispose of wave-particle duality, since the shades-og-grey version is already implicit in the UP, which he does not refute.

      back to top

      QM3.5: Waves/Particle duality and complementarity: momentum and position measurement in the same experiment

      If complementarity is the "black and white" claim that an observation can only be of a pure wave (complete accuracy about momentum) or a pure particle (complete accuracy about position) but not anything intermediate, then it has been experimentally disproven (that is, there is no need for anything beyond the shades-of-grey duality which is already supplied by uncertainty, no further axiom being needed).

      "At the end of the 1980s, three Indian physicists came up with a new suggestion for an experiment which could show single photons behaving both as particles and as waves at the same time. Dipankar Home, Partha Ghose and Girish Agarwal...".
      John Gribbin, Q for Quantum

      Neutron Interferometry used to make simultanesous approximate postion and momentum measurments back to top

      QM4 Quantum Mechanics and Locality

      QM4.1 What Locality Is

      Locality means that events are influenced only by events nearby in space, or at least that the amount of causal influence declines with distance. It is very much a counterpart or corolary of determinism -- at least in scientific circles. As far as philosophers are concerned, determinism stands supreme.

      QM4.2 The Relationship Between Locality and Determinism

      Neither principle is an apriori truth. Both are, up to a point, necessary presuppositions. If you don't assume that the way you have prepared your apparatus has an y effect on the results obtained, there is no way of performing an experiment. (That's determinism). If you can't exclude the possibility that your experiment is being affected by all the things outside your laboratory that you have no control over, you again perform an experiment. Experiments require repetition, and repetition means repetition of a local state. The state of the whole universe (probably) never repeats.

      QM4.3 Tests of Locality and Determinism I: The EPR paper

      The story of the testing of locality and determinsim in Quantum Mechanics starts with a paper co-authored by Albert Einstein, Boris Podolsky and Nathan Roseni n the 1930s.

      This was intended to be a dismissal, a reductio ad absurdum, of Quantum Mechanics. The EPR paper, as it is known, shows that particles that where once in an "entangled" state continue to be after they are separated, no matter how far they are separated. (An "entangled state" mean that only certain combinations of states of the two particles are allowed. if particle 1 is in state A, particle 2 must be in state B. If particle 1 is in state B, then particle 2 must be in state A). In conjunction with the quantum mechanical principle that the state of a particle is not decided until an observation is made, it suggest that when member of an entangeld pair is observed, the other must adapt its state to the complementary one -- instantaneously, across any distance.

      QM4.4 Tests of Locality and Determinism II: Bells Theorem

      The baton was taken up in the 1960s by the Northern Irish John S Bell -- like Einstein, ad opponent of quantum indeterminsim -- who found a way of using entangled pairs to test for the existence of hidden determinism. The usual objection to quantum indeteminism is that a system is "really" determined, and we just don't have enough information to predict it. In other words, there is extra information in the physical system that isn't in the mathematical description. This extra information is traditionally called "hidden variables".
      In quantum mechanics, Bell's Theorem states that a Bell inequality must be obeyed under any local hidden variable theory but can in certain circumstances be violated under quantum mechanics (QM). The term "Bell inequality" can mean any one of a number of inequalities in practice, in real experiments, the CHSH or CH74 inequality, not the original one derived by John Bell. It places restrictions on the statistical results of experiments on pairs of particles that have taken part in an interaction and then separated. A Bell test experiment is one designed to test whether or not the real world obeys a Bell inequality.
      wikipedia

      QM4.5 Tests of Locality and Determinism III: Aspects Experiment

      The final chapter is the performance of an experimental test of Bell's inequalities, by the Frenchman Alain Aspect and his co-workers. The test (and subsequent improvements) showed quantum mechanics to be correct, and that a classical theory -- that is a theory that is both local and deterministic -- is untenable.

      Note that this result is contrary to the expectations of both Einstein and Bell. In science, it is nature that ulitmately decides, not famous scientists.

      QM4.6 Tests of Locality and Determinism IV: Reactions to the Aspect Experiment

      The Aspect experiment does not flatly disprove hidden variables, but rather shows that they cannot operate locally. Informally, locality means that causes have to be in the vicinity of their effects. Formally, it means that information does not travel faster than light. The non-existence of local hidden variables means that the only deterministic theory of QM you could have is a holistic one. The popularity of holistic theories is therefore likely to be a reflection of the popularity of determinism (rather than an enthusiasm for holism per se). The only fully worked-out theory of holistic-deterministic QM is Bohm's, which seems to have some problems:

      http://www.arxiv.org/PS_cache/quant-ph/pdf/0206/0206196.pdf

      Even going down the other route allowed by EPR/Bell/Aspect -- standard QM indeterminism--there are still non-local correlations between events , but they do not violate the technical definition of locality since no infomation can be sent, because of the very indeterminism itself. Although two observations are correlated , you cannot 'force' the outcome of either of them, so you cannot signal. The much-derided QM indeterminism 'censors' the troublesome non-locality, as it were.

      QM4.7 Local Realism

      There is a tradition in physics, going back to Einstein, of using the word "realism" to refer to the claim that every observable quantity has, simultaneously with every other one, a pre-existing value, which is simply "read out" by the measurement process.

      This principle is sometimes expressed as "every object has pre-existing properties", but this is misleading. The tendency or disposition to come up with "answers" to an observational "query", even probabilistically, even, "on the fly", is still a property of sorts, a propensity or dispositional property. Different systems will come up with different probability-distributions and that characterises them as different systems: whatever characterises a system is a property of the system.

      This Einsteinian usage of "realism" is much more specific that the usual (and philosophical) which simply requires that things don't just exist "it the mind". It is perfectly possible that quantum entities can have a mind-independent existence, with dispositional properties, as I have described. More in Travis Norten's paper "Against Realism"

      QM5 Quantum Mechanics and Time

      QM5.1: Causal Reversal.

      Gerardus's 't Hooft's "deterministic" picture, like TI , requires reversed causality. http://www.phys.uu.nl/~thooft/quantloss/sld020.htm J. Baez on time operators. http://math.ucr.edu/home/baez/uncertainty.html

      back to top

      QM5.2: Recovering Time from a Timeless Multiverse.

      A standard quantum multiverse is based on the evolution of the Universal Wave Function according to Schrodinger's Equation, or something similar. Whilst more possibilities are allowed than in Single-Universe theories, there are still some impossibilities (0-measure worlds, in many-worlds jargon).

      In Julian Barbours version of the multiverse -- an omniverse or Platonia -- no possibility is excluded, and there is no evolution from one state to another. Each element of Platonia is a 3 dimensional configuration of matter, a "Now" in Barbours terminology. A Now is defined entirely by its contents, so that if the same configuration of matter is repeated, the same Now has been visited twice (although the notion of going from one now to another in any way is dubious).

      Whatever temporal and causal order exists in Platonia has to be derived from the inherent structure of Nows. Barbour thinks that what is important are time capsules or "records" of other Nows existing within a given Now. (However, they are not records in the standard sense, since they were not caused by anything).

      Given this picture. we could ask whether we can be sure that we have the right memories or "time capsules". The answer is that we can hardly be wrong! There is no ultimate metaphysical fact about histories in Platonia; an external observer can pick out histories by applying criteria of ordering and sorting "nows". (A sort of principle of least action, or least difference between 'nows' seems to be important in arriving at the conventional laws of physics). So having physically realistic memories means having a set of time-capsules which match our chosen criteria. The agreement between 'subjective' memories and 'objective' physics is achieved by a distinctly Kantian mechanism: they will agree because they are both ultimately subjective! Another quirk of this system is that memories are not the only kind of internal representation people have; there are also dreams and hallucinations. Standardly, we would distinguish these by their causal history. Memories are caused by veridical perceptions, and dreams by cheese and hallucinations by LSD. However, in Barbour's scheme they may well be peeps into distant corners of Platonia; if there is a possible configuration of matter corresponding to your dream, it exists. Presumably what makes memories memories is an internal criterion, a certain logical consistency within. This gets back to the previous point; memories are bound to agree with history, because the criterion that selects 'nows' as part of a sensible history is the criterion that selects mental representations as memories, not dreams or hallucinations.

      back to top

      QM6 Measurement and Reality.

      The Copenhagen interpretation says that you can be sure that what you have measured is real. It doesn't say that what is unmeasured is unreal. Officially, it says nothing either way about the what is not a measurement-result. The inference that "what is unobserved is unreal" is often made, but it is unsupported by the Copenhagen Interpretation, which itself is only an interpretation.

      QM6.1 Consciousness and Collapse

      There is no variable in the mathematical formalism of QM that represents consciousness. There is not instrument in the physicists laboratory that can measure it. Strictly speaking, it is not and cannot be part of QM or any other theory in Physics as we understand it. It can feature in interpretations of theories however. Since there are many interpretations of the theory of QM, "consciousness causes collapse" can only be an opinion.

      Whether

      1. collapse occurs
      2. occurs because of something
      3. occurs because of interaction with macroscopic systems
      4. occurs because of observation as opposed to other interactions
      5. occurs because of conscious observation as opposed to automated measurement by a machine

      is all debatable. (increasingly so as you go down the list)...

      There is also another, contrary way of thinking about these issues: not consciousness-causes-collapse but collapse-causes-consciousness. Proponents include Sir Roger Penrose and Henry Stapp. This theory has its critics (who often point out that the brain is too big, hot and wet to sustain complex quantum superpositions), but is essentially a completely different theory to consciousness-causes-collapse.

      A sturdy rebuttal of consciousness-causes-quantum can be found here

      back to top

      "Of course the introduction of the observer must not be misunderstood to imply that some kind of subjective features are to be brought into the description of nature. The observer has, rather, only the function of registering decisions, i.e., processes in space and time, and it does not matter whether the observer is an apparatus or a human being; but the registration, i.e., the transition from the "possible" to the "actual," is absolutely necessary here and cannot be omitted from the interpretation of quantum theory."
      Heisenberg, Physics and Philosophy, p. 137

      QM6.2 What is real in Quantum Mechanics ?

      "We may think we are making sense when we talk about what the world is doing whether we observe it or not. It seems perfectly reasonable, for example, to say there is a sound in the forest when a tree falls, whether or not there is anyone around to hear it. Quantum mechanics gives no support to this notion. The world on the atomic scale, at least, does not seem to be some particular way, whether physicists observe it or not. The atomic world appears to have particular qualities only as the result of measurements physicists make. Quantum mechanics is a way of talking about nature that allows physicists to predict how the world will respond to being measured. So long as we stick to this understanding, quantum mechanics raises no problems. If, on the other hand, we persist in demanding to know how the world is, independent of how it appears to be in experiments, we, in Feynman's words, "will get 'down the drain', into a blind alley from which no one has yet escaped." "

      -- Bruce Gregory, "Inventing Reality", p 98

      But we should not be tempted into thinking that there is no way the world is at all in the absence of measurement. Even if the only thing we can say about a system is that it has a disposition or propensity to "answer a question" in a particular way, the propensities of system A are not those of system B.

      That one propensity can only be realised at the expense of another not being realisable at that time is not at all mysterious; a coin has the possibility to land heads up or tails up but not both at once. That making a measurement alters a system is also unmysterious -- it is the classical alternative of ghostly observation that is indefensible (when taken literally).

      The problem seems to be that, whereas in classical physics a system always possesses a set of fully-defined actual properties, with dispositions being merely hypothetical if-then statements about what would happen under non-actual circumstances, QM seems to require that dispositional properties go all the way down, are just as real as actual properties.

      This is parallel to the causality 'problem'. Classicist are happy to accept probabilities as abstract constructs relating to lack of information about a system. QM likewise suggests probabilities are real and intrinsic.

      back to top

      QM6.3 The reality of collapse

      "Suppose that our source can be tuned so that it emits photons in either a left- or a right-hand polarised state. On a particular occasion it, it emits a right-handed photon (and takes note of this fact). After the photon has encountered a beam splitter, the photon's state is now a linear combination [..]"

      |psi+> = |tau+> + |rho->

      "Let us place our detector in the transmitted beam. Then if .. the source registers that it has emitted the right-handed photon, but the detector fails to register, so that it has not received the photon, then it must be concluded the state has jumped (upon 'non-detection' by the source) to the reflected left-hand state |rho->. The point I am making here is that the full projection postulate is required to ascertain the nature of this resulting state"

      Null Measurement (Road To Reality, R Penrose, p 548)

      The Quantum Zeno effect. Repeatedly measuring an unstable particles extends its lifetime.

      EPR-type experiments. Making one measurement has an instantaneous effect on entangled particles -- something must be connecting the particles.

      back to top "the 'collapse' or 'reduction' of the wave function. This was introduced by Heisenberg in his uncertainty paper [3] and later postulated by von Neumann as a dynamical process independent of the Schrodinger equation" Kiefer, C. On the interpretation of quantum theory from Copenhagen to the present day

      QM6.4: The reality of waves

      back to top "Whereas some physicists have indeed taken the view that all measurements are ultimately measurements of position, I would myself regard such a perspective as being much too narrow. Indeed, the way the quantum formalism is normally presented does no require all measurements to be only of position". (R. Penrose, "Road to Reality", p517).

      Waves are directly detectable -- for instance, an aerial detects a photon without strongly localising it (no particular atom absorbs the photon, it is absorbed by a see of free electrons), but extracting its frequency information (hence "tuning in"). In quantum mechanics, the wave aspect corresponds to the frequency/momentum and the particle aspect to position, so photons can be detected as waves.

      Interference effects belong very much to the wave aspect of matter. Wave-particle duality applies not just to photons, but electrons, neutrons and even fullerene (C60) molecules have shown interference effects -- and hence a wave nature -- in experiment.

      The wave nature of electrons can be seen in this image

      "Optical" effects using C60 can be are show here

      Interference with large molecules is described here TBD

    1. QM6.5: The reality of particles

      back to top TBD

      QM7 Mathematical Aspects of QM

      QM7.1 The digital universe: is the cosmos made of integers ?

      QM does not in fact suggest this. It is not a theory that suggests everything comes in distinct quantities -- any more than relativity is a theory that suggests everything is relative. In fact, some observables in QM are quite specifically and formally continuous:-
      The corresponding eigenvalues x and p and eigenvectors |x> and |p> satisfy the equations X|x> = x|x> P|p> = p|p> which, in general, could constitute a continuous spectrum of eigenvalues and eigenvectors. http://www.nyu.edu/classes/tuckerman/stat.mech/lectures/lecture_12/node4.html
      There is a speculative theory to the effect that everything is quantised at the level of the Planck Length, Plank time, and so on, but it is not Quantum Mechanics, and there is as yet no specific evidence for it.
      "More importantly for the physical theory, from Archimedes, thorough Galileo and Newton, to Maxwell, Einstein, Schrodinger, Dirac and the rest, a crucial role for the real number system has been that it provides a necessary framework for the standard formulation of the calculus. All successful dynamical have required notions of the calculus for their formulations. Now the conventional approach to calculus requires the infinitesimal nature of the reals to be what it is. That is to say,, on the small end of the scale, it is the entire range of real numbers hat is being made use of. The ideas of calculus underlie other physical physical notions, such as velocity, momentum and energy. Consequently the real-number system enters our successful physical theories in a fundamental way for the description of all these quantities also."
      (R. Penrose, Road to Reality, p 61)
      "It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypotheses that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.
      Richard Feynman in The Character of Physical Law, page 57. Einstein on continuous models

      back to top

      QM7.2 Quantum and Classical Physics

      The mathematical formalism of quantum physics is largely derived from a version of classical physics called the canonical formulation, plus the introduction of non-commutating operators. An operator is a kind of meta-function that transforms one function into another. Non-commutation means that the order in which the operators are applied makes a difference. Where operators are interpreted as making a measurement on a system , non-commutation means that the operator "disturbs" the systems so that it is no longer in the same state.

      back to top

      QM8. Myths about the Copnhagen interpretation
      The Copenhagen Interpretation is stated as the standard in most texts, particularly the more introductory ones. The most persistent myths about it are
      1. It was unequivocally defined and agreed on by Bohr and Heisenberg
      2. It is still standard
      3. It has been completely abandoned.
      Bohr was more irrealist/positivist than Heisenberg. Heisenberg became more realist after the devlopment of wave mecahnics by Schrodinger, which was more "inutitive" than his matrix mechanics.

      back to top