From Certainty to Uncertainty, Pt. 1
Peat, F. David. From Certainty to Uncertainty: The Story of Science and Ideas in the Twentieth Century. Washington D.C.: Joseph Henry Press, 2002.
Chapter One: Quantum Uncertainty
In 1900, Lord Kelvin spoke of the triumphs of physics and how Newton’s theory of motion could be extended to embrace the phenomena of light and heat. His address went on to mention “two clouds” that obscured the “beauty and clearness” of the theory: the first involved the way light travels through space, the second was the problem of distributing energy equally among vibrating molecules. The solution Kelvin proposed, however, proved to be way off the mark. Ironically, what Kelvin had taken to be clouds on the horizon were in fact two bombshells about to create a massive explosion in twentieth century physics. Their names were relativity and quantum theory, and both theories had something to say about light.
Light, according to physicists like Kelvin, is a vibration, and like every other vibration it should be treated by Newton’s laws of motion. But a vibration, physicists argued, has to be vibrating in something. And so physicists proposed that space is not empty but filled with a curious jelly called “the luminiferous ether.” But this meant that the speed of light measured in laboratories on earth-the speed with which vibrations appear to travel through the ether-should depend on how fast and in what direction the earth is moving through the ether. Because the earth revolves around the sun this direction is always varying, and so the speed of light measured from a given direction should vary according to the time of year. Scientists therefore expected to detect a variation in the speed of light measured at various times of the year, but very accurate experiments showed that this was not the case. No matter how the earth moves with respect to the background of distant stars, the speed of light remains the same.
This mystery of the speed of light and the existence, or nonexistence, of the ether was only solved with Einstein’s special theory of relativity, which showed that the speed of light is a constant, independent of how fast you or the light source is traveling.
The other cloud on Kelvin’s horizon, the way in which energy is shared by vibrating molecules, was related to yet another difficult problem-the radiation emitted from a hot body. In this case, the solution demanded a revolution in thinking that was just as radical as relativity theory-the quantum theory.
Bohr and Einstein
Special relativity was conceived by a single mind-that of Albert Einstein. Quantum theory, however, was the product of a group of physicists who largely worked together and acknowledged the Danish physicist Niels Bohr as their philosophical leader. As it turns out, the tensions between certainty and uncertainty that form the core of this book are nowhere better illustrated than in the positions on quantum theory taken by these two great icons of twentieth century physics, Einstein and Bohr. By following their intellectual paths we are able to discover the essence of this great rupture between certainty and uncertainty.
When the two men debated together during the early decades of the twentieth century they did so with such passion for truth that Einstein said that he felt love for Bohr. However, as the two men aged, the differences between their respective positions became insurmountable to the point where they had little to say to each other. The American physicist David Bohm related the story of Bohr’s visit to Princeton after World War II. On that occasion, the physicist Eugene Wigner arranged a reception for Bohr that would also be attended by Einstein. During the reception, Einstein and his students stood at one end of the room and Bohr and his colleagues at the other.
How did this split come about? Why, with their shared passion for seeking truth, had the spirit of open communication broken down between the two men? The answer encapsulates much of the history of twentieth century physics and concerns the essential dislocation between certainty and uncertainty. The break between them involves one of the deepest principles of science and philosophy-the underlying nature of reality. To understand how this happened is to confront one of the great transformations in our understanding of the world, a leap far more revolutionary than anything Copernicus, Galileo, or Newton produced. To find out how this came about we must first take a tour through twentieth century physics.
Einstein’s name is popularly associated with the idea that “everything is relative.” This word “relative” has today become loaded with a vast number of different associations. Sociologists, for example, speak of “cultural relativism,” suggesting that what we take for “reality” is to a large extent a social construct and that other societies construct their realities in other ways. Thus, they argue, “Western science” can never be a totally objective account of the world for it is embedded within all manner of cultural assumptions. Some suggest that science is just one of the many equally valid stories a society tells itself to give authority to its structure; religion being another.
In this usage of the words “relative” and “relativism” we have come far from what Einstein originally intended. Einstein’s theory certainly tells us that the world appears different to observers moving at different speeds, or who are in different gravitational fields. For example, relative to one observer lengths will contract, clocks will run at different speeds, and circular objects will appear ellipsoidal. Yet this does not mean that the world itself is purely subjective. Laws of nature underlie relative appearances, and these laws are the same for all observers no matter how fast they are moving or where they are placed in the universe. Einstein firmly believed in a totally objective reality to the world and, as we shall see, it is at this point that Einstein parts company with Bohr.
Perhaps a note of clarification should be added here since that word “relativity” covers two theories. In 1905, Einstein (in what was to become known as the special theory of relativity) dealt with the issue of how phenomena appear different to observers moving at different speeds. He also showed that there is no absolute frame of reference in the universe against which all speeds can be measured. All one can talk about is the speed of one observer when measured relative to another. Hence the term “relativity.”
Three years later the mathematician Herman Minkowski addressed the 80th assembly of German National Scientists and Physicians at Cologne. His talk opened with the famous words: “Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.” In other words, Einstein’s special theory of relativity implied that space and time were to be unified into a new four-dimensional background called space-time.
Einstein now began to ponder how the force of gravity would enter into his scheme. The result, published in 1916, was his general theory of relativity (his earlier theory now being a special case that applies in the absence of gravitational fields). The general theory showed how matter and energy act on the structure of space-time and cause it to curve. In turn, when a body enters a region of curved space-time its speed begins to change. Place an apple in a region of space-time and it accelerates, just like an apple that falls from a tree on earth. Seen from the perspective of General Relativity the force of gravity acting on this apple is none other than the effect of a body moving through curved space-time. The curvature of this space-time is produced by the mass of the earth.
Now let us return to the issue of objectivity in a relative world. Imagine a group of scientists here on earth, another group of scientists in a laboratory that is moving close to the speed of light, and a third group located close to a black hole. Each group observes and measures different phenomena and different appearances, yet the underlying laws they deduce about the universe will be identical in each of the three cases. For Einstein, these laws are totally independent of the state of the observer.
This is the deeper meaning of Einstein’s great discovery. Behind all phenomena are laws of nature, and the form of these laws, their most elegant mathematical expression, is totally independent of any observer. Phenomena, on the other hand, are manifestations of these underlying laws but only under particular circumstances and contexts. Thus, while phenomena appear different for different observers, the theory of relativity allows scientists to translate, or transform, one phenomenon into another and thus to return to an objective account of the world. Hence, for Einstein the certainty of a single reality lies behind the multiplicity of appearance.
Relativity is a little like moving between different countries and changing money from dollars into pounds, francs, yen, or euros. Ignoring bank charges, the amount of money is exactly the same, only its physical appearance-the bank notes in green dollars or pounds, yen, euros, and so on changes. Similarly a statement made at the United Nations is simultaneously translated into many different languages. In each particular case the sound of the statement is quite different but the underlying meaning is the same. Observed phenomena could be equated to statements in different languages, but the underlying meaning that is the source of these various translations corresponds to the objective laws of nature.
This underlying reality is quite independent of any particular observer. Einstein felt that if the cosmos did not work in such a way it would simply not make any sense and he would give up doing physics. So, in spite of that word “relativity,” for Einstein there was a concrete certainty about the world, and this certainty lay in the mathematical laws of nature. It is on this most fundamental point that Bohr parted company with him.
If Einstein stood for an objective and independent reality what was Niels Bohr’s position? Bohr was an extremely subtle thinker and his writings on quantum theory are often misunderstood, even by professional physicists! To discover how his views on uncertainty and ambiguity evolved we must go back to 1900, to Kelvin’s problem of how energy is distributed amongst molecules and an even more troubling, related issue, that of blackbody radiation.
A flower, a dress, or a painting is colored because it absorbs light at certain frequencies while reflecting back other frequencies. A pure black surface, however, absorbs all light that falls on it. It has no preference for one color over another or for one frequency over another. Likewise, when that black surface is warmer than its surroundings it radiates its energy away and, being black, does so at every possible frequency without preferring one frequency (or color) over another.
When physicists in the late nineteenth century used their theories to calculate how much energy is being radiated, the amount they arrived at, absurdly, was infinite. Clearly this was a mistake, but no one could discover the flaw in the underlying theory.
Earlier that century the Scottish physicist James Clerk Maxwell had pictured light in the form of waves. Physicists knew how to make calculations for waves in the ocean, sound waves in a concert hall, and the waves formed when you flick a rope that is held fixed at the other end. Waves can be of any length, with an infinite range of gradations. In the case of sound, for example, the shorter the wavelength-the distance between one crest and the next-the higher the pitch, or frequency, of the sound because the shorter the distance between wave crests, the more crests pass a particular point, such as your ear, in a given length of time. The same is true of light: long wavelengths lie toward the red end of the spectrum, whereas blue light is produced by higher frequencies and shorter wavelengths.
By analogy with sound and water waves, the waves of light radiated from a hot body were assumed to have every possible length and every possible frequency; in other words, light had an infinite number of gradations from one wavelength to the next. In this way an infinity crept into the calculation and emerged as an infinite amount of energy being radiated.
In 1900, Max Planck discovered the solution to this problem. He proposed that all possible frequencies and wavelengths are not permitted, because light energy is emitted only in discrete amounts called quanta. Rather than continuous radiation emerging from a hot body, there is a discontinuous, and finite, emission of a series of quanta.
With one stroke the problem of blackbody radiation was solved, and the door was opened to a whole new field that eventually became know as quantum theory. Ironically Einstein was the first scientist to apply Planck’s ideas. He argued that if light energy comes in the form of little packages, or quanta, then when light falls on the surface of a metal it is like a hail of tiny bullets that knock electrons out of the metal. In fact this is exactly what is observed in the “photoelectric effect,” the principle behind such technological marvels as the “magic eye.” When you stand in the doorway of an elevator you interrupt a beam of light that is supposed to be hitting a photocell. This beam consists of light quanta, or photons, that knock electrons from their atoms and in this way create an electrical current that activates a relay to close the door. A person standing in the doorway interrupts this beam and so the door does not close.
The next important step in the development of quantum theory came in 1913 from the young Niels Bohr who suggested that not only light, but also the energy of atoms, is quantized. This explains why, when atoms emit or lose their energy in the form of radiation, the energy given out by a heated atom is not continuous but consists of a series of discrete frequencies that show up as discrete lines in that atom’s spectrum. Along with contributions from Werner Heisenberg, Max Born, Erwin Schrödinger and several other physicists the quantum theory was set in place. And with it uncertainty entered the heart of physics.
Just as relativity taught that clocks can run at different rates, lengths can contract, and twins on different journeys age at different rates, so too quantum theory brought with it a number of curious and bizarre new concepts. One is called wave-particle duality. In some situations an electron can only be understood if it is behaving like a wave delocalized over all space. In other situations, an electron is detected as a particle confined within a tiny region of space. But how can something be everywhere and at the same time also be located at a unique point in space?
Niels Bohr elevated duality to a universal principle he termed “complementarity.” A single description “this is a wave” or “this is a particle,” he argued, is never enough to exhaust the richness of a quantum system. Quantum systems demand the overlapping of several complementary descriptions that when taken together appear paradoxical and even contradictory. Quantum theory was opening the door to a new type of logic about the world.
Bohr believed that complementarity was far more general than just a description of the nature of electrons. Complementarity, he felt, was basic to human consciousness and to the way the mind works. Until the twentieth century, science had dealt in the certainties of Aristotelian logic: “A thing is either A or not-A.” Now it was entering a world in which something can be “both A and not-A.” Rather than creating exhaustive descriptions of the world or drawing a single map that corresponds in all its features to the external world, science was having to produce a series of maps showing different features, maps that never quite overlap.
Chance and the Irrational in Nature
If complementarity shook our naive belief in the uniqueness of scientific physical objects, certainty was to receive yet another shock in the form of the new role taken by chance. Think, for example, of Marie Curie’s discovery of radium. This element is radioactive, which means that its nuclei are unstable and spontaneously break apart or “decay” into the element radon. Physicists knew that after 1,620 years only half of this original radium will be left-this is known as its half-life. After a further 1,620 years only a quarter will remain, and so on. But an individual atom’s moment of decay is pure chance-it could decay in a day, or still be around after 10,000 years.
The result bears similarity to life insurance. Insurers can compute the average life expectancy of 60-year-old men who do not smoke or drink, but they have no idea when any particular 60-year-old will die. Yet there is one very significant difference. Even if a 60-year-old does not know the hour of his death, it is certain that his death will be the result of a particular cause-a heart attack, a traffic accident, or a bolt of lightning. In the case of radioactive disintegration, however, there is no cause. There is no law of nature that determines when such an event will take place. Quantum chance is absolute.
To take another example, chance rules the game of roulette. The ball hits the spinning wheel and is buffeted this way and that until it finally comes to rest on a particular number. While we can’t predict the exact outcome, we do know that at every moment there is a specific cause, a mechanical impact, that knocks the ball forward. But because the system is too complex to take into account all the factors involved-the speed of the ball, the speed of the wheel, the precise angle at which the ball hits the wheel, and so on-the laws of chance dominate the game. As with life insurance, chance is another way of saying that the system is too complex for us to describe. In this case chance is a measure of our ignorance.
Things are quite different in the quantum world. Quantum chance is not a measure of ignorance but an inherent property. No amount of additional knowledge will ever allow science to predict the instant a particular atom decays because nothing is “causing” this decay, at least in the familiar sense of something being pushed, pulled, attracted, or repelled.
Chance in quantum theory is absolute and irreducible. Knowing more about the atom will never eliminate this element. Chance lies at the heart of the quantum universe. This was the first great stumbling block, the first great division between Bohr and Einstein, for the latter refused to believe that “the Good Lord plays dice with the universe.”
Einstein: The Last Classical Physicist
Even now, half a century after Einstein’s death, it is too soon to assess his position in science. In some ways his stature could be compared to that of Newton who, following on from Galileo, created a science that lasted for 200 years. He made such a grand theoretical synthesis that he was able to embrace the whole of the universe. Some historians of science also refer to Newton as the last magus, a man with one foot in the ideas of the middle ages and the other in rationalistic science. Newton was deeply steeped in alchemy and sought the one Catholick Matter. He had a deep faith in a single unifying principle of all that is.
Likewise Einstein, who was responsible for the scientific revolution of relativity as well as some of the first theoretical steps into quantum theory, is regarded by some as the last of the great classical physicists. As with Shakespeare, great minds such as Newton’s and Einstein’s appear to straddle an age, in part gazing forward into the future, in part looking back to an earlier tradition of thought.
When Einstein spoke of “the Good Lord” as not playing dice with the universe, he was referring not to a personal god but rather to “the God of Spinoza,” or, as with Newton, to an overarching principle of unity that embraces all of nature. The cosmos for Einstein was a divine creation and thus it had to make sense, it had to be rational and orderly. It had to be founded upon a deep and aesthetically beautiful principle. Its underlying structure had to be satisfyingly simple and uniform. Reality, for Einstein, lay beyond our petty human wishes and desires. Reality was consistent. It reflected itself at every level. Moreover, the Good Lord had given us the ability to contemplate and understand such a reality.
Einstein could have sat down at Newton’s dinner table and discussed the universe with him, something he was ultimately unable to do with Bohr. Bohr and quantum theory spoke of absolute chance. “Chance” to Einstein was a shorthand way of referring to ignorance, to a gap in a theory, to some experimental interference that had not yet been taken into account.
Wolfgang Pauli, another of the physicists who helped to develop quantum theory, put the counterargument most forcefully when he suggested that physics had to come to terms with what he called “the irrational in matter.” Pauli himself had many conversations with the psychologist Carl Jung, who had discovered what Pauli termed an “objective level” to the unconscious. It is objective because this collective unconscious is universal and lies beyond any personal and individual events in a person’s life. Likewise, Pauli suggested that just as mind had been discovered to have an objective level, so too would matter be found to have a subjective aspect. One feature of this was what Pauli called the “irrational” behavior of matter. Irrationality, for Pauli, included quantum chance, events that occur outside the limits of causality and rational physical law.
The gap between Pauli’s irrationality of matter and Einstein’s objective reality is very wide. What made this gap unbridgeable was an even more radical uncertainty-whether or not an underlying reality exists at the quantum level, whether or not there is any reality independent of an act of observation.
Heisenberg’s Uncertainty Principle
This disappearance of an ultimate reality has its seed in Werner Heisenberg’s famous uncertainty principle. When Heisenberg discovered quantum mechanics he noticed that his mathematical formulation dictated that certain properties, such as the speed and position of an electron, couldn’t be simultaneously known for certain. This discovery was then expressed as Heisenberg’s uncertainty principle.
When astronomers want to predict the path of a comet all they need to do is measure its speed and position at one instance. Given the force of gravity and Newton’s laws of motion, it is a simple matter to plug speed and position into the equations and plot out the exact path of that comet for centuries to come. But when it comes to an electron, things are profoundly different. An experimenter can pin down its position, or its speed, but never both at the same time without a measure of uncertainty or ambiguity creeping in. Quantum theory dictates that no matter how refined are the measurements, the level of uncertainty can never be reduced.
How does this come about? It turns out to be a direct result of Max Planck’s discovery that energy, in all its forms, is always present in discrete packets called quanta. This means a quantum cannot be split into parts. It can’t be divided or shared. The quantum world is a discrete world. Either you have a quantum or you don’t. You can’t have half or 99 percent of a quantum.
This fact has a staggering implication when it comes to our knowledge of the atomic world. Scientists learn about the world around them by making observations and taking measurements. They ask: How bright is a star? How hot is the sun? How heavy is Newton’s apple? How fast is a meteor?
Whenever a measurement is made something is recorded in some way. If no record were created, if no change had occurred, then no measurement would have been made or registered. This may not be obvious at first sight so let’s do an experiment: Measure the temperature of a beaker of water. Put a thermometer in the water and register how high the mercury rises. For this to happen some of the heat of the water must have been used to heat up and expand the mercury in the thermometer. In other words, an exchange of energy between the water and the thermometer is necessary before a measurement can be said to have been recorded.
What about the position or the speed of a rocket? Electromagnetic waves are bounced off the rocket, picked up on a radar dish, and processed electronically. From the returned signals it is a simple matter to determine the rocket’s position. These same signals can also be used to find