Posted: 01/23/2013 10:29 pm
Former NASA researcher;
computational scientist; emeritus professor of mathematics, James
Madison University; author, 'Reason and Wonder'
God does not play dice." --Albert Einstein
Since
my Jan. 2 post,
we've been discussing assumptions that science initially embraced,
either explicitly or tacitly, only to abandon later as invalid or
unnecessary. These include most of the following: dualism, determinism,
reductionism, absolute time, absolute space, the principle of locality,
materialism and realism.
The last post
addressed absolute time and space. Today we ring the death knell for
determinism, the collateral damage of two revolutionary scientific
developments of the 20th century. Let's review them in reverse
historical order.
* * * * *
"If I have seen further than others, it is by standing on the
shoulders of giants," acknowledged Newton. He didn't name the giants,
but it's clear that Galileo and Kepler were among them. Galileo Galilei,
the father of experimental physics, laid foundations for the science of
motion. Johannes Kepler, the "Protestant Galileo," laid the foundations
of celestial mechanics. Both men developed descriptive laws. Galileo
described the now-familiar parabolic trajectory of a thrown object;
Kepler teased from Tycho Brahe's astronomical data the fact that the
planets travel in elliptical orbits. Neither explained
why.
It took the peculiar genius of Newton to move from
descriptions of natural phenomena to
explanations.
By combining 1) his three laws of motion, 2) the calculus, which he
termed "fluxions," and 3) an inverse-square law of gravitation, Newton
proved that the planetary orbits naturally and exactly satisfy all three
of Kepler's descriptive laws.
Newton's achievement was monumental. To those of his era, it seemed
that Newton had illuminated every scientific corner, leaving nature
bereft of secrets. The French mathematician Pierre Simon de Laplace, the
"Newton of France," extrapolated beyond Newton to envision the day when
science could predict the future "movements of the greatest bodies of
the universe and those of the tiniest atom." What was necessary to
preordain the future? "An intellect which at a certain moment would know
all forces that set nature in motion, and all positions of all items of
which nature is composed." The difficulties with
Laplace's daemon,
as this hypothesized intelligence became know, were practical rather
than conceptual. By the 1950s Laplace's daemon was incarnate in the
digital computer. Two decades later men stood on the moon, navigated
there not by compass but by a digital daemon that could solve Newton's
equations of celestial motion in real time.
* * * * *
Just as it seemed that the computer might fulfill the dream of
scientific determinism, it turned against Laplace. The year was 1961,
the place was MIT, and the person was Edward Lorenz, a meteorologist.
While playing with a "toy" weather model on a primitive, vintage,
vacuum-tube Royal McBee computer, Lorenz stumbled onto something
unexpected:
chaos.
Lorenz's system consisted of just three simple differential equations. However, the equations were
nonlinear,
a mathematical term indicating that the system's output is not directly
proportional to its input. On impulse, Lorenz restarted an old
computation at its midpoint in time. The old and new solutions followed
nearly the same trajectories for a while, then they diverged
dramatically, eventually losing all resemblance. Suspecting a computer
malfunction, Lorenz checked and rechecked the Royal McBee. It was in
perfect working order.
Pausing to ponder, Lorenz was stunned by the truth. For the original
calculation, he had entered the initial data to six significant digits;
in the repeat calculation, he had done so only to three. Tiny
differences in the input were magnified by the system to yield enormous
differences in output, a phenomenon known to mathematicians as
sensitive dependence to initial conditions, and to the general public as the
butterfly principle, thanks to James Gleick's bestseller
Chaos
(1988). In a nutshell, nonlinear systems can be extraordinarily
sensitive to their initial states. Change the initial data a smidgeon
and you'll get a wholly different solution.
* * * * *
Quantum mechanics rankled Einstein, steeped as he was in the
determinism of the philosopher Spinoza. Quanta (subatomic particles) are
defiantly probabilistic and therefore non-deterministic.
With the discovery of radioactivity in 1895, probability thrust its
ugly head into physics. The moment at which a radioactive element decays
is inherently unpredictable. One can ascertain the probability of an
aggregation of decay events from a lump of radium with sufficient
precision to define its half-life spot on. But as to when an individual
atom will shed its alpha particle, one is powerless to say.
With the advent of quantum mechanics in the early 1900s, probability
was here to stay. Each of the nearly 120 elements that comprise the
universe emits a unique fingerprint, its visual
spectrum, when
heated or burned. No one knew why. Niels Bohr's adaptation of Ernest
Rutherford's atomic model explained the hitherto mysterious spectral
lines, but at the expense of quantizing the orbits of electrons. Each
spectral line originates when an electron jumps from one orbit to
another, Bohr reasoned. But Bohr's model troubled Rutherford: The
leaping electrons seemed "to choose not only the timing of their leap
but the destination [orbit] too."
The harder physicists tried to explain away the probabilistic nature
of the quantum the more resilient it became. Erwin Schroedinger, a titan
of quantum mechanics, grumbled, "If we're going to put up with these
damn quantum jumps, I am sorry that I ever had anything to do with
quantum theory." Einstein fought amicably with Bohr over the issue for
more than two decades, and lost.
Dominating the landscape of quantum mechanics is Werner Heisenberg's
uncertainty principle,
which quantifies what can and cannot be known about quantum objects.
Specifically, one can never know precisely and simultaneously the
position and velocity of, say, an electron. You can know where it is but
not where it's going. Or you can know where it's going, but not where
it is. You can even split the difference, accepting some fuzziness in
each. But you can never precisely know its complete initial state.
* * * * *
On the shoal of uncertainty, determinism founders. On the reef of
chaos, it sinks. By and large, nature has a strong predilection for the
nonlinear. Newton's law of gravitation is nonlinear. Maxwell's equations
of electrodynamics are not, but they're the exceptions. General
relativity is strongly nonlinear. So are the strong and weak nuclear
forces. As a result, so is most chemistry. And because life involves
chemistry, life's basic mechanisms are inherently nonlinear.
To accurately predict the evolution of events in a nonlinear universe
-- in which the flapping of a butterfly's wing in Brazil literally
affects Peoria's weather days later -- Laplace's daemon would need
absolute precision in its initial data: the position and velocity of
every molecule of air and butterfly. Heisenberg's uncertainly principle
forbids just that. In the wonderfully pithy summary of Lorenz: "The
present predicts the future, but the approximate present does not
predict the approximate future."
The universe is contingent, not deterministic. For sentient beings
this implies that the cosmos is also participatory. Individual
intentions and actions, however seemingly insignificant, shape the
cosmic future.
This blog post is adapted from Chapters 1 and 11 -- "The
Clockmaker and the Clockwork" and ''Through the Looking Glass,"
respectively -- of my recent book Reason and Wonder.
No comments:
Post a Comment