FAIR USE NOTICE

FAIR USE NOTICE

A BEAR MARKET ECONOMICS BLOG

OCCUPY THE SCIENTIFIC METHOD


This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.

In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml

If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates

All Blogs licensed under Creative Commons Attribution 3.0

Saturday, April 6, 2013

Science's Sacred Cows (Part 9): Conclusion


science


Science's Sacred Cows (Part 9): Conclusion

Posted: 04/04/2013 11:59 am
Dave Pruett

 

Science's Sacred Cows (Part 8): Materialism


science


Science's Sacred Cows (Part 8): Materialism

Posted: 03/29/2013 7:39 pm
Dave Pruett

 

Science's Sacred Cows (Part 7): Reductionism


science


Science's Sacred Cows (Part 7): Reductionism

Posted: 03/19/2013 7:27 pm

Science's Sacred Cows (Part 6): Realism


science


Science's Sacred Cows (Part 6): Realism

Posted: 03/12/2013 4:35 pm

Dave Pruett





"The universe is not only queerer than we suppose, it is queerer than we can suppose." -- J.B.S. Haldane


In the previous post, we discussed a fiendishly clever gedanken experiment posed in 1935 by Einstein and co-workers and designed to expose presumed flaws in quantum mechanics (QM). When the so-called "EPR paradox" was finally tested experimentally in 1977 at Lawrence Berkeley Laboratory, the results were a resounding victory for QM, while ringing the death knell for Einstein's cherished principle of local causes. The "EPR paradox" -- and Bell's Theorem, which ultimately led to its resolution -- firmly established the notion of quantum entanglement. Further experiments over the past three decades have moved entanglement from the status of novelty into the mainstream of physics. Today we discuss further experiments involving entanglement that call into question another of science's tacit assumptions: realism.

Realism is the philosophical stance of most sane human beings, scientists included. Realism asserts: "All measurement outcomes depend on pre-existing properties of objects that are independent of the measurement." In layman's terms, there's a real world out there independent of us (although we may see it differently because of variations in our measurement devices, i.e., our eyes and lenses). After all, if I ski down a slope and collide with a tree in my path, it makes little difference whether or not I believe the tree is real. Most persons -- a few Zen philosophers excepted ("What sound does a falling tree make if no one hears?") would assert that the tree has a reality all its own, independent of me. But in the microscopic world of the quantum, things are so very different than our macroscopic experiences would lead us to believe.

Recall from the previous post that the paper by Einstein, Podolsky, and Rosen (hence the moniker "EPR") involved two-particle (say two photon) systems in which twin particles are spatially separated by a vast distance. Certain implications of QM, fretted Einstein, suggested that measuring the polarization of one photon would instantaneously set the polarization of its twin, violating the principle of local causes.

Let's carefully parse Einstein's words when he sprang the EPR trap to conclude that QM is incomplete.
"One can escape from this conclusion [that quantum theory is incomplete] only by either assuming the measurement of [photon 1 telepathically] changes the real situation at [photon 2] or by denying independent real situations as such to things which are spatially separated from each other. Both alternatives appear to me entirely unacceptable."
By referring to "the real situation" and "independent real situations," Einstein explicitly assumed physical realism. Furthermore, by assuming that spatially separated particles could not instantly "communicate," he invoked the principle of local causes. Thus Einstein made two fundamental and independent assumptions: locality and realism.

Because of the profound implications of entanglement, researchers around the world continue to propose and conduct experiments to further clarify the remarkable phenomenon. To summarize their collective results prior to 2007: experiments based upon Bell's Theorem prove "that all hidden-variable theories based on the joint assumption of locality and realism [emphases added] are at variance with the predictions of quantum mechanics." The logical conclusion from these results is that theories based on local realism fail to be consistent with the predictions of QM because at least one assumption fails. But which?

In 2007, a group of researchers at premier institutions in Austria and Poland collaborated to perform yet another stunning EPR-like investigation. The experiment, by Simon Gröblacher and a host of coworkers, tested a new theorem by A. Leggett (2003) that further refines Bell's theorem. Leggett's theorem yields a mathematical inequality that, when combined with Bell's inequality, permits the independent testing of Einstein's two assumptions: locality and realism. The results of Gröblacher's team, published in Nature (April 19, 2007), once again validated quantum mechanical predictions. That was to be expected. The unexpected occurred in identifying which assumption of local realism failed. Locality or realism? Surprisingly, both. The authors concluded:
"Our result suggests that giving up the concept of locality is not sufficient to be consistent with quantum experiments, unless certain intuitive features of realism are [also] abandoned."
An independent reality, it now appears, has become the latest casualty of QM. The universe is nonlocal, nondeterministic, and apparently "unreal" as well.
Haldane was right: the universe, at least at the quantum level, is "queerer" than we can imagine.

This essay is extracted from Chapter 11 -- "Through the Looking Glass" -- of the author's book Reason and Wonder.

Science's Sacred Cows (Part 5): Locality


science


Science's Sacred Cows (Part 5): Locality

Posted: 02/22/2013 5:00 pm

Dave Pruett





In three previous posts we've discussed assumptions that science once embraced, later to discard as invalid or unnecessary. Thus far we've dispatched with absolute time and space (Part 2), determinism (Part 3), and dualism (Part 4). Today we examine the principle of local causes.

*****
 
Einstein, who deposed Newton, grew intellectually stodgy in old age. With respect to quantum mechanics (QM), he was positively reactionary. A die-hard determinist, Einstein rejected the statistical implications of quantum theory. In a letter to his friend and fellow physicist Max Born, Einstein confided: "Quantum mechanics is very impressive. But an inner voice tells me that it is not yet the real McCoy. The theory produces a good deal but hardly brings us closer to the secret of the Old One."

By 1935, Einstein believed he had found the Achilles heel of QM. With two coworkers, he pitched a heroic attempt to expose its fatal flaw by posing what became known as the "EPR paradox."

Strictly speaking, paradoxes are unresolved; not so for EPR. Einstein lost, and QM won. However, in 1935, when Einstein, Boris Podolsky, and Nathan Rosen posed their fiendishly clever gedanken experiment, a betting person might have put odds on Einstein's team. QM must be incomplete, Einstein felt instinctively. The probabilistic appearances of quantum events -- such as the spontaneous decay of a radioactive element, for example -- were mirages, he surmised, the result not of any propensity of the natural world toward statistics but rather of our incomplete knowledge of nature. Einstein hypothesized the existence of "hidden variables" of which the current theory remained ignorant. If the hidden variables were exposed to the light of day, he argued, the apparent statistical predilection of nature would evaporate.

*****
 
As the originator of the theory of relativity, Einstein held the principle of local causes to be inviolate. According to special relativity, the velocity of light in vacuo, denoted c, plays the role of a universal speed limit, faster than which travel is forbidden by physical impossibility. Nothing -- neither matter nor information -- should be able to rove from point A to point B faster than a beam of light can make the transit.

To illustrate, suppose we humans unwisely break the atmospheric test-ban treaty, detonating a nuclear weapon. Aliens presumably will be unaware of the ominous event until photons from the flash on earth reach their distant planet. Most physicists, Einstein included, regarded as absurd the thought that an event at one point in space could instantaneously effect an outcome in a distant region.

QM, Einstein inferred, violated the locality principle. Sporting the title "Can Quantum-Mechanical Descriptions of Physical Reality Be Considered Complete?," the EPR paper considered simple, two-particle quantum systems. Quanta -- whether electrons, positrons, or photons -- involve a property called spin. It is a helpful analogy to think of such particles as spinning about an axis like a top. Spin, a form of angular momentum, is defined by an axis of rotation and an angular velocity. For subatomic particles, spin is quantized: spin magnitudes must be whole or half multiples of Planck's constant h. For photons of light, spin is synonymous with polarization. Photons carry spin magnitudes of plus or minus 1. No other values are permitted by nature.

Now consider the simplest aggregate of quantum particles: a two-particle system consisting of one spin-up (+1) photon and one spin-down (−1) photon. The net spin momentum is thereby zero, the sum of the individual spins. By conservation of angular momentum, the net spin of the system must remain zero for all time.

And now things get interesting. As a quantum of angular momentum, spin is subject to Heisenberg's uncertainty principle, with two enormous implications. First, until measurement, a particle's spin exists only as a probability, not as a reality. Second, the act of measuring spin determines its actual value.

Now to Einstein's punch line. Consider the creation of a two-particle system of net spin zero, comprised of two photons, each traveling horizontally at velocity c in opposite directions. Suppose further that the two photons are now separated by an enormous expanse of space. Suppose finally that we measure the spin of one photon, which fixes its spin. According to quantum theory, the other must simultaneously register a spin value exactly opposite that of its twin in order to preserve angular momentum. But how does the distant twin know instantly what spin value to assume?

Einstein believed that he held QM by the soft parts. He closed by springing the trap:
"One can escape from this conclusion [that quantum theory is incomplete] only by either assuming the measurement of [particle 1 telepathically] changes the real situation at [particle 2] or by denying independent real situations as such to things which are spatially separated from each other. Both alternatives appear to me entirely unacceptable."
Einstein euphemistically termed such mysterious (and presumably impossible) "telepathic" or "superluminal" communication by the colorful phrase "spooky action at a distance." And so the EPR paradox rested for 30 years, a full-fledged, unresolved, perplexing conundrum about the nature of reality.

*****
 
Then, in 1964, John Stewart Bell, an Irish physicist associated with CERN (the Euroean Center for Nuclear Research), proved a theorem directly related to EPR. Bell's theorem is extraordinary for several reasons. Foremost, it's a theorem, not a theory. Theorems, which are mathematical, rest on the solid foundation of formal logic and carry 100 percent certitude. In particular, Bell's theorem established a mathematical inequality by which to experimentally test whether the principle of local causes is valid or the statistical predictions of QM are valid. However, there had to be a winner because the two outcomes are mutually exclusive.

In the decade following the publication of Bell's theorem, numerous physicists reformulated and sharpened the argument. By the early 1970s, experimentalists had posed a version involving photon polarization that could be tested in the laboratory. In 1972, Stuart Freedman and John Clauser of Lawrence Laboratory performed the long-awaited experiment to resolve the EPR paradox. By generalizing Bell's inequality, they directly tested Einstein's assertion that hidden variables could preserve local causes. The results were unequivocal: "Our data, in agreement with quantum mechanics, ... provid[e] strong evidence against local hidden-variable theories." Spooky action at a distance prevailed.

*****
 
The modern term for spooky action is quantum entanglement. The term suggests that quantum particles entangled "at birth" remained entangled forever despite the intervening distance. One is reminded of the psychic connections widely reported by human twins. Once connected, always connected, it seems.

In 1975, shortly after the first experimental test of Bell's theorem, a prescient physicist, Henry Stapp, went out on a limb to write in a governmental report: "Bell's theorem is the most profound discovery of science."

At the time, it is doubtful that many concurred. Today, Stapp's words ring true. Quantum entanglement, now well established, is quickly making its way into a variety of applications, from cryptography to quantum computing. Bell's theorem may be the Cullinan diamond of physics' many gems. By revealing quantum entanglement, Bell's theorem exposes subtle and mysterious interconnections that may lie outside the universe's spacetime fabric.

Had Einstein been buried rather than cremated, he would be turning in his grave.

(This essay was adapted from Chapter 11 of Reason and Wonder.)

Science's Sacred Cows (Part 4): Dualism


science



Science's Sacred Cows (Part 4): Dualism

Posted: 02/06/2013 2:18 pm

Dave Pruett




"The very act of observing alters the thing being observed."
--Werner Heisenberg

Since Jan. 2's post, we've been discussing assumptions that science initially embraced either explicitly or tacitly, later to abandon them as invalid or unnecessary. In the last post we rang the death knell for determinism. Today let's ring it for dualism.

The flavor of dualism most germane to science is Cartesian dualism, in reference to the 17th-century French philosopher and mathematician René Descartes, who partitioned the cosmos into two domains: the res extensa (matter) and the res cogitans (mind). Foundational for classical physics, the Cartesian partition presupposed the independence of subject (the observer) and object (the thing observed). Implicit in the subject-object dichotomy was the presumption of an independent reality "out there" that remains undisturbed during scientific observation.

Science proceeded objectively along until 1900, when German physicist Max Planck stumbled onto the subatomic quantum, the daintiest morsel of the material world. Matter and energy, it turns out, are quantized into discrete parcels that defy further subdivision. A little thought experiment helps expose the crack that the quantum opened in the bedrock of physics.

Imagine being ticketed for speeding, say for driving at 65 miles per hour on a road posted at 55. You didn't spot the patrol car until well past it. The trooper's radar gun got you from behind. In traffic court you attempt a quantum-mechanical defense: "Your Honor, I was traveling at the posted speed limit. When the officer fired his radar gun, the colliding radio-frequency photons transferred their combined momentum to my car, bumping up its speed to 65 mph. Had the officer not fired the radar gun, I would not have been speeding."
Of course, you must now pay the speeding fine and a fine for contempt of court, because your argument is patently absurd. However, it contains a germ of truth. Momentum was exchanged between the photons and your car. Moreover, the momentum lost by the reflected photons shifted their frequency -- similar to the familiar auditory Doppler shift -- allowing the patrolman to infer your speed. But the effect on your vehicle was immeasurably small, because a car's momentum so vastly exceeds the combined momentum of a few photons.

But suppose we wish to similarly detect the position and velocity of a moving electron rather than an automobile. An electron is too small to observe directly, so we infer its presence by beaming energy at it and observing the pattern of reflected energy. The essential difference from the previous case is that electrons are exceedingly small, and each therefore carries only a tiny amount of momentum. When a light photon, for example, collides with an electron, the impact occurs among virtual equals.

In the quantum world we have some choice in the color (frequency) of the photon projectiles. We can use "blue" photons, by which we mean highly energetic ones, or wimpy "red" ones. Suppose we first try wimpy ones, like the radio-frequency photons of a radar gun. An advantage of low-frequency photons is that they carry little momentum and, upon collision, scarcely disturb the electron's velocity, which can be inferred accurately. But what can be said of the electron's position? Very little. The wavelength of the photon provides the natural "tick marks" of our distance-measuring "yardstick." For low-frequency, long-wavelength photons, the tick marks are exceedingly sparse. Therefore, we obtain only a crude measurement of position.

Then let's try highly-energetic "blue" photons. In this case we determine the electron's position accurately, but its trajectory is so utterly altered that its velocity cannot be inferred.

There must be a way out of this corner. Let's try subdividing high-frequency photons into smaller parcels that won't disturb the electron so violently. That is, let's use one half, or one fourth, or one millionth of a photon. However, the central tenet of quantum mechanics precludes this scheme: Quanta are not subdivisible. Try as we might, the electron's position and velocity cannot simultaneously be determined precisely. This is the essence of Heisenberg's uncertainty principle.

* * * * *
 
The discussion above hints at another type of dualism, better termed duality, for it refers to differing aspects of a single entity. We have caricatured the photon as if it were a particle, like a small billiard ball. But we have also spoken of its frequency and wavelength, attributes of a wave. Quantum mechanics reveals that all material objects -- not just photons and electrons -- manifest in two fundamentally different ways: as waves or as particles. Waves are distributed in space; particles are localized. Two waves may occupy the same place at the same time through superposition. Two particles cannot. The forms are mutually exclusive. Which facet one observes depends upon the design of the experiment. Some experiments reveal an electron's wave nature, for example, and others its particle nature. No experiment reveals both aspects concurrently.

The oracle of quantum mechanics, Niels Bohr, spoke of complementarity rather than duality. Matter has two faces. Both faces reveal something about the object. By analogy, I, despite being a unitary human being, am both a party animal and a contemplative person. If you want to truly know me, observe me both at a party, interacting with others, and while meditating alone on my yoga mat.
Decades of attempts to resolve wave-particle duality failed. Every material object possesses an inherent wavelike nature expressed mathematically by its wavefunction. What does a quantum object's wavefunction reveal? The startling consensus of physicists is that the wavefunction encodes the probability of the object being detected at a given location when observed. Apparently, until observed, quantum objects manifest only a tendency to exist.

Again, an analogy helps (thanks to Charles Peskin of Courant Institute). Imagine skiing down a steep slope. A tree grows in the middle of the run near the bottom, presenting a hazard. The tree lies immediately ahead if you maintain your current course. Several possible scenarios exist, each associated with a probability. There exists a nonzero probability of striking the tree and perishing or sustaining injury. You could veer to the right of the tree, an option with a probability of slightly less than 50 percent. With equal probability you could veer left. The possibilities can be more finely graded, as in the very small probability of missing the tree by one mile to the right or the relatively high probability of missing it by six inches to the left. At the instant of your awareness of impending danger, all possibilities exist as potentia, each characterized by a probability.
Time is of the essence. You decide to adjust course to the left. At the moment of conscious intent -- that is, decision -- all potentia but one dissolve, and there remains a single reality: You miss the tree to the left. In quantum mechanics, this is known as collapse of the wavefunction.

In his delightful book Uncertainty, David Lindley summarizes: "Measurements are not passive accountings of an objective world, but interactions in which the thing measured and the way it is measured contribute inseparably to the outcome."

Heisenberg's uncertainty principle removes the partition separating mind from matter, rendering a fatal blow to Cartesian dualism. But complementarity remains. Possibly foreshadowing a paradigm shift for physics, Nobel laureate Wolfgang Pauli envisioned: "It would be most satisfactory of all if psyche and matter could be seen as complementary aspects of the same reality."

This blog post was adapted from Chapter 11 of my recent book Reason and Wonder.

Science's Sacred Cows (Part 3): Determinism




science


Science's Sacred Cows (Part 3): Determinism

Posted: 01/23/2013 10:29 pm

Dave Pruett



God does not play dice." --Albert Einstein

 
Since my Jan. 2 post, we've been discussing assumptions that science initially embraced, either explicitly or tacitly, only to abandon later as invalid or unnecessary. These include most of the following: dualism, determinism, reductionism, absolute time, absolute space, the principle of locality, materialism and realism. The last post addressed absolute time and space. Today we ring the death knell for determinism, the collateral damage of two revolutionary scientific developments of the 20th century. Let's review them in reverse historical order.

* * * * *
 
"If I have seen further than others, it is by standing on the shoulders of giants," acknowledged Newton. He didn't name the giants, but it's clear that Galileo and Kepler were among them. Galileo Galilei, the father of experimental physics, laid foundations for the science of motion. Johannes Kepler, the "Protestant Galileo," laid the foundations of celestial mechanics. Both men developed descriptive laws. Galileo described the now-familiar parabolic trajectory of a thrown object; Kepler teased from Tycho Brahe's astronomical data the fact that the planets travel in elliptical orbits. Neither explained why.

It took the peculiar genius of Newton to move from descriptions of natural phenomena to explanations. By combining 1) his three laws of motion, 2) the calculus, which he termed "fluxions," and 3) an inverse-square law of gravitation, Newton proved that the planetary orbits naturally and exactly satisfy all three of Kepler's descriptive laws.

Newton's achievement was monumental. To those of his era, it seemed that Newton had illuminated every scientific corner, leaving nature bereft of secrets. The French mathematician Pierre Simon de Laplace, the "Newton of France," extrapolated beyond Newton to envision the day when science could predict the future "movements of the greatest bodies of the universe and those of the tiniest atom." What was necessary to preordain the future? "An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed." The difficulties with Laplace's daemon, as this hypothesized intelligence became know, were practical rather than conceptual. By the 1950s Laplace's daemon was incarnate in the digital computer. Two decades later men stood on the moon, navigated there not by compass but by a digital daemon that could solve Newton's equations of celestial motion in real time.

* * * * *
 
Just as it seemed that the computer might fulfill the dream of scientific determinism, it turned against Laplace. The year was 1961, the place was MIT, and the person was Edward Lorenz, a meteorologist. While playing with a "toy" weather model on a primitive, vintage, vacuum-tube Royal McBee computer, Lorenz stumbled onto something unexpected: chaos.

Lorenz's system consisted of just three simple differential equations. However, the equations were nonlinear, a mathematical term indicating that the system's output is not directly proportional to its input. On impulse, Lorenz restarted an old computation at its midpoint in time. The old and new solutions followed nearly the same trajectories for a while, then they diverged dramatically, eventually losing all resemblance. Suspecting a computer malfunction, Lorenz checked and rechecked the Royal McBee. It was in perfect working order.
Pausing to ponder, Lorenz was stunned by the truth. For the original calculation, he had entered the initial data to six significant digits; in the repeat calculation, he had done so only to three. Tiny differences in the input were magnified by the system to yield enormous differences in output, a phenomenon known to mathematicians as sensitive dependence to initial conditions, and to the general public as the butterfly principle, thanks to James Gleick's bestseller Chaos (1988). In a nutshell, nonlinear systems can be extraordinarily sensitive to their initial states. Change the initial data a smidgeon and you'll get a wholly different solution.

* * * * *
 
Quantum mechanics rankled Einstein, steeped as he was in the determinism of the philosopher Spinoza. Quanta (subatomic particles) are defiantly probabilistic and therefore non-deterministic.

With the discovery of radioactivity in 1895, probability thrust its ugly head into physics. The moment at which a radioactive element decays is inherently unpredictable. One can ascertain the probability of an aggregation of decay events from a lump of radium with sufficient precision to define its half-life spot on. But as to when an individual atom will shed its alpha particle, one is powerless to say.

With the advent of quantum mechanics in the early 1900s, probability was here to stay. Each of the nearly 120 elements that comprise the universe emits a unique fingerprint, its visual spectrum, when heated or burned. No one knew why. Niels Bohr's adaptation of Ernest Rutherford's atomic model explained the hitherto mysterious spectral lines, but at the expense of quantizing the orbits of electrons. Each spectral line originates when an electron jumps from one orbit to another, Bohr reasoned. But Bohr's model troubled Rutherford: The leaping electrons seemed "to choose not only the timing of their leap but the destination [orbit] too."

The harder physicists tried to explain away the probabilistic nature of the quantum the more resilient it became. Erwin Schroedinger, a titan of quantum mechanics, grumbled, "If we're going to put up with these damn quantum jumps, I am sorry that I ever had anything to do with quantum theory." Einstein fought amicably with Bohr over the issue for more than two decades, and lost.

Dominating the landscape of quantum mechanics is Werner Heisenberg's uncertainty principle, which quantifies what can and cannot be known about quantum objects. Specifically, one can never know precisely and simultaneously the position and velocity of, say, an electron. You can know where it is but not where it's going. Or you can know where it's going, but not where it is. You can even split the difference, accepting some fuzziness in each. But you can never precisely know its complete initial state.

* * * * *
 
On the shoal of uncertainty, determinism founders. On the reef of chaos, it sinks. By and large, nature has a strong predilection for the nonlinear. Newton's law of gravitation is nonlinear. Maxwell's equations of electrodynamics are not, but they're the exceptions. General relativity is strongly nonlinear. So are the strong and weak nuclear forces. As a result, so is most chemistry. And because life involves chemistry, life's basic mechanisms are inherently nonlinear.

To accurately predict the evolution of events in a nonlinear universe -- in which the flapping of a butterfly's wing in Brazil literally affects Peoria's weather days later -- Laplace's daemon would need absolute precision in its initial data: the position and velocity of every molecule of air and butterfly. Heisenberg's uncertainly principle forbids just that. In the wonderfully pithy summary of Lorenz: "The present predicts the future, but the approximate present does not predict the approximate future."

The universe is contingent, not deterministic. For sentient beings this implies that the cosmos is also participatory. Individual intentions and actions, however seemingly insignificant, shape the cosmic future.

This blog post is adapted from Chapters 1 and 11 -- "The Clockmaker and the Clockwork" and ''Through the Looking Glass," respectively -- of my recent book Reason and Wonder.

Science's Sacred Cows (Part 2): Absolute Space and Time


science


Science's Sacred Cows (Part 2): Absolute Space and Time

Posted: 01/09/2013 2:08 pm

Dave Pruett




We are to admit to no more causes of natural things than such as are both true and sufficient to explain their appearances." -- Isaac Newton
 
In January 2nd's post, I asserted:
Science remains most true to itself and of greatest value to humanity when it assiduously avoids unnecessary assumptions. Over the long arc of history, science has initially embraced -- then discarded -- most of the following tacit assumptions: dualism, determinism, reductionism, absolute time, absolute space, the principle of locality, materialism, and most recently, realism. In subsequent posts, we'll examine each ...
Today, let's discuss the notions of absolute space and time.

The publication of Newton's Principia Mathematica in 1687 paved the way for the Age of Reason. Prior to Newton, there were isolated scientists -- Archimedes, Da Vinci, and Galileo, for example -- but there was not yet science. In a single stroke, Newton laid solid foundations for scientific methodology by combining inductive reasoning to infer general laws from experimental observations, mathematical formalism to state those laws concisely, logic to deduce new laws, and deductive reasoning to make predictions based upon those laws.

Principia's first volume formalizes the science of motion. At the onset, Newton assumes time and space to be absolutes. Regarding time, he writes: "Absolute, true, and mathematical time, of itself ... flows equitably without relation to anything external." Similarly for space. In today's lingo, we might say that Newton viewed Euclidean space as a fixed stage upon which physical events occur, that time flows the same for every observer, and that time and space are independent.

And so it remained until 1905, when a lowly patent official in Bern Switzerland examined patent applications by day and plotted the overthrow of Newtonian mechanics by night. Albert Einstein, then 26 years of age, noticed something about the nature of light that the titans of physics had overlooked.

In 1862, the principles of electromagnetism had joined Newtonian mechanics in the Pantheon of classical physics. The brainchild of Scottish physicist James Clerk Maxwell, Maxwell's equations describe the interaction of electricity and magnetism. Because all manner of phenomena -- light, electricity, and radio waves among them -- are electromagnetic in origin, Maxwell's equations are astoundingly practical.

Maxwell's equations take many forms, all equivalent. When they are expressed in so-called Gaussian units, the velocity of light in vacuo, symbolized by c, appears as a universal constant. In this ostensibly innocuous fact, Einstein sensed that the world is not as it seems.

The long-accepted principle of relativity (not to be confused with the theory of relativity) held that the mathematical expressions of physical laws must retain the same form in all inertial frames of reference (that is, in all non-accelerating coordinate systems). However, the appearance of c as a constant in the equations of electrodynamics suggested that the velocity of light must be independent of the reference frame in which that velocity is measured.
Imagine having a peripatetic friend that you frequently encounter during travels. No matter where you meet your friend, he always passes you by at a velocity of, say, 7 mph. Whether you are flying at 600 mph, walking at 3 mph, or biking at 20 mph, you always measure your friend's velocity at exactly 7 mph in the coordinate system that travels with you. How strange! But that's how light behaves, albeit at the blazing velocity of 186,000 miles per second.

Familiar moving objects -- baseballs, trains, planes, etc. -- don't behave this way. For example, moving sidewalks expedite pedestrian traffic along airport concourses because the velocity of the sidewalk relative to the concourse, say 2 mph, and the velocity of the traveler relative to the sidewalk, say 3 mph, add to yield 5 mph relative to the concourse, a combined rate at which kiosks and sports bars whiz by. But light's velocity does not add to that of its reference frame.

That the subtler implications of Maxwell's equations had escaped the notice of virtually all physicists explains their rapt attention to the Michelson-Morley experiment of 1887. Believing that light -- like sound -- needed a medium in which to propagate, physicists hypothesized the existence of the aether, a weightless, frictionless substance filling the void of space. It was further presumed that the aether remained stationary in an absolute frame of reference, Newton's absolute space still in vogue. Physicists believed that Michelson and Morley would detect slight differences in the velocity of light measured from different directions (i.e. frames), allowing them to extract from these differences the "aether drift" of the earth, the absolute velocity of the earth relative to the stationary aether.

The experiment failed abjectly. The velocity of light was maddeningly consistent. Measurements taken at differing times of day or year and differing orientations of the apparatus showed no appreciable differences in c. Michelson and Morley concluded tersely, "... the result of the hypothesis of stationary aether is thus shown to be incorrect."

Through clever thought experiments, Einstein reasoned that the independence of c from its reference frame must imply -- astonishingly -- the relativity of time: two observers in different frames see one another's clocks ticking at different rates. The "moving" clock is observed to tick more slowly, and the greater the velocity difference of the frames, the greater the discrepancy in the flow of time.
Relativistic time dilation is ordinarily minuscule, and so it escaped notice until the 20th century when the advent of the cesium clock made possible the measurement of time to 14 digits of precision. Using two such clocks in 1971 -- one on earth and one on a round-the-world flight -- two physicists, Joseph Hafele and Richard Keating, confirmed the time dilation predicted by Einstein's theory of relativity.

Time's relativity implies the relativity of space as well, and the interdependence of space and time. Although Hermann Minkowski, Einstein's mathematics professor, once characterized his wayward student as a "lazy dog" for cutting classes, he was smitten by the student's theory. "Henceforth space by itself and time by itself are doomed to fade away into mere shadows," Minkowski enthused, "and only a kind of union of the two will preserve an independent reality."

Newton's assumptions of absolute space and time were reasonable in his era and necessary for the development of classical physics, but relativity forced their abandonment. In the next post, we'll examine the demise of determinism.

(This article is adapted from Chapter 6 -- "A Wrinkle in Time" -- of the author's recent book Reason and Wonder.)

Science's Sacred Cows (Part 1)

science


Science's Sacred Cows (Part 1)

Posted: 01/02/2013 6:07 pm

Dave Pruett



In a 1983 address to an international symposium on Galileo, Pope John Paul II issued a stunning pronouncement:
The Church is convinced that there can be no real contradiction between science and faith. ... It is certain that science and faith represent two different orders of knowledge, autonomous in their processes, but finally converging upon the discovery of reality in all its aspects...
Given centuries of animosity between science and religion, the pontiff's admission astounds for several reasons. First, it stresses the complementarity rather than the antagonism of rational and intuitive modes of knowing. Second, it grants autonomy to both revelatory processes, implying that neither should seek to manipulate or triumph over the other. And third, it suggests that ultimate truth -- so far as we can know it -- emerges from the concerted efforts of external and internal explorations.

But the devil is in the details. Autonomy among those in relationship is best preserved when each party maintains a clear and robust boundary and a high degree of integrity. I'll defer to the philosophers to painstakingly demarcate the domains of science and religion, but one thing is certain: Most of the historic animosity between them is due to boundary infractions. And both parties are guilty.

The violations of science's domain by religion are numerous, well known and egregious. Particularly odious was the church's burning of Giordano Bruno at the stake in 1600 for multiple "heresies" that included the promotion of Copernicanism (the idea that the Earth orbits the Sun rather than vice versa), a suspicion that the stars are suns like our own and a belief in the plurality of worlds. Close on the heels of Bruno's demise came the trial of Galileo of 1632-3 in which the Inquisition convicted the world's most eminent scientist of heresies "more scandalous, more detestable, and more pernicious to Christianity than any contained in the books of Calvin, of Luther, and of all other heretics put together." Galileo's life was spared when he signed a confession recanting the "heresy" of Copernicanism; however, he remained under house arrest for the duration of his life.

Skirmishes between science and religion persist. Today's religious fundamentalists periodically attempt to force the teaching of creationism (or one of its many guises) in public schools, in violation both of science's domain and the constitutional separation of church and state. For a short summary of the most recent major skirmish, the 2005 U.S. Supreme Court case Kitzmiller v. Dover Area School District, see pages 89-90 of Jason Rosenhouse's Among the Creationists (Oxford, 2012).

Science's infractions are subtler but equally damaging to the human spirit. During an enlightening lecture in 2000 by religion scholar Huston Smith, I began to appreciate how science infringes on religion's domain. Smith thoughtfully distinguished science from scientism. The former is an investigative protocol; the latter is a religion, complete with dogma. Science is a formalized procedure for making sense of the world by studying its material properties, perceived through the awareness of the senses, albeit senses heightened by modern marvels such as the electron microscope, the Hubble Space Telescope or the Chandra X-Ray Observatory. Scientism (or scientific materialism), on the other hand, adds to science a statement of faith: The universe is only material. Moreover, given the spectacular successes of science over the past three centuries, it is more than fair to acknowledge that science represents a powerful way to learn about the world. But scientism ups the ante: Science is the best (or only) way to make sense of the world. In short, scientism is to science what fundamentalism is to religion: cocksure and inflexible.

Science remains most true to itself and of greatest value to humanity when it assiduously avoids unnecessary assumptions. Over the long arc of history, science has initially embraced -- then discarded -- most of the following tacit assumptions: dualism, determinism, reductionism, absolute time, absolute space, the principle of locality, materialism and, most recently, realism. In subsequent posts, we'll examine each of these in some detail. For now, let's summarize.
Despite the demise of most of its once-sacred cows, science remains alive and well, implying that the assumptions abandoned were never essential.

Unwarranted assumptions -- blinders, really -- may have been necessary to the methodical progress of science, but ultimately they squelch open inquiry. Indeed, all of science may rest upon a single inviolate assumption: The same physical laws apply throughout the cosmos. Why not leave it there (at least for now)?
Ultimately, science and religion should serve rather than dominate the human societies from which they emerged. Each, I believe, serves best from a stance of awe and humility that assumes as little as possible. The best from both worlds -- the greatest scientists and the most profound religious thinkers and teachers -- have always practiced these two qualities. Childlike awe motivated Einstein. "All our knowledge is but the knowledge of schoolchildren," he accepted. "The real nature of things, that we shall never know, never." Similarly, the German Jesuit theologian Karl Rahner invoked both humility and awe when he asked, "Which do we love more, the small island of our so-called knowledge or the sea of infinite mystery?"

This essay is adapted from the author's recent book Reason and Wonder (Praeger, 2012).