FAIR USE NOTICE

FAIR USE NOTICE

A BEAR MARKET ECONOMICS BLOG

OCCUPY THE SCIENTIFIC METHOD


This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.

In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml

If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates

All Blogs licensed under Creative Commons Attribution 3.0

Thursday, March 31, 2011

U.S. finds radioactive iodine in milk

Reuters


U.S. finds tiny amount of radiation in milk

WASHINGTON | Thu Mar 31, 2011 11:23am EDT

WASHINGTON (Reuters) - A trace amount of radioactive iodine, well below levels of public health concerns, has been detected in milk from the state of Washington as the U.S. monitors radiation levels amid the nuclear crisis in Japan, U.S. regulators said on Wednesday.

"These types of findings are to be expected in the coming days and are far below levels of public health concern, including for infants and children," the Food and Drug Administration and the Environmental Protection Agency said in a joint statement.

Testing of the milk sample showed 0.8 pico Curies per liter (pCi/L) of iodine-131, a radioactive form of iodine.

Although there are naturally occurring levels of radiation in milk, such an isotope is not normally found in milk, but the agencies stressed it was 5,000 times lower than the FDA's standard, known as the "defined intervention level."

"These findings are a minuscule amount compared to what people experience every day," FDA scientist Patricia Hansen said in a statement.

The EPA said it has increased radiation monitoring in U.S. milk, precipitation and drinking water in response to radiation leaks at Japan's Fukushima Daiichi nuclear plant, which was damaged by the huge tsunami that was followed by the massive 9.0 quake on March 11.

The agencies said Iodine-131 has a very short half-life of approximately eight days, and the level detected in milk and milk products was therefore expected to drop relatively quickly.

Contaminated milk is a worry after a nuclear accident because toxic levels of radioactive iodine can get into rainwater and feed that is ingested by cows and taken up in their milk. Contaminated milk was one of the biggest causes of thyroid cancers after the nuclear accident in Chernobyl because people near the plant kept drinking milk from local cows.

Iodine-131 is a threat to human health because it goes immediately to the thyroid gland, where it can cause cancer. Experts say thyroid cancer is generally considered non-fatal because treatments are so effective.

Wednesday, March 30, 2011

WHAT SCIENTIFIC CONCEPTS WOULD IMPROVE EVERYBODY'S COGNITIVE TOOLKIT?








March 29, 2011, 1:40 PM

More Tools For Thinking

In Tuesday’s column I describe a symposium over at Edge.org on what scientific concepts everyone’s cognitive toolbox should hold. There were many superb entries in that symposium, and I only had space to highlight a few, so I’d like to mention a few more here.

Before I do, let me just recommend that symposium for the following reasons. First, it will give you a good survey of what many leading scientists, especially those who study the mind and society, are thinking about right now. You’ll also be struck by the tone. There is an acute awareness, in entry after entry, of how little we know and how complicated things are. You’ll come away with a favorable impression of the epistemological climate in this subculture.

__________

There were many superb entries in that symposium, and I only had space to highlight a few, so I’d like to mention a few more here.

Before I do, let me just recommend that symposium for the following reasons. First, it will give you a good survey of what many leading scientists, especially those who study the mind and society, are thinking about right now. You’ll also be struck by the tone. There is an acute awareness, in entry after entry, of how little we know and how complicated things are. You’ll come away with a favorable impression of the epistemological climate in this subculture.

__________

Here though, are a few more concepts worth using in everyday life:

Clay Shirkey nominates the Pareto Principle. We have the idea in our heads that most distributions fall along a bell curve (most people are in the middle). But this is not how the world is organized in sphere after sphere. The top 1 percent of the population control 35 percent of the wealth. The top two percent of Twitter users send 60 percent of the messages. The top 20 percent of workers in any company will produce a disproportionate share of the value. Shirkey points out that these distributions are regarded as anomalies. They are not.

Jonathan Haidt writes that "humans are the giraffes of altruism." We think of evolution as a contest for survival among the fittest. Too often, "any human or animal act that appears altruistic has been explained away as selfishness in disguise." But evolution operates on multiple levels. We survive because we struggle to be the fittest and also because we are really good at cooperation.

A few of the physicists mention the concept of duality, the idea that it is possible to describe the same phenomenon truthfully from two different perspectives. The most famous duality in physics is the wave-particle duality. This one states that matter has both wave-like and particle-like properties. Stephon Alexander of Haverford says that these sorts of dualities are more common than you think, beyond, say the world of quantum physics.

Douglas T. Kenrick nominates "subselves." This is the idea that we are not just one personality, but we have many subselves that get aroused by different cues. We use very different mental processes to learn different things and, I’d add, we have many different learning styles that change minute by minute.

Helen Fisher, the great researcher into love and romance, has a provocative entry on "temperament dimensions." She writes that we have four broad temperament constellations. One, built around the dopamine system, regulates enthusiasm for risk. A second, structured around the serotonin system, regulates sociability. A third, organized around the prenatal testosterone system, regulates attention to detail and aggressiveness. A fourth, organized around the estrogen and oxytocin systems, regulates empathy and verbal fluency.

This is an interesting schema to explain temperament. It would be interesting to see others in the field evaluate whether this is the best way to organize our thinking about our permanent natures.

Finally, Paul Kedrosky of the Kauffman Foundation nominates "Shifting Baseline Syndrome." This one hit home for me because I was just at a McDonald’s and guiltily ordered a Quarter Pounder With Cheese. I remember when these sandwiches were first introduced and they looked huge at the time. A quarter pound of meat on one sandwich seemed gargantuan. But when my burger arrived and I opened the box, the thing looked puny. That’s because all the other sandwiches on the menu were things like double quarter pounders. My baseline of a normal burger had shifted. Kedrosky shows how these shifts distort our perceptions in all sorts of spheres.

There are interesting stray sentences throughout the Edge symposium. For example, one writer notes, "Who would be crazy enough to forecast in 2000 that by 2010 almost twice as many people in India would have access to cell phones than latrines?"



THE NEW YORK TIMES
March 29, 2011

OP-ED COLUMNIST

TOOLS FOR THINKING
By David Brooks

Science offers some help in the everyday as we navigate the currents of this world.

A few months ago, Steven Pinker of Harvard asked a smart question: What scientific concept would improve everybody's cognitive toolkit?

The good folks at Edge.org organized a symposium, and 164 thinkers contributed suggestions. John McWhorter, a linguist at Columbia University, wrote that people should be more aware of path dependence. This refers to the notion that often "something that seems normal or inevitable today began with a choice that made sense at a particular time in the past, but survived despite the eclipse of the justification for that choice. ...

... Daniel Kahneman of Princeton University writes about the Focusing Illusion, which holds that "nothing in life is as important as you think it is while you are thinking about it." He continues: "Education is an important determinant of income — one of the most important — but it is less important than most people think. If everyone had the same education, the inequality of income would be reduced by less than 10 percent. When you focus on education you neglect the myriad of other factors that determine income. The differences of income among people who have the same education are huge." ...

... Public life would be vastly improved if people relied more on the concept of emergence. Many contributors to the Edge symposium hit on this point.

We often try to understand problems by taking apart and studying their constituent parts. But emergent problems can't be understood this way. Emergent systems are ones in which many different elements interact. The pattern of interaction then produces a new element that is greater than the sum of the parts, which then exercises a top-down influence on the constituent elements.

Culture is an emergent system. A group of people establishes a pattern of interaction. And once that culture exists, it influences how the individuals in it behave. An economy is an emergent system. So is political polarization, rising health care costs and a bad marriage.

Emergent systems are bottom-up and top-down simultaneously. They have to be studied differently, as wholes and as nested networks of relationships. We still try to address problems like poverty and Islamic extremism by trying to tease out individual causes. We might make more headway if we thought emergently.

We'd certainly be better off if everyone sampled the fabulous Edge symposium, which, like the best in science, is modest and daring all at once. 



[continue to David Brooks's New York Times Op-Ed Column...]

Tuesday, March 29, 2011

Nuclear Power — Not Now, Not Ever

Dissident Voice: a radical newsletter in the struggle for peace and social justice

Nuclear Power — Not Now, Not Ever

Any citizen with even a casual awareness of the public debate over nuclear power is familiar with the usual talking points, pro and con, regarding this issue: safety, costs, environmental impacts, etc. I will not burden the reader with a rehash of these familiar issues.

Instead, I propose to enrich the debate with some issues with which the general public might be less familiar, all of which issues lead strongly to the conclusion that electric power generation from nuclear reactors should be phased out with deliberate speed and the technology abandoned — permanently.

This essay consists of three sections: First of all, the recent disaster at the Fukushima nuclear plant in Japan urgently brings the science of plate tectonics into the debate, and raises the question of whether the promoters of nuclear power are willing and able to take the long-term implications of that technology into consideration as they select sites for these facilities.

In the second section, we ask whether it is possible to accurately and reliably assess the safety of nuclear reactors. A failed attempt to do so thirty years ago suggests that such an assessment is impossible, not simply because of a lack of scientific knowledge and technological capacity, but more fundamentally, because of the insurmountable inability to anticipate all possible circumstances that might occur in the operation of the plant.

Finally, these and other considerations lead to the conclusion that nuclear power is not economically viable and sustainable without massive government subsidies that are unavailable to its competing technologies.

Fukushima: A Disaster Waiting to Happen

What were Tokyo Electric Power Co. (TEPCO) and General Electric thinking when they decided to site the world’s largest nuclear power complex at Fukushima, on the eastern coast of Northern Japan?

Perhaps they weren’t thinking at all, or at least they were thinking only for the short-term. Myopia is endemic to the corporate mind, which is dedicated to an early return on investment. “In the long-term,” John Maynard Keynes famously remarked, “we’ll all be dead.”

Nonetheless, a disastrous earthquake followed by a tsunami was certain to happen along the eastern coast of Japan. Not a question of if, but of when. That certainty was ordained by the science of plate tectonics and validated in the geological record.

The sword of Damocles hanging over Fukushima is the Japan Trench, located about 100 miles due east of, and parallel to, the coastline where the plant is located.

The trench is a subduction zone, where the Pacific plate dives down under the Okhotsk plate and into the mantle. The Japanese islands, like the Marianas and the Aleutians, owe their very existence to subduction which, as it grinds along, produces great earthquakes and tsunamis.

Tsunamis can be produced by volcanoes and landslides. But they most reliably occur along subduction zones, as the ocean floor during an earthquake is suddenly and violently jolted, causing a pulse of water to move outward and perpendicular to the fault line. The Indonesian tsunami of December 26, 2004, which killed almost a quarter of a million people, was caused by a magnitude 9.1 earthquake along a subduction zone about 100 miles west of Sumatra. Among other noteworthy subduction quakes/tsunamis are the “Good Friday” Alaska earthquake in 1964 (magnitude 9.1), and the Chilean earthquake of 2010 (magnitude 8.8).

And so, because the Japan Trench is parallel to the coast of northern Japan, the tsunami was aimed directly at that coast.

Because of the dynamics of plate tectonics, earthquake/tsunamis are endemic to Japan. For example, in 1923 a magnitude eight earthquake struck central Japan, leveling the city of Yokohama and destroying more than half of Tokyo, at the cost of about 100,000 lives.

The investors of the Fukushima plant knew all this, and yet they went ahead and built a facility that was designed to withstand a magnitude seven earthquake. (The Richter magnitude scale is not linear, it is logarithmic. Accordingly, the energy released in a magnitude nine quake is not two-ninths greater than that of a magnitude seven. It is about a thousand times greater). TEPCO continued to operate the facility, despite warnings from the International Atomic Energy Commission.

To put the matter bluntly, the investors and designers of Fukushima gambled that during the operational lifetime of the plant, there would be no earthquake greater than magnitude seven. They gambled, and the people of northern Japan lost. Economists call this loss an “externality.”

In California two commercial nuclear power facilities, at San Onofre between San Diego and Los Angeles and at Diablo Canyon near San Luis Obisbo, are located along the Pacific coast and near seismically active faults. As a resident of southern California, I must wonder if the operator of that plant, Southern California Edison, like TEPCO in Japan, is likewise gambling with my life and the lives of my neighbors. Heads they win, tails we lose.

And earthquakes and tsunamis are not the only, or even the greatest, threat posed by nuclear power reactors. The Three Mile Island accident was caused by a mechanical failure, and the Chernobyl disaster was caused by human error.

Building a nuclear power complex along a shoreline opposite a subduction zone is risky. That fact is a “known known.” How risky? That is an unknowable unknown. Any attempt to assess the risk, or for that matter the risk associated with any and all nuclear power plants, is almost certain to underestimate that risk. A reliable and accurate assessment of the risk of a failure of a nuclear power reactor is unobtainable, now and forever.

These are bold assertions that I will endeavor to demonstrate below. To do so, we will examine an ambitious and massive attempt, some thirty years ago, to assess the safety of nuclear power plants, and its subsequent spectacular failure to achieve that objective. Because the reasons for that failure remain valid today, this is a tale well worth retelling in the light of the disaster at Fukuyama and in the face of the determination of the Obama Administration, despite that disaster, to proceed with the construction of the first new nuclear power plants in thirty years.

Reactor Safety: The Rasmussen Report Revisited

Concerned about public criticism of their nuclear energy ambitions, the promoters of commercial atomic energy at the Atomic Energy Commission (AEC) initiated in 1972, the “Reactor Safety Study,” which was to become known as “The Rasmussen Report,” after its Director, Norman Rasmussen of the Massachusetts Institute of Technology. In August, 1974, the draft Report was released with much fanfare in a public-relations extravaganza that prompted one newspaper to proclaim: “Campaigners Against Nuclear Power Stations Put to Rout.” Following this triumphant entrance, scrupulous scientific assessment began behind the facade, after which it was all downhill for the Report. The AEC’s successor organization, the Nuclear Regulatory Commission (NRC), quietly withdrew endorsement of the Rasmussen Report in January, 1979.

Rushed into print to provide support for a renewal of the Price Anderson Act (a federally mandated limit of industry liability following a nuclear reactor failure), an eighteen page “Executive Summary” of the final Report was distributed to Congress and the Press in October, 1975, and in advance of the release of the full, 2300 page Report.

Perhaps the most famous item of the Executive Summary was the claim that the chances of being killed by a nuclear power plant “transient” is about equal to that of being killed by a meteorite. This mind-catching statistic has proven to have a longevity far exceeding that of the Report which spawned it. In general, the Summary concluded that:

…The likelihood of reactor accidents is much smaller than that of many non-nuclear accidents having similar consequences. All non-nuclear accidents examined in this study including fires, explosions, toxic chemical releases, dam failures, airplane crashes, earthquakes, hurricanes and tornadoes, are much more likely to occur and can have consequences comparable to, or larger than, those of nuclear accidents.

Closer examination revealed a startling discrepancy between the cheerful reassurances of the Executive Summary and the nine volumes of technical information. In his splendid book, The Cult of the Atom (Simon and Schuster, 1982), based upon tens of thousands of pages of AEC documents pried loose by the Freedom of Information Act, Daniel Ford observes that:

As one moves from the very technical material … to the Executive Summary … a change of tone as well as of technical content is evident. In the “back” of the study, there are cautionary notes, discussion of uncertainties in the data, and some sense that there may be important limitations to the results. The qualifications successively drop away as one moves toward the parts of the study that the public was intended to see. In the months following the study’s completion, the honesty of the official summary … became the most controversial issue.

The reassuring conclusions of the Rasmussen Report were based upon numerous highly questionable assumptions and methodologies. Among them:

  • By definition, the report estimated damage and casualties due to anticipated events. There is no clear acknowledgment that all possible significant events were not, and could not be, covered by the study. As it turned out, the near-disaster at Three Mile Island was just one of several “unanticipated” events. And as noted above, a magnitude nine earthquake was not anticipated by the designers of the Fukushima plant.
  • In fact, whole categories of failures were excluded from the risk estimates. For example, it was assumed that back-up safety systems would always operate in case of the failure of a primary system. Given this assumption, the risk of a catastrophic accident would be the product of the probability of the independent failure of both systems, and thus highly unlikely. However, this discounted the possibility of a “common-mode failure,” such as that at Browns Ferry, Alabama, in 1975 (soon after the release of the Report), where, due to faulty design, an accidental fire disabled both systems at once — yet another event excluded by the Rasmussen rules. Similarly, the Japanese earthquake and tsunami of March 11, 2011 disabled both the primary and backup safety systems at the Fukushima facility.
  • The Report focused on mechanical and equipment failures, and discounted design flaws and “human error,” as if these were in some sense insignificant. Also overlooked was the possibility of sabotage and terrorism.
  • The report adopted the so-called “fault-tree” method of analysis, described by the Report as “developed by the Department of Defense and NASA … [and] coming into increasing use in recent years.” Not so. As Daniel Ford reports, “long before [Rasmussen] adopt the fault-tree methods … the Apollo program engineers had discarded them.” [146] As a retired professor of engineering recently explained to me: “the simulation or probability tree … analyses … are used to locate the weak links in your design, given the possible sources of failure that you know of or can specify… [However, the analyses] are not meant to yield a credible probability of failure, but instead yield at best a lower bound for that probability.” (EP emphasis)
  • The “probabilities” assigned to the component “events” in the “fault tree,” leading to a hypothetical failure, were based upon almost pure speculation, since, because the technology was new, the evaluators lacked any precedents upon which to base probability assessments. (Both Rasmussen himself, and his Report, admitted as much). (Ford 138, 141). Thus, because the Report was fundamentally an advocacy document, this gave its pro-nuclear investigators the license to concoct unrealistically low risk assessments.
  • These “low risk estimates” in the Executive Summary were startling, to say the least: “non-nuclear events,” it claimed, “are about 10,000 times more likely to produce large numbers of fatalities than nuclear plants.” But the footnote to this statement gave it away, when it added that such “fatalities … are those that would be predicted to occur within a short period of time” after the accident. However, few fatalities due to radiation exposure are “short-term.” In fact, as Physicist Frank von Hipple pointed out, a careful reading of the voluminous technical material would disclose that for every ten “early deaths” conceded in the Summary, the same accident would cause an additional seven thousand cancer deaths. (Ford, 170) This was only one of the several scandalous discrepancies between the “public” Executive Summary and the Technical material in the Report, which led Morris Udall, then Chair of the Subcommittee on Energy and the Environment, to demand a new Executive Summary. The NRC refused.
  • The “peer review” of the Report was perfunctory at best. The reviewers were given eleven days to assess an incomplete 3,000 page draft report — a schedule virtually designed to yield invalid assessments. Even so, many of the referees returned withering criticisms, especially of the statistical methods employed by the studies. The findings of this review group were not released by the AEC or the NRC, and the published Report was unaltered by these criticisms.

These and numerous other flaws in the study led one critic to wryly comment that “the chance of the Rasmussen Report being substantially wrong is somewhat more than the chance of your being hit by a meteorite.”

Though the general public was much impressed by the public relations show orchestrated by AEC, informed professional investigators immediately began the erosion of credibility. Among these were the Bulletin of the Atomic Scientists, the Union of Concerned Scientists, and, most significantly, an independent panel set up by the American Physical Society and chaired by Harold Lewis of the University of California, Santa Barbara. Each of these returned severe criticisms of the Report.

All this bad news eventually led the Reactor Safety Study into the halls of Congress. Daniel Ford describes what followed:

In some cases [congressional] members and staff probed the issues [of reactor safety] carefully, prepared detailed follow-up reports, and tried to bring about needed reforms. Congressman Morris Udall’s Subcommittee on Energy and the Environment, for example, held extensive hearings on the validity of the Reactor Safety Study. His protests about the misleading manner in which the report’s findings were presented to the public forced the NRC, in January 1979, to repudiate the results of the study. (p. 226)

And so, at length, the relentless discipline of science and scholarship, combined with a rare display of uncompromising congressional oversight investigation, brought about the downfall of the AEC/NRC “Reactor Safety Study.”

The NRC’s “withdrawal of endorsement” stood in stark contrast to its release, scarcely four years earlier. This time there were no publicity releases, media interviews or press conferences. It was hoped that the announcement would go unnoticed amidst the usual gross output of news out of Washington. Given the widespread public opposition to nuclear power, this expectation was bound to be frustrated.

In the end, the Rasmussen Report was yet another attempt at justification of “the peaceful atom” which backfired on the proponents. Historians looking back on this technological extravaganza may note, with some bewilderment, that however severe the attacks by the critics, commercial nuclear power was, in this case at least, inadvertently done in by its defenders.

Nuclear Power Fails the Free Market Test

Still more substantial objections to nuclear power have been raised by scientists and engineers much more qualified than I am. So I will not repeat them here. (To read these objections, google “Physicians for Social Responsibility,” “Union of Concerned Scientists” “Natural Resources Defense Council” and “The Rocky Mountain Institute“). However, in closing, a few additional concerns are worthy of mention.

(1) First of all, every source of electric power, with the exception of nuclear power, “fails safe.” A failure at a coal-fired plant would, at worst, destroy the plant. But the damage would be localized and short-term. Failures at a wind-farm or solar facility are trivial. However, the damage caused by a nuclear meltdown and radiation release endures for millennia and can render huge areas permanently uninhabitable, as they have in Ukraine and Belarus due to the Chernobyl disaster, and as they likely will in Japan following the Fukushima catastrophe.

(2) Nuclear industry assurances as to the safety of their facilities are flatly refuted by their unwillingness to fully indemnify the casualty and property losses that would result from a catastrophic release of radiation from a nuclear accident. Since 1957, the Price Anderson Act has set a limit on the liability that private industry must pay in the event of an accident. The amount of that limit, originally $560 million for each plant, has been routinely revised, so that as of 2005 the limit is now $10.8 billion for each incident. Clearly, the Fukushima disaster will exact a cost far exceeding that amount. Were such an event to occur in the United States, the cost of such a disaster in excess of ten billion would be born by the victims and by the taxpayers. The contradiction is stark: the nuclear industry and its enablers in the NRC tell the public that nuclear energy is safe. And yet, at the same time, they are unwilling to back up these assurances with a full indemnification of their facilities.

(3) The public has not been adequately informed of the ongoing hazards of nuclear power. For example, the Union of Concerned Scientists report that in the past year, there were fourteen “near misses” among the 104 nuclear plants operating in the United States. And according to the Washington Post (March 24), the Nuclear Regulatory Commission has disclosed that “A quarter of U.S. nuclear plants [are] not reporting equipment defects.”

(4) The widely-heard claim that “nobody in the United States has ever died due to commercial nuclear power” utilizes “the fallacy of the statistical casualty.” Specific cancer deaths due to artificial nuclear radiation are, of course, indistinguishable from cancer deaths due to other causes. Yet epidemiological studies show, beyond reasonable doubt, that some deaths are attributable to artificial radiation. The inference from “no identifiable specific deaths” to “no deaths whatever” is fallacy made infamous by the tobacco industry’s successful defense against suits filed by injured smokers or their surviving families.

(5) The claim that nuclear power is the “safest” source of energy commits the “fallacy of suppressed evidence.” Such a claim pretends that the risk of nuclear power is confined to the radiation risks adjacent to a normally operating plant and immediately following each “event.” Usually excluded from such assessments are deaths and injuries involved in the mining, milling, processing, shipment, reprocessing, storage and disposal of fuel — in short, the entire “fuel cycle.”

(6) Similarly, the claim that nuclear power is the “cheapest” power available is likewise based upon “the fallacy of suppressed evidence.” Specifically, nuclear proponents arrive at this conclusion by “externalizing” (i.e., failing to include) such costs as government subsidies for research and development, the costs of disposing of wastes, the cost of decommissioning of facilities, and, again, the cost of risks to human life, health and property. As noted above, the risk factor is excluded due to the Price Anderson Act and the failure to acknowledge “statistical casualties. Once all these “externalized costs” are included, nuclear power adds up to the most expensive energy source, hands down. Over fifty years of industry research, development and operation have not altered this fact. Meanwhile, as R & D of alternative energy sources progress and economies of scale kick in, the costs of solar, wind, tide, geo-thermal and biomass energy continue to fall. (See UCS, Nuclear Power Subsidies: The Gift that Keeps on Taking,” and Amory Lovins, With Nuclear Power, ‘No Acts of God Can Be Permitted’”).

Because of considerations such as these, no nuclear plants have been commissioned since the completion in 1985 of the Diablo Canyon facility along the central coast of California. The Obama Administration is prepared to change all this, as the President has announced $8 billion in federal loan guarantees to allow the building of the first nuclear power plant since Diablo Canyon.

Without this “federal intervention,” along with the Price-Anderson liability cap, no new nuclear plants would be built. The “free market” would not allow it. And yet there are no conspicuous complaints from the market fundamentalists on the right.

Why am I not surprised?

PostScript: My involvement in the Diablo Canyon controversy goes way back. In 1981, a group of local citizens blockaded the Diablo Canyon construction site in an act of civil disobedience, for which, predictably, they were arrested. At the time, I was a Visiting Associate Professor of Environmental Studies at the University of California, Santa Barbara. The defense team asked me to testify as to the “reasonableness” of the protesters’ belief that the Diablo Canyon nuclear reactors posed a significant danger to their community and to themselves. The prosecution objected on the grounds that the defense was asking me to “do the jury’s work.” The judge concurred, and so I was not permitted to testify. My account of this experience and critique of the ruling may be found in my unpublished paper, “A Philosopher’s Day in Court” at my website, The Online Gadfly, The discussion above of the Rasmussen Report is a revision of my unpublished class discussion paper from 1980, “The Strange Saga of the Rasmussen Report”.

Ernest Partridge is the co-editor of The Crisis Papers. Read other articles by Ernest, or visit Ernest's website.

This article was posted on Tuesday, March 29th, 2011 at 8:01am and is filed under Disasters, Energy, Japan.

Monday, March 28, 2011

Unsafe at Any Exposure There's no safe level of radiation exposure.

CommonDreams.Org


As the radioactive contamination of food, water, and soil in Fukushima, Japan worsens, the media is continuously reassuring us that these levels are "safe." But there is no safe level of radiation.

Yes, at lower levels the risk is smaller, but the National Research Council of the National Academies of Science has concluded that any exposure to radiation makes it more likely that an individual will get cancer.

The press is reporting that 100 millisieverts (mSv) is the lowest dose that increases cancer risks. This simply isn't true. According to the NAS, if you are exposed to a dose of 100 mSv, you have a one in 100 chance of getting cancer, but a dose of 10 mSv still gives you a one in 1,000 chance of getting cancer, and a dose of 1 mSv gives you a one in 10,000 risk.

Those odds sound fairly low for one individual, but if you expose 10,000 people to a one in 10,000 risk, one of them will get cancer. If you expose 10 million people to that dose, 1,000 will get cancer. There are more than 30 million people in the Tokyo metropolitan area.

To understand the danger of low levels of radiation exposure, consider several factors.

First, the total dose is the most important factor, not the dose per hour. When you get an X-ray, you're exposed to a one-time burst of radiation. If you work for 10 hours in a spot where the radiation level is 1 millisievert per hour, your dose is 10 millisieverts, and the dose goes up the longer you stand there.

Second, there's a big difference between external and internal radiation. If you're standing in a spot where you're exposed to external radiation, that exposure ends as soon as you move away. But if you ingest or inhale a radioactive particle, it continues to irradiate your body as long as it remains radioactive and stays in your body.

Further, if you ingest radioactive particles, the dose isn't spread evenly over your entire body. It concentrates where the particles lodge. The average total body dose may be relatively low, but the dose at the site may be large enough to damage that tissue and cause cancer.

That's why the radiation being found in Japan in spinach, milk, and other food--as well as water--is so worrisome. If consumed, it will create ongoing radiation exposure and increase the risk of cancer.

A large majority of the hundreds of thousands of cancer cases that have occurred in the former Soviet Union because of the Chernobyl catastrophe were caused by people eating radioactively contaminated food.

Finally, it makes a big difference who gets irradiated. Children are much more vulnerable than adults. If a fetus is exposed to only 10 mSv in utero, his or her risk of getting cancer by age 15 doubles. So it's particularly dangerous when children or pregnant women consume radioactive food or water.

Reports indicate that the total radioactive releases in Fukushima have been relatively small so far. If this is the case, then the health effects will be correspondingly small. But it's not "safe" to release this much radiation. Some people will get cancer as a result. Most importantly, we don't know at this point how much more radiation there will be.

That’s why the U.S. government has said that people shouldn't be allowed within 50 miles of the plant.

If a comparable accident were to occur at the Indian Point nuclear reactors 24 miles north of New York City, 17 million people would need to evacuate. That's something to think about when we're told everything is OK at our nuclear plants.

Ira Helfand

Dr. Ira Helfand is an internist and a member of the board of Physicians for Social Responsibility. ww.psr.org

Sunday, March 27, 2011

The Darkside of Technology



March 26, 2011 at 20:38:54

5 Reasons Why Technology Can Never Be Neutral

By Mickey Z. (about the author)


DSG by Mickey Z.

It's repeated so often that few of us even stop to question its validity: "Technology is neutral. It's only as good or as bad as those using it."

Here are 5 reasons why this is far from true:

1. Technology Devours Nature

Thanks to the automobile culture, for example, in the 20th century, an area equal to all the arable land in Ohio, Indiana, and Pennsylvania was paved in the US This means highways, off-ramps, parking lots, etc.--each replacing countless eco-systems.

2. Technology Leaves Behind Lots of Toxic Waste

Three million tons of household electronics tossed by Americans in 2006. There are 300 million obsolete computers in the US today and only 50% of a computer is recycled. The non-recyclable components of a single computer may contain almost 2 kilograms of lead. Seventy percent of the entire toxic waste stream of landfills is e-waste.

3. Technology Spawns Alienation

We have social media but we're sacrificing social skills. "With the present means of long distance mass communication, sprawling isolation has proved an even more effective method of keeping a population under control, henceforth a one-way world," writes Lewis Mumford. To green anarchists, technology is "more than wires, silicon, plastic, and steel. It is a complex system involving division of labor, resource extraction, and exploitation for the benefit of those who implement its process. The interface with and result of technology is always an alienated, mediated, and distorted reality."

4. Technology is Not Available to Everyone

In Australia, 60.4% of the population has access to the Internet. In Asia, that number is 19.4%. Pretty stark difference, huh? Get ready for this one: In North America, 74.2% of the population has access to the Internet. In Africa, that number is 6.8%. If you think it can't get worse than that, try this on for size: In six African nations--Burundi, Chad, Central African Republic, Liberia, Rwanda, and Sierra Leone--only 3 to 5% of people can access electricity. In fact, 79% of the Third World (1.5 billion people) have no access to electricity.

5. Technology Results in Environmental Racism

While the developed world quenches its insatiable thirst for the newest and latest gizmo, much of the subsequent e-waste is exported to countries like India, China, Pakistan, Nigeria, and Ghana. "The pollution and related health problems in countries where e-waste is dumped will increase massively as the amount of electronics used worldwide is growing exponentially and the number of countries used as dump sites will grow," says Kim Schoppink, Toxics Campaigner at Greenpeace.

Thanks to our myriad techno-fetishes, we civilized humans happily delegate such tedious tasks as learning how to spell, remembering phone numbers, doing math in our heads, memorizing directions, or even walking up a single flight of stairs to technology so we can have time to focus on the truly important stuff, like...um...well...uh...removing 90% of the large fish from the ocean, perhaps?

Take-home message: Technology can never be neutral and industrial civilization can never be sustainable.

Until the laws are changed or the power runs out, Mickey Z. can be found on this crazy new website called Facebook. His eleventh book (and third novel), A Darker Shade of Green, can be pre-ordered now.

http://www.mickeyz.net

Mickey Z. can be found on the Web at http://www.mickeyz.net.

The views expressed in this article are the sole responsibility of the author
and do not necessarily reflect those of this website or its editors.

Nuclear Radiation 'The Greatest Public Health Hazard'

CommonDreams.org

Helen Caldicott says it is impossible to have a safe nuclear power plant

When she was an adolescent, Helen Caldicott says, she read the nuclear apocalypse novel "On the Beach." The story was set in the aftermath of an atomic war; the protagonists must await the arrival of a deadly fallout cloud.

Helen Caldicott says it is impossible to have a safe nuclear power plant It was a formative event, she says, and later, in medical school, the connection between health and nuclear energy would galvanize her. "I learned about genetics and radiation in first-year medicine and became acutely aware of nuclear weapons, nuclear war and the damage radiation does to genes and all life forms."

Caldicott went on to become one of the most vocal, ubiquitous and controversial opponents of nuclear power during the anti-nuclear movement of the 1970s and 1980s.

The crisis at the Fukushima Daiichi Nuclear Power Station, severely damaged after the earthquake and tsunami in Japan, has given a fresh urgency, she says, to a "medical problem of vast dimensions," highlighted by reports that emerge daily on the spread of radiation.

A pediatrician, Caldicott came from her native Australia to become an instructor on the faculty of Harvard Medical School, where she specialized in the treatment of cystic fibrosis at the Children's Hospital Medical Center. She soon helped revive the moribund Physicians for Social Responsibility, a health organization dedicated to halting the proliferation and use of nuclear weapons and nuclear power.

While she was president, from 1978 through 1984, the group grew to 23,000 physician members and in 1985 shared in a Nobel Peace Prize with International Physicians for the Prevention of Nuclear War. "We led the nuclear weapons freeze movement with many other professional groups," she said. "I think we helped to end the Cold War."

Caldicott, who lives in Australia and the U.S., remains engrossed in the anti-nuclear issue, heads the Helen Caldicott Foundation for a Nuclear-Free Planet and regularly lectures around the world on its dangers. She's written seven books, including "If You Love This Planet: A Plan to Heal the Earth" and "Nuclear Power Is Not the Answer."

CNN spoke with Caldicott ahead of a talk she was giving in Camden, New Jersey.

CNN: What is the health risk for people living near the Fukushima Daiichi plant?

Helen Caldicott: The risk cannot be determined with any accuracy yet, because it is not clear how much radiation has or is escaping. NPR reported last week that 17 workers have suffered what the Japanese government called "deposition of radioactive material" to their faces. And some plant workers have already been hospitalized for exposure to radiation, which means they received a huge dose of radiation.

High levels of exposure can cause acute radiation sickness, a syndrome first recognized by the medical profession after Hiroshima and Nagasaki. It can have terrible effects. In two weeks, victims' hair begins to drop out. They develop hemorrhaging under the skin, severe nausea and diarrhea and may eventually die from bleeding or infection.

If a meltdown occurred at the plant, a large number of people could be exposed to high doses of radiation in this region, one of the most heavily populated in Japan. (After the March 11 earthquake, the Japanese government evacuated people living within a 20-kilometer radius to mitigate the possibility.)

Men exposed to such a dose would be rendered sterile, women would stop menstruating, and spontaneous abortions would likely occur. Babies could be born with microcephaly, with tiny heads and mental disabilities. Many people would develop acute shortness of breath from lung damage. In five years, there would be an epidemic of leukemia, and in 15 years, solid cancers would start appearing in many organs: lung, breast, thyroid, brain and bone.

Even if the release is not huge, the incidence of cancer and leukemia will increase in the population. Children are 10 to 20 times more sensitive to the carcinogenic effects of radiation than adults, and fetuses thousands of times moreso because their cells are rapidly dividing and are thus vulnerable to genetic mutations. Genetic diseases, like cystic fibrosis, diabetes, dwarfism and metabolic disorders, will be passed on to future generations.

There is no way to decontaminate exposed people once they inhale or ingest radioactive elements, which are dispersed throughout the body to many different organs.

CNN: How is this disaster comparable to the accidents at Chernobyl and Three Mile Island?

Caldicott: The radiation monitors at Three Mile Island went off the scale within minutes of the accident, so releases were only guesstimates by physicists. But almost certainly, radioactive elements like strontium 90, cesium 137 and tritium escaped. Chernobyl had a full meltdown and rupture of the containment vessel, and fallout contaminated 40% of Europe and England.

There are six reactors at the Fukushima Daiichi Plant No. 1 in Japan, and their spent fuel pools, which contain highly radioactive fuel rods, are also at risk of melting down. These pools contain two to 10 times more radiation than in the reactor core, which itself contains as much long-lived radiation as 1,000 Hiroshima bombs.

CNN: Is it possible to have a safe nuclear power plant?

Caldicott: No. They are very complicated machines containing the energy released when an atom is split: Einstein's formula e=mc², the mass of the atom times the speed of light squared. Anything can go wrong: natural disasters, failure of cooling systems, human and computer error, terrorism, sabotage. Radioactive waste must be isolated from the ecosphere for half a million years or longer, a physical and scientific impossibility, and as it leaks it will concentrate in food chains, inducing epidemics of genetic diseases, leukemia and cancer in all future generations, the greatest public health hazard the world will ever see.

Einstein said, "The splitting of the atom changed everything save man's mode of thinking; thus we drift towards unparalleled catastrophe." He also said, "Nuclear power is a hell of a way to boil water."

CNN: Doesn't every form of energy production involve some risk, as we saw with the oil spill in the Gulf?

Caldicott: Well, that was dreadful. But to leave a legacy of huge vats of leaking radioactive waste around the world, inducing epidemics of malignancy and random compulsory genetic engineering, is a legacy for which future generations will be distinctly ungrateful.

CNN: Is there any other aspect of this event that we should be paying attention to and are not?

Caldicott: No, except that the media keep interviewing nuclear engineers and physicists, but in truth this is a medical problem of vast dimensions.

Iodine-131 in Seawater 'Off Chart'; Contamination Spreading

CommonDreams.org

1,250 Times Higher than Normal; Contamination Spreading

by Kanako Takahara and Kazuaki Nagata

TOKYO -- The level of radioactive iodine detected in seawater near the Fukushima No. 1 nuclear power plant was 1,250 times above the maximum level allowable, the Nuclear and Industrial Safety Agency said Saturday, suggesting contamination from the reactors is spreading.

OFF THE CHART -- Picture released by Tokyo Electric Power Company (TEPCO) via Jiji Press shows black smoke rising from reactor number three at the Fukushima dai-ichi nuclear power plant. Radiation levels in seawater near the tsunami-hit nuclear plant have soared over the past several days, officials said.


Meanwhile, plant operator Tokyo Electric Power Co. turned on the lights in the control room of the No. 2 reactor the same day, and was analyzing and trying to remove pools of water containing radioactive materials in the turbine buildings of reactors 1 to 3.

The iodine-131 in the seawater was detected at 8:30 a.m. Friday, about 330 meters south of the plant's drain outlets. Previously, the highest amount recorded was about 100 times above the permitted level.

If a person drank 500 ml of water containing the newly detected level of contamination, it would be the equivalent of 1 millisievert of radiation, or the average dosage one is exposed to annually, the NISA said.

"It is a substantial amount," NISA spokesman Hidehiko Nishiyama told a news conference.

But he also stressed there is "no immediate risk to public health," as the changing tides will dilute the iodine-131, and its half-life, or the amount of time it takes for it to lose half its radioactivity, is only eight days.

Nishiyama said the high concentration was perhaps caused by airborne radiation that contaminated the seawater, or contaminated water from the plant that flowed out to sea.

Tepco said early Saturday that it had detected a radiation reading of 200 millisieverts per hour in a pool of water in the No. 1 reactor's turbine building on March 18 and failed to notify workers, but later denied that a radiation level that high was found.

"If we had warned them, we may have been able to avoid having workers (at the No. 3 reactor) exposed to radiation," a Tepco official said.

Chief Cabinet Secretary Yukio Edano said the government had not been informed about the high radiation reading at the No. 1 plant and he will order Tepco to thoroughly report information. "If (Tepco) doesn't report various information with speed and accuracy, the government can't give proper instructions," Edano said. "It will only trigger distrust from the public and from the workers at the site."

On Thursday, three workers in the turbine building's basement of reactor 3 were exposed to a high dosage of radiation when they stepped into about 15 cm of contaminated water.

Two of the workers were not wearing high boots and received beta ray burns when the water soaked their legs. All three were sent to the National Institute of Radiological Sciences in Chiba Prefecture.

Tests revealed that while the two received 170 to 180 millisieverts of radiation, within the maximum allowable dose of 250 millisieverts, their feet were exposed to between 2 and 6 sieverts. One sievert is equivalent to 1,000 millisieverts. But their injuries are not thought to be life-threatening and will be treated the same way as regular burns, the institute said, adding that the workers are able to walk unassisted.

On Saturday, Tepco also started pumping fresh water rather than seawater into the No. 2 reactor.

Thursday, March 24, 2011

Denmark, Finland and Belgium Have Best Democracies, Experts Say



Science News


Denmark, Finland and Belgium Have Best Democracies, Experts Say


ScienceDaily (Jan. 27, 2011) — A new democracy barometer from the University of Zurich and the Social Science Research Center Berlin (WZB) shows the development of the thirty best democracies in the world. Denmark, Finland and Belgium have the highest quality of democracy, whereas Great Britain, France, Poland, South Africa and Costa Rica the lowest. Moreover, the barometer shows no evidence of a crisis of democracy.

Diagnoses of a crisis of democracy are as old as democracy itself; they are a common theme in the political discourse of the Western world. However, until now there was no instrument that allowed a systematic measurement of the quality and stability of democracy in highly developed industrialized countries across national borders and over long periods of time. A democracy barometer that has analyzed the development of the most important aspects of the world's thirty foremost democracies since 1990 has now been presented at the University of Zurich.

The barometer uses 100 empirical indicators to measure how well a country complies with the three democratic principles of freedom, equality and control as well as the nine basic functions of democracy. The comparison of thirty established democracies between 1995 and 2005 has revealed that Denmark is leading the way, followed by Finland and Belgium. "In the comparison, the lowest quality is exhibited by the democracies in Poland, South Africa and Costa Rica," says Marc Bühlmann from the University of Zurich. While Italy, as might be expected, finds itself towards the bottom end of the scale, it is surprising that Great Britain (26th) and France (27th) are also so far down the ranking. Equally surprising is the fact that Switzerland (14th) is only mediocre and lags behind 11th- placed Germany. USA ranks 10th, behind Canada at 7thplace.

Quality of democracy on the rise

The democracy barometer can also be used to measure the quality of democratic systems over time. "There was, however, no evidence of an overall crisis or a decline in the quality of democracy," according to Bühlmann. Quite the contrary: if the quality of democracy in all thirty countries is seen as a whole, an increase in the quality of democracy from 1995 to 2000 can be observed and, despite a slight dip again between 2000 and 2005, it is still at a higher level in 2005 than in 1995. Consideration of the individual countries reveals that nine democracies exhibit a lower quality than in 1995 (ITA, CZE, POR, USA, CRC, FRA, IRL, AUS and GER), whereas the quality of democracy has risen in the remaining twenty-one countries.

The democracy barometer registers the differences in the quality of political participation, representation and transparency as well as those concerning the rule of law, individual liberties or the ability of a government to actually implement democratic decisions. If the countries are viewed as a whole, an increase in the quality of transparency and representation becomes apparent, but so does a slight decline in the rule of law. The positive trend can be attributed -- among other things -- to the ever-better integration of women in the political process and the increase in transparency virtually forced into being by citizens, audit divisions, ombudsmen, NGOs and the media. On the other hand, the rule of law is losing ground due to an increasing unequal treatment of minorities. Here, too, there are major differences between the individual countries. Positive developments are apparent in younger democracies such as South Africa and Cyprus, which are making up a lot of ground in terms of developing and protecting personal liberties, whilst a decline was evident in George W. Bush's America and Silvio Berlusconi's Italy.

"Democracy is still a work in progress," say the two project leaders Marc Bühlmann (Zurich) and Prof. Wolfgang Merkel (Berlin). Sustainable democratization is needed, even in established democracies." Our democracy barometer shows the strengths and weaknesses of the democracies in the individual countries. But it also reveals where progress and success have been achieved and where it is worth studying the best practices of successful democracies more closely," say Merkel und Bühlmann.

Country comparison: Average quality of democracy 1995-2005 with highest ranking countries at the top.

  1. Denmark ... 88.3
  2. Finland ... 87.7
  3. Belgium ... 85.1
  4. Iceland ... 83.5
  5. Sweden ... 82.9
  6. Norway ... 82.1
  7. Canada ... 79.4
  8. Netherlands ... 79.0
  9. Luxembourg ... 75.2
  10. USA ... 74.9
  11. Germany ... 73.2
  12. New Zealand ... 72.1
  13. Slovenia ... 69.6
  14. Switzerland ... 67.8
  15. Ireland ... 67.0
  16. Portugal ... 66.7
  17. Spain ... 66.6
  18. Australia ... 65.5
  19. Hungary ... 63.2
  20. Austria ... 63.1
  21. Czech Republic ... 58.2
  22. Italy ... 57.0
  23. Cyprus ... 55.5
  24. Malta ... 54.2
  25. Japan ... 45.8
  26. Great Britain ... 44.6
  27. France ... 42.8
  28. Poland ... 42.0
  29. South Africa ... 39.8
  30. Costa Rica ... 32.7

Radioecologists Developing Japan-Response Recommendations

Science/AAAS




Radioecologists Developing Japan-Response Recommendations

on 24 March 2011, 11:40 AM

Two months ago, to little fanfare, the U.S. Department of Energy launched a new research center at its Savannah River National Laboratory (SRNL) in Georgia. Now, thanks to the ongoing troubles at the Fukushima nuclear plant, the fledgling National Center for Radioecology (NCoRE) suddenly finds itself with a lot more work to do.

Radioecology is the science of how radiation affects ecosystems, and on Tuesday, NCoRE’s group of experts in this field, drawn from six U.S. universities, France, Ukraine, and SRNL, held a teleconference to discuss how best to respond to the emerging radiation situation in Japan. They decided to begin drafting recommendations for a plan to aid Japan in collecting ecological samples and testing current models of radiation’s effects on ecosystems. After the situation in Japan stabilizes, NCoRE hopes to bring Japanese radioecology experts on board as well and ultimately craft a white paper to present to the U.S. government and other relevant groups.

Inspired by the 25th anniversary of the Chernobyl accident, NCoRE was founded in January 2011 as a way to collect various types of expertise in radioecology and study the environmental impacts of a growing number of nuclear power plants. The group also plans to test models that would assist in the preparation for nuclear attacks, as well as offer training for potential radioecologists. The latter task is seen as crucial because no formal graduate program in radioecology currently exists. “It’s very clear there’s a paucity of expertise in this area,” says radioecologist Timothy Mousseau of the University of South Carolina, Columbia.

The end of the nuclear testing era brought a gradual decline in funding for radioecology; investment in the field following Chernobyl proved to be only temporary. But Mousseau, who researches the long-term ecological effects of Chernobyl, expects that Japan’s current nuclear troubles will trigger “renewed interest in what we’ve learned.”

Timothy Jannik of SRNL, who specializes in risk modeling, says that because Japan is still beset with earthquake and tsunami damage, it will be difficult to quickly and accurately take current samples of any potential radiation across the country’s many ecosystems. Much of what radioecologists will be able to learn about exposure doses and how the radiation was dispersed will come from retroactive modeling, which will be more difficult but could result in improved dispersion models applicable to future radiation leaks.

Ideally, NCoRE will work with groups in Japan to track the future impact of any released radiation. “We would like to help develop an intensive monitoring mechanism to track deposition of radionuclides and how they are transported,” Mousseau says. One of NCoRE’s recommendations will be to test and reassess models of how radiation is spread through the food chain: in addition to direct fallout onto crops and agricultural soil, radionuclides in the water could have widespread effects. Cesium-137 in particular is known to accumulate in freshwater fish—a mainstay of the Japanese diet. “I’m assuming if [radiation has] spread on the ground, it also dispersed into lakes,” Jannik says. “The key is getting samples in the emerging situation.”

For our complete coverage of the crisis in Japan, see our Japan Earthquake page. For Science's answers to reader questions about the crisis, see our Quake Questions page.

Radiation: Nothing to See Here?

logo


Radiation: Nothing to See Here?


by: Brian Moench, MD, t r u t h o u t | News Analysis

Radiation: Nothing to See Here?


Children are screened for radiation at a public health center in Yamagata Prefecture of Japan, March 17, 2011. (Photo: Ko Sasaki / The New York Times)

Administration spokespeople continuously claim "no threat" from the radiation reaching the US from Japan, just as they did with oil hemorrhaging into the Gulf. Perhaps we should all whistle "Don't worry, be happy" in unison. A thorough review of the science, however, begs a second opinion.

That the radiation is being released 5,000 miles away isn't as comforting as it seems. The Japanese reactors hold about 1,000 times more radiation than the bombs dropped over Hiroshima.(1) Every day, the jet stream carries pollution from Asian smoke stacks and dust from the Gobi Desert to our West Coast, contributing 10 to 60 percent of the total pollution breathed by Californians, depending on the time of year. Mercury is probably the second most toxic substance known after plutonium. Half the mercury in the atmosphere over the entire US originates in China. It, too, is 5,000 miles away. A week after a nuclear weapons test in China, iodine 131 could be detected in the thyroid glands of deer in Colorado, although it could not be detected in the air or in nearby vegetation.(2)

The idea that a threshold exists or there is a safe level of radiation for human exposure began unraveling in the 1950s when research showed one pelvic x-ray in a pregnant woman could double the rate of childhood leukemia in an exposed baby.(3) Furthermore, the risk was ten times higher if it occurred in the first three months of pregnancy than near the end. This became the stepping-stone to the understanding that the timing of exposure was even more critical than the dose. The earlier in embryonic development it occurred, the greater the risk.

A new medical concept has emerged, increasingly supported by the latest research, called "fetal origins of disease," that centers on the evidence that a multitude of chronic diseases, including cancer, often have their origins in the first few weeks after conception by environmental insults disturbing normal embryonic development. It is now established medical advice that pregnant women should avoid any exposure to x-rays, medicines or chemicals when not absolutely necessary, no matter how small the dose, especially in the first three months.

"Epigenetics" is a term integral to fetal origins of disease, referring to chemical attachments to genes that turn them on or off inappropriately and have impacts functionally similar to broken genetic bonds. Epigenetic changes can be caused by unimaginably small doses - parts per trillion - be it chemicals, air pollution, cigarette smoke or radiation. Furthermore, these epigenetic changes can occur within minutes after exposure and may be passed on to subsequent generations.(4)(5)(6)

The Endocrine Society, 14,000 researchers and medical specialists in more than 100 countries, warned that "even infinitesimally low levels of exposure to endocrine-disrupting chemicals, indeed, any level of exposure at all, may cause endocrine or reproductive abnormalities, particularly if exposure occurs during a critical developmental window. Surprisingly, low doses may even exert more potent effects than higher doses."(7) If hormone-mimicking chemicals at any level are not safe for a fetus, then the concept is likely to be equally true of the even more intensely toxic radioactive elements drifting over from Japan, some of which may also act as endocrine disruptors.

Many epidemiologic studies show that extremely low doses of radiation increase the incidence of childhood cancers, low birth-weight babies, premature births, infant mortality, birth defects and even diminished intelligence.(8) Just two abdominal x-rays delivered to a male can slightly increase the chance of his future children developing leukemia.(9) By damaging proteins anywhere in a living cell, radiation can accelerate the aging process and diminish the function of any organ. Cells can repair themselves, but the rapidly growing cells in a fetus may divide before repair can occur, negating the body's defense mechanism and replicating the damage.

Busy schedule? Click here to keep up with Truthout with free email updates.

Comforting statements about the safety of low radiation are not even accurate for adults.(10) Small increases in risk per individual have immense consequences in the aggregate. When low risk is accepted for billions of people, there will still be millions of victims. New research on risks of x-rays illustrate the point.

Radiation from CT coronary scans is considered low, but, statistically, it causes cancer in one of every 270 40-year-old women who receive the scan. Twenty year olds will have double that rate. Annually, 29,000 cancers are caused by the 70 million CT scans done in the US.(11)(12) Common, low-dose dental x-rays more than double the rate of thyroid cancer. Those exposed to repeated dental x-rays have an even higher risk of thyroid cancer.(13)

Even properly functioning nuclear plants emit a steady stream of radiation into nearby water and atmosphere, which can be inhaled directly or ingested from soil contact, plants or cows milk. Many studies confirm higher rates of cancers like childhood leukemia, and breast and thyroid cancer among people who live in the same counties as nuclear plants, and among nuclear workers.(3)

Beginning with Madam Curie, the story of nuclear power is one where key players have consistently miscalculated or misrepresented the risks of radiation. The victims include many of those who worked on the original Manhattan Project, the 200,000 soldiers who were assigned to eye witness our nuclear tests, the residents of the Western US who absorbed the lion's share of fallout from our nuclear testing in Nevada, the thousands of forgotten victims of Three Mile Island or the likely hundreds of thousands of casualties of Chernobyl. This could be the latest chapter in that long and tragic story when, once again, we were told not to worry.

Footnotes:

1. "Fukushima Daiichi reactors contain radiation equal to a thousand Hiroshima bombs," Vancouver Observer, March 14, 2011; Ira Helfand, Robert Alvarez, Ken Bergeron and Peter Bradford (former member of the US Nuclear Regulatory Commission), on behalf of Physicians for Social Responsibility.

2. Rosenthal E. Radiation, "Once Free, Can Follow Tricky Path," The New York Times, March 21, 2011.

3. International Commission on Radiological Protection.

4. Huang YC, Schmitt M, Yang Z, Que LG, Stewart JC, Frampton MW, Devlin RB, "Gene expression profile in circulating mononuclear cells after exposure to ultrafine carbon particles," Inhal Toxicol, 2010 May 27. (Epub ahead of print.)

5. Baccarelli A, Wright R, Bollati V, et al, "Rapid DNA Methylation Changes after Exposure to Traffic Particles." Am. J. Respir. Crit. Care Med., April 2009; 179: 572 - 578.

6. Zhong Y, Carmella S, Upadhyaya P, Hochalter JB, et al, "Immediate Consequences of Cigarette Smoking: Rapid Formation of Polycyclic Aromatic Hydrocarbon Diol Epoxides Chem. Res. Toxicol.," Article ASAP DOI: 10.1021/tx100345x publication date (web): December 27, 2010.

7. "Endocrine-Disrupting Chemicals: An Endocrine Society Scientific Statement," 2009.

8. Bartley K, Metayer C, Selvin S, et al, "Diagnostic X-rays and risk of childhood leukaemia," Int. J. Epidemiol. (2010) 39(6): 1628-1637, first published online October 1, 2010, doi:10.1093/ije/dyq162.

9. Bailey H, Armstrong B, de Klerk N, et al, "Exposure to Diagnostic Radiological Procedures and the Risk of Childhood Acute Lymphoblastic Leukemia," Cancer Epidemiol Biomarkers Prev, November 2010, 19:2897-2909; Published online first, September 22, 2010.

10. Shuryak I, Sachs R, Brenner D., "Cancer Risks After Radiation Exposure in Middle Age," JNCI J Natl Cancer Inst Volume102, Issue 21, Pp. 1628-1636.

11. Berrington de González A, Mahesh M, Kim K, et al, "Projected Cancer Risks From Computed Tomographic Scans Performed in the United States in 2007," Arch Intern Med, December 14/28, 2009; 169: 2071 - 2077.

12. Smith-Bindman R, Lipson J, Marcus R, et al, "Radiation Dose Associated With Common Computed Tomography Examinations and the Associated Lifetime Attributable Risk of Cancer," Arch Intern Med., 2009; 169(22): 2078-2086.

13. Memon A, Godward S, Williams D, et al, "Dental x-rays and the risk of thyroid cancer: A case-control study," Acta Oncologica, May 2010, Vol. 49, No. 4: 447–453.

Creative Commons License
This work by Truthout is licensed under a Creative Commons Attribution-Noncommercial 3.0 United States License.

Sunday, March 20, 2011

FYI: How Does Nuclear Radiation Do Its Damage?

POPSCI

FYI: How Does Nuclear Radiation Do Its Damage?


Or, why everyone is stocking up on iodine tablets

By Molika Ashford Posted 03.16.2011 at 3:32 pm


Boy Is Screened for Radiation Koriyama City, Fukushima Prefecture, Japan, March 16, 2011 Tayama Tatsuyuki/Gamma-Rapho via Getty Images

Ionizing radiation—the kind that minerals, atom bombs and nuclear reactors emit—does one main thing to the human body: it weakens and breaks up DNA, either damaging cells enough to kill them or causing them to mutate in ways that may eventually lead to cancer.

After last week’s earthquake and tsunami in Japan, four nuclear reactors at the Fukushima Daiichi plant are now damaged and releasing radiation. Workers trying to keep the reactors from getting worse are themselves being exposed, while the Japanese government has called for anyone within 20 kilometers of the plant to evacuate.

Nuclear radiation, unlike the radiation from a light bulb or a microwave, is energetic enough to ionize atoms by knocking off their electrons. This ionizing radiation can damage DNA molecules directly, by breaking the bonds between atoms, or it can ionize water molecules and form free radicals, which are highly reactive and also disrupt the bonds of surrounding molecules, including DNA.

Peter Dedon, a member of the Radiation Protection Committee at MIT, explains: “What happens is that the nucleus of radioactive elements undergoes decay and emits high-energy particles. If you stand in the way of those particles, they are going to interact with the cells of your body. You literally get a particle, an energy packet, moving through your cells and tissues.”

If radiation changes DNA molecules enough, cells can’t replicate and begin to die, which causes the immediate effects of radiation sickness -- nausea, swelling, hair loss. Cells that are damaged less severely may survive and replicate, but the structural changes in their DNA can disrupt normal cell processes -- like the mechanisms that control how and when cells divide. Cells that can’t control their division grow out of control, becoming cancerous.

With ingested particles, some may pass through the body before they do much damage, but others linger, Dedon says. Radioactive iodine-131 poses a particularly significant risk, because it is absorbed rapidly by the thyroid gland and held there. That is why it is recommended that those who may be exposed to radioactivity in the air pre-dose themselves with iodine pills: the non-radioactive iodine is absorbed by the thyroid, which then does not absorb radioactive iodine if it comes along.

Radiation exposure risk is measured in units called sieverts, which take into account the type and amount of radiation, and which parts of the body are exposed, allowing us to compare different kinds of exposures in one scale.

In a typical year, a person might receive a total dose of two or three millisieverts from things like ambient radioactivity, plane flights and medical procedures. In the U.S. the annual exposure limit for nuclear plant workers is 0.05 sieverts per year. At or below these levels, the enzymes that repair DNA keep up with damage enough to keep the risk of cancer low. Above them, the body’s systems of repair can’t keep pace. 100 millisieverts a year is the threshold above which cancer risk starts to increase, according to the World Nuclear Association.

According to reports, radiation levels have fluctuated at Fukushima, rising at one reading to 400 millisieverts per hour. At that level, Dedon says, seven minutes would bring you to the U.S. yearly limit. Over an hour could be a lethal dose. The 400 millisieverts level was not a sustained measurement and levels continue to fluctuate much lower.

Dedon stresses that because radiation dissipates, like light, by the square of its distance, even if levels are high in the plant, just a few miles away, they would be miniscule. The greater danger for people living in the area is the release of radioactive particles into the air, which can accumulate in the body, damaging tissue over time and causing cancer.

Receiving a one-sievert dose of radiation in a day is enough to make you feel ill, according to Dedon. “At one to three, you have damaged bone marrow and organs, and you’ll really be sick. At three to six you add hemorrhaging, and more infection,” he says. “From six to ten, at that level death is something like 90 percent. And above ten, they just call that incapacitation and death.”

Friday, March 18, 2011

Nuclear Nightmare


Posted by nimda in In the Public Interest

Nuclear Nightmare

The unfolding multiple nuclear reactor catastrophe in Japan is prompting overdue attention to the 104 nuclear plants in the United States—many of them aging, many of them near earthquake faults, some on the west coast exposed to potential tsunamis.
Nuclear power plants boil water to produce steam to turn turbines that generate electricity. Nuclear power’s overly complex fuel cycle begins with uranium mines and ends with deadly radioactive wastes for which there still are no permanent storage facilities to contain them for tens of thousands of years.

Atomic power plants generate 20 percent of the nation’s electricity. Over forty years ago, the industry’s promoter and regulator, the Atomic Energy Commission estimated that a full nuclear meltdown could contaminate an area “the size of Pennsylvania” and cause massive casualties. You, the taxpayers, have heavily subsidized nuclear power research, development, and promotion from day one with tens of billions of dollars.

Because of many costs, perils, close calls at various reactors, and the partial meltdown at the Three Mile Island plant in Pennsylvania in 1979, there has not been a nuclear power plant built in the United States since 1974.

Now the industry is coming back “on your back” claiming it will help reduce global warming from fossil fuel emitted greenhouse gases.

Pushed aggressively by President Obama and Energy Secretary Chu, who refuses to meet with longtime nuclear industry critics, here is what “on your back” means:

1. Wall Street will not finance new nuclear plants without a 100% taxpayer loan guarantee. Too risky. That’s a lot of guarantee given that new nukes cost $12 billion each, assuming no mishaps. Obama and the Congress are OK with that arrangement.

2. Nuclear power is uninsurable in the private insurance market—too risky. Under the Price-Anderson Act, taxpayers pay the greatest cost of a meltdown’s devastation.

3. Nuclear power plants and transports of radioactive wastes are a national security nightmare for the Department of Homeland Security. Imagine the target that thousands of vulnerable spent fuel rods present for sabotage.

4. Guess who pays for whatever final waste repositories are licensed? You the taxpayer and your descendants as far as your gene line persists. Huge decommissioning costs, at the end of a nuclear plant’s existence come from the ratepayers’ pockets.

5. Nuclear plant disasters present impossible evacuation burdens for those living anywhere near a plant, especially if time is short.

Imagine evacuating the long-troubled Indian Point plants 26 miles north of New York City. Workers in that region have a hard enough time evacuating their places of employment during 5 pm rush hour. That’s one reason Secretary of State Clinton (in her time as Senator of New York) and Governor Andrew Cuomo called for the shutdown of Indian Point.

6. Nuclear power is both uneconomical and unnecessary. It can’t compete against energy conservation, including cogeneration, windpower and ever more efficient, quicker, safer, renewable forms of providing electricity. Amory Lovins argues this point convincingly (see RMI.org). Physicist Lovins asserts that nuclear power “will reduce and retard climate protection.” His reasoning: shifting the tens of billions invested in nuclear power to efficiency and renewables reduce far more carbon per dollar (http://www.nirs.org/factsheets/whynewnukesareriskyfcts.pdf). The country should move deliberately to shutdown nuclear plants, starting with the aging and seismically threatened reactors. Peter Bradford, a former Nuclear Regulatory Commission (NRC) commissioner has also made a compelling case against nuclear power on economic and safety grounds (http://www.nirs.org/factsheets/whynewnukesareriskyfcts.pdf).

There is far more for ratepayers, taxpayers and families near nuclear plants to find out. Here’s how you can start:

1. Demand public hearings in your communities where there is a nuke, sponsored either by your member of Congress or the NRC, to put the facts, risks and evacuation plans on the table. Insist that the critics as well as the proponents testify and cross-examine each other in front of you and the media.

2. If you call yourself conservative, ask why nuclear power requires such huge amounts of your tax dollars and guarantees and can’t buy adequate private insurance. If you have a small business that can’t buy insurance because what you do is too risky, you don’t stay in business.

3. If you are an environmentalist, ask why nuclear power isn’t required to meet a cost-efficient market test against investments in energy conservation and renewables.

4. If you understand traffic congestion, ask for an actual real life evacuation drill for those living and working 10 miles around the plant (some scientists think it should be at least 25 miles) and watch the hemming and hawing from proponents of nuclear power.

The people in northern Japan may lose their land, homes, relatives, and friends as a result of a dangerous technology designed simply to boil water. There are better ways to generate steam.

Like the troubled Japanese nuclear plants, the Indian Point plants and the four plants at San Onofre and Diablo Canyon in southern California rest near earthquake faults. The seismologists concur that there is a 94% chance of a big earthquake in California within the next thirty years. Obama, Chu and the powerful nuke industry must not be allowed to force the American people to play Russian Roulette!