FAIR USE NOTICE

FAIR USE NOTICE

A BEAR MARKET ECONOMICS BLOG

OCCUPY THE SCIENTIFIC METHOD


This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.

In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml

If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates

All Blogs licensed under Creative Commons Attribution 3.0

Thursday, November 19, 2009

Science Cracks Corn and No One Cares

ScienceDaily

Scientists Crack Corn Code: Reference Genome of Maize, Most Important US Crop


An ear of corn on the stalk in a field ready for harvesting. (Credit: iStockphoto/Jim Parkin)

ScienceDaily (Nov. 19, 2009) — A four-year, multi-institutional effort co-led by three Cold Spring Harbor Laboratory (CSHL) scientists culminated today in publication of a landmark series of papers in the journal Science revealing in unprecedented detail the DNA sequence of maize (Zea mays). Maize, or corn, as it is commonly called by North American consumers, is one of the world's most important plants and the most valuable agricultural crop grown in the United States, representing $47 billion in annual value.

The sequence spans 2.3 billion DNA base-pairs and contains some 32,500 genes, or about one-third more than the human genome, according to the team that assembled it over the last four years. This version of the maize genome -- taken from a variant called B73 -- is important, in part, because it is regarded by the scientific and agricultural communities as a "reference" version. It represents a significant filling-in of gaps in a draft maize sequence announced a year and a half ago, but more importantly, comes with what amounts to a detailed reference manual, a set of comprehensive annotations.

A scientific and practical landmark

"Both the sequence itself and the annotations are a landmark," says Doreen Ware, Ph.D., a co-principal investigator of the project whose CSHL lab focused primarily on the annotation and evolutionary analysis. Principal investigator of the Maize Genome Project of the National Science Foundation, which provided funding with the U.S. Department of Agriculture (USDA) and the U.S. Department of Energy, is Richard Wilson, Ph.D., of Washington University, St. Louis. Other co-principal investigators include scientists from the University of Arizona and Iowa State University.

In a parallel effort, Ware's CSHL team also helped generate the first so-called "HapMap" of maize in collaboration with Edward Buckler, a USDA scientist. The HapMap, a shorthand for haplotype map, gauges diversity in the maize genome by comparing 27 distinct genetic lines of the plant with the reference version. A human HapMap, prepared in conjunction with the Human Genome Project, has revealed important linkages between genetic variations and risk for major diseases in ethnically and geographically distinct human populations.

"What's important about the maize project," says W. Richard McCombie, Ph.D., CSHL Professor, co-principal investigator on the maize genome project, and a pioneer in genome sequencing efforts, "is that it provides a reference DNA sequence for the most important agricultural crop in the U.S., making it much easier for people to look at the many variants of different strains or 'accessions' of maize." New sequencing technologies, just now becoming commercially viable, will now "analyze other maize strains by comparing them to this one -- albeit at dramatically lower costs and accelerated speeds," McCombie notes.

Another of the CSHL co-project leaders, Professor Robert Martienssen, Ph.D., puts the maize sequencing project into historical perspective. Martienssen, a world leader in research on transposons -- bits of DNA that copy and insert themselves randomly across the chromosomes -- noted that transposable elements are found in all organisms, "but were discovered in maize more than 60 years ago," by CSHL's Barbara McClintock, who was honored with a Nobel Prize for the discovery in 1983. "It is a remarkable achievement to now be able to visualize transposons in such detail in the maize genome sequence," Martienssen says.

"Wonderful diversity" and its evolutionary implications

Transposons play a particularly dramatic role in the maize genome, as the sequence clearly shows. Nearly 85 percent of the genome is composed of hundreds of families of transposable elements, distributed unevenly across the 10 maize chromosomes. This is one aspect of the maize genome's complexity; another is its variability between different "individuals." Maize plants from two different strains are, on average, more genetically different than humans are different from chimpanzees.

"The wonderful diversity that we see in maize today is the product of many things," Ware explains. "Several million years ago, the genome of the maize effectively doubled in size, to 20 chromosomes, and then, subsequently, returned to its current size of 10 chromosomes. In the detailed sequence that we now have obtained, we can begin to study the impact of that 'doubling event.'"

Ware notes that genome doubling is not uncommon in the plant kingdom, and hypothesizes that "it may be a very successful way for speciation to occur." Once an organism can draw on two full sets of essentially the same genes, it can begin "to de-evolve certain of the genes in one set and adapt them to some other function -- importantly, without compromising the gene's original function," she says. In this way, hypothetically, the plant could become more "capable," in genetic terms, over long periods of time.

The maize reference genome will also provide a basis for close investigation of the impact of human breeding and trait selection, in the much more proximate historical era since the plant's domestication, some 10,000 years ago. Maize is known to have evolved from a common grass found in Mexico and Central America called teosinte. It was human intervention -- breeding -- that led to full domestication and unimagined value and utility. As maize became ever more useful to people -- as food and animal feed -- it was carried beyond the volcanic soils of central Mexican valleys and the indigenous peoples of North and South America to the far reaches of the planet, at first by European mercantile and imperial powers of the sixteenth and seventeenth centuries. It has long been a central cultural element in the region of its initial domestication, but has since worked its way into the sinews of many other cultures. This week, for example, in schoolrooms across America, tales are being told about how "Indian corn" was served at the very first Thanksgiving meals in early-seventeenth century New England.

The genome as starting point for improving an indispensable crop

Today maize is an important, if controversial, source of biomass for a wide range of industrial applications, and, very recently, a prime source of biofuel. CSHL's Ware, who is also a scientist at the USDA, has a keen interest in thinking about maize in terms of its identity as agricultural germplasm. "What we're trying to do is identify what is best -- and keep the best in the germplasm," she says. "The 'best' will vary, depending on what the environment is. What's best in Missouri is not necessarily best in Washington state. That helps explain why having a HapMap of maize will be useful for breeders in producing improved corn plants.

"We're trying to use the genome to understand not only the differences between individual lines, but also to identify what differences, in genetic terms, are still available within maize. Ideally, we'd like to understand the function of every gene. In comparing different lines, we want to find genes associated with what we call quantitative traits -- genes that affect traits of importance to agriculture, everything from the size of the seeds to when the plant flowers to whether it can tolerate drought or dampness.

"With climate change upon us, there is great need in the years ahead to adapt existing germplasm to future needs," Ware suggests. "Will we be able to grow maize 20 years from now in the same places we do today? What will we need to do to improve this extremely valuable plant?" With the reference version of the maize genome and tools like the maize HapMap now in the public domain, the search will proceed with a new intensity, made possible by a treasure trove of new data.

The maize reference genome and the haplotype map are published online today ahead of print in the journal Science, in addition to a supplementary poster on the maize genome placing the plant and the sequencing project in historical and cultural perspective. In addition to CSHL's Ware, the co-lead author of the reference genome paper is Patrick S. Schnable, Ph.D., of Iowa State University. The paper is entitled, "The B73 Maize Genome: Complexity, Diversity, and Dynamics." The corresponding author is Richard K. Wilson. The Hap Map paper, appearing simultaneously online ahead of print in Science, is entitled, "A First-Generation Haplotype Map of Maize"; the corresponding author is Edward S. Buckler. Doreen Ware and her CSHL colleagues have also played an important role in authoring a series of papers probing some of the biology underlying the maize reference genome that are being published concurrently in the journal Public Library of Science (PLoS) Genetics. Please visit: http://www.plosgenetics.org/home.action.

In addition, The DNA Learning Center of Cold Spring Harbor Laboratory has just launched a series of podcasts and short videos explaining the scientific importance of the maize genome as well as the cultural and historical significance of the maize plant itself. These can be accessed at: www.weedtowonder.org For additional educational information about maize and genetics, please visit: www.dnalc.org

Cold Spring Harbor Laboratory (CSHL) is a private, not-for-profit research and education institution at the forefront of efforts in molecular biology and genetics to generate knowledge that will yield better diagnostics and treatments for cancer, neurological diseases and other major causes of human suffering. For more information, visit www.cshl.edu and www.maizesequence.org

One word: bioplastics




One word: bioplastics

At a new plant in Iowa, MIT-rooted technology will use bacteria to turn corn into biodegradable plastics.

Anne Trafton, MIT News Office

Graphic: Christine Daniloff

Every year, more than 540 billion pounds of plastic are produced worldwide. Much of it ends up in the world’s oceans, a fact that troubles MIT biology professor Anthony Sinskey.

“Plastic does not degrade in the ocean. It just gets ground up into tiny particles,” he says. In the Pacific Ocean, a vast swath twice the size of Texas teems with tiny bits of oil-based plastic that can poison ocean life.

Sinskey can’t do much about the plastic that’s already polluting the Earth’s oceans, but he is trying to help keep the problem from getting worse. Next month, a company he founded with his former postdoc, Oliver Peoples, will open a new factory that uses MIT-patented technology to build plastic from corn. The plant aims to produce annually 110 million pounds of the new bioplastic, which biodegrades in soil or the ocean.

That’s a fraction of one percent of the United States’ overall plastic production, which totaled 101.5 billion pounds in 2008. Though it will take bioplastics a long time before they can start making a dent in that figure, the industry has significant growth potential, says Melissa Hockstad, vice president for science, technology and regulatory affairs for SPI: The Plastics Industry Trade Association.

“Bioplastics are making inroads into new markets and are an important area to watch for the future of the plastics industry,” says Hockstad, who noted that the current global market for biodegradable polymers is estimated at about 570 million pounds per year but is expected to more than double by 2012.

‘Timing is everything’

For Sinskey and Peoples, the road started 25 years ago. Peoples, who had just earned his PhD in molecular biology from the University of Aberdeen, arrived in Sinskey’s lab in 1984 and set out to sequence a bacterial gene. Today, high-speed sequencing machines could do the job in about a week. Back then, it took three years.

That gene, from the bacterium R. eutropha, turned out to code for an enzyme that allows bacteria to produce polyhydroxyalkanoate (PHA) — a naturally occurring form of polyester — starting with only sunlight, water, and a carbon source. (Bacteria normally manufacture PHA as a way to store carbon and energy.)

Sinskey and Peoples realized that if they could ramp up the bacteria’s plastic producing abilities, they could harness the organisms for industrial use. In 1994, they started a company called Metabolix and took out exclusive patents from MIT on the gene work they had done on PHA-synthesizing bacteria.

Thus began a 15-year effort to develop the technology into a robust, large-scale process, and to win support for such an approach.

On the scientific side, Peoples and the scientists at Metabolix developed a method to incorporate several genes from different bacteria into a strain of E. coli. Using this process, now called metabolic engineering, they eventually created a strain that produces PHA at levels several-fold higher than naturally occurring bacteria.

However, they had some difficulty generating support (and funding) for the idea. In the early 1990s, the public was not very receptive to the idea of alternative plastics. “Oil was $20 a barrel, and people didn’t believe in global warming,” Peoples recalls.

“Timing is everything,” says Sinskey. “There has to be a market for these materials” for them to be successful.

‘Growing interest’

The scientists believe that consumers are now ready for bioplastics. Such plastics have been commercially available for about a decade, mostly in the form of plastic cups, bottles and food packaging. Most of those products are made from a type of plastic called polylactic acid (PLA), which is also produced from corn. PLA is similar to PHA, but PHA has higher heat resistance, according to Peoples.

Possible uses for the Metabolix bioplastics include packaging, agricultural film, compost bags, business equipment and consumer products such as personal care products, gift cards and pens. Products like these, along with existing bioplastic products, tap into a “growing interest in materials that can be made from renewable resources or disposed of through practices such as composting,” says Hockstad.

The new Metabolix plant, located in Clinton, Iowa, is a joint venture with Archer Daniels Midland. Metabolix is also working to engineer crops — including switchgrass — that will grow the plastic directly within the plant.

Turning to those agricultural starting materials could help reduce the amount of petroleum needed to manufacture traditional plastics, which currently requires about 2 million barrels of oil per day (10 percent of total U.S. daily oil consumption). “It’s important to develop alternative ways to make these chemicals,” says Peoples.

Monday, November 16, 2009

In SUSY we trust: What the LHC is really looking for

Physics & Math


In SUSY we trust: What the LHC is really looking for

Forget the God particle - the rebooted Large Hadron Collider will give us much greater revelations


This simulation depicts the decay of a Higgs particle following a collision of two protons in the CMS experiment (Image: CMS)

This simulation depicts the decay of a Higgs particle following a collision of two protons in the CMS experiment (Image: CMS)

4 more images

AS DAMP squibs go, it was quite a spectacular one. Amid great pomp and ceremony - not to mention dark offstage rumblings that the end of the world was nigh - the Large Hadron Collider (LHC), the world's mightiest particle smasher, fired up in September last year. Nine days later a short circuit and a catastrophic leak of liquid helium ignominiously shut the machine down.

Now for take two. Any day now, if all goes to plan, proton beams will start racing all the way round the ring deep beneath CERN, the LHC's home on the outskirts of Geneva, Switzerland.

Nobel laureate Steven Weinberg is worried. It's not that he thinks the LHC will create a black hole that will engulf the planet, or even that the restart will end in a technical debacle like last year's. No: he's actually worried that the LHC will find what some call the "God particle", the popular and embarrassingly grandiose moniker for the hitherto undetected Higgs boson.

"I'm terrified," he says. "Discovering just the Higgs would really be a crisis."

Why so? Evidence for the Higgs would be the capstone of an edifice that particle physicists have been building for half a century - the phenomenally successful theory known simply as the standard model. It describes all known particles, as well as three of the four forces that act on them: electromagnetism and the weak and strong nuclear forces.

It is also manifestly incomplete. We know from what the theory doesn't explain that it must be just part of something much bigger. So if the LHC finds the Higgs and nothing but the Higgs, the standard model will be sewn up. But then particle physics will be at a dead end, with no clues where to turn next.

Hence Weinberg's fears. However, if the theorists are right, before it ever finds the Higgs, the LHC will see the first outline of something far bigger: the grand, overarching theory known as supersymmetry. SUSY, as it is endearingly called, is a daring theory that doubles the number of particles needed to explain the world. And it could be just what particle physicists need to set them on the path to fresh enlightenment.

So what's so wrong with the standard model? First off, there are some obvious sins of omission. It has nothing whatsoever to say about the fourth fundamental force of nature, gravity, and it is also silent on the nature of dark matter. Dark matter is no trivial matter: if our interpretation of certain astronomical observations is correct, the stuff outweighs conventional matter in the cosmos by more than 4 to 1.

Ironically enough, though, the real trouble begins with the Higgs. The Higgs came about to solve a truly massive problem: the fact that the basic building blocks of ordinary matter (things such as electrons and quarks, collectively known as fermions) and the particles that carry forces (collectively called bosons) all have a property we call mass. Theories could see no rhyme or reason in particles' masses and could not predict them; they had to be measured in experiments and added into the theory by hand.

These "free parameters" were embarrassing loose threads in the theories that were being woven together to form what eventually became the standard model. In 1964,Peter Higgs of the University of Edinburgh, UK, and François Englert and Robert Brout of the Free University of Brussels (ULB) in Belgium independently hit upon a way to tie them up.

That mechanism was an unseen quantum field that suffuses the entire cosmos. Later dubbed the Higgs field, it imparts mass to all particles. The mass an elementary particle such as an electron or quark acquires depends on the strength of its interactions with the Higgs field, whose "quanta" are Higgs bosons.

Fields like this are key to the standard model as they describe how the electromagnetic and the weak and strong nuclear forces act on particles through the exchange of various bosons - the W and Z particles, gluons and photons. But the Higgs theory, though elegant, comes with a nasty sting in its tail: what is the mass of the Higgs itself? It should consist of a core mass plus contributions from its interactions with all the other elementary particles. When you tot up those contributions, the Higgs mass balloons out of control.

The experimental clues we already have suggest that the Higgs's mass should lie somewhere between 114 and 180 gigaelectronvolts - between 120 and 190 times the mass of a proton or neutron, and easily the sort of energy the LHC can reach. Theory, however, comes up with values 17 or 18 orders of magnitude greater - a catastrophic discrepancy dubbed "the hierarchy problem". The only way to get rid of it in the standard model is to fine-tune certain parameters with an accuracy of 1 part in 1034, something that physicists find unnatural and abhorrent.

Three into one

The hierarchy problem is not the only defect in the standard model. There is also the problem of how to reunite all the forces. In today's universe, the three forces dealt with by the standard model have very different strengths and ranges. At a subatomic level, the strong force is the strongest, the weak the weakest and the electromagnetic force somewhere in between.

Towards the end of the 1960s, though, Weinberg, then at Harvard University, showed with Abdus Salam and Sheldon Glashow that this hadn't always been the case. At the kind of high energies prevalent in the early universe, the weak and electromagnetic forces have one and the same strength; in fact they unify into one force. The expectation was that if you extrapolated back far enough towards the big bang, the strong force would also succumb, and be unified with the electromagnetic and weak force in one single super-force (see graph).

In 1974 Weinberg and his colleagues Helen Quinn and Howard Georgi showed that the standard model could indeed make that happen - but only approximately. Hailed initially as a great success, this not-so-exact reunification soon began to bug physicists working on "grand unified theories" of nature's interactions.

It was around this time that supersymmetry made its appearance, debuting in the work of Soviet physicists Yuri Golfand and Evgeny Likhtman that never quite made it to the west. It was left to Julius Wess of Karlsruhe University in Germany and Bruno Zumino of the University of California, Berkeley, to bring its radical prescriptions to wider attention a few years later.

Wess and Zumino were trying to apply physicists' favourite simplifying principle, symmetry, to the zoo of subatomic particles. Their aim was to show that the division of the particle domain into fermions and bosons is the result of a lost symmetry that existed in the early universe.

According to supersymmetry, each fermion is paired with a more massive supersymmetric boson, and each boson with a fermionic super-sibling. For example, the electron has the selectron (a boson) as its supersymmetric partner, while the photon is partnered with the photino (a fermion). In essence, the particles we know now are merely the runts of a litter double the size (see diagram).

The key to the theory is that in the high-energy soup of the early universe, particles and their super-partners were indistinguishable. Each pair co-existed as single massless entities. As the universe expanded and cooled, though, this supersymmetry broke down. Partners and super-partners went their separate ways, becoming individual particles with a distinctive mass all their own.

Supersymmetry was a bold idea, but one with seemingly little to commend it other than its appeal to the symmetry fetishists. Until, that is, you apply it to the hierarchy problem. It turned out that supersymmetry could tame all the pesky contributions from the Higgs's interactions with elementary particles, the ones that cause its mass to run out of control. They are simply cancelled out by contributions from their supersymmetric partners. "Supersymmetry makes the cancellation very natural," says Nathan Seiberg of the Institute of Advanced Studies, Princeton.

That wasn't all. In 1981 Georgi, together with Savas Dimopoulos of Stanford University, redid the force reunification calculations that he had done with Weinberg and Quinn, but with supersymmetry added to the mix. They found that the curves representing the strengths of all three forces could be made to come together with stunning accuracy in the early universe. "If you have two curves, it's not surprising that they intersect somewhere," says Weinberg. "But if you have three curves that intersect at the same point, then that's not trivial."

This second strike for supersymmetry was enough to convert many physicists into true believers. But it was when they began studying some of the questions raised by the new theory that things became really interesting.

One pressing question concerned the present-day whereabouts of supersymmetric particles. Electrons, photons and the like are all around us, but of selectrons and photinos there is no sign, either in nature or in any high-energy accelerator experiments so far. If such particles exist, they must be extremely massive indeed, requiring huge amounts of energy to fabricate.

Such huge particles would long since have decayed into a residue of the lightest, stable supersymmetric particles, dubbed neutralinos. Still massive, the neutralino has no electric charge and interacts with normal matter extremely timorously by means of the weak nuclear force. No surprise then that it is has eluded detection so far.

When physicists calculated exactly how much of the neutralino residue there should be, they were taken aback. It was a huge amount - far more than all the normal matter in the universe.

Beginning to sound familiar? Yes, indeed: it seemed that neutralinos fulfilled all the requirements for the dark matter that astronomical observations persuade us must dominate the cosmos. A third strike for supersymmetry.

Each of the three questions that supersymmetry purports to solve - the hierarchy problem, the reunification problem and the dark-matter problem - might have its own unique answer. But physicists are always inclined to favour an all-purpose theory if they can find one. "It's really reassuring that there is one idea that solves these three logically independent things," says Seiberg.

Supersymmetry solves problems with the standard model, helps to unify nature's forces and explains the origin of dark matter

Supersymmetry's scope does not end there. As Seiberg and his Princeton colleague Edward Witten have shown, the theory can also explain why quarks are never seen on their own, but are always corralled together by the strong force into larger particles such as protons and neutrons. In the standard model, there is no mathematical indication why that should be; with supersymmetry, it drops out of the equations naturally. Similarly, mathematics derived from supersymmetry can tell you how many ways can you fold a four-dimensional surface, an otherwise intractable problem in topology.

All this seems to point to some fundamental truth locked up within the theory. "When something has applications beyond those that you designed it for, then you say, 'well this looks deep'," says Seiberg. "The beauty of supersymmetry is really overwhelming."

Sadly, neither mathematical beauty nor promise are enough on their own. You also need experimental evidence. "It is embarrassing," says Michael Dine of the University of California, Santa Cruz. "It is a lot of paper expended on something that is holding on by these threads."

Circumstantial evidence for supersymmetry might be found in various experiments designed to find and characterise dark matter in cosmic rays passing through Earth. These include the Cryogenic Dark Matter Search experiment inside the Soudan Mine in northern Minnesota and the Xenon experiment beneath the Gran Sasso mountain in central Italy. Space probes like NASA's Fermi satellite are also scouring the Milky Way for the telltale signs expected to be produced when two neutralinos meet and annihilate.

The best proof would come, however, if we could produce neutralinos directly through collisions in an accelerator. The trouble is that we are not entirely sure how muscular that accelerator would need to be. The mass of the super-partners depends on precisely when supersymmetry broke apart as the universe cooled and the standard particles and their super-partners parted company. Various versions of the theory have not come up with a consistent timing. Some variants even suggest that certain super-partners are light enough to have already turned up in accelerators such as the Large Electron-Positron collider - the LHC's predecessor at CERN - or the Tevatron collider in Batavia, Illinois. Yet neither accelerator found anything.

The reason physicists are so excited about the LHC, though, is that the kind of supersymmetry that best solves the hierarchy problem will become visible at the higher energies the LHC will explore. Similarly, if neutralinos have the right mass to make up dark matter, they should be produced in great numbers at the LHC.

Since the accident during the accelerator's commissioning last year, CERN has adopted a softly-softly approach to the LHC's restart. For the first year it will smash together two beams of protons with a total energy of 7 teraelectronvolts (TeV), half its design energy. Even that is quite a step up from the 1.96 TeV that the Tevatron, the previous record holder, could manage. "If the heaviest supersymmetric particles weigh less than a teraelectronvolt, then they could be produced quite copiously in the early stages of LHC's running," says CERN theorist John Ellis.

If that is so, events after the accelerator is fired up again could take a paradoxical turn. The protons that the LHC smashes together are composite particles made up of quarks and gluons, and produce extremely messy debris. It could take rather a long time to dig the Higgs out of the rubble, says Ellis.

Any supersymmetric particles, on the other hand, will decay in as little as 10-16seconds into a slew of secondary particles, culminating in a cascade of neutralinos. Because neutralinos barely interact with other particles, they will evade the LHC's detectors. Paradoxically, this may make them relatively easy to find as the energy and momentum they carry will appear to be missing. "This, in principle, is something quite distinctive," says Ellis.

So if evidence for supersymmetry does exist in the form most theorists expect, it could be discovered well before the Higgs particle, whose problems SUSY purports to solve. Any sighting of something that looks like a neutralino would be very big news indeed. At the very least it would be the best sighting yet of a dark-matter particle. Even better, it would tell us that nature is fundamentally supersymmetric.

There is a palpable sense of excitement about what the LHC might find in the coming years. "I'll be delighted if it is supersymmetry," says Seiberg. "But I'll also be delighted if it is something else. We need more clues from nature. The LHC will give us these clues."

Blood brothers?

String theory and supersymmetry are two as-yet unproved theories about the make-up of the universe. But they are not necessarily related.

It is true that most popular variants of string theory take a supersymmetric universe as their starting point. String theorists, who have taken considerable flak for advocating a theory that has consistently struggled to make testable predictions, will breathe a huge sigh of relief if supersymmetry is found.

That might be premature: the universe could still be supersymmetric without string theory being correct. Conversely, at the kind of energies probed by the LHC, it is not clear that supersymmetry is a precondition for string theory. "It is easier to understand string theory if there is supersymmetry at the LHC," says Edward Witten, a theorist at the Institute of Advanced Studies, Princeton, "but it is not clear that it is a logical requirement."

If supersymmetry does smooth the way for string theory, however, that could be a decisive step towards a theory that solves the greatest unsolved problem of physics: why gravity seems so different to all the rest of the forces in nature. If so, supersymmetry really could have all the answers.

Anil Ananthaswamy is a consulting editor for New Scientist

Issue 2734 of New Scientist magazine
  • Subscribe to New Scientist and you'll get:
  • New Scientist magazine delivered to your door
  • Unlimited access to all New Scientist online content -
    a benefit only available to subscribers
  • Great savings from the normal price
  • Subscribe now!

If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.


Common cold may hold off swine flu

Health


Keeping swine flu at bay  (Image: Leander Baerenz/Getty)

Keeping swine flu at bay (Image: Leander Baerenz/Getty)

1 more image

Common cold may hold off swine flu


A VIRUS that causes the common cold may be saving people from swine flu. If this intriguing idea turns out to be true, it would explain why swine flu's autumn wave has been slow to take off in some countries and point to new ways to fight flu.

"It is really surprising that there has not been more pandemic flu activity in many European countries," says Arnold Monto, an epidemiologist at the University of Michigan, Ann Arbor.

It is really surprising that there has not been more pandemic flu activity in many European countries

In France, flu cases rose in early September, then stayed at about 160 per 100,000 people until late October, when numbers started rising again. The delayed rise was puzzling, says Jean-Sebastien Casalegno of the French national flu lab at the University of Lyon.

He reports that the percentage of throat swabs from French respiratory illnesses that tested positive for swine flu fell in September, while at the same time rhinovirus, which causes colds, rose (Eurosurveillance, vol 14, p 19390). He told New Scientist that in late October, rhinovirus fell - at the same time as flu rose. He suspects rhinovirus may have blocked the spread of swine flu via a process called viral interference.

This is thought to occur when one virus blocks another. "We think that when you get one infection, it turns on your antiviral defences, and excludes the other viruses," says Ab Osterhaus at the University of Rotterdam in the Netherlands.

How important such interference is in viral epidemics is unclear, however: there are also cases in which there is no interference, and people catch two viruses at the same time. Normally, we don't get a chance to see how rhinovirus affects flu, as flu epidemics usually strike in winter, whereas rhinovirus hits when schools start (late summer in the northern hemisphere).

But this year the pandemic meant flu came early - and France isn't the only country in which rhinovirus seems to have held it at bay. In Eurosurveillance last month, Mia Brytting of the Swedish Institute for Infectious Disease Control in Solna reported a rise in rhinovirus coupled with a swine flu lull just after school resumed in Sweden at the end of August (see graph). She too says rhinovirus has now fallen, as flu has climbed. Researchers in Norway report rhinovirus rose there as flu fell in August, while Ian Mackay at the University of Queensland found the same trend in Australia.

What's more, in March, Mackay reported that people with rhinovirus are less likely to be infected with a second virus than people with other viruses, and are just one-third as likely to have simultaneous seasonal flu (Journal of Clinical Virology, DOI: 10.1016/j.jcv.2009.03.008).

So why hasn't the US, for example, seen a dip in pandemic cases during a back-to-school rhinovirus outbreak? Mackay speculates that interference from rhinovirus may not be enough to fend off flu if someone is exposed repeatedly. There were far more cases of swine flu in the US in September than in Europe.

The effects of rhinovirus, often dismissed as "only" a cold, are too poorly understood, say all the researchers. Its seeming ability to block swine flu may already have saved lives in France by buying the nation time before the vaccine arrived. It may even lead to a drug that induces the antiviral state, but without the sniffles.

Issue 2734 of New Scientist magazine
  • Like what you've just read?
  • Don't miss out on the latest content from New Scientist.
  • Get New Scientist magazine delivered to your door, plus unlimited access to the entire content of New Scientist online.
  • Subscribe now and save

If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.

Time-travelling browsers navigate the web's past

Tech


Time-travelling browsers navigate the web's past

15:41 16 November 2009 by Paul Marks

Time-travel computing past (Image: Flying Colours Ltd/Getty)

Time-travel computing past (Image: Flying Colours Ltd/Getty)

8 more images


Finding old versions of web pages could become far simpler thanks to a "time-travelling" web browsing technology being pioneered at the Los Alamos National Laboratory in New Mexico.

Bookmarking a page takes you to its current version – but earlier ones are harder to find (to see an award-winning 1990s incarnation of newscientist.com, see our gallery of web pages past, right). One option is to visit a resource like the Internet Archive's Wayback Machine. There, you key in the URL of the site you want and are confronted with a matrix of years and dates for old pages that have been cached. Or, if you want to check how a Wikipedia page has evolved, you can hit the "history" tab on a page of interest and scroll through in an attempt to find the version of the page on the day you're interested in.

It's a lot of hassle. But it shoudn't be, says Herbert Van de Sompel, a computer scientist at Los Alamos. "Today we treat the web like a library in which you have to know how to go and search for things. We've a better way."

That "better way" is a system that gives browsers a "time-travel" mode, allowing users to find web pages from particular dates and times without having to navigate through archives.

Total recall

Called Memento, the system Van de Sompel is developing alongside colleagues from Old Dominion University in Norfolk, Virginia, harnesses a function of the hypertext transfer protocol (HTTP) – the system which underpins the world wide web by defining how web pages are formatted and transmitted from servers to browsers.

One of HTTP's standard functions is called content negotiation. This allows one URL to send multiple types of data, depending on the settings of the browser that contacts the URL: for instance, a browser in France accessing a URL may retrieve an HTML page in French, while accessing the same URL from the US may deliver an English version.

"Your browser does this negotiation all the time, but you don't notice it," says Van de Sompel. But HTTP content negotiation is not limited to arbitrating between media formats and languages – it can cope with any data type. So the team are adding another dimension to page requests: date and time.

"In addition to language and media type, we negotiate in time. So Memento asks the server not for today's version of this page, but how it looked one year ago, for instance," says Van de Sompel.

Browsing the past

Memento comprises both server and browser software. On a server running the open-source Apache web system, just four lines of extra code are needed to build in date-and-time negotiation. On the browser, a drop-down menu will let users enter the date and time for which they want to view a page.

So far, the team has developed a Memento plug-in for the open-source Firefox browser, plus a "hacked" version of Firefox with built-in Memento capability. Web pages need no extra features: the web server just needs to intercept the date-time requests of users. A demonstration of what Memento can do is available for any browser.

Of course, the whole idea requires website owners to store many more time-stamped versions of their pages than they do now, but the team think Memento will encourage them to do this.

"I would love to see Memento supported," says Van de Sompel. "It would be such fun to set our browsers back in time and just browse the past."

Dig deep

Jakob Voss, a developer with the Common Library Network in Göttingen, Germany, is an early Memento user – and he is already advocating use of Memento for sites with frequently updated pages like Wikipedia.

"Memento is only a proof of concept but it looks very promising and could be a great enhancement to the web. There is little support in today's browsers for digging into archives, especially those with dynamic content management systems like wikis and weblogs," Voss says.

"Tracking versions, and the provenance of web information, is becoming more and more important and Memento could help manage this complex task."

He's not alone in that view. Ian Jacobs, a spokesman for the World Wide Web Consortium in Boston, Massachusetts, agrees that "URL persistence" is a valuable aim – and that users should be able to browse the latest version of a page or one on a given date.

"The browser should allow the user to choose," says Jacobs.

Van de Sompel is presenting the Memento technology today at a meeting of the National Digital Information Infrastructure and Preservation Program at the Library of Congress in Washington DC.

Journal reference: arxiv.org, arxiv: 0911.1112v2

'Doomsday' 2012 Prediction Explained: Mayan Calendar Was Cyclical

ScienceDaily

Science News


'Doomsday' 2012 Prediction Explained: Mayan Calendar Was Cyclical

ScienceDaily (Nov. 14, 2009) — Contrary to what the latest Hollywood blockbuster movie would suggest, the world will NOT end on Dec. 21, 2012, according to Ann Martin, a doctoral candidate in Cornell University's department of astronomy. Her research focuses on the hydrogen content of galaxies in the nearby universe.

The Mayan calendar was designed to be cyclical, so the fact that the long count comes to an end in December 2012 is really of no consequence, according to Martin. Simply, it is the end of great calendar cycle in Mayan society, much like our modern society celebrated the new Millennium. It does not mean that the "world will end." In fact, the Mayan calendar does not end then and there is no evidence to suggest that the Mayans -- or anyone for that matter -- has knowledge for the world's demise.

For the past three years, Martin has been a volunteer with Cornell's "Curious? Ask an Astronomer" service, a Web site founded by astronomy graduate students in 1997.

"Curious? Ask an Astronomer" features the answers to over 750 frequently asked astronomy questions, and readers who can't find their answers there can submit a new question and receive a personal answer from a graduate student volunteer.

For further information see: http://curious.astro.cornell.edu/question.php?number=686

What's going to happen on December 21st 2012?

Will the world end on 21st December 2012 because of the end of the Mayan calendar or because the winter solstice is "aligned" with the Milky Way?

It appears that Mayan ideas about time keeping and calendars were very cyclical. This is actually easy to understand because it's quite like our calendar which has cycles of various sizes very familiar to us. For example there is a:

  • 1 Jan every year
  • day 1 every month
  • Monday every week
  • 1am every day

The Mayan cycles were a bit more complex, such that every day in a 52 year period had a unique name from a combination of various different cycles (similar to the idea that there is a Monday 1st January only every 7 years or so). This 52 year cycle has is called a Calendar Round. To keep track of dates on longer time scales the Mayans then had what's known as the Long Count, which provides a unique numerical indicator for each day. Mayans did not count in base ten like we did, but usually instead in base 20 (although not always). The Mayan long count can be summarized as:

#days Mayan count
1 1 kin
20 20 kin 1 uinal
360 360 kin 18 uinal 1 tun
7200 7200 kin 360 uinal 20 tun 1 kactun
144000 144000 kin 8000 uinal 400 tun 20 kactun 1 bactun

The name for a Mayan epoch apparently translates as 13 bactuns, which you can see is 13*144000 days or 5125.26 years (roughly). There is actually some minor disagreement over when the current Mayan long cycle started, but it was probably either August 11th or 13th 3114 BC, which means it comes to an end on either Dec 21st or 23rd 2012.

As I mentioned above the Mayan calendar was designed to be cyclical, so the fact that the long count comes to an end in Dec 2012, while having some significance for the Maya as the end of a great cycle (much like we celebrated the millennium (incorrectly as it happens) on Dec 31st 1999), does not mean that the "world will come to an end". It's actually true that there are Mayan names for periods of time longer than 13 bactuns, so that their calendar doesn't even end then, and even if it did there is no evidence to suggest that they (or anyone for that matter) have any special knowledge about the end of the world.

There are however a lot of theories knocking around the Internet which use the end of this calendar cycle to predict the end of the world. They often also mention the fact that Dec 21st is the winter solstice, and that the Sun on the solstice that year is "aligned" with the plane of the galaxy. On the winter solstice, the Sun always has a Declination of -23.5 degrees, and a Right Ascension of 18 hours, but exactly where this is on the sky relative to more distant stars changes very slowly due to the "precession of the equinoxes". We have a posted answer explaining this effect but how it's important in this answer (and how it was first noticed) is by the fact that it moves the position of the equinoxes, and solstices with a period of 26,000 years in a complete circle around the sky westward along the ecliptic. So the position of the winter solstice moves 360 degrees in 26,000 years. That means that it moves 360/26000 = 0.01 degrees a year. Defining an exact boundary for the plane of the Milky Way is tough, but it's at least 10-20 degrees wide across much of the sky, meaning that the solstice can be described as being "in the plane of the Milky Way" for 700-1400 years! To put it another way, the winter solstice that just past (2005) was only 0.1 degrees away from where it will be in 2012, a distance smaller than the size of the Sun itself (which is about 0.5 degrees in diameter). In any case the Sun crosses the plane of the Galaxy twice every year as we orbit around it, with no ill effect on Earth.

To conclude:

  • The Mayan calendar does not predict the end of the world on Dec 12th 2012.
    1. The exact date of the end of the current Mayan Long Count is still a matter of debate amongst Mayan scholars, although it is likely to be around Dec 21 2012.
    2. The Mayan calendar is cyclical, and there are names for cycles longer than 13 bactuns of the Long Count which are coming to an end in 2012.
    Even if the Mayans did believe that the world would come to an end at the end of the Long Count (which I don't believe is true), there is no reason to assume that they have any special knowledge which would allow them to make this prediction correctly. You are free to believe the Sun won't come up tomorrow, but it will anyway....

  • The fact that the winter solstice on 2012 is "aligned" with the plane of the Galaxy has no significance.
    1. It takes the winter solstice 700-1400 years to cross the plane of the Galaxy.
    2. The solstice last year (2005) was within 0.1 degrees (or 1/5th the size of the Sun) of where it will be on 2012.
    3. The Sun crosses the plane of the Milky Way twice every year with no ill effect.

Much of the information about Mayan calendars I got from the Mayan Calendars article from Wikipedia.

For another detailed answer covering these and other points relating to Mayan calendar predictions and Astronomical events see here.

Alzheimer's Researchers Find High Protein Diet Shrinks Brain

ScienceDaily

Science News


Alzheimer's Researchers Find High Protein Diet Shrinks Brain


Researchers studying Alzheimer's disease found that, unexpectedly, a high protein diet apparently led to a smaller brain. (Credit: iStockphoto/Kelly Cline)

ScienceDaily (Oct. 21, 2009) — One of the many reasons to pick a low-calorie, low-fat diet rich in vegetables, fruits, and fish is that a host of epidemiological studies have suggested that such a diet may delay the onset or slow the progression of Alzheimer's disease (AD). Now a study published in BioMed Central's open access journal Molecular Neurodegeneration tests the effects of several diets, head-to-head, for their effects on AD pathology in a mouse model of the disease. Although the researchers were focused on triggers for brain plaque formation, they also found that, unexpectedly, a high protein diet apparently led to a smaller brain.

A research team from the US, Canada, and the UK tested four differing menus on transgenic mouse model of AD, which express a mutant form of the human amyloid precursor protein (APP). APP's role in the brain is not fully understood; however it is of great interest to AD researchers because the body uses it to generate the amyloid plaques typical of Alzheimer's. These mice were fed either

  1. a regular diet,
  2. a high fat/low carbohydrate custom diet,
  3. a high protein/low carb version or
  4. a high carbohydrate/low fat option.

The researchers then looked at the brain and body weight of the mice, as well as plaque build up and differences in the structure of several brain regions that are involved in the memory defect underlying AD.

Unexpectedly, mice fed a high protein/low carbohydrate diet had brains five percent lighter that all the others, and regions of their hippocampus were less developed. This result was a surprise, and, until researchers test this effect on non-transgenic mice, it is unclear whether the loss of brain mass is associated with AD-type plaque. But some studies in the published literature led the authors to put forward a tentative theory that a high protein diet may leave neurones more vulnerable to AD plaque. Mice on a high fat diet had raised levels of plaque proteins, but this had no effect on plaque burden.

Aside from transgenic mice, the pressing question is whether these data have implications for the human brain. "Given the previously reported association of high protein diet with aging-related neurotoxicity, one wonders whether particular diets, if ingested at particular ages, might increase susceptibility to incidence or progression of AD," says lead author, Sam Gandy, a professor at The Mount Sinai School of Medicine in New York City and a neurologist at the James J Peters Veterans Affairs Medical Center in the Bronx NY. The only way to know for sure would require prospective randomised double blind clinical diet trials. According to Gandy, "This would be a challenging undertaking but potentially worthwhile. If there is a real chance that the ravages of AD might be slowed or avoided through healthy eating. Such trials will be required if scientists are ever to make specific recommendations about dietary risks for AD.

Story Source:

Adapted from materials provided by BioMed Central, via EurekAlert!, a service of AAAS.

Cinderella fruit: Wild delicacies become cash crops

Science in Society



Cinderella fruit: Wild delicacies become cash crops

IF YOU had come here 10 years ago, says Thaddeus Salah as he shows us round his tree nursery in north-west Cameroon, you would have seen real hunger and poverty. "In those times," he says, "we didn't have enough chop to eat." It wasn't just food - "chop" in the local dialect - that his family lacked. They couldn't afford school fees, healthcare or even chairs for their dilapidated grass-thatch house.

There is a big future for some of Africa's native fruit (Image: Charlie Pye-Smith)

There is a big future for some of Africa's native fruit (Image: Charlie Pye-Smith)

Salah's fortunes changed in 2000 when he and his neighbours learned how to identify the best wild fruit trees and propagate them in a nursery. "Domesticating wild fruit like bush mango has changed our lives," he says. His family now has "plenty chop", as he puts it. He is also earning enough from the sale of indigenous fruit trees to pay school fees for four of his children. He has been able to re-roof his house with zinc sheets and buy goods he could only dream of owning before. He even has a mobile phone.

From Salah's farm we gaze across the intensively cultivated hills which roll away towards the Nigerian border. "Ten years ago, you'd hardly see any safou [African plum, Dacryodes edulis] in this area," says Zachary Tchoundjeu, a botanist at the World Agroforestry Centre's regional office in the Cameroonian capital Yaoundé. "Now you see them growing everywhere."

The spread of African plum through these hills is one small part of a bigger movement that could change the lives of millions of Africans. The continent is home to some 3000 species of wild fruit tree, many of which are ripe for domestication. Chocolate berries, gingerbread plums, monkey oranges, gumvines, tree grapes and a host of others could soon play a role in ensuring dependable food supplies in areas now plagued by malnutrition (see "Future fruits of the forest").

One of the architects of the programme is Roger Leakey, a former director of research at the World Agroforestry Centre. He calls these fruit trees "Cinderella species": their attributes may have gone unrecognised by science and big business, but the time has come for them to step into the limelight.

"The last great round of crop domestication took place during the green revolution [in the mid-20th century], which developed high-yielding varieties of starchy staples such as rice, maize and wheat," says Leakey. "This new round could scarcely be more different." Sparsely funded and largely ignored by agribusiness, high-tech labs and policy-makers, it is a peasant revolution taking place in the fields of Africa's smallholders.

The revolution has its roots in the mid-1990s, when researchers from the World Agroforestry Centre conducted a series of surveys in west Africa, southern Africa and the Sahel to establish which indigenous trees were most valued by local people. "We were startled by the results," says Tchoundjeu. "We were expecting people to point to commercially important timber species, but what they valued most were indigenous fruit trees."

In response to this unexpected finding, the World Agroforestry Centre launched a fruit tree domestication programme in 1998. It began by focusing on a handful of species, including bush mango (Irvingia gabonensis), an indigenous African species unrelated to the Indian mango, African plum - not actually a plum but a savoury, avocado-like fruit sometimes called an afrocado - and a nut tree known locally as njansan (Ricinodendron heudelotii). Though common in the forests and as wild trees on farms, they were almost unknown to science. "We knew their biological names, but that was about all," says Ebenezar Asaah, a tree specialist at the World Agroforestry Centre. "We had no idea how long it took for them to reach maturity and produce fruit, and we knew nothing about their reproductive behaviour." Local people, in contrast, knew a good deal about them, as the trees' fruits have long been part of their diet.

Rural Africans consume an enormous variety of wild foodstuffs. In Cameroon, fruits and seeds from around 300 indigenous trees are eaten, according to a study by researchers at Cameroon's University of Dschang. A similar survey in Malawi and Zambia found that up to 40 per cent of rural households rely on indigenous fruits to sustain them during the "hungry months", particularly January and February, when supplies in their granaries are exhausted and they are waiting for their next harvest (Acta Horticulturae, vol 632, p 15).

Some of these so called "famine foods" have already been domesticated by accident, says ethnoecologist Anthony Cunningham of People and Plants International, an NGO based in Essex Junction, Vermont. He cites the example of marula (Sclerocarya birrea), a southern African tree in the cashew family with edible nutty seeds encased in a tart, turpentine-flavoured fruit. "Long before the development of agricultural crops, hunter-gatherers were eating marula fruit," he says. "They'd pick the best fruit, then scatter the seeds around their camps." These would eventually germinate and mature into fruit-bearing trees, ensuring, in evolutionary terms, the survival of the tastiest. Marula is now fully domesticated and the fruit is used to make juice, a liqueur called Amarula Cream and cosmetic oils.

Hard graft

Likewise, generations of farmers in west Africa have selected and eaten the tastiest varieties of African plum and bush mango, planted their seeds and traded the seedlings - to such an extent that these trees are now widely grown. However, this is a haphazard and unscientific way to domesticate plants.

The planned domestication programme in Cameroon, initially led by Leakey and Kate Schreckenberg of the Overseas Development Institute in London, began with an analysis of the traits most appreciated in the villages. Unsurprisingly, farmers wanted trees that produce lots of large, sweet fruit as quickly as possible. So the researchers asked the farmers to show them their favourite wild trees, and took samples so they could propagate their own. Farmers also received training in horticultural techniques, such as grafting, used to propagate superior varieties.

Initially, many villagers viewed the techniques with suspicion. "People said this was white man's witchcraft, and at first they didn't want anything to do with it," says Florence Ayire, a member of a women's group in Widikum, Northwest Province. They changed their tune, however, once they saw how her grafted fruit trees - created by splicing material from a superior tree onto one which lacks the desired traits - flourished. Now they all want to learn, she says.

This isn't the only technique farmers are learning. They are also being trained how to clone superior trees by taking cuttings - one of the best ways of producing large numbers of genetically identical plants - and how to do marcotting, which involves peeling away bark from a branch and tricking it into producing roots while it is still attached to the parent plant. Once the roots appear, the branch can be cut down and planted in the soil.

Marcotting overcomes an important barrier to domestication for many species: the time it takes a tree to reach maturity and bear fruit. "There's a saying round here that if you plant the nut of a kola tree, you'll die before the first harvest," says Kuh Emmanuel, who helped to establish the centre where Salah was trained. It is still not known how long it takes for a wild kola tree to reach maturity - probably 20 years or more. Using marcots, however, farmers can raise kola trees that fruit after just four years. What's more, says Emmanuel, it results in a dwarf tree, which is important when you consider how many people fall to their death when harvesting fruit.

There's nothing new about the horticultural techniques being used to develop superior varieties of fruit tree in Africa. "What distinguishes this from most crop development programmes is the way it's being implemented," says Leakey. The traditional model involves the development by agribusiness companies of new varieties which can be grown as monocultures in vast plantations. "What's happening with the domestication programme in Cameroon is completely different," he says. "Local farmers play a key role in developing and testing new varieties, and they're the ones who stand to benefit most."

The programme has been a huge success: in 1998, there were just two farmer-run nurseries in Cameroon; now there are several hundred. Many are independent businesses, making significant profits and providing enough trees to transform the lives of tens of thousands of rural families.

Many farmers have increased their income by a factor of three or more, and their spending priorities are nearly always the same: more and better-quality food, school fees, decent healthcare, and zinc sheets to replace leaking thatch. Many also use their new-found wealth to buy land or livestock. One of the most exciting things about the domestication programme, says Tchoundjeu, is the way it is encouraging young people to stay in their villages rather than head to the cities to look for work.

Priorities are always the same: better food, school fees, decent healthcare and zinc sheets for the roof

Some projects are evolving into big business. Leakey is particularly impressed by Project Novella, a public-private partnership involving, among others, Unilever, the World Agroforestry Centre and the International Union for Conservation of Nature (IUCN). The project is promoting the domestication of Allanblackia, a group of trees whose seeds contain oil perfect for making margarine. Some 10,000 farmers in Ghana and Tanzania already grow the trees. If all goes to plan, this will rise to 200,000 farmers growing 25 million Allanblackia trees in a decade's time, earning them a total of $2 billion a year - half the annual value of west Africa's most important agricultural export, cocoa.

All of this chimes well with the findings of a recent analysis of the problems facing agriculture worldwide. The latest report by the International Assessment of Agricultural Knowledge, Science and Technology for Development suggests that business as usual is not an option. Instead, it argues, agriculture must do far more than simply produce food: it should focus on issues of social, economic and environmental sustainability, concentrating on the needs of the world's smallholders. The report also suggests that more attention should be paid to utilise wild species.

Better than cocoa

A glimpse of how such a future could look can be seen at Christophe Misse's smallholding, an hour's drive north of Yaoundé. In the 1990s, his main crop, cocoa, yielded an income for just three months a year; even with the extra cash he earned as a part-time teacher he struggled to make ends meet. Then, in 1999, he attended a training session held by the World Agroforestry Centre.

Two years later he set up a fruit tree nursery with three neighbours, and they now sell over 7000 trees a year. Their own farms are also much more profitable since they began growing indigenous trees. Some of Misse's most fruitful African plum trees earn 10,000 CFA francs (about $20) a year each, five times as much as his individual cocoa bushes. "I've built a new house," he says proudly, "and I'm making enough money to pay for two of my children to go to private school."

Misse still has some old timber trees shading his cocoa, but these are gradually being replaced by fruit trees, which will provide not just shade but a significant income and a habitat for wildlife. There is another benefit, too. Trees are much more capable of resisting droughts and other climatic shifts than annual crops such as cassava and maize. By planting a range of different tree species, farmers like Misse are taking out an insurance policy for the future.

As he sips a glass of Misse's home-made palm wine, Tchoundjeu muses on the changing landscape. "If you come back here in 10 years' time, I hope - I'm sure - you'll see improved varieties of indigenous fruit tree on every smallholding," he says. "I think you'll see a great diversity of different tree crops and a much more complex, more sustainable environment. And the people will be healthier and better off." It's a story, he believes, that could be repeated across Africa.

Future fruits of the forest

Last year, the US National Research Council published an exhaustive study of the wild fruits of Africa. Drawing on the knowledge of hundreds of scientists, the authors selected 24 species that could improve nutrition and food supplies.

Ten of these species have undergone a degree of domestication, including African plum, tamarind and marula. Of the 14 completely wild species - "essentially untouched by the almost magic hand of modern horticulture" - they identified seven with outstanding potential for domestication.

Given how many tropical fruits have already made their way into western supermarkets, here are some African staples that shoppers may soon find in their shopping cart.

Chocolate berries (Vitex spp)

Scattered across tropical Africa, these trees produce an abundance of blackish fruit with a chocolate flavour.

Aizen (Boscia senegalensis)

A scrawny scrub in the hottest and driest regions, its fruits, seeds, roots and leaves are eaten by desert-dwellers. The yellow, cherry-sized berries are sweet and pulpy when ripe, and harden into a sweet caramel-like substance when dried.

Ebony fruit (Diospyros spp)

Best known for their valuable, jet-black wood, ebony trees also produce large, succulent persimmon-like fruit with a delicate sweet taste.

Gingerbread plums (several genera of the family Chrysobalanaceae)

Distributed throughout sub-Saharan Africa, the plums this tree produces have the crunch of an apple and the flavour of a strawberry.

Medlars (Vangueria spp)

These trees grow well in arid areas and produce fruits that, when dried, have the flavour and smell of dried apples.

Sugar plums (Uapaca spp)

Found in woodlands, this tree bears juicy fruit with a honey-like taste.

Sweet detar (Detarium senegalense)

A leguminous tree of savannahs, its pods contain a sweet-and-sour pulp which can be eaten fresh or dried.

Charlie Pye-Smith is a freelance writer

How reputation could save the Earth

Opinion

How reputation could save the Earth

HAVE you ever noticed a friend or neighbour driving a new hybrid car and felt pressure to trade in your gas guzzler? Or worried about what people might think when you drive up to the office in an SUV? If so, then you have experienced the power of reputation for encouraging good public behaviour. In fact, reputation is such an effective motivator that it could help us solve the most pressing issue we face - protecting our planet.

Motivated to care (Image: Andrzej Krauze)

Motivated to care (Image: Andrzej Krauze)

Environmental problems are difficult to solve because Earth is a "public good". Even though we would all be better off if everyone reduced their environmental impact, it is not in anyone's individual interest to do so. This leads to the famous "tragedy of the commons", in which public resources are overexploited and everyone suffers.

Public goods situations crop up all over the place, including decisions on maintaining roads, funding the police and whether or not to shirk at work. This leads us to an important question: is it possible to make people care enough about such problems to do their bit? To help answer this, researchers have developed a representation of such situations called the public goods game. The results give cause to believe that the tragedy of the commons can be overcome.

In the public goods game, each player is given a sum of money, say $10. They then choose how much to keep and how much to anonymously contribute to a common pool. Contributions are multiplied by some factor (less than the number of players) and then split equally among all players. If everyone contributes, the payout is higher. But making a contribution is costly, and causes you to end up worse off than if you did not contribute.

Imagine, for example, four people playing a game in which contributions are doubled. If everyone contributes their $10, they all end up with $20. But a player who refuses to contribute while the others put in the full amount ends up with $25 while the rest get $15 each. If only one player contributes their $10, they end up with just $5 and everybody else $15. The self-interested thing to do, therefore, is never to contribute.

When the public goods game is played in the lab, most people usually begin by contributing a large amount, trying to do their part towards maximising the group's earnings. Some people, however, decide to take a slice of the profits without contributing. Over time this free-riding undermines the others' willingness to pay and the average contribution decreases. This results in significantly lower earnings all round, recreating the tragedy of the commons.

The public goods game gives us an opportunity to explore interventions that encourage cooperation. Experiments have shown, for example, that making each player's contribution public can sustain contributions at a high level. It appears that the benefit of earning a good name outweighs the costs of doing your part for the greater good, and even selfish people can be motivated to care. It is worth contributing in order to protect your standing in the community.

Out in the real world, these experiments suggest a way to help make people reduce their impact on the environment. If information about each of our environmental footprints was made public, concern for maintaining a good reputation could impact behaviour. Would you want your neighbours, friends, or colleagues to think of you as a free rider, harming the environment while benefiting from the restraint of others?

The power of reputation is already being harnessed to protect the environment. Hybrid cars such as the Toyota Prius have recognisable designs, advertising their driver's commitment to cleaner energy for all to see. Some energy companies give green flags to customers who choose to pay extra for energy from a more environmentally friendly source, allowing people to openly display their green credentials. Similarly, individuals who volunteer in environmental clean-up days receive T-shirts advertising their participation.

Tokens such as these serve a dual purpose. First, they allow those who contribute to reap benefits through reputation, helping to compensate them for the costs they incur. Secondly, when people display their commitment to conservation, it reinforces the norm of participation and increases the pressure on free riders. If you know that all of your neighbours are paying extra for green energy or volunteering on a conservation project, that makes you all the more inclined to do so yourself.

When people display their commitment to conservation, it ups the pressure on free riders

Even better than voluntary displays would be laws enforcing disclosure. For example, governments could require energy companies to publish the amount of electricity used by each home and business in a searchable database. Likewise, gasoline use could be calculated if, at yearly inspections, mechanics were required to report the number of kilometres driven. Cars could be forced to display large stickers indicating average distance travelled, with inefficient cars labelled similarly to cigarettes: "Environmentalist's warning: this car is highly inefficient. Its emissions contribute to climate change and cause lung cancer and other diseases." Judging from our laboratory research, such policies would motivate people to reduce their carbon footprint.

Although laws of this kind raise possible privacy issues, the potential gains could be great. In a world where each of us was accountable to everybody else for the environmental damage we cause, there would be strong incentives to reduce the energy we use, the carbon dioxide we emit and the pollution we create. In such a world, we might be able to avert a global tragedy of the commons.

David Rand is a postdoctoral fellow in mathematical biology at Harvard University.

Martin Nowak is professor of biology and mathematics at Harvard University.