How eating less might make you live longer
Press Release from PLoS Medicine
Caloric Restriction in non-obese people translates into less oxidative damage in muscle cells, according to a new
study by Anthony Civitarese, Eric Ravussin, and colleagues (Pennington Biomedical Research Center). As oxidative
damage has been linked to aging, this could explain how limiting calorie intake without malnutrition extends life
A calorie-restricted diet provides all the nutrients necessary for a healthy life but minimizes the energy (calories)
supplied in the diet. This type of diet increases the life span of mice and delays the onset of age-related chronic
diseases such as cancers, heart disease, and stroke in rodents. There are also hints that people who eat a calorie-
restricted diet might live longer than those who overeat. In addition, calorie-restricted diets beneficially affect
several biomarkers of aging, including decreased insulin sensitivity (a precursor to diabetes). But how might caloric
restriction slow aging? A major factor in the age-related decline of bodily functions is the accumulation of "oxidative
damage" in the body's proteins, fats, and DNA. Oxidants—in particular, chemicals called "free radicals"—are
produced when food is converted to energy by cellular structures called mitochondria. One theory for h ow caloric
restriction slows aging is that it lowers free-radical production by inducing the formation of efficient mitochondria.
Civitarese and colleagues enrolled 36 healthy overweight but non-obese young people into their study. A third of
them received 100% of their energy requirements in their diet; the caloric restriction (CR) group had their calorie
intake reduced by 25%; and the caloric restriction plus exercise (CREX) group had their calorie intake reduced by
12.5% and their energy expenditure increased by 12.5%. The researchers found that a 25% caloric deficit for 6
months, achieved by diet alone or by diet plus exercise, decreased 24hr whole body energy expenditure (i.e. overall
calories burned), which suggests improved mitochondrial function. Their analysis of genes involved in mitochondria
formation indicated that CR and CREX both increased the number of mitochondria in muscle. Both interventions also
reduced the amount of DNA damage—a marker of oxidative stress—in the participants' muscles.
The researchers also examined gene expression in the study participants. In yeast, worms, and flies the
activation of the Sir2 gene increases life span and regulates cellular metabolism. An important question is whether
caloric restriction can regulate SIRT1 (the mammalian equivalent of Sir2) in humans. Civitarese and colleagues
found that indeed fewer calories can improve whole body metabolism in conjunction with an increase in SIRT1 gene
expression in skeletal muscle. These results raise the possibility that SIRT1 may contribute to more efficient
metabolism, less oxidative stress, and increase longevity in humans as it does in lower organism.
The results suggest that even short-term caloric restriction can produce beneficial physiological changes leading
to improved health. Whether caloric restriction and the associated health benefits can be sustained over longer term
remains to be established in humans.
Whole body regeneration from a blood vessel
For a lucky subset of vertebrates, losing an appendage is no big deal. As many an inquisitive child knows,
salamanders can regenerate lost limbs or tails; and as lab investigators know, zebrafish can regrow lost fins. Of
course, humans and other "higher" vertebrates must make do with repairing rather than regenerating damaged
tissues. Though whole body generation (WBR) does occur, it‘s typically restricted to a subset of morphologically less
complex invertebrates, such as sponges, flatworms, and jellyfish. In a new study, Yuval Rinkevich et al. discovered
an unusual mode of WBR in our closest invertebrate relative, the sea squirt Botrylloides leachi.
Sea squirts (also called "tunicates" after their tough outer tunic) are widely distributed in shallow coastal waters
as colonies of genetically identical individuals called "zooids." To investigate WBR in B. leachi, Rinkevich et al.
collected colonies from the Mediterranean coast of Israel and analyzed the morphological, cellular, and molecular
characteristics of the process. The researchers removed fragments of blood vessels with ampullae from the colonies,
and placed the fragments on slides for regeneration. Of 95 fragments, 80 underwent WBR. Aggregating cells formed
around a hollow sphere, then reorganized into a thin and thick layer on opposite sides, very similar to early stages
of embryonic development. As cells proliferated, buds grew and the thick cell layer folded inward, forming double-
walled folds and chambers. Organ development continued and an adult zooid, capable of sexual reproduction,
appeared within two weeks.
For molecular insights into regeneration, the researchers focused on retinoic acid (RA) signaling by examining the
temporal expression of its receptor (RAR). In addition to its role in chordate body patterning, RA (a vitamin A
metabolite) induces the regeneration of several tissues and organs. Only regenerating vessels and ampullae
expressed RAR, and this expression continued through each phase of regeneration.
The researchers confirmed RA‘s vital role in regeneration by inhibiting RA synthesis with chemicals and
destroying RA transcripts with RNA interference. In both cases, malformed buds failed to generate zooids from
dissected fragments. Similar problems occurred when RAR function was disrupted. In contrast, RA overexpression
led to accelerated regeneration, with multiple buds reaching the fully developed zooid stage. RAR regulates
developmental elements of the normal budding process in a sister colonial tunicate species, suggesting that
organisms recruit the same signals for development and regeneration.
In regenerating fully functional adult tunicates from "minute vascular fragments," the researchers identified
several features of this system that differ from those of established regeneration model systems. In contrast to limb
or fin regeneration, which arises from local signals emanating from a "regeneration center," B. leachi WBR arises
from systemically induced signals in multiple "regeneration niches." These niches arise from the vascular network
(rather than from proliferating balls of cells), and regeneration appears to be regulated by systemic (rather than
local) cues. These systemic cues, the researchers propose, may travel through the circulation, thereby supporting
multiple regeneration foci. The researchers plan to investigate the cellular source of the tunicate‘s remarkable
regenerative power in future studies.
As invertebrate members of the phylum Chordata, sea squirts share
several fundamental biological pathways with vertebrates; consequently,
using them as a model system to study WBR could illuminate not only the
evolutionary origin of regeneration, but also its subsequent attenuation in
Recognized as one of the closest relatives to vertebrates, the colonial
urochordate Botrylloides leachi possesses unique modes for propagating
and regenerating that make this species a robust model organism for
studying diverse biological issues related to development, immunology,
stem cells, and regeneration. Reshef et al.
Dietary copper may ease heart disease
Including more copper in your everyday diet could be good for your heart, according to scientists at the
University of Louisville Medical Center and the USDA Human Nutrition Research Center. Their studies show that
giving copper supplements to mice eased the stress on their over-worked hearts by preventing heart enlargement.
The study will be published online on March 5th in The Journal of Experimental Medicine.
Insufficient copper intake is associated with increases in cholesterol levels, clot formation, and heart disease. The
new study found that feeding mice copper relieved heart disease and restored proper heart function, even when the
animals' hearts were continually stressed. Stressed mice that were not given copper supplements suffered heart
failure. The copper-rich diet increased the production of a protein that promotes the growth of new blood vessels,
although exactly how this protein might aid heart recovery is not yet clear.
The human equivalent of the beneficial dose of copper used in this study is about 3.0 mg/day. The current
recommended daily intake for humans, however, is only 0.9 mg/day. Increasing copper intake, especially in those
pre-disposed to heart disease, might thus be an easy way to reduce the mortality rate associated with this condition.
Spiders: Chastity belts stop cuckoos in the nest
Males plug their partner's sexual orifice
The fact that female wasp spiders have numerous sexual contacts is something which their male partners cannot
prevent. What they can do, however, is ensure that no offspring ensue from these tête à têtes with their rivals: the
male spiders simply place a chastity belt on their partner while copulating. The tip of their genital breaks off during
intercourse, blocking the sexual orifice of the female spider. Biologists from the universities of Bonn and Hamburg
report on this amazing mechanism in the journal ‗Behavioral Ecology‘ (vol. 18, pages 174-181, 2007).
When a male wasp spider discovers a potential partner, he attracts her by shaking her web. The female
thereupon supports herself on her long legs on the web so that the male, who is much smaller, can then creep
under her body. The rest works hydraulically: the tip of a transformed leg filled with sperm is inserted into the
female‘s sexual orifice – like a ski boot in its binding.
The female usually puts an end to the affair after a few seconds by attacking her partner and killing him if he
does not escape in time. ‗When the male detaches himself from the female, in more than 80 per cent of cases the
tip of his genital breaks off,‘ the Bonn lecturer Dr. Gabriele Uhl says. ‗The tip then remains in the sexual orifice like a
cork, blocking it.‘
Together with her colleague Professor Jutta Schneider and the behavioural biologist Stefan Nessler (both now at
the University of Hamburg), Dr. Uhl has been looking for a reason for this genital mutilation. ‗There are basically
two hypotheses,‘ she explains. ‗On the one hand detaching part of the genital organ could help the male to escape
from the female‘s murderous attack. On the other hand it might be a mechanism ensuring that paternity is
maintained, preventing or impeding further copulation by the female.‘
Genital tip as a ‘cork’
Even during the briefest of reproductive encounters, a male transfers enough sperm to fertilise all his partner‘s
eggs. However, if a rival male then gets a look in, his sperm cells compete during copulation with those of his
predecessor. ‗The detached tip might prevent subsequent intercourse, like a chastity belt,‘ Jutta Schneider explains.
‗The first male would thereby ensure that all the egg cells were fertilised by him rather than his rival.‘
In his diploma thesis at the University of Bonn, Stefan Nessler has been examining more closely which of the two
hypotheses is correct. The result is that whether the tip broke off or not had no significant effect on the male‘s
survival rate – but it did affect any subsequent copulation with another male: when the sexual orifice was blocked,
the encounter was over after only eight seconds; normally male spiders copulate for twice as long. ‗The results
show that the blockage at least impedes copulation,‘ Stefan Nessler emphasises. ‗Initial morphological studies show
that the detached tip plugs the orifice so securely that the transfer of semen is probably largely excluded.‘
The researchers have now been able to show that other species of wasp spiders also have this ‗plugging
mechanism‘. What they all have in common is that the female attempts to kill her partner during intercourse. ‗We
presume that genital mutilation only makes sense if there is hardly any chance of further copulation anyway,‘
Gabriele Uhl explains. ‗The males show maximum investment.‘
Dr. Uhl‘s diploma student Martin Busch is currently investigating a completely different strategy for preventing
competitors from having intercourse: the dwarf spider produces a viscous secretion in its genital which it spurts
after the sperm. This mucus plug blocks the female genital aperture so effectively that rivals can no longer copulate.
Whether a mucus plug or the tip of the genital organ is used, the oviposition is unaffected by the ‗chastity belt‘: ‗for
that the females have a separate aperture,‘ Dr. Uhl stresses. ‗In spider species with only one aperture for copulation
and oviposition no such contraceptive strategies exist.‘
UF study first to document evidence of 'mafia' behavior in cowbirds
"The Sopranos" have some competition -- brown-headed cowbirds.
GAINESVILLE, Fla. --
Cowbirds have long been known to lay eggs in the nests of other birds, which then raise the cowbirds‘ young as
Sneaky, perhaps, but not Scarface.
Now, however, a University of Florida study finds that cowbirds actually ransack and destroy the nests of
warblers that don‘t buy into the ruse and raise their young.
Jeff Hoover, an avian ecologist at the Florida Museum of Natural History, is the lead author on the first study to
document experimental evidence of this peeper payback -- retaliation to encourage acceptance of parasitic eggs.
Findings will be published online in the Proceedings of the National Academy of Sciences March 5.
"It‘s the female cowbirds who are running the mafia racket at our study site," said Hoover, who has a joint
appointment with the Illinois Natural History Survey. "Our study shows many of them returned and ransacked the
nest when we removed the parasitic egg."
So-called "brood parasitic birds" lay eggs in the nests of host birds that raise the parasite‘s offspring, usually at
the expense of some of their own. The brown-headed cowbird parasitizes more than 100 host species, including
many Neotropical migratory birds such as warblers, tanagers and vireos. Prothonotary warblers were used for this
study because they almost always accept cowbird eggs, Hoover said.
Hosts that use their beaks to grasp or puncture parasitic eggs and remove them from the nest are called
"ejecters." "Accepter" hosts raise parasitic eggs.
"Retaliatory mafia behavior in cowbirds makes hosts‘ acceptance of cowbird eggs a better proposition than
ejection," Hoover said. "The accepting warblers in our study produced more of their own offspring, on average, than
those where we ejected cowbird eggs."
Hosts may lose some, but not all, of their biological offspring by accepting parasitism. The retaliatory behavior of
ransacking nests encourages warblers to raise the cowbirds‘ offspring.
"We wanted to determine if the cowbirds were responsible for nest predation after we removed cowbird eggs
from parasitized warbler nests," Hoover said. To test for this, Hoover collaborated with Scott Robinson, Florida
Museum Ordway eminent scholar and natural history chair, to manipulate cowbird access to warbler nests in the
Cache River watershed of southern Illinois. The researchers monitored 182 predator-proofed nests over four
Hoover and Robinson found that warbler nests were ransacked 56 percent of the time when researchers
experimentally removed the parasitic eggs and cowbirds were allowed nest access, versus only 6 percent when the
cowbird eggs were accepted and cowbirds had nest access. No nests were ransacked when researchers removed
cowbird eggs and cowbirds were denied nest access. Together, these results implicate cowbirds and provide
evidence of mafia behavior.
"We also found evidence for ‗farming‘ behavior," Hoover said. "Cowbirds ‗farm‘ a non-parasitized nest by
destroying its contents so that the host will build another. The cowbird then syncs its egg laying with the hosts‘
Hoover found that warbler nests that were never parasitized but that cowbirds had access to, were ransacked 20
percent of the time. "Cowbirds parasitized 85 percent of the renests, which is strong supporting evidence for both
farming and mafia behavior," he said.
Hoover and Robinson‘s results imply that cowbirds actively monitor nests they parasitize -- which supports the
idea that cowbirds continue to visit nests they have parasitized to see the results of their handiwork.
Stephen Rothstein, a zoology professor at the University of California in Santa Barbara, said other studies have
shown evidence contrary to mafia and farming behaviors.
"Video surveillance would show the proportion of nest predation attributable to cowbirds," Rothstein said. "The
phenomenon may be perfectly true for these warblers, but that doesn‘t mean it holds true for other species,
especially those that aren‘t nesting in special circumstances. Nevertheless, this new study may extend our
knowledge of the extent to which parasitic cowbirds may have evolved tactics to facilitate their parasitism."
Hoover said his future research includes video surveillance of individually banded female cowbirds and warbler
Drug may help alcoholics cut down
Scientists have devised a treatment which could stop alcoholics drinking too much.
The Scripps Research Institute and the Eli Lilly drug company study also found the chemical could prevent
relapses and reduce the effects of hangovers.
The animal study in the Journal of Neuroscience used a synthetic compound to block a chemical response in the
brain which triggers relapse.
Alcohol related deaths have doubled in the UK since 1991.
The compound, called MTIP, blocks the action of the brain chemical corticotropin-releasing factor (CRF).
CRF levels in the brain rise in the short term after drinking, but normally return to normal within a day or so.
The study found the CRF system becomes overactive in animals with a long term history of alcohol dependence,
increasing the risk of relapse.
The researchers used MTIP to block CRF activity in stressful situations, but found it had no affect on CRF activity
under ordinary circumstances.
They looked at rats who had been put through several cycles of heavy alcohol consumption and withdrawal to
create dependency, as well as animals selectively bred to consume more alcohol.
Injections of MTIP stopped both groups of animals consuming excessive amounts of alcohol and eliminated their
susceptibility to relapse when they were stressed.
However the compound did not affect the animals' natural curiosity, or lower levels of alcohol consumption in
rats that were not alcohol-dependent.
The researchers say MTIP can be given orally and reaches the brain in sufficient amounts to block more than
90% of CRF receptors.
But MTIP does not accumulate in other organs, including the liver, in ways that would cause concerns about
potential side effects, they add.
Dr George Koob, of the Scripps Research Institute said: "This study shows the activity of a compound that
potentially could be used in human subjects.
"It moves the field closer to the day when the dark side of addiction is treated."
The researchers say that, in addition to alcoholism, MTIP might be useful in the treatment of depression or
anxiety disorders, in which CRF levels can also be high.
Bob Patton, a health psychologist at the National Addiction Centre, King's College London said: "This study
certainly shows promise.
"Alcoholism is a chronic relapsing condition with up to nine out of 10 patients return to drinking after treatment,
and any intervention that can reduce this would be welcomed.
"There are many pharmacological and psychological approaches that can help patients remain sober; however
none have proved to be definitive.
"The worth of MTIP needs to be explored in a properly conduced clinical trial. If successful, this could be a useful
addition to our existing treatment arsenal."
Studies force new view on biology of flavonoids
CORVALLIS, Ore. –Flavonoids, a group of compounds found in fruits and vegetables that had been thought to be
nutritionally important for their antioxidant activity, actually have little or no value in that role, according to an
analysis by scientists in the Linus Pauling Institute at Oregon State University.
However, these same compounds may indeed benefit human health, but for reasons that are quite different –
the body sees them as foreign compounds, researchers say, and through different mechanisms, they could play a
role in preventing cancer or heart disease.
Based on this new view of how flavonoids work, a relatively modest intake of them – the amount you might find
in a healthy diet with five to nine servings of fruits and vegetables – is sufficient. Large doses taken via dietary
supplements might do no additional good; an apple a day may still be the best bet.
A research survey, and updated analysis of how flavonoids work and function in the human body, were recently
published in Free Radical Biology and Medicine, a professional journal.
"What we now know is that flavonoids are highly metabolized, which alters their chemical structure and
diminishes their ability to function as an antioxidant," said Balz Frei, professor and director of the Linus Pauling
Institute. "The body sees them as foreign compounds and modifies them for rapid excretion in the urine and bile."
Flavonoids are polyphenolic compounds with some common characteristics that are widely found in fruits and
vegetables and often give them their color – they make lemons yellow and certain apples red. They are also found
in some other foods, such as coffee, tea, wine, beer and chocolate, and studies in recent years had indicated that
they had strong antioxidant activity – and because of that, they might be important to biological function and health.
"If you measure the activity of flavonoids in a test tube, they are indeed strong antioxidants," Frei said. "Based
on laboratory tests of their ability to scavenge free radicals, it appears they have 3-5 times more antioxidant
capacity than vitamins C or E. But with flavonoids in particular, what goes on in a test tube is not what‘s happening
in the human body."
Research has now proven that flavonoids are poorly absorbed by the body, usually less than five percent, and
most of what does get absorbed into the blood stream is rapidly metabolized in the intestines and liver and excreted
from the body. By contrast, vitamin C is absorbed 100 percent by the body up to a certain level. And vitamin C
accumulates in cells where it is 1,000 to 3,000 times more active as an antioxidant than flavonoids.
The large increase in total antioxidant capacity of blood observed after the consumption of flavonoid-rich foods is
not caused by the flavonoids themselves, Frei said, but most likely is the result of increased uric acid levels.
But just because flavonoids have been found to be ineffectual as antioxidants in the human body does not mean
they are without value, Frei said. They appear to strongly influence cell signaling pathways and gene expression,
with relevance to both cancer and heart disease.
"We can now follow the activity of flavonoids in the body, and one thing that is clear is that the body sees them
as foreign compounds and is trying to get rid of them," Frei said. "But this process of gearing up to get rid of
unwanted compounds is inducing so-called Phase II enzymes that also help eliminate mutagens and carcinogens,
and therefore may be of value in cancer prevention.
"Flavonoids could also induce mechanisms that help kill cancer cells and inhibit tumor invasion," Frei added.
It also appears that flavonoids increase the activation of existing nitric oxide synthase, which has the effect of
keeping blood vessels healthy and relaxed, preventing inflammation, and lowering blood pressure – all key goals in
prevention of heart disease.
Both of these protective mechanisms could be long-lasting compared to antioxidants, which are more readily
used up during their free radical scavenging activity and require constant replenishment through diet, scientists say.
However, Frei said, it‘s also true that such mechanisms require only relatively small amounts of flavonoids to
trigger them – conceptually, it‘s a little like a vaccine in which only a very small amount of an offending substance is
required to trigger a much larger metabolic response. Because of this, there would be no benefit – and possibly
some risk – to taking dietary supplements that might inject large amounts of substances the body essentially sees
as undesirable foreign compounds.
Numerous studies in the United States and Europe have documented a relationship between adequate dietary
intake of flavonoid-rich foods, mostly fruits and vegetables, and protection against heart disease, cancer and
neurodegenerative disease, Frei said.
Stanford diet study tips scale in favor of Atkins plan
STANFORD, Calif. --The case for low-carbohydrate diets is gaining weight. Researchers at the Stanford University
School of Medicine have completed the largest and longest-ever comparison of four popular diets, and the lowest-
carbohydrate Atkins diet came out on top.
Of the more than 300 women in the study, those randomly assigned to follow the Atkins diet for a year not only
lost more weight than the other participants, but also experienced the most benefits in terms of cholesterol and
"Many health professionals, including us, have either dismissed the value of very-low-carbohydrate diets for
weight loss or been very skeptical of them," said lead researcher Christopher Gardner, PhD, assistant professor of
medicine at the Stanford Prevention Research Center. "But it seems to be a viable alternative for dieters."
The results will be published in the March 7 issue of the Journal of the American Medical Association.
The 311 pre-menopausal, non-diabetic, overweight women in the study were randomly assigned to follow either
the Atkins, Zone, LEARN or Ornish diet. Researchers chose the four diets to represent the full spectrum of low- to
The Atkins diet, popularized by the 2001 republication of Dr. Atkins' New Diet Revolution, represents the lowest
carbohydrate diet. The Zone diet, also low-carbohydrate, focuses on a 40:30:30 ratio of carbohydrates to protein to
fat, a balance said to minimize fat storage and hunger. The LEARN (Lifestyle, Exercise, Attitudes, Relationships and
Nutrition) diet follows national guidelines reflected in the U.S. Department of Agriculture's food pyramid - low in fat
and high in carbohydrates. The Ornish diet, based on bestseller Eat More, Weigh Less by Dean Ornish, is very high
in carbohydrates and extremely low in fat.
Study participants in all four groups attended weekly diet classes for the first eight weeks of the study and each
received a book outlining the specific diet to which they were assigned. For the remaining 10 months of the study,
the women's weight and metabolism were regularly checked, and random phone calls monitored what they were
One of the strengths of the $2 million study was that this setup mimicked real-world conditions, Gardner said.
Women in the study had to prepare or buy all their own meals, and not everyone followed the diets exactly as the
books laid out, just as in real life.
At the end of a year, the 77 women assigned to the Atkins group had lost an average of 10.4 pounds. Those
assigned to LEARN lost 5.7 pounds, the Ornish followers lost 4.8 pounds and women on the Zone lost 3.5 pounds,
on average. In all four groups, however, some participants lost up to 30 pounds.
After 12 months, women following the Atkins diet, relative to at least one of the other groups, had larger
decreases in body mass index, triglycerides and blood pressure; their high-density lipoprotein, the good kind of
cholesterol, increased more than the women on the other diets.
Gardner has several ideas for why the Atkins diet had the overall best results. The first is the simplicity of the diet.
"It's a very simple message," he said. "Get rid of all refined carbohydrates to lose weight." This message directly
targets a major concern with the American diet right now - the increasing consumption of refined sugars in many
forms, such as high-fructose corn syrup.
Beyond pinpointing this high sugar intake, the Atkins diet does the best at encouraging people to drink more
water, said Gardner. And when people replace sweetened beverages with water, they don't generally eat more
food; they simply consume fewer calories over the course of the day.
The third theory Gardner offered as to why the Atkins diet was more successful is that it is not just a low-
carbohydrate diet, but also a higher protein diet. "Protein is more satiating than carbohydrates or fats, which may
have helped those in the Atkins group to eat less without feeling hungry," he said.
Although the Atkins group led in terms of the average number of pounds lost, this group also gained back more
weight in the second half of the study than those in the three other groups. Gardner also noted that the women in
the Atkins group had lost an average of almost 13 pounds after six months, but ended the one-year period with a
final overall average loss of 10 pounds.
Though critics of low-carbohydrate diets say that such diets can lead to health problems, none of the factors
measured in this study was worse for the Atkins group. Gardner cautions, however, that there are potential long-
term health problems that could not have been identified in a 12-month study. Also, several basic vitamins and
minerals can be difficult to get in adequate amounts from a very-low-carbohydrate diet.
In the long run, Gardner hopes to use the large data set generated in this study to investigate why different diets
might work better for different people. "We're trying to see if we can learn more about the factors that predict
success and failure with weight loss," he said.
Regardless of what new insights are revealed, Gardner said the message he hopes people take from the study is
the importance of eliminating from their diet, as much as possible, refined carbohydrates such as white bread and
Particle physics on the cancer ward
By Professor Bleddyn Jones Consultant, University Hospital, Birmingham
Techniques developed by atomic physicists are being used to develop the first of what promises to be a new
generation of cancer treatments in place of conventional radiotherapy. One day doctors could even be using anti-
matter. Recent announcements from the CERN laboratory in Geneva have aroused considerable interest in some
Cancer cells were successfully targeted with anti-matter subatomic particles, causing intense biological damage
leading to cell death.
These pilot experiments may have future potential. But applications borrowed from particle physics are already
being used in cancer treatment to help avoid the major side effects of radiotherapy.
In conventional radiotherapy, X-ray beams pass through the entire thickness of the body, so that many organs
and tissues receive unnecessary radiation.
Proton and ion therapy
Although beams of charged atomic particles act in the same way as X-rays - interacting with cellular DNA and
causing chromosome breaks and cell death - most of their energy can be delivered at a particular point in the
cancer. Little or no energy is deposited beyond the cancer target.
PROTON BEAM THERAPY
Treatment of a spinal cancer using proton beam therapy
Concentrations of radiation are shown in colour codes from high (red) to low (blue)
Most radiation is concentrated on the tumour
Pictures: Courtesy of Francis H. Burr Proton Therapy Center, Boston, Massachusetts
The amount of radiation affecting normal tissues can be reduced to half or even one tenth at the same time
as delivering the same or higher dose to the cancer.
The more efficient treatments bring numerous benefits.
Reducing collateral radiation is particularly beneficial for patients whose cancers
are close to the spinal cord, brain, heart, eye or ear.
Complications from irradiation of the lungs can cause shortness of breath.
While in bowels and kidneys, severe complications can require surgery.
Younger patients would have reduced risk of damage to their development and
lower risks of future cancers.
Types of cancer which are resistant to existing treatments could be treated with
higher doses than X-rays permit.
And complications from metallic hip replacements could be avoided, along with
issues with bone marrow irradiation making for safer subsequent cancer chemotherapy.
In Japan, doctors have already been using particle beams (of carbon ions) for cancers of the eye, lung, prostate
and liver. They report needing only one to four carbon ion treatments compared to six to seven weeks with X-rays.
The UK currently has only one treatment centre, a proton (hydrogen ion) facility at Clatterbridge Hospital,
Liverpool, the first hospital based facility in the world.
Treatment of a spinal cancer using X-rays
While most radiation is concentrated on the tumour, there are significant doses affecting the nearby
kidneys (outlined) and other organs
It has treated over 1,000 patients and produced excellent results, but it can only treat eye tumours.
And two young patients with special treatment needs from Birmingham were
referred for proton therapy in Boston, USA for tumours in the spinal column and
They received a higher tumour dose than would otherwise have been possible,
both with reduced risks of paralysis, a reduced lung dose in the first patient and of
further deafness in the second.
Other proton and ion facilities are being built around the world, notably in Japan,
the United States, Italy, Germany, Austria, Sweden and France.
Each centre requires a cyclotron or synchrotron, a particle accelerator from which
the particle beam can be delivered to multiple treatment rooms.
In the UK, they would cost £70-100m, and each could treat around 2,000 patients a year. There is a strong case
for providing them in all major UK cities.
Anti-matter treatment may still be a while off in the future but esoteric physics could be bringing major benefits
to many more cancer patients today.
Professor Bleddyn Jones is a Consultant in Clinical Oncology & Applied Radiobiology at University Hospital, Birmingham.
Many gene mutations drive cancer
However, as well as these "driver" mutations, each type of cancer cell carries many more "passenger" mutations
which play no role in causing disease.
The findings, published in Nature, come from the most extensive analysis yet of the human genome and cancer.
Sanger Institute scientists looked at more than 500 human genes and 200 types of cancer.
The research suggests that cancer biologists will face a big challenge in distinguishing between mutations that
cause cancer and those that do not.
Professor Mike Stratton, co-leader of the Cancer Genome Project, said: "We have found a much larger number of
mutated driver genes produced by a wider range of forces than we expected."
The researchers studied genes of a type called kinases, some of which have been previously implicated in
Kinases act as switches within cells, controlling behaviour such as cell division. In
mutated forms this process can go awry, leading to the uncontrolled division of cells
characteristic of cancer.
One example is the BRAF gene, which is mutated in more than 60% of cases of
the skin cancer malignant melanoma.
The Sanger team identified possible driver mutations in 120 genes, most of which
had not been seen before.
However, most mutations were found to be of the harmless "passenger" type.
The team also found the type of mutation varied markedly between individual
cancers - and that some of the processes that generated them were active decades
before the cancer showed itself.
Kinase proteins control cell division
The number of mutated genes that drive the development of cancer is greater than had been thought,
The latest analysis was made possible by Sanger's work in decoding around a third of the entire human genome.
Dr Mark Walport, director of the Wellcome Trust, which funds the Sanger Institute, said the study showed how
important the Human Genome Project had been.
He said: "Understanding the mutations that cause cancer is crucial in order to develop accurately targeted
Dr Francis Collins, director of the National Human Genome Research Institute in the US, said powerful analytical
tools were now available to probe the mysteries of cancer.
He said: "The important and interesting data on protein kinases in this report further encourages the conclusion
that a full assault on the cancer genome will yield many opportunities to
revolutionise diagnosis and treatment."
China confirms Moon probe in 2007
China will launch its first lunar probe this year, and expects to be able to land a
man on the Moon within 15 years, a senior space official has confirmed.
The Chang'e-1 lunar probe will be launched later this year aboard a Long March
The probe will provide 3D images of the Moon, survey the lunar landscape, study lunar microwaves and estimate
the thickness of the Moon's soil.
China became the third nation to place a human in space in October 2003.
The Moon exploration programme includes a planned lunar fly-by in 2007, a "soft landing" in 2012, return of
lunar samples by 2017, and landing an astronaut on the Moon within 15 years.
Reaching the moon has long been a goal of China's space programme AP
"The goal to land an astronaut on the Moon can surely be achieved in 15 years," said Huang Chunping, a senior
space official, while attending the annual full session of the National Committee of the Chinese People's Political
Consultative Conference (CPPCC), the country's top political advisory body.
Mr Huang gave no date for the launch of the lunar probe, but confirmed it would be later this year.
For the full Moon exploration programme, he admitted that unexpected difficulties could affect the timetable, but
said he had "full confidence" in the development of the country's rocket industry.
Mr Huang said that China's next generation carrier rocket, likely to be named Long March 5, would be ready for
launch in "seven or eight years," and the vehicles' engines had already been successfully tested.
The long-awaited new rocket would "use non-toxic fuels" and increase the payload capacity of the Long March
series from nine tonnes to 25 tonnes, he said.
Mr Huang also said China would launch its third manned spaceflight, Shenzhou VII, next year, with three
astronauts on board. They will attempt the Chinese programme's first space walk.
The Shenzhou VII launch was planned for this year, but work is still being done on the suits that astronauts will
wear during the space walk.
Two of the three astronauts are expected to venture outside the capsule, but no decision has been made on how
long the walks would be.
Mr Huang said the timing of the flight will depend on progress on finishing the space suits, and would not be
timed for the Summer Olympics, which start in August 2008 in Beijing.
Prescriptions for health advice online
When searching for health advice online, consumers often reject websites with high quality medical information
in favour of those with a human touch, according to new research funded by the Economic and Social Research
Faced with a minefield of information of variable quality, health consumers subject websites to an initial weeding-
out process that will eliminate most NHS and drug company websites from the search within a matter of seconds.
The study, carried out by Professor Pamela Briggs at Northumbria University, together with colleagues at both
Northumbria and Sheffield Universities, explored how health consumers decide whether or not to trust the
information and advice they find online.
The researchers observed the search strategies of people who wanted to find specific health information and
advice (about hypertension, menopause and HRT, the MMR vaccine, or generally improving their health and fitness)
and found that many websites were dismissed at quite amazing speeds.
"One thing that really put people off was advertising, so people clicked off drug company websites straight away",
explains Professor Briggs. "Generally, the medical information on drug company sites is very accurate but people
question the authors' motivation and agenda. The issue of impartiality is quite crucial in building trust."
The NHS websites fared little better. Often these were rejected because the first page participants were directed
to was a portal or had too much background or generic content. "People don't have the patience to scroll through
pages in order to find something useful. Ease of access is so important", says Professor Briggs.
Even if a site makes a favourable first impression, it is unlikely to keep our attention if there are no personal
stories that we can relate to. People are looking for advice from like-minded people and are drawn to sites such as
the charity based DIPEx and ProjectAWARE where they can read about the experiences of other people who have
the same problems and concerns.
Despite rejecting many of the more 'reputable' sites, participants in this study did manage to find information of
reasonable quality. But Professor Briggs warns that our searching strategy has the potential to let us down:
The tendency to particularly trust sites that contain contributions from like-minded peers could have dangerous
effects on some groups of consumers, such as those with anorexia, by reinforcing unhealthy behaviour patterns, she
The researchers have developed a set of guidelines for designing engaging and trustworthy sites and have shown
that trustworthy sites have more influence on consumer behaviour.
They found that moderate to heavy drinkers who viewed trustworthy websites describing the health risks
involved with alcohol consumption reduced their alcohol intake more than those who viewed the same information
on a site with untrustworthy features such as adverts or links to pharmaceutical companies.
But the most important advice for those trying to promote health information on-line is to use engaging stories
about people with similar experiences. Professor Briggs concludes:
"The great strength of the internet is that you can find people who have had the same problem that you have
and see how they have coped with it – to forget about that, or to act as if it's not happening, is missing the point."
Light puts asteroids into a spin
Jonathan Fildes Science and technology reporter, BBC News
The constant bombardment of billions of tiny particles from the Sun is shaping the Solar System, studies have
As the fine solar shower rains down on objects, such as asteroids, it can steadily alter their orbit and spin.
Although the mechanism that describes the effect has been known for many years, it has never been seen.
Now, separate studies published in the journals Nature and Science have observed and measured the tiny stellar
shoves on two spinning asteroids.
They reveal that both are gradually starting to spin faster and faster, which could eventually create new Solar
"If we can spin up an asteroid so fast, there's a really good chance that these things will fly apart," said Dr
Stephen Lowry, a planetary astronomer at Queen's University Belfast and one of the authors of the Science paper.
In this case, the fragments could form a binary asteroid where two objects orbit each other, he said.
"This is a phenomenon that gradually affects the evolution of the Solar System," said Dr Mikko Kaasalainen of the
University of Helsinki, who is an author of the Nature paper.
The Yarkovsky-O'Keefe-Radzievskii-Paddack (Yorp) effect is named after the astronomers who made key
observations that led to the theory.
It describes the torque, or rotational force, created when light particles hit the surface of an object, causing it to
"As the heat is re-emitted it exerts a gentle recoil effect," said Dr Lowry.
"An analogy would be if there was a child and they threw a ball forward, there
would be a slight recoil effect."
The theory was proposed in order to explain various Solar System phenomena
which show peculiar regular behaviour.
For example, astronomers looking at a cluster of asteroids known as the Koronis
family, which is located in the main belt between Mars and Jupiter, noticed that the
larger fragments spun in two distinct alignments.
Asteroid Mathilde Image: Nasa
The shower of solar particles can make asteroids spin faster
The cluster was thought to have been the result of a catastrophic collision more than two billion years ago which
would have blown erratically spinning fragments throughout the surrounding area.
Without the Yorp effect realigning the objects, this random motion would still be seen today.
The existence of other phenomenon such as binary asteroids is also explained by Yorp.
But although astronomers had several clues to the effect's existence, no direct evidence had ever been seen.
Now, two teams have analysed separate distal spinning asteroids and have been able to quantify how much their
spin is changing. They have also been able to predict the possible future fate of the rocky lumps.
3D model of Apollo 1862
The team from Finland created computer models of Apollo 1862
A team lead by Dr Lowry used optical and radar telescopes to monitor an asteroid known as 2000 PH5, which
makes regular near-Earth passes.
At its closest, the 114m-diameter (374ft) lump comes within 1.8 million km (1.1 million miles) of Earth.
"As the asteroid rotates, its brightness increases and decreases, which is directly related to how fast it's
spinning," said Dr Lowry.
By combining four years of optical information with radar to work out the size and shape of the object, the team
was able to measure the increasing spin of the asteroid and therefore the size of the Yorp effect acting on PH5.
"It currently rotates every 12 minutes and we detected a change of one millisecond per year," said Dr Lowry.
"It's a tiny, tiny effect but it's acting over millions of years."
The team predicts that over the next 15 million to 40 million years the asteroid will gradually speed up until it is
turning over every 20 seconds.
At this point, the rocky mass may fly apart forming a cluster of smaller asteroids or a new binary system.
A team from Finland has calculated the changes to another asteroid, 1862 Apollo, a 1.4km-wide (0.9 miles) near-
Earth object that is approximately 340 million km (210 million miles) away.
Instead of making new observations, the researchers looked at historical snapshots of the object dating back to
1982. From these they were able to extract light information that could be checked against a theoretical model to
discount other effects.
The result was also suitably minute.
"The current rotation period is about three hours and the change is only four thousandth of a second per year,"
said Dr Kaasalainen.
Although both results were almost intangibly small, the implications are much larger, especially for models of the
evolution of the Solar System, he said.
"We must include this radiation effect because it can transport asteroids between different orbital states and
effect their rotation," he said.
"We now know the Solar System doesn't just evolve due to gravitation."
Dr Lowry also believes it is a key finding for looking back through history.
"Asteroids are the leftovers from the start of the Solar System, so by understanding these asteroids, we may get
an idea of what the Solar System was like before the planets formed," he said.
"I don't want to call it a dawn of a new age of astronomical sciences but it will certainly spark a whole range of
People see pets through rose-tinted glasses
We always knew it but now it‘s official -- pet owners have rose-tinted views of their animals. People even become
defensive on behalf of a triangle if told it‘s ―theirs‖.
Pet owners notoriously make excuses for their own animal‘s bad behaviour while condemning that of others.
They are also more likely to anthropomorphise their own animal‘s behaviour, saying ―my dog wants to cheer me up‖,
To explore this, a team led by social psychologist Sara Kiesler from Carnegie Mellon University in Pittsburgh,
Pennsylvania, gave 82 university staff and students a Siamese fighting fish to look after for two weeks. Some were
told they temporarily ―owned‖ the fish, while others (―caretakers‖) were told it belonged to someone else.
After a fortnight, 95% of owners opted to keep their fish, compared with 75% of caretakers. The owners also
gave significantly higher scores reflecting their affection for the fish. Those most fond of their fish were also most
likely to say it was smart and liked them too.
―People who own and care about a pet are much more likely than those who just know an animal to
anthropomorphise the pet and feel that it has reasons for its choices,‖ says Kiesler.
In another experiment, Kiesler‘s team asked 36 students to watch a film in which two triangles and a circle seem
to have a skirmish. Half were told beforehand that they owned the smaller triangle. On average, this group rated
the larger triangle as less ―likeable‖ and most forcefully vilified it for being aggressive. Watch the video here.
The experimental results suggest to Kiesler that pet rescue centres might benefit from running bonding projects
for abandoned animals and their potential new owners -- just like companies send their employees to ―team-
building‖ Outward Bound courses, for instance.
―Owning an animal carries with it some effort and often some unpleasant experiences -- accidents on the rug, for
example,‖ says Kiesler. ―Bonding experiences would help create a feeling for the personality and individuality of the
animal, and might reduce failed adoptions.‖ Journal reference: Anthrozoo"s (vol 19, p 335)
Algae skeletons made into silicon components
* 18:00 07 March 2007
* NewScientist.com news service
* Mason Inman
The microscopic glass skeletons of algae could be transformed into silicon for
novel electronic applications, a new study suggests. The relatively simple
method creates a near-exact silicon replica of each shell, preserving its intricate
Since each replica is converted from silica to semiconducting silicon, and
since the algae, known as diatoms, come in a huge variety of forms, the
converted shells could have various potential applications. These could range
from making microscopic gas sensors, to creating new kinds of batteries, the
A silicon dioxide-based shell of an Aulacoseira diatom (Image: Nature)
Diatom shells are about 10 micrometres across and come in a variety of shapes -- resembling barrels, donuts,
triangles, and stars -- with regularly sized features of 10 nanometres or smaller. The new procedure replicates all of
these features accurately.
Converting a shell to silicon involves cooking the silica (silicon dioxide) with magnesium gas at 650°C. The
magnesium combines with oxygen in the silica, turning it into an interconnected mesh of magnesium oxide and
cid can then be used to etch away the magnesium oxide, leaving pure silicon behind. "You're getting rid of most
of the material," says Kenneth Sandhage of Georgia Tech University, US, who led of the study. "But it's all
interconnected so it doesn't fall apart."
ince there are more than 100,000 species of diatoms, "some shapes might be better than others" for various
electronic applications, Sandhage says.
he process also leaves behind nanometre-sized holes in the silicon, meaning the shells may be particular well
suited to certain applications, Sandhage says. For example, the study showed that they could work as microscopic
gas sensors, capable of detecting minute amounts of gas.
y passing a current through the silicon shells the researchers were able to sense a few parts per million of nitric
oxide gas. As the gas is absorbed by the porous silicon, it changes the way the material conducts electricity.
andhage also believes the porous shells could be used to make components in compact batteries that hold more
energy and recharge faster, for use in devices such as computers or cameras. For both applications, the high
surface area of the shells is crucial, as it allows them to absorb gas or ions more quickly.
t may be possible to control the shape of the diatom shells genetically, according to Mark Hildebrand, a molecular
biologist at the Scripps Oceanography Institute in San Diego, California, US. "There's such a variety [across species],
it implies it is very easy to change the shape, and that there are few genes involved," he says.
ildebrand is researching the genetic basis of diatom shapes, with the aim of creating custom shapes though
genetic engineering. The new method for creating silicon replicas "has really broadened the possibilities" of what
these custom diatoms could be used for, he says.
Journal reference: Nature (vol 446, p 172)
Blood tests may be possible for mental health conditions
Blood tests for panic disorder and other mental health conditions are potentially around the corner, based on
results from a University of Iowa study.
The findings, which were based on analysis of genetic information in immature white blood cells, appear online
March 6 in the American Journal of Medical Genetics.
"The ability to test for panic disorder is a quantum leap in psychiatry," said the study's lead author, Robert
Philibert, M.D., Ph.D., professor of psychiatry in the UI Roy J. and Lucille A. Carver College of Medicine.
"Panic disorder will no longer be a purely descriptive diagnosis, but, as with cystic fibrosis, Down syndrome and
other conditions, a diagnosis based on genetic information," he said. "In addition, the finding could help us better
understand the pathways that initiate, promote and maintain panic disorder."
The team compared gene expression in lymphoblasts (immature white blood cells) culled from 16 participants
with panic disorder and 17 participants without the disorder. The study found many genes were more expressed in
people with panic disorder than in people without the condition. Similarly, the study found many genes were less
expressed in people with panic disorder. There were also sex-related differences.
Overall, people with panic disorder had noticeably different patterns of gene expression than people without the
disorder. Although panic disorder is a disease of brain cells, the study used lymphoblasts as "stand-ins" for the
genetic testing because brain cells are not accessible or easily tested.
Approximately 3 percent of people in the United States have panic disorder, which involves having at least one
panic attack every four weeks. Panic attacks can involve up to 10 symptoms, including palpitations, shortness of
breath, sweating and a feeling of loss of control or dying -- symptoms that are very similar to heart attack
"People with panic disorder often end up in the emergency room for heart tests when in fact they have panic
disorder. This is just one of the reasons that it would be helpful to have a blood test for panic disorder," Philibert
A blood test for commercial use is now being developed by the UI, which raises larger questions about how
information revealed by such tests will be used. The issue of patient medical records and how they can potentially
be used by employers, insurers, government agencies and other institutions is a concern, Philibert said.
"Science is like a hammer. You can use it to build a house or break a window," Philibert said. "We certainly
intend for this finding to help people manage their disease, and when possible, to prevent it from affecting their
"If we can, it could help us identify systems that interact with the environment and possibly lead the way to new,
even non-drug, therapies to prevent illness," he added.
Petrified lightning bolts’ give peek into ancient climates
When lightning strikes sand or soil, it melts and fuses grains into features that some have called petrified
lightning bolts. Their scientific name is fulgurites, after fulgur, the Latin word for lightning.
Fulgurites are branched, thin, hollow tubes usually 1 or 2 inches in diameter and a few feet to tens of feet long.
They are rough on the outside (where sand grains and other material stick to the molten material) but glassy
smooth on the inside, with many bubble holes produced by vaporized gases.
They usually are considered mere curiosities, but a recent bit of research reported in the February issue of the
journal Geology put fulgurites to a scientific use, to obtain 15,000-year-old climate data.
About 65 lightning flashes occur per second worldwide, but lightning is not randomly distributed — some areas
get more than others.
One area that sees few storms is the Libyan Desert in southwestern Egypt. Lightning does occur to the north,
around the Mediterranean, and to the south, in a region called the Sahel.
Fulgurites found in the Libyan desert suggest that storms were more common there in the past. Analyses of gas
bubbles from one of the glassy tubes confirmed it.
The main gases found were carbon dioxide, carbon monoxide and nitric oxide. There also were traces of oxygen,
methane, argon and others.
For a time, the source of the gases was unclear — they could have come from either the atmosphere or from the
ground. Low amounts of argon showed they probably came from vaporized organic material in the soil; if the gases
were from the atmosphere, the amount of argon should have been hundreds of times greater.
Further analyses of gases indicated a much different climate than occurs there today. Rainfall was typical of the
Sahel, about 370 miles south, and the ratio of carbon-12 to carbon-13 indicated that what is now desert was
covered by grasses and low shrubs.
The fulgurites were dated by a technique called thermoluminescence dating. Radiation from both the ground and
from space (cosmic rays) produces changes in the crystal structure of quartz grains that accumulate with time.
When carefully heated, the crystal structure returns to normal, but as it does, it gives off light. The longer the
grains were radiated, the more light was given off.
Measurements taken at two different wavelengths — ultraviolet and blue — gave close ages. Both indicated the
fulgurites formed 15,000 years ago. Dale Gnidovec is curator of the Orton Geological Museum at Ohio State University.
Unbrushed Teeth Reveal Ancient Diets
Jennifer Viegas, Discovery News
Ick factor aside, ancient tartar-encrusted teeth may be a biological gold mine for scientists, thanks to a new
technique for extracting food particles from teeth that once belonged to prehistoric humans.
The method already has solved a mystery surrounding what early coastal Brazilians ate.
In the future, similar studies may reveal clues about other ancient diets, particularly in areas with little plant
preservation from earlier times.
"There is great potential of dental calculus (old tooth tartar) analysis in past populations that inhabited tropical
regions," said Sabine Eggers, co-author on a new study detailing the method.
Eggers is a researcher in the Biological Anthropology Laboratory at the University of Sao Paulo, Brazil.
After being awarded a Fulbright Commission grant, she and colleagues Celia Boyadjian and Karl Reinhard created
the new tartar extraction method, which involves a "dental wash" containing four percent hydrochloric acid as the
main active ingredient.
Their findings have been accepted for publication in the Journal of Archaeological Science.
Eggers explained that ancient tartar could reveal what an individual ate in the days or weeks before death.
Evidence suggests some prehistoric populations cleaned their teeth — using fibrous foods and shell fragments as
natural abrasives — but many groups simply let nature take its course.
To test the dental wash, the scientists gathered several teeth from Brazilian burials dating from 2,800 to 1,805
years ago. All were excavated at a southeastern coastal site called Sambaqui Jabuticabeira II.
The site has yielded several large piles of mollusk shells mixed with other debris, which are associated with
The researchers swirled recovered teeth in the solution to loosen the tartar. To isolate the particles in the tartar,
the scientists strained and spun the solution in a centrifuge.
They found three types of microfossils. Most common were starch grains from tubers. The researchers also found
diatoms, microscopic algae used as a food source by marine organisms, and phytoliths, tiny mineral particles
produced by plants.
The first two microfossils suggest the individual's last meals likely consisted of shellfish accompanied by some
sort of tuber.
"At the moment, we do not know where the phytoliths came from," Eggers said. "They might have been part of a
plant eaten or a plant element, such as palm leaves, chewed for the production of ropes, baskets, hammocks or
some kind of clothing."
Although the dental wash was successful, it made some of the ancient teeth brittle, while others turned bright
Since scientists hope to leave specimens in a condition as close to the original as possible, the researchers
suggested the dental wash recipe requires further tinkering. They also said particles might be removed using sound
Sheila Souza, a scientist at Brazil's National School of Public Health, told Discovery News that she and colleague
Veronica Wesolowski have also been recovering particulate matter from ancient teeth.
"It is really new to try the washing technique proposed by the researchers," Souza said, adding that the new
dental wash technique is "interesting," but problematic for the reasons the inventors cited themselves.
Souza agreed that it is important to analyze ancient teeth, however, particularly in Brazil, where early plant
evidence has been sparse in some areas.
"We are really opening a big, new field to improve prehistoric reconstructions about the Sambaquis diet and
lifestyle with calculus and microfossils," she said.
Performance: Test of Pilots Shows Age May Be Advantageous
By ERIC NAGOURNEY
Airline passengers who feel reassured when they glimpse a shock of white hair in the cockpit may be on to
A new study finds that in pilots, the declines in physical and thinking skills that come with age may be
outweighed by their years of experience.
Writing in the February issue of Neurology, researchers described the results of a study in which 118 pilots, ages
40 to 69, were repeatedly tested in a flight simulator over three years. The study was led by Joy L. Taylor of the
Stanford/VA Aging Clinical Research Center.
In general, the researchers found, older pilots did not do as well the first time they used the simulators, which
tested skills in communicating with air traffic controllers, avoiding traffic, keeping track of cockpit instruments and
landing. The tests involved piloting a single-engine aircraft over flat terrain near mountains.
But as the tests were repeated over the period of the study, the older pilots‘ skills declined less than the younger
ones‘ did. In traffic avoidance, all the pilots‘ skills improved over time, but the older pilots improved more than the
younger ones. The study also found that pilots with more training scored better on the tests at the beginning, and
showed less decline over time.
Under current regulations, commercial airline pilots must retire at age 60, but some pilots are pushing to work
longer, and the results of this study may add to the debate. The researchers suggest that it may not be enough to
assess pilots‘ ability to stay in the job by using just the ordinary array of generalized skills tests.
The issue, they say, is that these tests do not measure the specific skills pilots have attained over the years.
Studies have shown that in many fields requiring expertise, from typing to music, those skills remain even after
A Diagnosis for One, but an Impact Shared
By HARRIET BROWN
It was the first time my daughter‘s sixth-grade teacher had ever called, and clearly the news was not good.
Breathless and upset, she phoned me at work to say that 11-year-old Lulu had been ―out of control‖ for several
hours that morning, yelling, crumpling papers, saying rude things and, eventually, running out of the building.
The class had been acting out a scenario about hunter-gatherers during a famine, she said, and Lulu was one of
the children chosen to ―starve to death.‖
“No one else got upset,‖ the teacher said. ―I don‘t understand why she did.‖
But I understood. From the moment I heard the words ―starve to death,‖ I knew exactly what had happened.
Fifteen months earlier, Lulu‘s older sister had received a diagnosis of anorexia, and our family had been
consumed since then with helping her recover. Though we had tried to shield Lulu as much as possible, she had
suffered along with the rest of us.
She‘d seen her sister lie in a bed in the intensive care unit, listless and wasted. She‘d sat at the table and listened
to hysterical yelling and crying, suicide threats; she‘d heard her beloved older sister say incomprehensible and scary
things. She‘d watched her sister eat her way back from the brink, bite after agonizing bite, and no doubt had seen
more than we thought of our frustration, confusion and grief.
Now, finally, her sister was more or less recovered: weight restored, in school, back to her healthy and happy
self. And I had been waiting for something like this to happen, ever since life in our house started to feel normal
In the nightmarish chaos of dealing with anorexia, my husband, Lulu and I had focused on helping her sister
recover. Now that she had, it was safe for Lulu to feel all the terror, grief and rage of the last 15 months.
It was pure bad luck that it all came flooding out at school. The classroom role-play was the kind of flukish thing
we couldn‘t have anticipated.
Being the other sibling is always hard. When one child grapples with a life-changing diagnosis or accident,
parents have to focus on that child. And there are ways in which it‘s good for children to learn that other people
have needs, and sometimes those needs come first.
The thing about eating disorders is that the reminders are everywhere. We all have to eat, after all. And there
may be no time in a child‘s life when she feels more self-conscious and vulnerable about body image and weight
than middle school. Add to that our school district‘s recent emphasis on nutrition, and I knew Lulu felt she couldn‘t
get away from her sister‘s illness.
The classroom role-play was the most extreme example, but there were plenty of other triggers: posters in the
sixth-grade hallway urging exercise and ―healthy eating‖ to lose weight, a gym unit on fitness in which children were
asked to record each other‘s weight, a movie that mentioned anorexia. All of these brought up the painful feelings
again and again.
Then, too, most people don‘t truly understand how serious and pervasive an eating disorder can be. Lulu‘s
teacher was shocked that the classroom role-play had caused such a reaction; I‘m sure she thought Lulu had some
kind of underlying emotional problem. When my husband and I sat down with her and shared some of what Lulu
had gone through, I watched her face change as she began to understand why our daughter had reacted the way
From other parents whose children had recovered from anorexia, I heard the same kinds of concerns. Predictably,
the younger the child, the more acting out the parents saw, including clinginess, tantrums, mimicking dangerous
behaviors like not eating, and depression. There seems to be a kind of symmetry to the siblings‘ recovery; it can
take about a year to fully refeed an anorexic child at home, and another year for the sisters and brothers to get
back to normal behavior.
Part of Lulu‘s recovery is talking about it. Recently she came home from school upset after an argument with
another sixth grader, a friend who insisted that it‘s better to be thin than fat. With the image of her starving sister in
mind, Lulu had replied, “I‘d rather be too fat than too skinny.‖ Her friend‘s response: ―Thin is good — the thinner
Lulu was furious when she told me about this conversation. Didn‘t her friend get it? Didn‘t she know what
anorexia was like?
No, I told her, she doesn‘t. You said all the right things, but it will take more than a comment or two to change
your friend‘s opinion. I rubbed her back and sat with her while she cried.
I knew the pain would lessen but never go away completely. For better or worse, what had happened to her
sister had happened to her, and to all of us. None of us would ever be the same again.
On the (sound) track of anesthetics
Danish scientists challenge the accepted scientific views of how nerves function and of how anesthetics work.
Their research suggests that action of nerves is based on sound pulses and that anesthetics inhibit their
Every medical and biological textbook says that nerves function by sending electrical impulses along their length.
"But for us as physicists, this cannot be the explanation. The physical laws of thermodynamics tell us that electrical
impulses must produce heat as they travel along the nerve, but experiments find that no such heat is produced,"
says associate professor Thomas Heimburg from the Niels Bohr Institute at Copenhagen University. He received his
Ph.D. from the Max Planck Institute in Göttingen, Germany, where biologists and physicists often work together – at
most institutions these disciplines are worlds apart. Thomas Heimburg is an expert in biophysics, and when he came
to Copenhagen, he met professor Andrew D. Jackson, who is an expert in theoretical physics. They decided to work
together in order to study the basic mechanisms which govern the way nerves work.
Physics explains biology
Nerves are 'wrapped' in a membrane composed of lipids and proteins. According to the traditional explanation of
molecular biology, a pulse is sent from one end of the nerve to the other with the help of electrically charged salts
that pass through ion channels in the membrane. It has taken many years to understand this complicated process,
and a number of the scientists involved in the task have been awarded the Nobel Prize for their efforts. But –
according to the physicists – the fact that the nerve pulse does not produce heat contradicts the molecular biological
theory of an electrical impulse produced by chemical
processes. Instead, nerve pulses can be explained
much more simply as a mechanical pulse according to
the two physicists. And such a pulse could be sound.
Normally, sound propagates as a wave that spreads
out and becomes weaker and weaker. If, however, the
medium in which the sound propagates has the right
properties, it is possible to create localized sound
pulses, known as "solitons", which propagate without
spreading and without changing their shape or losing
The figure shows a biological membrane at its melting point. The green molecules are liquid, and the red
are solid. Molecules of anesthetic reduce the number of red areas so that the sound pulse can no longer
transport its signal. The nerve is anesthetized. Heiko Seeger, Ph.D. Niels Bohr Institute
The membrane of the nerve is composed of lipids, a material that is similar to olive oil. This material can change
its state from liquid to solid with temperature. The freezing point of water can be lowered by the addition of salt.
Likewise, molecules that dissolve in membranes can lower the freezing point of membranes. The scientists found
that the nerve membrane has a freezing point, which is precisely suited to the propagation of these concentrated
sound pulses. Their theoretical calculations lead them to the same conclusion: Nerve pulses are sound pulses.
Anesthetized by sound
How can one anesthetize a nerve so that feel ceases and it is possible to operate on a patient without pain? It
has been known for more than 100 years that substances like ether, laughing gas, chloroform, procaine and the
noble gas xenon can serve as anesthetics. The molecules of these substances have very different sizes and chemical
properties, but experience shows that their doses are strictly determined by their solubility in olive oil. Current
expertise is so advanced that it is possible to calculate precisely how much of a given material is required for the
patient. In spite of this, no one knows precisely how anesthetics work. How are the nerves "turned off"? Starting
from their theory that nerve signals are sound pulses, Thomas Heimburg and Andrew D.
Jackson turned their attention to anesthesia. The chemical properties of anesthetics are all so different, but their
effects are all the same - curious!
But the curious turned out to be simple. If a nerve is to be able to transport sound pulses and send signals along
the nerve, its membrane must have the property that its melting point is sufficiently close to body temperature and
responds appropriately to changes in pressure. The effect of anesthetics is simply to change the melting point – and
when the melting point has been changed, sound pulses cannot propagate. The nerve is put on stand-by, and
neither nerve pulses nor sensations are transmitted. The patient is anesthetized and feels nothing.
Tough Question to Answer, Tough Answer to Hear
By JANE E. BRODY
Upon receiving a diagnosis of a fatal illness like metastatic cancer, Alzheimer‘s disease or congestive heart failure,
many patients ask, ―Doc, how much time have I got?‖ It‘s a reasonable question, given that there is often much to
plan for and accomplish before a progressive illness robs patients of their physical or mental abilities.
Yet prognosticating is one of the most challenging tasks doctors face. Unless patients are within days or weeks of
dying, it is often impossible to provide an accurate prognosis. And studies have shown that when doctors do try to
gauge a patient‘s remaining life expectancy, more often than not they overestimate it. Out of fear, ignorance or
concern for their patients‘ emotional well-being, they tend to be overly optimistic.
“Accurately predicting life expectancy in terminally ill patients is challenging and imperfect,‖ a medical team
wrote in the journal Mayo Clinic Proceedings in November 2005. ―Physicians are typically optimistic in their
estimates of patient survival.‖
The team, from the Mayo Clinic, cited a study in which survival predictions were made for 468 patients in hospice
programs, meaning they had stopped treatment to prolong their lives. Only 20 percent of the predictions were
accurate; 63 percent were too optimistic.
In a second study, also among patients receiving hospice care, the median survival was 24 days, but the median
survival estimate doctors offered patients was 90 days. In general, researchers have found that doctors tend to
overestimate patient survival by a factor of three to five.
As one physician interviewed anonymously by The Journal of the American Medical Association put it, ―When we
prognosticate and it turns out that the patient lives a longer life, then we can be joyous with them, but when we
prognosticate and the patient ends up living a far shorter time, that‘s when we really do harm.‖
The Value of Candor
Patients use information about the expected course of their illness, including how long they are likely to survive,
in a variety of ways. It can help them decide whether to take a long-awaited trip, which therapies are worth
pursuing, what kind of support system they may need as their condition worsens, and how much time they will have
to put their affairs in order.
Patients often have things they want to accomplish before they die, and knowing that their time is short may
prompt them to attend to such matters. Receiving a terminal prognosis may also open up conversations about death
and dying that may be painful at first but can bring considerable relief to patients and family members alike.
Doctors do best in providing accurate prognoses for patients with advanced cancer, because the disease follows
a more predictable course and the medical literature provides a range of survival times for most cancers. For
example, when my brother-in-law was found to have mesothelioma, an asbestos-related cancer, he was told he
could expect to live 8 to 12 months. He used that time to get his financial and personal affairs in order, share
meaningful goodbyes with his family and friends, and pass along a rich legacy of good music, memories and wisdom.
Arlene Wysong, a New York businesswoman, was 65 years old, ostensibly very healthy and leading a rich, active
and fulfilling life when she was surprised by a diagnosis of Stage 4 lung cancer. She said she had quit smoking 22
years earlier after being ―a marginal smoker for about 20 years.‖
“It was a total shock,‖ she said, ―and I knew what it meant. It meant I had an incurable cancer. I asked for a
prognosis. The doctor said 3 to 12 months.‖
Ms. Wysong made a choice. Knowing she could not be cured, she chose ―no chemotherapy, only palliative care‖
— treatment for pain and any other symptoms that might impede her ability to live out her last months as fully as
“I didn‘t want toxic chemo,‖ she said in an interview. ―I didn‘t want to lose my hair and be sick. I felt I had a
very short time left, and I didn‘t want to spend it being sick. So I rented a house in the country large enough for
people to visit and stay overnight, and I enjoyed the summer.
“My goal was to make sure I saw all the friends and family I wanted to see and to spend quality time with them.
I made out a new will and transferred my business, but I stayed involved with it for as long as I could.‖
Ms. Wysong also used the time to work on serious issues that had caused a rift with her daughter, and they
succeeded in restoring a loving relationship. She also outlived her doctor‘s prognosis, and 16 months after learning
of her cancer, she called in hospice care, remaining at home until she died last October with her family at her
Prognosis is helpful, not just for patients, but also for their families, who may need to know, for instance, how
much time they may have to take off from work, whether they should arrange for an extended leave, what might be
involved in caring for a dying person at home and whether other arrangements should be explored.
The Doctor’s Dilemma
“Quite separate from the challenge of estimating survival accurately, physicians may also find the process of
disclosing the prognosis to their patients difficult,‖ wrote Dr. Elizabeth B. Lamont and Dr. Nicholas A. Christakis in
The Journal of the American Medical Association in July 2003.
In some cases, patients make it clear that they simply do not want to know.
More often, however, the family wants to keep difficult facts from the patient. A study in Ireland, for example,
found that while 83 percent of patients wanted to be told the truth, only 55 percent of their relatives wanted the
patient to be truthfully informed. The lesson here, the researchers concluded, is for doctors to ask patients, not
family members, how much they want to know about their disease.
A common fear among doctors is that providing a terminal prognosis will strip patients of hope. Indeed, it will
dash hopes of long-term survival. But the doctor can convey other sources of hope. For example, patients may be
relieved to learn that they will remain well enough to attend an important family event, or that palliative care is
available for distressing symptoms like pain, nausea and shortness of breath.
Most important, patients say, is for doctors to stay with them until the end. Fear of abandonment (some
terminally ill patients are in fact abandoned by their doctors) is extremely common. Doctors see themselves as
healers, trained to cure or ameliorate illness, and typically view the impending death of a patient as a personal
failure. Rather than face failure, they abandon the patient.
Patients may be able to help themselves in this respect by reassuring the doctor. ―I know you tried very hard and
I appreciate all you did for me,‖ they might say. ―It‘s not your fault that I won‘t survive this disease. It would help a
lot, though, if you stay with me for the long haul.‖
In Lice, Clues to Human Origin and Attire
By NICHOLAS WADE
Published: March 8, 2007
One of the more embarrassing mysteries of human evolution is that people are host to no fewer than three kinds
of louse while most species have just one.
Even bleaker for the human reputation, the pubic louse, which gets its dates and residence-swapping
opportunities when its hosts are locked in intimate embrace, does not seem to be a true native of the human body.
Its closest relative is the gorilla louse. (Don‘t even think about it.)
Louse specialists now seem at last to have solved the question of how people came by their superabundance of
fellow travelers. And in doing so they have shed light on the two major turning points in the history of fashion:
when people lost their body hair, and when they first made clothing.
Three kinds of louse call Homo sapiens their home, but each occupies a different niche on the human body. The
head louse, Pediculus humanus, lives in the forest of fine hairs on the scalp. Its cousin, the body louse, lives not on
the skin but in clothes. And the exclusive territory of the pubic louse, Phthirus pubis, is the coarser hairs of the
Lice are intimately adapted to their hosts and cannot long survive away from the body‘s blood and warmth. If
their host evolves into two species, the lice will do likewise. So biologists have long been puzzled over the fact that
the human head louse is a sister species to the chimpanzee louse, but the pubic louse is closely related to the gorilla
By comparing louse DNA, a
team led by David L. Reed of
the University of Florida has
now reconstructed how this
strange situation probably
came about. Dr. Reed‘s team
collected pubic lice from a
public health clinic in Salt Lake
City. Samples of gorilla lice
were obtained by members of
the Mountain Gorilla Veterinary
Project, which provides free
health care to gorillas in the
Scientists believe they have figured out how and why the
human pubic louse, right, and the gorilla louse, left, diverged 3.3 million years ago.
Left, The Natural History Museum, London; right, Mona Lisa Productions/Photo Researchers
The number of DNA differences between the gorilla
louse and the pubic louse indicates that they diverged
some 3.3 million years ago, Dr. Reed and colleagues
report in today‘s issue of the journal Biomed Central
Biology. Among people, the pubic louse is usually
spread by sexual contact, but the gorilla louse could
have been contracted in some other way.
“We‘ll never know if it was sex or something more
tame,‖ Dr. Reed said. What can be said about the
transfer, he believes, is that it signals human ancestors
had already lost their body hair by 3.3 million years
ago, confining the human louse to the head and
leaving the groin open to invasion by the gorilla louse.
Archaeologists contend that human ancestors lost
their standard ape body hair when they left the shade
of the forests for the hot, open savanna and needed
bare skin for efficient sweating. Adaptation to the
savanna was well in place by 1.7 million years ago. But
loss of body hair could have begun earlier, and Dr.
Reed‘s result suggests a time for when people first
Divergences in Lice Species and Milestones in Human Evolution
If people first became nudists 3.3 million years ago, when did they start to wear clothes? Surprisingly, lice once
again furnish the answer. Though humans may long have worn loose garments like animal skin cloaks, the first
tailored clothing would have been close-fitting enough to tempt the head louse to expand its territory. It evolved a
new variety, the body louse, with claws adapted for clinging to fabric, not hairs.
In 2003, Mark Stoneking, a geneticist at the Max Planck Institute in Leipzig, Germany, estimated from DNA
differences that the body louse evolved from the head louse about 107,000 years ago. The first sewn clothes were
presumably made shortly before this time.
Probing back even earlier in louse evolution, Dr. Reed and his colleagues report that the two species of primate
lice, Pediculus and Phthirus, probably diverged from each other on an ape host 13 million years ago. The divergence
may have happened after the lice started to specialize in different parts of the body.
Some seven million years ago, this ancient ape species split into gorillas and the ancestors of humans and chimps,
with both lineages infected by both species of lice. But Pediculus then fell extinct in its gorilla hosts, according to Dr.
Reed‘s reconstruction, and Phthirus vanished from the chimp-human ancestor. Next, chimps and humans diverged,
and their joint louse diverged with them into Pediculus humanus and Pediculus schaeffi.
The last event in this history of human-louse cohabitation was the transfer of the gorilla‘s Phthirus louse to
Dr. Stoneking said Dr. Reed‘s reconstruction was ―pretty reasonable‖ and said he agreed that acquisition of the
gorilla‘s louse indicated people had lost their body hair by then. ―The transfer doesn‘t have to be sexual,‖ he said,
―but presumably it does require reasonably close contact.‖
Paper challenges 1491 Amazonian population theories
Much of Amazonian Basin not well-populated -- No 'built landscape'
There's a scholarly debate brewing about whether pre-Columbian Amazonian populations settled in large
numbers across Amazonia and created the modern forest setting that many conservationists take to be ‗natural.'
This view has become fashionable among many archaeologists and anthropologists, and is challenged in a recent
paper from Dr. Mark Bush of the Florida Institute of Technology. The findings of Bush‘s research may rekindle a
debate has major implications for land use and policy-setting in the rain forest.
"We don't contradict that there were major settlements in key areas flanking the Amazon Channel -- there could
have been millions of people living there," says Mark Bush, a British-born paleo-ecologist who travels to extremely
remote rain forest locations to collect core samples from ancient lakes. He then analyzes those samples for pollen
and charcoal and thus is able to conclude with a high degree of accuracy the extent of human settlement in that
"What we do say is that when you start to look away from known settlements, you may see very long-term local
use," he says. "These people didn't stray very far from home, or from local bodies of water for several thousands of
years. We looked at clusters of lakes and landscapes where people lived, and asked, did they leave their homesite
to farm around other nearby lakes? No they didn't. These findings argue for a very localized use of Amazonian
forest resources outside the main, known, archaeological areas."
Bush says the evidence comes from a geographically diverse area: three districts, each with 3 (in two cases) or
"In each we have one lake occupied and used, and the others little used or not used at all," he says. "So this is a
total of 10 lakes that provide three separate instances -- one in Brazil, one in Ecuador and one in Peru, where there
is evidence of long, continuous occupation of more than 5,000 years that did not spread to the adjacent, 8 to 10
kilometer distant lakes."
The findings are published in a paper titled "Holocene fire and occupation in Amazonia: records from two lake
districts" that appears in a recent issue of Philosophical Transactions of the Royal Society of London B: Biological
Sciences. Bush says this paper, and another forthcoming in the journal Frontiers in Ecology and the Environment,
have important policy implications.
That's because the hypothesis of human-manufactured landscapes has been made popular by Charles Mann‘s
book - 1491: New Revelations of the Americas Before Columbus – and could influence conservation policy in the
Americas. That millions of people once populated the Americas, and that in Amazonia, at least, the rainforest is the
product of long term human use, has been used as farmers and loggers as justification for clearcutting rainforests.
Their argument, that the ecosystem already experienced vast landscape disturbance and proved resilient, relies on
the ubiquitous influence of Pre-Columbian people, the suggestion that Bush‘s work rejects.
"These data are directly relevant to the resilience of Amazonian conservation, as they do not support the
contention that all of Amazonia is a 'built landscape' and therefore a product of past human land use," Bush says.
"Most archaeologists are buying into the argument that you had big populations that transformed the landscape en
masse. Another group of archaeologists say that transformation was very much limited to river corridors, and if you
went away from the river corridors there wasn't that much impact. That's what our findings tend to support."
Bush doesn't expect that his new findings will settle the debate, however.
"There's just too much passion on this issue. People who are inclined to believe what we're talking about will say
this is very strong evidence, and say 'let's have more.' The archaeologists will say this study only examines two
Bush himself calls the paper, co-authored with Claudia Listopad, William D. Gosling, and Christopher Williams of
Florida Tech, Paulo E. de Oliveira of Universidade do Guarulhos in Brazil, Miles R. Silman and Carolyn Krisel of Wake
Forest and Mauro B. de Toledo of Florida Tech and Universidade Federal Fluminense in Brazil, an important first
step in making the case, through core sampling and pollen and charcoal analysis of sediment from seven lake
bottoms, three in one district, four in the other, that much of Amazonia has not been transformed by human actions,
and ideally should be kept that way, to preserve species biodiversity.
"The way to see this is as a sneak peak," he says. "It's a new way to look at landscapes and it's a new tool. The
study needs to be replicated in more places before people will be persuaded, but it's certainly a warning shot across
"While the majority of archaeologists argue the rivers were the major conduit for populations," he adds, "there is
an increasing vocalization that there was much more widespread habitat transformation; that you still had a bulk of
people along the river but their influence extended deep into the forest. It's still nebulous, and difficult to get people
to map stuff, or put hard numbers on it, but there is a sentiment that the Amazonia has been disturbed and that the
view of the Amazonian rainforest as a built landscape is gaining momentum. There are extremes at either ends, and
the majority of people are in middle but there's a tendency of drifting toward the high end."
For example, he says 1950s population estimates were 1 million, in the 70s that estimate drifted up to 4 million;
and in the 1990s drifted up to 10 million.
"We've now got a polarized community," he says.
At one end, he says, is Anna Roosevelt of the Field Museum in Chicago (she argues for large populations
dispersed throughout Amazonia); at the other is Betty Meggers at Smithsonian (she argues these were very
primitive people with low population).
Mark's studies are the first to apply core sampling methodology to determine through coal and pollen levels, how
much human activity was going on.
A Toast to Evolvability and Its Promise of Surprise
By NATALIE ANGIER
Late last month, the day after my birthday, I was feeling punch drunk on my favorite glogg of sullenness, self-
pity and panic. My life was passing by at relativistic speed, not one of my rotten siblings had called to wish me a
happy birthday, my husband hadn‘t bothered to arrange so much as a waiter-serenaded slice of cake at the
restaurant the night before, and did he really think that his gift to me of an ―amazing squirrel-proof bird feeder‖
would excite anybody but the squirrels?
My post-birthday gloom was so rich, so satisfyingly glutinous, that I forgot to be suspicious, and when we
headed over to a neighbor‘s house later that evening, I opened the door like a cartoon buffoon onto a huge throng
of friends and relations, gathered from across the nation and athwart my entire curriculum vitae, bellowing out in
fractured synchrony that magic word ―Surprise!‖ I gasped. I squealed. I felt like I‘d died and gone to a TV game
show. I‘d gotten the surprise party of my admittedly oft-expressed fantasies, and I was thrilled, moved and
profoundly grateful. Yet as I stumbled in a stupor from one friend who‘d spent hundreds of dollars on airfare just to
be there to the next, I couldn‘t help wondering why I‘d wanted such a shock to my system in the first place.
I‘m not much of a thrill seeker or adventurer. I like libraries, museums and speed bumps. I am, nevertheless, a
multicellular organism of reasonably complex structure, and we complex bioforms can‘t help but appreciate novelty.
We are the fruits of it. If not for evolutionary novelty — that is, the periodic and often radical overhauling of an
existing cell type, body plan, limb shape or brain design into something new and useful, or at least entertaining —
we might still be so many daubs of blue-green algae decorating an Australian rock. And while I mean no offense to
algae and recognize that my ancestors looked very much like them, an algal colony has yet to throw me a surprise
party or make a passable stab at saying ―G‘day.‖
A tip of the paper-cone hat, then, to biological novelty. Under its tutelage, early groups of cells made the leap
from the sleepy expulsion of oxygen as waste to the aerobic consumption of oxygen to grow at a hastier pace; and
groups of single cells learned to pool their talents into multicellular collectives of specialized body compartments that
could then go out and hunt other multicellular collectives; and fishy fins became amphibious feet and crept onto the
beach, and some land-weary feet changed their mind and flippered back to the sea, while still other limb bones
lengthened and found skin flaps for flying, and, hey, this airborne business is pretty handy, let‘s rearticulate the
forelimbs of three separate lineages and take wing as a pterodactyl, a bird, a bat.
As scientists see it, these and others of nature‘s fancy feats forward are clearly the result of large-scale
evolutionary forces, but the precise mechanisms behind any given innovation remain piquantly opaque. For some
researchers, the conventional gradualist narrative, in which organisms evolve over time through the steady
accretion of many mincing genetic mutations, feels unsatisfying when it comes to understanding true biological
“The standard Darwinian view always sounds like a better theory for making improvements than for making
inventions,‖ said Dr. Marc W. Kirschner, a professor of systems biology at Harvard Medical School. If incremental,
additive genetic changes were responsible for all the boggling biodiversity we see around us, he said, how can it be
that humans have hardly more genes than a microscopic nematode, and that many of those genes are nearly
identical in roundworms and humans besides?
In their recently published book, ―The Plausibility of Life,‖ Dr. Kirschner and Dr. John C. Gerhart of the University
of California, Berkeley, offer a fresh look at the origins of novelty. They argue that many of the basic components
and systems of the body possess the quality of what they call ―evolvability‖ — that is, the components can be
altered without wreaking havoc on the parts and systems that connect to them, and can even produce a reasonably
functional organ or body part in their modified configuration. For example, if a genetic mutation ends up
lengthening a limb bone, said Dr. Kirschner, the other parts that attach to and interact with that bone needn‘t also
be genetically altered in order to yield a perfectly serviceable limb. The nerves, muscles, blood vessels, ligaments
and skin are all inherently plastic and adaptable enough to stretch and accommodate the longer bone during
embryogenesis and thus, as a team, develop into a notably, even globally, transformed limb with just a single
mutation at its base. And if, with that lengthened leg, the lucky recipient gets a jump on its competitors, well, g‘day
to you, baby kangaroo.
Dr. Kirschner also observes that cells and bodies are extremely modular, and parts can be moved around with
ease. A relatively simple molecular switch that in one setting allows a cell to respond to sugar can, in a different
context, help guide the maturation of a nerve cell. In each case, the activation of the switch initiates a tumbling
cascade of complex events with a very distinctive outcome, yet the switch itself is just your basic on-off protein
device. By all appearances, evolution has flipped and shuffled and retrofitted and duct-taped together a
comparatively small set of starter parts to build a dazzling variety of botanic and bestial bodies.
The combined modularity and bounciness of body parts suggest that life is spring-loaded for change, for
outrageous commixtures, the wildest fusion cuisine. And who knows whether our organismic suppleness, our deep
evolvability, isn‘t related to our mental thirst for the new, and our hope that behind the door lies the best surprise
Study Uncovers Memory Aid: A Scent During Sleep
By BENEDICT CAREY
Science Published: March 9, 2007
Scientists studying how sleep affects memory have found that the whiff of a familiar scent can help a slumbering
brain better remember things that it learned the evening before. The smell of roses — delivered to people‘s nostrils
as they studied and, later, as they slept — improved their performance on a memory test by about 13 percent.
The new study, appearing today in the journal Science, is the first rigorous test of the effect of odor on human
memory during sleep. The results, whether or not they can help students cram for tests, clarify the picture of what
the sleeping brain does with newly learned material and help illuminate what it takes for this process to succeed.
Researchers have long known that sleep is crucial to laying down new memories, and studies in the 1980s
and ‘90s showed that exposing the sleeping brain to certain cues — the sound of clicking, for instance — could
enhance the process. But it is only in recent years that scientists have begun to understand how this is possible.
“The idea didn‘t get any traction with scientists back then, because it didn‘t make sense,‖ said Dr. Robert
Stickgold, an associate professor of psychiatry at Harvard, who was not involved in the research. The new study, Dr.
Stickgold added, ―shows not only that sleep is important for declarative memory, but also allows us to look at
exactly when and how this process might happen.‖
In the study, neuroscientists from two German institutions, the University of Lübeck and the University Medical
Center Hamburg-Eppendorf, had groups of medical students play a version of concentration, memorizing the
location of card pairs on a computer screen. Upon learning the location of each pair, the students received a burst
of rose scent in their noses through masks they wore. The researchers delivered the fragrance in bursts because the
brain quickly adjusts to strong smells in the air and begins to ignore them.
The students went to sleep about a half-hour later, with electrodes on their heads tracking the depth of their
slumber. Neuroscientists divide sleep into stages, including deep (or slow wave) sleep and the shallow, dream-rich
state called rapid eye movement (or REM) sleep.
The brain is thought to process newly acquired facts, figures and locations most efficiently in deep sleep. This
restful state usually descends within the first 20 minutes or so after head meets pillow and may last an hour or
longer, then recur once or more later in the night. The researchers delivered pulses of rose bouquet during this
slow-wave state; the odor did not interrupt sleep, and the students said they had no memory of it.
But their brains noticed, and retained an almost perfect memory of card locations. The students scored an
average of 97 percent on the card game, compared with 86 percent when they played the game and slept without
being perfumed by nighttime neuroscience fairies.
The students did not get the same boost when they received bursts of the fragrance just before sleep or in REM
sleep rather than in deep slumber, and their improvements were not due to practice, the study found.
The study‘s results could eventually help doctors improve patients‘ memory by devising treatments directed at
deep sleep. As they age, people spend less and less time each night in such sleep, and existing sleep medications
do not generally increase it. But pharmaceutical companies are investigating compounds that do so.
Previous research has shown that regions of the cortex, the thinking and planning part of the brain, communicate
during deep sleep with a sliver of tissue deeper in the brain called the hippocampus, which records each day‘s
memories. What is most likely happening in that communication, the study‘s authors argue, is that the cortex is
telling the hippocampus to reactivate the same neurons that fired when a particular fact was noticed or learned. The
hippocampus does so, encoding the firing sequence in the cortex and thereby consolidating the memory.
“We would expect spontaneous reactivation driven by the slow-wave sleep, but by presenting the rose odor
cues we intensified this activation and enhanced the transfer of these memories,‖ said Dr. Jan Born, a neuroscientist
at Lübeck who undertook the study with Björn Rasch, Christian Büchel and Steffen Gais.
Olfactory sensing pathways in the brain lead more directly to the hippocampus than do visual and auditory ones.
That may be why smells can so vividly revive things past like forgotten joys or humiliations.
To check their reasoning, the researchers took M.R.I. images of some of the students‘ brains during their rose-
scented slumber. As expected, regions of the cortex became noticeably more active, as did the hippocampus.
The findings suggest that distinct sleep states may be specialized to integrate different kinds of information. For
example, the researchers found that the rose scent did not enhance memories of a learned finger-tapping sequence
— a rhythmic memory that does not appear to be consolidated by the hippocampus.
Likewise, given that the rose fragrance during REM sleep made no difference to the students‘ scores, it may be
that the hues, horrors and hilarity of dreams during REM reflect the brain‘s efforts to integrate emotional, rather
than factual, memories, said Dr. Stickgold, of Harvard.
“Extracting patterns and rules and what we call the gist of a memory might turn out to be antithetical to the
process of nailing down the facts themselves,‖ Dr. Stickgold said. ―So, for instance, you might use REM to integrate
one, and slow-wave sleep for the other.‖
The new findings hardly close the book on how memories are formed and consolidated during sleep. Other
scientists have found evidence that rather than reactivation, the brain‘s slow-wave state induces an overall
weakening of neuron-to-neuron signaling, making recently recorded memories look bolder by reducing the
background neural ―noise.‖ And it may be, Dr. Born said, that both processes are occurring during sleep: a pruning
away of the noise of the day‘s irrelevant observations, and a replaying of its important ones.
Either way, the researchers said, the new findings are likely to prompt some creative thinking on the part of
students facing the terror of final exams. (The German research group has preliminary evidence that acrid smells
may be even better in enhancing memory.)
“We use an apparatus to sense the onset of slow-wave sleep and deliver the odor‖ in short, alternating bursts,
Dr. Born said, adding, ―I suppose for some students it would not be too difficult to develop something like this.‖
That‘s what engineering departments are for.
A United Kingdom? Maybe
By NICHOLAS WADE
Britain and Ireland are so thoroughly divided in their histories that there is no single word to refer to the
inhabitants of both islands. Historians teach that they are mostly descended from different peoples: the Irish from
the Celts and the English from the Anglo-Saxons who invaded from northern Europe and drove the Celts to the
country‘s western and northern fringes.
But geneticists who have tested DNA throughout the British Isles are edging toward a different conclusion. Many
are struck by the overall genetic similarities, leading some to claim that both Britain and Ireland have been inhabited
for thousands of years by a single people that have remained in the majority, with only minor additions from later
invaders like Celts, Romans, Angles, Saxons, Vikings and Normans. The implication that the Irish, English, Scottish
and Welsh have a great deal in common with each other, at least from the geneticist‘s point of view, seems likely to
please no one. The genetic evidence is still under development, however, and because only very rough dates can be
derived from it, it is hard to weave evidence from DNA, archaeology, history and linguistics into a coherent picture
of British and Irish origins.
That has not stopped the attempt. Stephen Oppenheimer, a medical geneticist at the University of Oxford, says
the historians‘ account is wrong in almost every detail. In Dr. Oppenheimer‘s reconstruction of events, the principal
ancestors of today‘s British and Irish populations arrived from Spain about 16,000 years ago, speaking a language
related to Basque.
The British Isles were unpopulated then, wiped clean of people by glaciers that had smothered northern Europe
for about 4,000 years and forced the former inhabitants into southern refuges in Spain and Italy. When the climate
warmed and the glaciers retreated, people moved back north. The new arrivals in the British Isles would have found
an empty territory, which they could have reached just by walking along the Atlantic coastline, since the English
Channel and the Irish Sea were still land.
This new population, who lived by hunting and gathering, survived a sharp cold spell called the Younger Dryas
that lasted from 12,300 to 11,000 years ago. Much later, some 6,000 years ago, agriculture finally reached the
British Isles from its birthplace in the Near East. Agriculture may have been introduced by people speaking Celtic, in
Dr. Oppenheimer‘s view. Although the Celtic immigrants may have been few in number, they spread their farming
techniques and their language throughout Ireland and the western coast of Britain. Later immigrants arrived from
northern Europe had more influence on the eastern and southern coasts. They too spread their language, a branch
of German, but these invaders‘ numbers were also small compared with the local population.
In all, about three-quarters of the ancestors of today‘s British and Irish populations arrived between 15,000 and
7,500 years ago, when rising sea levels split Britain and Ireland from the Continent and from each other, Dr.
Oppenheimer calculates in a new book, ―The Origins of the British: A Genetic Detective Story‖ (Carroll & Graf, 2006).
Ireland received the fewest of the subsequent invaders; their DNA makes up about 12 percent of the Irish gene
pool, Dr. Oppenheimer estimates. DNA from invaders accounts for 20 percent of the gene pool in Wales, 30 percent
in Scotland, and about a third in eastern and southern England.
But no single group of invaders is responsible for more than 5 percent of the current gene pool, Dr. Oppenheimer
says on the basis of genetic data. He cites figures from the archaeologist Heinrich Haerke that the Anglo-Saxon
invasions that began in the fourth century A.D. added about 250,000 people to a British population of one to two
million, an estimate that Dr. Oppenheimer notes is larger than his but considerably less than the substantial
replacement of the English population assumed by others. The Norman invasion of 1066 brought not many more
than 10,000 people, according to Dr. Haerke.
Other geneticists say Dr. Oppenheimer‘s reconstruction is plausible, though some disagree with details. Several
said genetic methods did not give precise enough dates to be confident of certain aspects, like when the first
“Once you have an established population, it is quite difficult to change it very radically,‖ said Daniel G. Bradley,
a geneticist at Trinity College, Dublin. But he said he was ―quite agnostic‖ as to whether the original population
became established in Britain and Ireland immediately after the glaciers retreated 16,000 years ago, as Dr.
Oppenheimer argues, or more recently, in the Neolithic Age, which began 10,000 years ago.
Bryan Sykes, another Oxford geneticist, said he agreed with Dr. Oppenheimer that the ancestors of ―by far the
majority of people‖ were present in the British Isles before the Roman conquest of A.D. 43. ―The Saxons, Vikings
and Normans had a minor effect, and much less than some of the medieval historical texts would indicate,‖ he said.
His conclusions, based on his own genetic survey and information in his genealogical testing service, Oxford
Ancestors, are reported in his new book, ―Saxons, Vikings and Celts: The Genetic Roots of Britain and Ireland.‖
A different view of the Anglo-Saxon invasions has been developed by Mark Thomas of University College, London.
Dr. Thomas and colleagues say the invaders wiped out substantial numbers of the indigenous population, replacing
50 percent to 100 percent of those in central England. Their argument is that the Y chromosomes of English men
seem identical to those of people in Norway and the Friesland area of the Netherlands, two regions from which the
invaders may have originated.
Dr. Oppenheimer disputes this, saying the similarity between the English and northern European Y chromosomes
arises because both regions were repopulated by people from the Iberian refuges after the glaciers retreated.
Dr. Sykes said he agreed with Dr. Oppenheimer on this point, but another geneticist, Christopher Tyler-Smith of
the Sanger Centre near Cambridge, said the jury was still out. ―There is not yet a consensus view among geneticists,
so the genetic story may well change,‖ he said. As to the identity of the first postglacial settlers, Dr. Tyler-Smith said
he ―would favor a Neolithic origin for the Y chromosomes, although the evidence is still quite sketchy.‖
Dr. Oppenheimer‘s population history of the British Isles relies not only on genetic data but also on the dating of
language changes by methods developed by geneticists. These are not generally accepted by historical linguists,
who long ago developed but largely rejected a dating method known as glottochronology. Geneticists have recently
plunged into the field, arguing that linguists have been too pessimistic and that advanced statistical methods
developed for dating genes can also be applied to languages.
Dr. Oppenheimer has relied on work by Peter Forster, a geneticist at Anglia Ruskin University, to argue that Celtic
is a much more ancient language than supposed, and that Celtic speakers
could have brought knowledge of agriculture to Ireland, where it first
appeared. He also adopts Dr. Forster‘s argument, based on a statistical
analysis of vocabulary, that English is an ancient, fourth branch of the
Germanic language tree, and was spoken in England before the Roman
English is usually assumed to have developed in England, from the
language of the Angles and Saxons, about 1,500 years ago. But Dr.
Forster argues that the Angles and the Saxons were both really Viking
peoples who began raiding Britain ahead of the accepted historical
schedule. They did not bring their language to England because English, in
his view, was already spoken there, probably introduced before the arrival
of the Romans by tribes such as the Belgae, whom Caesar describes as
being present on both sides of the Channel.
The Belgae perhaps introduced some socially transforming technique,
such as iron-working, which led to their language replacing that of the
indigenous inhabitants, but Dr. Forster said he had not yet identified any
specific innovation from the archaeological record.
Germanic is usually assumed to have split into three branches: West
Germanic, which includes German and Dutch; East Germanic, the
language of the Goths and Vandals; and North Germanic, consisting of the
Scandinavian languages. Dr. Forster‘s analysis shows English is not an
offshoot of West Germanic, as usually assumed, but is a branch
independent of the other three, which also implies a greater antiquity.
Germanic split into its four branches some 2,000 to 6,000 years ago, Dr. Forster estimates.
Historians have usually assumed that Celtic was spoken throughout Britain when the Romans arrived. But Dr.
Oppenheimer argues that the absence of Celtic place names in England — words for places are particularly durable
— makes this unlikely.
If the people of the British Isles hold most of their genetic heritage in common, with their differences consisting
only of a regional flavoring of Celtic in the west and of northern European in the east, might that perception draw
them together? Geneticists see little prospect that their findings will reduce cultural and political differences. The
Celtic cultural myth ―is very entrenched and has a lot to do with the Scottish, Welsh and Irish identity; their main
identifying feature is that they are not English,‖ said Dr. Sykes, an Englishman who has traced his Y chromosome
and surname to an ancestor who lived in the village of Flockton in Yorkshire in 1286.
Dr. Oppenheimer said genes ―have no bearing on cultural history.‖ There is no significant genetic difference
between the people of Northern Ireland, yet they have been fighting with each other for 400 years, he said.
As for his thesis that the British and Irish are genetically much alike, ―It would be wonderful if it improved
relations, but I somehow think it won‘t.‖
Despite their heft, many dinosaurs had surprisingly tiny genomes
Two major classes of dinosaurs show genomes distinctly aligned with modern birds, reptiles
CAMBRIDGE, Mass. -- They might be giants, but many dinosaurs apparently had genomes no larger than that of a
So say scientists who've linked bone cell and genome size among living species and then used that new
understanding to gauge the genome sizes of 31 species of extinct dinosaurs and birds, whose bone cells can be
measured from the fossil record.
The researchers, at Harvard University and the University of Reading, were led by Chris Organ and Scott V.
Edwards, both at Harvard. They report their findings this week in the journal Nature.
"We see distinct differences between two major lineages of dinosaurs," says Organ, a postdoctoral fellow in
organismic and evolutionary biology supported by the National Institutes of Health. "The theropods -- carnivores
such as Tyrannosaurus rex and Velociraptor -- had very small genomes, in the range of modern birds. Ornithischians
-- which include Stegosaurus and Triceratops -- had more moderately sized genomes, akin to those of living lizards
and crocodilians. We aren't sure about the genomes of the long-necked sauropods yet."
Organ and Edwards say the clear-cut dichotomy in dinosaur genomes is likely due to different amounts of
repetitive and non-coding DNA in the two groups' genetic material, a factor largely responsible for variation in
genome size across animal species. They estimate that active repetitive DNA might have comprised an average 12
percent of the ornithischian genome but just 8.4 percent of theropod genetic constitution.
The work indicates that the small genomes typically associated with birds -- whose genetic composition is
noticeably sparer than that of other vertebrates -- evolved in dinosaurs some 230 to 250 million years ago, rather
than with the emergence of modern living birds just 110 million years ago. Organ and Edwards suggest after this
shrinking, theropod genomes then stabilized in size for hundreds of millions of years, a process that continues in
"Our work debunks the theory that the small, repeat-poor genomes typical of birds may have co-evolved with
flight as a means of conserving energy," says Edwards, professor of organismic and evolutionary biology in
Harvard's Faculty of Arts and Sciences and Alexander Agassiz Professor of Zoology and curator of ornithology in
Harvard's Museum of Comparative Zoology. "In fact, our work shows these streamlined genomes arose long before
the first birds and flight, and can be added to the list of dinosaur traits previously thought to be found only in
modern birds, including feathers, pulmonary innovations, and parental care and nesting."
Other researchers had previously determined that the sizes of various cell types, across species, tend to reflect
the size of an organism's genome. Analyzing 26 living species, Organ and Edwards are the first to show that the
same applies to the bone cells called osteocytes.
These cells reside in individual lacunae, small pockets inside bone tissue. This uniquely durable cellular housing
allowed the scientists to look back in time at the size of 31 extinct species' genomes: By measuring lacunae in
dinosaur and extinct bird specimens housed at Harvard's Museum of Comparative Zoology and at the Museum of
the Rockies in Bozeman, Mont., they were able to determine just how big the various extinct species' osteocytes had
"These fossils let us sample species through evolutionary time," Edwards says, "providing genomic information
that's often unavailable for long-extinct ancestors."
A gatekeeper for the US drug industry
IN THE land of the free market, the idea of the government influencing the choice and cost of medicines is
heresy. But that's exactly what's in store for the US, as it tries to rein in its healthcare costs, which threaten to
cripple the economy if left unchecked.
Other countries have already taken steps to deal with such problems. The UK set up the National Institute for
Health and Clinical Excellence (NICE) in 1999 to decide which drugs the country's National Health Service could use
(see "UK's gatekeeper has the final say"). Similar organisations operate in Australia and Canada, and all claim to be
working successfully, allowing governments to just say no to ineffective drugs and haggle with pharmaceutical
companies when prices are too high.
Now, at long last, the US is considering a similar proposal in the shape of a proposed Comparative Effectiveness
Board (CEB), which would review the evidence on how well drugs work and whether they are costeffective. If
necessary, the CEB would carry out its own clinical trials. The idea is to break the pharmaceutical industry's
stranglehold on drug prices and stop it peddling marginally effective medicines. The drug industry is already
expressing its displeasure at the idea of a government body judging a drug's value for money.
Support for such a body is growing in both the public and private healthcare arenas. "There are cultural
differences about how the role of government is viewed, and most Americans tend to be on the side of 'less
government'," says Steve Pearson, of Harvard Medical School and a key proponent of the CEB. "But that's starting
to change, as people have problems affording healthcare, and something has to give."
On a diet? You'll spend more on impulse purchases
People who exercise self control in some way, such as dieting or trying not to look at or think about something,
will tend to make more impulse purchases if given the opportunity, explains a study from the March issue of the
Journal of Consumer Research.
Kathleen D. Vohs and Ronald J. Faber (University of Minnesota) point out that opportunities for impulse
purchasing have increased with the proliferation of ATMs, shopping on the Internet, and shop-at-home television
programs. Instead of gauging desire for things, they are the first to measure actual spending patterns after the
exertion of self-control, expanding the literature to include the impulse to buy.
For example, the researchers explore the effects of mental self control in an experiment that asked a group of
participants to write down all their thoughts for six minutes. Another group was told that they should also write
down all their thoughts – with one exception. The participants were told that if they thought of "a white bear" they
were NOT to write it down, but instead to place a check mark at the side of their paper.
The participants were then told that they were taking place in an unrelated study and given $10 to spend on
items from the college bookstore. They were told that whatever unspent money was theirs to keep.
Even a small regulatory exercise – the attempt to suppress an innocuous thought about a white bear – caused
the participants who had exercised mental self-control to spend and buy more. Those who had just tried to control
their thoughts spent an average of $4.05. Those who had been free to write whatever they wanted spent an
average of $1.21. Participants who had previously been asked to exercise self-control also bought twice as many
items on average as members of the unregulated group.
"Overall, the research shows that people need self-regulatory resources to resist impulse buying temptations, and
that these resources can be depleted by prior self-control efforts," write the researchers. "As a result, people should
avoid shopping on days when they have earlier exercised great self-control or when starting a new self-
improvement program, such as a new diet."
Stroke patients admitted to hospitals on weekends may be more likely to die
American Heart Association rapid access journal report
Patients admitted to hospitals for ischemic stroke on weekends had a higher risk of dying than patients admitted
during the week, in a Canadian study published in Stroke: Journal of the American Heart Association.
A "weekend effect" has been previously documented when looking at other conditions such as cancer and
pulmonary embolism; however, little is known of its impact on stroke death.
"What is really novel in our work beyond the discovery of the 'weekend effect' on ischemic stroke is the subgroup
analysis in other settings/characteristics and the identification of variables associated with the 'weekend effect,'"
said Gustavo Saposnik, M.D., M.Sc., lead author of the study. "This is a large, population-based study across
Canada including different facilities-rural/urban, teaching/non-teaching facilities, and small/large community."
Researchers evaluated the impact of weekend admission on in-hospital stroke deaths in different settings based
on data from all ischemic stroke hospital admissions in Canada from April 2003 to March 2004. An ischemic stroke is
caused by a blood clot that blocks blood flow in an artery in or leading to the brain. It is the most common type of
Of 26,676 patients admitted to 606 hospitals, 24.8 percent were admitted on Saturdays and Sundays. Patients
admitted on the weekend were on average age 75, while those admitted during the week were average age 74.
After adjusting for age, gender and other medical complications, researchers found that patients admitted on the
weekend had a 14 percent higher risk of dying within seven days of admission compared to patients admitted
during the week.
The "weekend effect" on deaths in the seven-day period was even greater when patients were admitted to a
rural hospital versus an urban hospital and when the physician in charge was a general practitioner compared to a
The analysis didn't define what type of specialist.
"Although the 'weekend effect' affected patients admitted to both rural and urban hospitals and those treated by
general practitioners and specialists, the effect may be larger in patients admitted to rural hospitals and when the
most responsible physician is a general practitioner," said Saposnik, an assistant professor of medicine and director
of the Stroke Research Unit Division of Neurology at the University of Toronto and a staff neurologist at St. Michael's
Hospital in Toronto.
Other factors that influence the death rate in a weekend admission were if it occurred in a non-teaching hospital
or if the patient required care in the intensive care unit. Further analysis also revealed that patients admitted on the
weekend also were less likely to be discharged to go home.
"This appears be an 'unmodifiable' risk from the patient's side," Saposnik said. "This seems to be a 'natural'
phenomenon in health care, even in Canada with universal, government-funded health insurance with no co-
payments. If the 'weekend effect' occurs in a socialized health care system, it is likely that the effect may be larger
in other settings."
Researchers said disparities in resources, expertise and health care providers working during the weekend may
explain the difference.
However, Saposnik stressed that patients, relatives, health care professionals and policy makers must understand
that "time is brain, so the sooner the patient seeks medical attention, the higher the chance of better outcome, no
matter the day, time or living area."
Larry B. Goldstein, M.D., chair of the Stroke Council of the American Heart Association, also emphasized the
importance of seeking immediate treatment whenever a person experience, stroke symptoms.
"Although the differences in weekend admission found in this study may be real, the potential benefits of
obtaining early treatment would well outweigh the risk of waiting," Goldstein said. "Patients developing symptoms of
stroke such as abrupt difficulty speaking or understanding, weakness or numbness affecting an arm or leg, and
unexplained difficulty walking or with coordination need to get to a hospital organized to provide stroke care as soon
as possible, regardless of the day of the week."
UGA research shows rats are capable of reflecting on mental processes
First time ever shown for a nonprimate species, opens new areas of study
Let‘s say a college student enters a classroom to take a test. She probably already has an idea how she
Athens, Ga. --
will do—knowledge available before she actually takes out a pencil. But do animals possess the same ability to think
about what they know or don‘t know?
A new study by researchers from the University of Georgia, just published in the journal Current Biology, shows
that laboratory rats do. It‘s the first demonstration that any non-primate knows when it doesn‘t know something,
and it could open the way to more in-depth studies about how animals—and humans—think.
"This kind of research may change how we think about cognition and memory in animals," said Jonathon Crystal,
an associate professor of psychology in UGA‘s Franklin College of Arts and Sciences.
Crystal‘s co-author on the paper is Allison Foote, a graduate student in the department of psychology at UGA.
Researchers have believed for some time that people and non-human primates are capable of "metacognition"—
reasoning or thinking about one‘s own thinking. There have been studies on birds about this kind of thinking
process, but results thus far have been inconclusive. The new study is the first that shows a non-primate species
has metacognition—a proposal that may well be controversial.
The study involved what is called a "duration-discrimination" test—offering rats rewards for classifying a signal as
either short or long. As in most such tests, the "right" answer led to a large food reward, while a "wrong" answer
led to no reward at all. The twist, however, is that before taking the duration test,
the rats were given the chance to decline the test completely. If they made that
choice, they got a small reward anyway.
"If rats have knowledge about whether they know or don‘t know the answer to
the test, we would expect them to decline most frequently on difficult tests," said
Crystal. "They would also show the lowest accuracy on difficult tests that they can‘t
decline. Our data showed both to be true, suggesting the rats have knowledge of
their own cognitive states."
It‘s easy to find out when humans believe they know or don‘t know the answer
to a task or test. You just ask them. With non-verbal animals, it is necessary to
used experimental conditions in which a subject can demonstrate knowledge of a
cognitive state through its behavior.
The tests asked the rats to discriminate among a number of responses.
Sometimes, the choices were relatively easy, and the rats were able to make a
choice that generated a large reward. But often, the choices were quite difficult,
and the animals faced a dilemma: Should they continue and take a chance on the
test with the risk of no food reward, or should they just bail out and take the small,
but guaranteed reward?
A new study by researchers from the University of Georgia, just published in the journal Current Biology,
shows that laboratory rats possess the ability to think about what they know or don’t know. It’s the first
demonstration that any non-primate knows when it doesn’t know something, and it could open the way to
more in-depth studies about how animals -- and humans -- think. Dot Paul, University of Georgia
One part of the problem, for example, was presenting the rats with a sound and asking them to determine if it
was "short" or "long." When the sounds were near the extremes of either end, discriminating was easy. But for
sounds with durations in the mid-range, the rats found it extremely hard to know if they were "short" or "long." So
what should they do: Guess and possibly be wrong, or simply refuse to take the test and get a small reward?
"Our research showed that the rats know when they don‘t know the answer to a question," said Crystal.
The results of the just-published study present a dilemma for those who had previously believed that only
primates could achieve metacognition. But it also presents a rodent model that should allow researchers to
understand better what animals are "cognitively sophisticated" and why.
The research will also open new lines of inquiry about the underlying neural mechanisms of this ability. Reflecting
on one‘s own mental experiences is a defining feature of human existence, and the demonstration of metacognition
in rats suggests that this type of cognition may be widespread among animals. Does it mean, for example, that rats
are "conscious," and could that also be true of other non-primates?
Genes and groups of genes commonly shared between species, studies show
Sharing genes likely helps organisms adapt more quickly to new environments
Berkeley -- Two new studies by University of California, Berkeley, scientists highlight the amazing promiscuity of genes,
which appear to shuttle frequently between organisms, especially more primitive organisms, and often in packs.
Such gene flow, dubbed horizontal gene transfer, has been seen frequently in bacteria, allowing pathogenic
bacteria, for example, to share genes conferring resistance to a drug. Recently, two different species of plants were
shown to share genes as well. The questions have been: How common is it, and how does it occur?
In a report appearing this week in the Proceedings of the National Academy of Sciences (PNAS), UC Berkeley and
Lawrence Berkeley National Laboratory (LBNL) researchers analyzed more than 8,000 different families of genes
coding for proteins - families that represent the millions of proteins in all living creatures - to assess the prevalence
of horizontal gene transfer.
They found that more than half of all the most primitive organisms, Archaea, have one or more protein genes
acquired by horizontal gene transfer, as compared to 30 to 50 percent of bacteria that have acquired genes this way.
Fewer than 10 percent of eukaryotes - plants and animals - have genes acquired via horizontal gene transfer.
In a second report published online by Nature on March 7, two species of bacteria living together in the pink
slime of an acidic California mine were found to share large groups of genes. These genes code for proteins that
work together, so by acquiring the entire block from another organism, bacteria can gain a new function that helps
them adapt more quickly to the same type of environment - in this case, a hot, highly acidic, metal-rich broth.
This is the first observation of exchange of very large genomic blocks between organisms in a natural microbial
community, according to UC Berkeley's Jill Banfield, who led the team of researchers from LBNL, Oak Ridge National
Laboratory (ORNL), Lawrence Livermore National Laboratory and the U. S. Department of Energy's Joint Genome
"One of the key questions being debated was, 'Is horizontal gene transfer extensive and rampant, or is it a
relatively rare event?'" said Sung-Hou Kim, professor of chemistry at UC Berkeley and coauthor of the PNAS paper.
"This becomes important in classifying organisms and comparing whole genomes to find their relationships.
"Our study shows that gene transfer is fairly common, but the extent in a given organism is fairly low - that is,
most organisms have received one or more genes from a closely related organism. And while it's very likely that
genes are transferred in chunks that are linked metabolically, I bet it's not always true. If a group of genes doesn't
have value in a new environment for a new organism, it's not going to stick around."
"This provides important information about the conservation of genetic resources to enable life to survive and
thrive," said ORNL's Bob Hettich, a co-author of the Nature paper. "Ultimately, the basic knowledge gained from this
research will lead to a greater understanding of genetic diversity in related organisms and should lead to
developments in human health and bioremediation."
Though the Nature findings about mine slime bear on the issue of horizontal gene transfer, the study's main goal
was to detect, with high resolution, which organism is able to carry out what function within a natural, uncultivated
microbial community, according to Banfield.
"In addition to revealing a history of genetic exchange between two dominant organism types in the mine, we
show that it is possible to identify a large fraction of the proteins from coexisting organisms and determine which
organism most of the proteins comes from, even if the organisms are quite closely related," said Banfield, a
professor of earth and planetary science and of environmental science, policy and management at UC Berkeley and
also an LBNL researcher.
Banfield leads a long-term study of the community of organisms in mine slime obtained from the Richmond Mine
near Redding, Calif. This microbial biofilm has turned out to be an ideal research subject, Banfield said, because the
simple community contains few enough organisms that they can be used as a model system to uncover aspects of
how microbes interact with each other and their surroundings in ways that are difficult or impossible in other
Banfield contrasts her strategy of ever more detailed studies of a single site to that of Craig Venter, who has
been sailing the world's oceans aboard his boat, Sorcerer II, sampling large communities of organisms to survey
global diversity. After four years collecting vast amounts of genomic information, he plans to publish some of his
analyses next week in the Public Library of Science, or PLoS.
In 2002, the mine was the source of samples for the first fairly comprehensive community genomic, or
metagenomic, characterization of a natural microbial consortium. In 2005, Banfield and colleagues presented the
first relatively large-scale analysis of the proteins that consortia members make to carry out the various metabolic
tasks needed for life underground - work that revealed information about the machinery used to adapt to the
extreme conditions in which they live. More recently, in 2006, research scientist Brett Baker, Banfield and colleagues
reported that the biofilms harbor novel archaeal organisms that appear to be extremely small compared to other life
"Analysis of how microorganisms respond to their environments and the role of exchange of genetic material in
adaptation and evolution is important if we are to understand important environmental processes such as acid mine
drainage, or even degradation of cellulose for ethanol production by microbial communities," added Banfield.
In their new paper, the researchers combine metagenomics with strain-resolved shotgun proteomics to show that
different organisms are exchanging large blocks of their genes.
"Who's there and what are they doing are key questions in microbial ecology," said Banfield's colleague Vincent
Denef, a post-doctoral researcher in UC Berkeley's Department of Earth and Planetary Science. "Our high-resolution,
mass spectrometry-based community proteomics approach answers both at the same time. We can now tell apart
closely related organisms, which we previously would have grouped as one species, and we can monitor and
discriminate their behavior within the same natural community. These abilities will allow us to understand the
implications of small differences in genome sequence and content on ecological performance, one of the key goals
of the current microbial genomic sequencing efforts."
Hospital equipment unaffected by cell phone use, study finds
ROCHESTER, Minn. -- Calls made on cellular phones have no negative impact on hospital medical devices, dispelling the
long-held notion that they are unsafe to use in health care facilities, according to Mayo Clinic researchers.
In a study published in the March issue of Mayo Clinic Proceedings, researchers say normal use of cell phones
results in no noticeable interference with patient care equipment. Three hundred tests were performed over a five-
month period in 2006, without a single problem incurred.
Involved in the study were two cellular phones which used different technologies from different carriers and 192
medical devices. Tests were performed at Mayo Clinic campus in Rochester.
The study‘s authors say the findings should prompt hospitals to alter or abandon their bans on cell phone use.
Mayo Clinic leaders are reviewing the facility‘s cell phone ban because of the study‘s findings, says David Hayes,
M.D., of the Division of Cardiovascular Diseases and a study author.
Cell phone bans inconvenience patients and their families who must exit hospitals to place calls, the study‘s
The latest study revisits two earlier studies that were done ‗in vitro‘ (i.e., the equipment wasn‘t connected to the
patients), which also found minimal interaction from cell phones used in health care facilities. Dr. Hayes says the
latest study bolsters the notion that cells phones are safe to use in hospitals.
Other Technology-Related Proceedings Articles Explore Concerns for Patients
Two other pieces in the March issue of Mayo Clinic Proceedings also address whether technological devices
interfere with patient care equipment. Unlike the cellular phone study, the other reports detail technological devices
that caused patient care equipment to malfunction.
A letter to the editor published in the journal details the first known case of a portable CD player causing an
abnormal electrocardiographic (ECG) recording within a hospital setting. The recording returned to normal when the
CD player, which the patient was holding close to the ECG lead, was turned off.
Technology also can threaten implantable rhythm devices such as pacemakers and defibrillators outside the
hospital setting, according to a journal report. The report outlines two cases of retail stores‘ anti-theft devices
causing people‘s heart devices to malfunction.
The anti-theft devices are commonly placed near store exits and entrances, triggering an alarm if customers
leave with merchandise that was not purchased. In two instances in Tennessee, customers with a pacemaker and
an implantable cardiac defibrillator experienced adverse reactions after nearing anti-theft devices.
The devices triggered the adverse reactions, sending both patients to emergency rooms for evaluation. The
report‘s authors recommend that the anti-theft devices be placed in areas of stores where customers won‘t linger --
away from vending machines or displays of sale merchandise, for instance -- to help avoid future episodes.
Store employees also should be trained to move a customer who has collapsed near an anti-theft device when
medically advisable, says J. Rod Gimbel, M.D, of East Tennessee Heart Consultants, and an author of the report. If
they aren‘t moved, they could experience recurring life-threatening malfunction to their implantable device, as did
one patient who was described in the report.
“Simply moving the person away from the anti-theft device may save their life,‖ Dr. Gimbel says.
Though Gimbel‘s report outlines only two cases of anti-theft devices causing implantable heart devices to
malfunction, he asserts that similar instances are likely underreported, qualifying the problem as a potentially
widespread public safety issue.
“Many times with public safety issues we wait until something bad occurs before we act,‖ Dr. Gimbel says.
―Here‘s an opportunity where we can make our knowledge public and head off future problems.‖
In an accompanying editorial, John Abenstein, M.D., of Mayo Clinic‘s Department of Anesthesiology, addresses
the journal reports relating to the impact of technological devices on patient care equipment.
Dr. Abenstein says the risk of some technological devices upsetting the function of patient care equipment in
hospitals appears to be small. The Food and Drug Administration(FDA) should take a more explicit stand on the
matter, he says, so that health care facility policies can be altered when appropriate.
Across the Universe
Trying to Meet the Neighbors
By DAVE ITZKOFF
Is there anybody out there? Give the question some thought before you answer, because it‘s more perilous than
it seems. Deny the possibility of a universe populated with intelligent extraterrestrials that can speak and mate and
battle with humanity, and the science-fiction canon collapses; more than a century‘s worth of novels, from ―The War
of the Worlds‖ to ―Old Man‘s War,‖ would find their speculative foundations swept out from underneath them. But
admit to a sincere belief in the remotest potential for alien life, and prepare to be fitted for a straitjacket; a recent
survey conducted by Baylor University found that more Americans believe in ancient civilizations like the lost
continent of Atlantis than in U.F.O.‘s.
If intelligent humans think the existence of aliens is a perplexing notion, the idea that a small group of
astronomers and physicists are attempting to determine if aliens exist is even more so. To the uninitiated and the
ignorant (and those of us who fall into both camps), the science of SETI, the Search for Extraterrestrial Intelligence,
is as inscrutable as the phenomenon it seeks, conjuring up images of mad scientists out of old EC comic books, their
ears trained to intricate listening devices as they scan the interstellar static for unearthly broadcasts — or worse, of
hapless crackpots chasing down every report of alien abduction they read in Weekly World News.
Surprisingly, the science-fiction community (which knows a thing or two about being misunderstood and
dismissed) is not unequivocally supportive of SETI‘s work. In a 2003 lecture entitled ―Aliens Cause Global Warming,‖
Michael Crichton declared, ―SETI is unquestionably a religion.‖ And authors free of Crichton‘s political baggage do
not cast SETI‘s mission in particularly upbeat terms, either: in his short story ―The Puzzle,‖ the Serbian author Zoran
Zivkovic writes of a scientist pursuing a SETI-like experiment, whose ―gloomy exultation‖ can end only with
irrefutable evidence of extraterrestrial intelligence — only ―with contact made would he be able to say that his life‘s
work had meaning.‖
The truth of SETI is more nuanced and more complicated, but at least it‘s easier to find. Since 1991, Seth
Shostak has been engaged in this ambitious, arduous and often frustrating endeavor, as the senior astronomer of
the SETI Institute, a foundation that studies the nature and origins of life wherever it might turn up. While his years
of observation have yet to detect a blip of alien communication, Shostak is more interested in the repercussions that
would follow if the slightest evidence of extraterrestrial life — or the faintest arrow pointing to it — were someday
discovered. ―If you could find even dead pond scum on Mars, it tells you biology‘s not a miracle,‖ he said during a
recent interview at his Mountain View, Calif., office. ―It means there‘s got to be life all over the place.‖
Conspiracy theorists would be disappointed by the SETI Institute‘s sedate headquarters, in a corporate park it
shares with the software maker Symantec, a few miles from the campus of the Internet giant Google. I found no
evidence of any alien autopsies during my visit, and saw no little green men wandering the halls, aside from the
occasional action figure or inflatable toy decorating a desk or cubicle.
Established in 1984, the institute has been almost entirely privately financed since the 1990s (though its sister
organization, the Carl Sagan Center, still receives NASA grants for its astrobiology research). The small fraction of its
staff involved in actual SETI experiments knows that the work does not necessarily inspire the awe and admiration
of the general public, a stigma that has been around at least since the publication of Sagan‘s novel ―Contact‖ in
1985: to be a scientist who believed in the possibility of alien intelligence, he wrote, was to possess a flaw, the way
your other colleagues ―had topless bars, or carnivorous plants, or something called Transcendental Meditation.‖
Shostak is no social deviant (as far as I know): a gregarious 63-year-old with a Ph.D. in astronomy from the
California Institute of Technology, he taught for 13 years at the Rijksuniversiteit Groningen in the Netherlands and
conducted SETI experiments there in the 1980s, using the nearby Westerbork Synthesis Radio Telescope, before
joining the institute. But he can understand how his research might appear to outside observers like an empty
endeavor. ―It‘s like pulling the lever on a slot machine,‖ he said. ―The next quarter might be it.‖
There is some persuasive data to suggest that ours isn‘t the only world capable of beating the odds. While
astronomers once thought planets were rare, they now believe that anywhere from 5 percent to 90 percent of all
stars are orbited by them, meaning there could be as many as 10,000 billion billion other solar systems in the
Meanwhile, experiments on Earth have found micro-organisms in harsh conditions — extreme temperatures,
pressures and altitudes — thought to be adverse to biology, suggesting that bodies as nearby as Mars and some
moons of Jupiter could have the right stuff to cook up alien life.
Using massive radio telescopes, SETI hopes to detect electromagnetic transmissions coming from these potential
neighborhoods. In the past decade, Shostak estimates, the institute has looked at 750 star systems, but within the
next 25 years it should be able to scan one million to two million more. And he is fond of betting his interrogators a
cup of Starbucks that the question of whether mankind is alone in the universe will be settled by the year 2025.
Though he occasionally sits on panels at science-fiction conventions, where authors and readers quiz him about
whose depictions of imaginary beings are the most realistic, Shostak tries not to weigh in on questions about what
extraterrestrial life looks like or how it behaves. ―This is all alien sociology,‖ he said with a grin, ―and I have to tell
you the data set is small.‖
To his mind, the sci-fi traditions of hominid aliens with insectoid features (think of the buglike Formics of Orson
Scott Card‘s ―Ender‘s Game‖ novels) or serpentine qualities (say, the reptilian members of the Race in Harry
Turtledove‘s ―Worldwar‖ books) probably say more about earthbound writers than actual extraterrestrials. ―We‘re
hard-wired not to like insects very much, and the same is true for snakes,‖ Shostak explained. ―You turn on Animal
Planet, you don‘t see many shows on gerbil behavior. Anything that can hurt you, you‘re much more interested.‖
Yet some of the wilder ideas about alien life that Shostak‘s fellow scientists have contemplated in their own
fiction could be within the realm of possibility. From his own reading, he describes as fanciful but impossible to
dismiss completely the physicist Robert L. Forward‘s novel ―Dragon‘s Egg‖ — which imagines tiny creatures on the
surface of a neutron star, whose lives elapse a million times faster than our own — and the title character of the
astrophysicist Fred Hoyle‘s novel ―The Black Cloud,‖ a diffuse organism as wide as the orbit of Venus.
Underlying all these conjectures is a single, basic assumption (proponents of intelligent design should stop
reading here): no matter where life is found, it will obey Darwinian tenets of evolution. ―It‘s hard to avoid
evolution,‖ Shostak said, ―because all it says is that if you survive, then you must be good, so whatever traits you
have tend to propagate.‖
To the extent that SETI is modest about its expectations, it has found philosophical support in unexpected
quarters. When Richard Dawkins sniffs around the underlying science of SETI in ―The God Delusion,‖ he writes that
he doesn‘t ―immediately scent extreme improbability.‖ And Dawkins concedes that ―there are very probably alien
civilizations that are superhuman, to the point of being godlike in ways that exceed anything a theologian could
possibly imagine‖ — an idea he acknowledges cribbing from Arthur C. Clarke, whose so-called Third Law of
prediction states, ―Any sufficiently advanced technology is indistinguishable from magic.‖
Shostak has as much use for magic as Dawkins has for the Almighty, but the two are in agreement on this point.
―The universe is three times as old as the earth, so there‘s opportunity for civilizations that are billions of years
ahead of us,‖ Shostak said. ―Our radio transmitters are a million times more powerful than Marconi‘s were, and
we‘ve done that in a hundred years. What could they do in a billion?‖
One thing he believes they could do is build thinking machines to rival human intelligence. ―The trouble with
Darwinian evolution is, we don‘t change very fast,‖ Shostak said. ―But machines are on time scales of decades, not
hundreds of thousands of years.” Should he ever intercept a text message from E.T., Shostak has his own guess
about who will be transmitting it on the other end: ―My bet is, they‘re not soft and squishy.‖
In his own sporadic attempts at science fiction, Shostak seems even less confident about what SETI might
someday find. In an imaginative, somewhat cynical story entitled ―In Touch at Last,‖ originally published in a 1999
issue of Science, he writes of a scientist some 50 years in the future who is the first to identify the communications
of an extraterrestrial intelligence — neither a benign message of peace nor a bellicose threat, but simply an
interstellar buzz repeating itself at regular intervals. Self-deprecating on a cosmic scale, the narrator refuses to
believe he has tapped the mother lode of scientific discoveries. ―I just stumbled on a loose nugget,‖ he says.
The story left me slightly depressed: given Shostak‘s scientific expertise, another half-century of progress and the
infinite canvas of a work of fiction, this was the most hopeful scenario he could think of? Then again, he told me,
the situation could be worse: maybe the aliens know we‘re here, and they just don‘t care. ―It‘s sort of like your
attitude toward mollusks,‖ he said. ―Are you hostile to them? Are you in favor of them? Do you try and support their
work? Maybe intelligent life is so common that, aside from people on Phi-2 Orionis making a catalog of all the
intelligent critters around, it‘s of little consequence.‖
I left our meeting unnerved by these lonely forecasts, and it took an unlikely source to cheer me up. Robyn
Asimov, the daughter of the science-fiction master Isaac Asimov, has become a friend of the SETI Institute despite
her upbringing: Dad was more of a robot guy than an alien guy, and agnostic on the subject of extraterrestrial
intelligence, except for one instance when father and daughter mistook the Goodyear blimp for a U.F.O. (―He nearly
had a heart attack,‖ she told me in a telephone interview. ―He thought he saw his career going down the drain.‖)
In our conversation, she explained to me why SETI research is still valuable, even if scientists like Shostak never
find any Daleks, Vulcans or Wookiees. ―My father knew he wouldn‘t live to see space travel as he wrote about it, or
robots acting in the way he wrote about them, and he was fine with that,‖ Asimov said. ―His major thrust, and I
think Seth‘s and SETI‘s as well, is to get people interested in science, and doing something about it, and then
handing the baton over to the next generation. It‘s an almost egoless outlook, because the intellectual curiosity is
what takes priority.‖
Insert the name of your favorite science-fiction author in place of Shostak‘s in the previous quotation, and I think
her formulation still holds true. Like the science-fiction community whose support, suspicion and occasional
provocation it can always expect, the SETI field inspires the same intense loyalty, from followers who regard such
pursuits less like a hobby than a calling. ―I suppose I could have been a C.P.A.,‖ Shostak said, ―but here you get to
work on a really big-picture question. I don‘t feel cursed — I feel privileged.‖
And I don‘t think Shostak, like a good science-fiction author, can truly accept a universe devoid of the possibility
for transcendence — where extraterrestrials know humanity exists, but are merely indifferent to it. At the end of ―In
Touch at Last,‖ his fictional scientist decodes the mysterious alien transmission and is humbled to discover that the
repetitious buzz is a kind of universal clock. ―I didn‘t expect them to give us the time of day,‖ the narrator says. ―But
Penn study finds inhaled anesthetics accelerate the appearance of brain plaque in animals
Could lead to early onset of Alzheimer's disease
PHILADELPHIA – Researchers at the University of Pennsylvania's School of Medicine have discovered that common
inhaled anesthetics increase the number of amyloid plaques in the brains of animals, which might accelerate the
onset of neurodegenerative diseases like Alzheimer's. Roderic Eckenhoff, MD, Vice Chair of Research in the
University of Pennsylvania's Department of Anesthesia and Critical Care, and his co-authors, report their findings in
the March 7th online edition of Neurobiology of Aging.
Every year over 100 million people undergo surgery worldwide, most under general anesthesia with an inhaled
drug. These drugs clearly affect cognitive ability at least in the short term, but the growing concern is that inhaled
anesthetics may affect a person well beyond the perioperative period, even permanently. Several factors appear to
play a role in this subtle loss of cognitive ability, most notably age.
A specific effect of these drugs on dementias like Alzheimer's disease, though suspected for many years, has only
been recently supported by data. In 2003, Eckenhoff's group showed that the inhaled anesthetics enhance the
aggregation and cytotoxicity of the amyloid beta peptide. Just last month, a study reported that these drugs also
enhance the production of amyloid beta in isolated cells. But these protein and cell culture studies are a long way
from showing that an effect occurs in vivo. This new study provides the first evidence that the predicted effect
occurs in animals.
"This animal study data suggests that we have to at least consider the possibility that anesthetics accelerate
certain neurodegenerative disorders," said Eckenhoff. "In the field of Alzheimer's research, most effort is focused on
delaying, not curing the disease. A delay in the onset of Alzheimer's disease of only three to five years would be
considered a success. Therefore, if commonly used drugs, like anesthetics, are accelerating this disorder, even by a
few years, then a similar success might follow even small changes in the care of the operative patient."
Mice don't naturally get Alzheimer's, so the animals in this study were genetically engineered to express the
human protein responsible, called amyloid beta. "These mice develop a syndrome with many features of the human
disease," explains Eckenhoff. Post-doctoral fellow and first author Shannon Bianchi, MD, exposed "middle-aged"
Alzheimer mice to anesthetics at low to moderate concentrations for two hours a day over a total of five days, not
unusual for a clinical scenario. The cognitive abilities of the mice were then analyzed using standard behavioral tests,
and their brains were examined for plaque and cell death.
"Compared to controls, the anesthesia did not appear to worsen cognitive ability, which was already considerably
compromised at this age, but it did accelerate amyloid beta aggregation and plaque appearance," said
corresponding author Maryellen Eckenhoff, PhD. "We need to test whether anesthetic at earlier, presymptomatic
stages, might accelerate both cognitive loss and plaque." This is the main cause of concern because a large fraction
of clinical patients receiving inhaled anesthetics during surgery are older, but presymptomatic individuals.
Are there anesthetics that do not accelerate plaque? "We think so, but far more research is necessary to show
this with any confidence. We have to take this one step at a time – a problem has still not been demonstrated in
humans". It is important to remember that this effect is likely to be subtle, especially with brief surgical procedures,
so the risk of not having needed surgery may exceed any potential risk from this still unproven effect. But this latest
study adds a little urgency to the effort to find out. "If inhaled anesthetics are contributing to the rise and early
onset of this devastating disease then we need to know, and soon," concludes Eckenhoff.
To access the full article click on link below: http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6T09-4N6NH93-
Cocoa 'vitamin' health benefits could outshine penicillin
The health benefits of epicatechin, a compound found in cocoa, are so striking that it may rival penicillin and
anaesthesia in terms of importance to public health, reports Marina Murphy in Chemistry & Industry, the magazine
of the SCI. Norman Hollenberg, professor of medicine at Harvard Medical School, told C&I that epicatechin is so
important that it should be considered a vitamin.
Hollenberg has spent years studying the benefits of cocoa drinking on the Kuna people in Panama. He found that
the risk of 4 of the 5 most common killer diseases: stroke, heart failure, cancer and diabetes, is reduced to less then
10% in the Kuna. They can drink up to 40 cups of cocoa a week. Natural cocoa has high levels of epicatechin.
'If these observations predict the future, then we can say without blushing that they are among the most
important observations in the history of medicine,' Hollenberg says. ‗We all agree that penicillin and anaesthesia are
enormously important. But epicatechin could potentially get rid of 4 of the 5 most common diseases in the western
world, how important does that make epicatechin?... I would say very important‘
Nutrition expert Daniel Fabricant says that Hollenberg‘s results, although observational, are so impressive that
they may even warrant a rethink of how vitamins are defined. Epicatechin does not currently meet the criteria.
Vitamins are defined as essential to the normal functioning, metabolism, regulation and growth of cells and
deficiency is usually linked to disease. At the moment, the science does not support epicatechin having an essential
role. But, Fabricant, who is vice president scientific affairs at the Natural Products Association, says: 'the link
between high epicatechin consumption and a decreased risk of killer disease is so striking, it should be investigated
further. It may be that these diseases are the result of epicatechin deficiency,' he says.
Currently, there are only 13 essential vitamins. An increase in the number of vitamins would provide significant
opportunity for nutritional companies to expand their range of products. Flavanols like epicatechin are removed for
commercial cocoas because they tend to have a bitter taste. So there is huge scope for nutritional companies to
develop epicatechin supplements or capsules
Epicatechin is also found in teas, wine, chocolate and some fruit and vegetables.
These legs were made for fighting
Human ancestors had short legs for combat, not just climbing
Ape-like human ancestors known as australopiths maintained short legs for 2 million years because a squat
physique and stance helped the males fight over access to females, a University of Utah study concludes.
"The old argument was that they retained short legs to help them climb trees that still were an important part of
their habitat," says David Carrier, a professor of biology. "My argument is that they retained short legs because
short legs helped them fight."
The study analyzed leg lengths and indicators of aggression in nine primate species, including human aborigines.
It is in the March issue of the journal Evolution.
Creatures in the genus Australopithecus – immediate predecessors of the human genus Homo – had heights of
about 3 feet 9 inches for females and 4 feet 6 inches for males. They lived from 4 million to 2 million years ago.
"For that entire period, they had relatively short legs – longer than chimps' legs but shorter than the legs of
humans that came later," Carrier says.
"So the question is, why did australopiths retain short legs for 2 million years? Among experts on primates, the
climbing hypothesis is the explanation. Mechanically, it makes sense. If you are walking on a branch high above the
ground, stability is important because if you fall and you're big, you are going to die. Short legs would lower your
center of mass and make you more stable."
Yet Carrier says his research suggests short legs helped australopiths fight because "with short legs, your center
of mass is closer to the ground. It's going to make you more stable so that you can't be knocked off your feet as
easily. And with short legs, you have greater leverage as you grapple with your opponent."
While Carrier says his aggression hypothesis does not rule out the possibility that short legs aided climbing, but
"evidence is poor because the apes that have the shortest legs for their body size
spend the least time in trees – male gorillas and orangutans."
He also notes that short legs must have made it harder for australopiths "to
bridge gaps between possible sites of support when climbing and traveling through
Nevertheless, he writes, "The two hypotheses for the evolution of relatively
short legs in larger primates, specialization for climbing and specialization for
aggression, are not mutually exclusive. Indeed, selection for climbing performance
may result in the evolution of a body configuration that improves fighting
performance and vice versa."
Great Apes' Short Legs Provide Evidence for Australopith Aggression
All modern great apes – humans, chimps, orangutans, gorillas and bonobos –
engage in at least some aggression as males compete for females, Carrier says.
Carrier set out to find how aggression related to leg length. He compared
Australian aborigines with eight primate species: gorillas, chimpanzees, bonobos,
orangutans, black gibbons, siamang gibbons, olive baboons and dwarf guenon
monkeys. Carrier used data on aborigines because they are a relatively natural
For the aborigines and each primate species, Carrier used the scientific literature
to obtain typical hindlimb lengths and data on two physical features that previously
have been shown to correlate with male-male competition and aggressiveness in
* The weight difference between males and females in a species. Earlier studies
found males fight more in species with larger male-female body size ratios.
The male-female difference in the length of canine teeth, which are next to the
incisors and are used for biting during fights.
This drawing of a male gorilla skeleton illustrates their very short legs. Male gorillas fight to gain access to
reproductively mature females. Relatively short legs increase the stability and strength of great apes, and
should therefore increase fighting performance. A new University of Utah study suggests human ancestors
known as australopiths had short legs for the same reason, not just for climbing trees. Public domain, from
Alfred Brehm, "Brehms Tierleben" ("Brehm's Life of Animals"), small edition, 1927
Carrier used male-female body size ratios and canine tooth size ratios as numerical indicators for aggressiveness
because field studies of primates have used varying criteria to rate aggression. He says it would be like having a
different set of judges for each competitor in subjective Olympic events like diving or ice dancing.
The study found that hindlimb length correlated inversely with both indicators of aggressiveness: Primate species
with greater male-female differences in body weight and length of the canine teeth had shorter legs, and thus
display more male-male combat.
There was no correlation between arm length and the indicators of aggression. Carrier says arms are used for
fighting, but "for other things as well: climbing, handling food, grooming. Thus, arm length is not related to
aggression in any simple way."
Verifying the Findings
Carrier conducted various statistical analyses to verify his findings. First, he corrected for each species' limb
lengths relative to their body size. Primates with larger body sizes tend to have shorter legs, humans excepted.
Without taking that into account, the correlation between body size and aggression indicators might be false.
Another analysis corrected for the fact different primate species are related. For example, if three closely related
species all have short legs, it might be due to the relationship – an ancestor with short legs – and not aggression.
Even with the corrections, short legs still correlated significantly with the two indicators of aggressiveness.
The study also found that females in each primate species except humans have relatively longer legs than males.
"If it is mainly the males that need to be adapted for fighting, then you'd expect them to have shorter legs for their
body size," Carrier says.
He notes there are exceptions to that rule. Bonobos have shorter legs than chimps, yet they are less aggressive.
Carrier says the correlation between short legs and aggression may be imperfect because legs are used for many
other purposes than fighting.
Humans "are a special case" and are not less aggressive because they have longer legs, Carrier says. There is a
physical tradeoff between aggression and economical walking and running. Short, squat australopiths were strong
and able to stand their ground when shoved, but their short legs made them ill-suited for distance running. Slender,
long-legged humans excel at running. Yet, they also excel at fighting. In a 2004 study, Carrier made a case that
australopiths evolved into lithe, long-legged early humans only when they learned to make weapons and fight with
Now he argues that even though australopiths walked upright on the ground, the reason they retained short legs
for 2 million years was not so much that they spent time in trees, but "the same thing that selected for short legs in
the other great apes: male-male aggression and competition over access to reproductively active females."
In other words, shorter legs increased the odds of victory when males fought over access to females – access
that meant passing their genetic traits to offspring.
Yet, "we don't really know how aggressive australopiths were," Carrier says. "If they were more aggressive than
modern humans, they were exceptionally nasty animals."
Why Should We Care that Australopiths Were Short and Nasty?
"Given the aggressive behavior of modern humans and apes, we should not be surprised to find fossil evidence
of aggressive behavior in the ancestors of modern humans," Carrier says. "This is important because we have a real
problem with violence in modern society. Part of the problem is that we don't recognize we are relatively violent
animals. Many people argue we are not violent. But we are violent. If we want to prevent future violence we have
to understand why we are violent."
"To some extent, our evolutionary past may help us to understand the circumstances in which humans behave
violently," he adds. "There are a number of independent lines of evidence suggesting that much of human violence
is related to male-male competition, and this study is consistent with that."
Nevertheless, male-male competition doesn't fully explain human violence, Carrier says, noting other factors such
as hunting, competing with other species, defending territory and other resources, and feeding and protecting