#11 A Drink or Two a Day May Lower the Risk of Alzheimer's
By Meredith Melnick Friday, August 19, 2011
1 A new review finds that moderate drinking is associated with a lower risk of memory problems and Alzheimer's disease or
other forms of dementia.
2 The authors analyzed results from 143 studies, dating back to 1977, which included 365,000 participants in 19 countries. The
studies compared non-drinkers to drinkers: 74 of the studies looked at the risk of dementia, while the other 69 focused on memory
3 The review found that moderate drinkers were 23% less likely than teetotalers to develop signs of memory problems or
Alzheimer's. That effect was significant in 14 of the 19 countries, including the U.S. Heavy drinkers, on the other hand, tended to
have a higher risk of memory problems and dementia than non-drinkers, but that association was not statistically significant,
4 If you're keeping score, moderate drinking means one drink a day for women and two drinks a day for men. Heavy drinking
means three to five drinks or more a day. One drink is defined as 1.5 oz. of spirits, 5 oz. of wine, or 12 oz. of beer. Wine appeared to
have more of a protective effect than beer or spirits, but that finding was based on a small number of studies, so there's not enough
data to make a distinction between types of alcohol, the authors said.
5 While the analysis didn't offer an explanation for why drinking may lower the risk of cognitive decline, the researchers
theorized that alcohol may have anti-inflammatory properties. Inflammation is thought to play a role in Alzheimer's disease
(along with other conditions like heart disease and stroke), and moderate amounts of alcohol may suppress inflammation in the
brain; too much alcohol could stimulate it, the authors suggested.
6 Further, reported HealthDay:
One premise suggests that alcohol might improve blood flow in the brain and thus brain metabolism, the researchers said. And
they offered up another theory, that small amounts of alcohol may make brain cells more fit by slightly stressing them and
increasing their ability to cope with major levels of stress that can eventually cause dementia.
7 The findings don't suggest that nondrinkers start chugging alcohol to stave off Alzheimer's. The study showed only an
association, not cause and effect. It's possible, for example, that moderate alcohol consumption was a marker for an overall
healthier lifestyle — like eating a heart-healthy diet, exercising and maintaining positive social relationships — all of which may
also help lower the risk of dementia.
8 But for those who do consume alcohol, the authors say moderation is key. "Social drinking can be a very positive thing as long
as it is not excessive and doesn't exceed a drink per day for women or two drinks for men," Christy Tangney, an associate professor
of clinical nutrition at Rush University Medical Center in Chicago, told WebMD. "Light-to-moderate drinking appears to benefit
cognitive performance." (475 words)
The analysis was published in the journal Neuropsychiatric Disease and Treatment.
Meredith Melnick is a reporter at TIME. Find her on Twitter at @MeredithCM. You can also continue the discussion on TIME's
Facebook page and on Twitter at @TIME.
Find this article at:
#12 Want to Live Longer? Turn Off Your TV
By Alice Park Wednesday, August 17, 2011
1 Sitting in front of the television may be a relaxing way to pass an evening, but spending too much time in front of the tube may
take years off your life. That's what Australian researchers found when they generated life-expectancy tables for people based on
mortality information from the Australian Bureau of Statistics as well as participants' survey responses about how much TV they
had watched in the past week.
2 The TV-viewing data from more than 11,000 participants older than 25 years showed that Australian adults watched an
estimated 9.8 billion hours of television in 2008. People who watched an average six hours of TV a day lived an average 4.8 years
fewer than those who didn't watch any television, the study found. Even more humbling: every hour of TV that participants
watched after age 25 was associated with a 22-minute reduction in their life expectancy.
3 The findings suggest that watching too much TV is as detrimental to longevity as smoking and lack of exercise. Previous
research has shown that smoking is associated with a four-year reduction in life expectancy after the age of 50. That works out to
an average 11 minutes of life lost for every cigarette smoked — the equivalent to 30 minutes of TV time, according to the current
4 The study notes also that people who report low levels of physical activity lose nearly 1.5 years in life expectancy compared
with those who exercise a moderate amount, an effect similar to that of watching just over two hours of television a day. "The
strong correlation is a bit of a surprise," said lead author Lennert Veerman of the University of Queensland in an e-mail response
to questions about his research, which was published in the British Journal of Sports Medicine. "It suggests that going from
inactive to slightly active is as important as exercise."
5 It's no mystery that sitting in front of the tube isn't exactly a healthy pursuit. The more TV you watch, the less physically
active you are. And the less exercise you get, the more likely you are to develop diseases such as diabetes or heart problems.
6 But while previous studies have hinted at the potentially deadly impact of too much TV watching — in June, a Harvard study
found that for every two hours of TV watched, people's risk of dying from any cause increased 13% over a seven-year period — the
new analysis was the first to translate the effect of TV viewing to life expectancy at birth. Were it not for TV, researchers
estimated that life expectancy for men would be 1.8 years longer and for women, 1.5 years longer.
7 Veerman acknowledges that it may not just be the sedentary nature of watching TV that lowers life expectancy but also the
poor diet that onscreen junk-food advertising can promote. Still, the association between excess TV viewing and lower life
expectancy persisted, even after adjusting for diet, Veerman says.
8 He says it might make sense for doctors to start asking their patients about how much time they spend in front of the TV and to
treat TV time as they would other risk factors for poor health, such as lack of exercise and an unhealthy diet. Also, Veerman notes,
the dangers of TV viewing can easily be neutralized by simply turning off the TV and getting off the couch. "Exercise is good," he
says, "but even light physical activity also improves health." (569 words)
MORE: Just 15 Minutes of Exercise a Day May Add Years to Your Life
Find this article at:
#13 The Math Gender Gap: Nurture Trumps Nature (Part 1)
By Maia Szalavitz Tuesday, August 30, 2011
1 Rural India might not seem a likely place to study the roots of gender differences in math performance. But a new study of two
tribes living in the northeast of the country offers intriguing evidence that biology alone does not determine women's math
aptitude (or lack thereof, as former Harvard President Lawrence Summers once infamously suggested) and that culture has a lot
to do with the differences between the genders.
2 Prior research has found that fewer than 10% of tenured math professors are women at Phd-granting institutions (only 7% are
full professors at top 100 universities)*, so understanding the reasons for the disparity could help address it. The new study of
members of the Khasi and Karbi tribes of India suggests that the influence of culture can virtually eliminate at least some of the
3 Researchers led by Moshe Hoffman, a postdoctoral fellow at the University of California, San Diego, studied villagers from both
tribes. Genetically, the Khasi and Karbi are highly similar: the groups only became separate a few hundred years ago and some
intermarriage continues. Both groups are also subsistence farmers, living mainly on rice in a hilly region that gets world-record
levels of rainfall.
4 Culturally, however, they are quite distinct. The Karbi are patrilineal. Women are only rarely allowed to own land and the
eldest son in each family inherits the property. Political and religious leadership is male-dominated and girls leave school nearly
four years earlier than boys.
5 Among the Khasi, though, women are the landowners, with no exceptions. Inheritance goes to the youngest daughter and men
are not supposed to handle money. Even cash earned by men working outside the family farm is typically given to their wives.
Both genders are equally educated.
6 The Khasi are not completely matriarchal, however. Men do make up the political and religious leadership. (These variant land
practices are permitted in India, as the tribal regions are semi-autonomous, similar to American Indian reservations.)
7 Hoffman and his colleagues studied 1,279 people, from four Khasi and four Karbi villages, paying them for their time to test
their ability to solve block puzzles. Each block was divided into four parts and tests were scored by how fast people could
accurately assemble the pictures painted on them. The puzzles were designed to test participants' spatial abilities, which are
linked to math and science aptitude.
8 Among the male-dominated Karbi, men were 36% faster at solving the block puzzles than women. But about a third of the
overall difference was attributable to the greater education received by the boys among the Karbi, and the rest seemed to be linked
to other cultural differences.
9 Among the Khasi, the difference between men and women was so small that it was not statistically significant. "This study
tells us that culture does matter," says Hoffman. "What makes it unique is that we can control for biology."
10 Hoffman describes conversations he had with villagers that typify the differences. Among the Karbi, he spoke with an
18-year-old girl who had recently married. She had left school at age 8. "I asked her, 'Why didn't you keep going?'" She replied,
"That would be a complete waste. Women are not smart enough to understand and would I never use it anyway."
11 Among the Khasi, however, it is male abilities that are the subject of negative stereotypes. Speaking to a Khasi woman,
Hoffman confirmed that she handled the finances in her marriage. When asked why, she replied, "If you give a man money, he's
just going to waste it on booze."
12 While the Karbi seemed typical on levels of trust and hospitality, the Khasi were exceptional in both, according to Hoffman.
"They are some of the nicest people I've ever met," he says, describing how people welcomed and trusted him, even when he first
arrived. For example, he once needed to buy almost all of the food in the town's lone store. When he didn't have appropriate change,
the storekeeper gave Hoffman the food anyway, even though the two had never met before, saying to pay the next day.
(674 / 1159 words, to be continued)
#14 The Math Gender Gap: Nurture Trumps Nature (Part 2)
By Maia Szalavitz Tuesday, August 30, 2011
13 The current study is not without its limitations, namely that the puzzle used to test villagers' spatial skills did not include the
rotation of figures — similar to that seen in the computer game Tetris — which is used in traditional spatial-ability tests. Such
tests were not used in this case, however, because the abstract objects would have been too unfamiliar to the Indian tribes.
14 Critics of the findings are bound to point to the lack of spatial-rotation testing, says Rebecca Goldin, director of research at
science-media watchdog Stats.org and associate professor of mathematics at George Mason University. She was not connected
with the study.
15 "I think that is valid concern," she says. "But I do think the study certainly does suggest that some spatial abilities have a
cultural influence. This fits into the large amount of literature that suggests that culture differences have a large impact on
16 Indeed, culture is not limited simply to encouragement of young girls in grade-school math. Studies that have looked at
gender gaps in math performance have found that the more equitably a country treats its people, the smaller that gap is. In
Scandinavian countries, for example, where men and women share paid family leave and high quality day care is affordable, the
gap is much narrower.
17 Girls in those countries see in their mothers' lives that child-rearing and math careers are not incompatible; the mothers also
don't have to give up high powered jobs to have kids so they reach higher levels of equality with men at work. Even in the U.S., the
ranks of female math and science professors—including those in tenure-track positions— are growing appreciably.
18 "These questions of biology could be possibly relevant if we had solved all of the social problems," says Goldin. "It could be that
there's a difference, but it doesn't matter when you have such gaping cultural differences."
19 Goldin has a family history that provides unique insight on the issue. Her father is a physicist, her sister is also a math
professor and all three were educated at Harvard. "In my opinion, it has a lot to do with self-definition," Goldin says. "For boys,
math is validated and opportunities abound to identify yourself as being mathematically strong and liking math — at least for
white and Asian boys. It's really not there as much for girls."
20 Goldin notes that a huge proportion of her female math professor colleagues also had fathers who were scientists or
mathematicians. "That could be genetic, who knows?" she says. "In my own personal life, my father intervened in many subtle and
explicit ways." For instance, when principals or teachers tried to steer her away from math, her father objected and stopped them.
21 "It would be wrong to conclude from the new study that nature doesn't play a role. But nurture plays a substantial role, large
enough that we can even see a gender difference wiped out," Hoffman says. (485 /1159 words)
Find this article at:
#15 Using a Big Fork May Help You Eat Less
By Laura Blue Friday, July 15, 2011
1 Here's a well-known weight-loss tip: use a smaller plate, and you'll be satisfied with a smaller portion. The tip works —
provided you're not genuinely very hungry — because a large part of our satisfaction at the end of a meal is determined by
expectations about what a decent meal looks like. If we feel like we've eaten a proper dinner, we're not likely to eat another one an
2 If the sight of our meals matters, then how about the sight of each bite? Business-school researchers at the University of Utah,
Salt Lake City, conducted a clever experiment — published recently in the Journal of Consumer Research — to find out.
3 The study authors enlisted help from a local Italian restaurant. Over the course of two days — serving two lunches and two
dinners — the researchers randomly selected tables to receive either unusually large forks (20% larger than the restaurant's
normal fork) or unusually small forks (20% smaller than normal). They then weighed each plate of food before it went out to a
customer and once again when it came back, in order to calculate how much each person had eaten.
4 Overall, the results showed, the customers given bigger forks ate less, leaving more on their plates at the end of each meal.
That left the study authors trying to explain why people might eat more when they're given bigger portions, but less when they're
given bigger forks? The study authors suggest that both phenomena can be explained by the same logic.
5 In their paper they write:
Diners focus on the visual cue of whether they are making any dent in the amount of food on their plates…. The smaller fork
(compared to the larger fork) appears to provide less satisfactory goal progress; that is, diners feel they are not making much of a
dent in consuming their food and, hence, satisfying their hunger. This, in turn, focuses diners to put in more effort (e.g., more
forkfuls) toward satiating their hunger. As a result, diners with smaller forks consume more food than those using larger forks.
6 By this same logic, if the food portion is very large to begin with, diners will eat more of it because they don't notice themselves
making a dent in the meal until a lot has been consumed.
7 The argument also suggests an interplay between bite size and portion size. In the experiment, restaurant-goers who received
both small forks and large portions ate disproportionately more than either one of those factors alone would predict.
8 Importantly, however, the bigger fork may encourage people to eat less only when their goal is to eat a full meal and satisfy
their hunger — precisely the goal of most restaurant-goers. The study authors also tested the effect of fork size on food
consumption among people were not necessarily hungry, but who instead were merely snacking.
9 They gave university students some pasta salad and the same large forks and small forks that were used in the restaurant
experiment. They found that, when people were presented with food outside of a mealtime, larger forks led people to consume
more. The students were, perhaps, less concerned about making a dent in any food they were given, so that they simply took a few
bites out of habit. In that scenario, the authors write, people may "become more willing to anchor on the fork size as the
appropriate bite size." (572 words)
Find this article at:
# 16 The Half-Baked Teen Brain: A Hazard or a Virtue?
By Maia Szalavitz Friday, September 16, 2011
1 Teenagers have a bad reputation. They're moody, they thrive on drama. They take risks that terrify their parents and seem
blithely unaware of the potential consequences of their actions. The reason for this, as scientists have discovered through modern
brain-scanning technology, is that the teen brain isn't fully cooked — it's still in the process of rewiring and remodeling itself and
maturing toward adulthood.
2 But here's an intriguing question: Why would the human brain pass through such a seemingly senseless and dangerous —
and protracted — phase on its way to maturity? Is there some utility to the vulnerable adolescent brain, or should it be seen only
as a hazard?
3 These are the questions raised by science writer David Dobbs in a fascinating feature in National Geographic. Although most
scientific research has focused on adolescents' deficits and self-destructive impulses, Dobbs highlights work that looks at the
upside of the teen brain and the evolutionary needs that drive adolescence.
4 The teenage brain isn't just a "work in progress," Dobbs argues. Rather, it's adapted to meet specific challenges. He writes:
Over the past five years or so, even as the work-in-progress story spread into our culture, the discipline of adolescent brain
studies learned to do some more-complex thinking of its own. A few researchers began to view recent brain and genetic
findings in a brighter, more flattering light, one distinctly colored by evolutionary theory. The resulting account of the
adolescent brain — call it the adaptive-adolescent story — casts the teen less as a rough draft than as an exquisitely
sensitive, highly adaptable creature wired almost perfectly for the job of moving from the safety of home into the complicated
5 This view will likely sit better with teens. More important, it sits better with biology's most fundamental principle, that of
natural selection. Selection is hell on dysfunctional traits. If adolescence is essentially a collection of them — angst, idiocy, and
haste; impulsiveness, selfishness, and reckless bumbling — then how did those traits survive selection? They couldn't — not if
they were the period's most fundamental or consequential features.
6 Dobbs explores the research on changes in the brain's dopamine system, for example, which is what drives teens to novelty
and thrill-seeking, and part of what makes socializing with their peers so overwhelmingly attractive. It's maddening to the parent,
who focuses on the risks while his teen seeks rewards, but evolutionarily speaking, this is exactly what sexually maturing humans
should be doing if they are to successfully find mates and create a social network to support child-raising.
7 Of course, the problem is that, these days, becoming a teen parent is not the typical road to success. Like many hard-wired
human imperatives — the genes that spur our bodies to seek and store fat, for example — it is more adapted to our past than to
our present. But when humans first evolved, early reproduction was probably advantageous. And in small, tribal groups, the
ability to fit in with one's peers was actually a life-or-death matter — much the way modern teens still tend to view it.
8 Dobbs' story covers a lot of other fascinating research; check it out here. And let's hope this new perspective engenders a little
more compassion for teens: they face big challenges, and being treated dismissively (or worse) by adults doesn't make it easier.
Maia Szalavitz is a health writer at TIME.com. Find her on Twitter at @maiasz. You can also continue the discussion on TIME
Healthland's Facebook page and on Twitter at @TIMEHealthland.
Find this article at:
#17 In 40 Years of Cancer Research, How Far Have We Come?
By Alice Park Wednesday, September 21, 2011
1 I don't normally write about anniversaries, but this one seems worth noting. It's been 40 years since President Richard Nixon
signed the National Cancer Act in 1971, the historic legislation that focused attention — and perhaps more importantly,
government funding — on the need to research and find treatments for cancer.
2 A lot has changed in the past four decades. The disease that doctors thought they knew then is very different from the cancer
they're studying today. For one thing, scientists have a much better understanding that cancer isn't simply one disease in which
cells suddenly start to grow out of control, but rather hundreds of different diseases. In fact, according to the American Association
for Cancer Research Cancer Progress Report, cancer is actually more like 200 distinct diseases, each spurred on by slightly
different causes and requiring different treatments.
3 And instead of focusing so slavishly on the tumors themselves, as experts did initially, researchers have enlarged the window
through which they study cancer, allowing the consideration of other critical features, such as how the patient's own makeup
might affect the disease. Scientists also look at how tumors tend to co-opt their environment for their own pathological needs,
turning healthy tissues into diseased ones in a process that makes cancer increasingly difficult to control.
4 "In the haste to continue research and fund it, you sometimes need to stop and turn around and look back at what we've
accomplished," notes Dr. William Dalton, a co-chair of the AACR committee writing the report. "The reduction in death rates of
many common cancers that has occurred over the last 40 years is incredible. That's important because that's huge progress
against something that is probably the biggest health scare for any society."
5 Indeed, the death rates for cancer in the U.S. have dropped by 22% for men and 14% for women between 1990 and 2007. And in
1975, only 50% of people diagnosed with cancer could expect to live for another five years; now nearly 70% do. Among children, the
gains are even greater: 80% of youngsters can expect to survive their childhood cancer today, compared with 52% in 1975.
6 Much of that success can be attributed to two key milestones in cancer research: understanding the simple lifestyle factors that
contribute to cancer and, on the opposite end of the technological spectrum, the mapping of the human genome in 2001. Behavioral
changes such as quitting smoking and avoiding exposure to UV rays, for example, have played a significant role in preventing lung
and skin cancers, while the Human Genome Project continues to yield new and useful information on the genetic drivers of cancer.
7 "If I were to rank developments over the past 40 years that have had the most impact on our understanding of cancer, I would
say genetics is No. 1, and lifestyle factors are No. 2," says Dalton.
8 Advances in genetics are making it possible to shift into the next phase of cancer care, a more personalized approach in which
every patient's cancer will be treated in the way that best suits his or her case. Already, in the past decade scientists have
developed more tailored therapies that target cancer cells specifically, and as these approaches become more routine and refined,
physicians will be better able to match the right therapies with the right cancers.
9 "When I think of the next 40 years, the ultimate goal is personalized medicine," says Dalton. "I think personalized medicine
will have the greatest impact on prevention." Matching an individual's cancer biology to the best treatments for that tumor will go
a long way toward controlling illness and death from cancer, and personalized approaches can help identify people at highest risk
of developing cancer as well.
10 Preventing cancer is just as important as treating it, he says, especially as the U.S. population ages. "By and large, cancer
affects older people; the longer we live, the more likely we will be to develop cancer," he says. That means that stopping the disease
before it even starts will have a huge impact on controlling costs and deaths in coming decades.
11 So while the AACR report highlights how far researchers have come in understanding cancer, it's clear that we're not close to
conquering cancer — at least not yet. Last year, more than 570,000 people died of cancer, still a sobering number that experts hope
to shrink in coming years. (727 words)
Find this article at:
#18 What Do Gut Bugs Have to Do With High Cholesterol? A Lot
By Alice Park Friday, October 14, 2011
1 It's hard to think of bacteria as the good guys when it comes to health, but research increasingly shows that the bugs that live
in our gut may have a big influence on our well-being. In a new study, scientists find that these bugs may affect how patients
respond to potentially life-saving drugs like statins.
2 Statin drugs can help lower cholesterol levels and reduce the risk of heart disease and stroke in people who take them. But not
every patient benefits from the medications, and doctors didn't have any good explanations for why.
3 Now, researchers report in the journal PLoS One that the answer may lie with our gut bacteria. These bugs normally live in
our intestines and work to digest and break down the food we eat. But scientists have recently learned that their impact goes even
further; they also produce vitamins, assist the immune system and may affect everything from our weight to the way we
4 In the new study of 100 people, researchers found that those whose LDL, or bad cholesterol, dropped the most after taking the
statin drug simvastatin (Zocor) for six weeks also had higher levels of three bile acids made by the gut bacteria, compared with
those whose LDL didn't decline as much. The people who did not respond well to simvastatin showed higher levels of five other bile
acids made by intestinal flora.
5 Why the difference drug response? The researchers believe that the five compounds made by the poor responders mimic statins,
and therefore compete with the drug in binding to the appropriate cells. So, having too many of these bile acids means that statin
molecules can't get to the liver cells where they would regulate production of cholesterol.
6 "We found that the benefit of statins could be partly related to the type of bacteria that lives in our guts," the study's co-author
Rima Kaddurah-Daouk, a professor of psychiatry at Duke University, said in a statement. "The reason we respond differently is
not only our genetic makeup, but also our gut microbiome."
7 That means that it might be possible to test people before they start taking certain statins to know who will respond and who
won't. Blood tests can detect the bile acids that may compete with simvastatin, so people who show high levels could be steered
toward different statins that may better control their cholesterol levels.
8 There's also the possibility of altering the landscape of the gut flora by consuming probiotics, in order to promote colonies of
bacteria that don't counteract the effects of statins. Some yogurts and other probiotic foods already attempt to take advantage of
this beneficial gut flora to boost well-being in other ways. "We're at a very early stage of understanding this relationship, but
[there's] no doubt that metabolites from bacteria are playing an important role in regulating our systems," said Kaddurah-Daouk.
And possibly helping us to be healthier. (483 words)
Find this article at:
#19 Study: Chocolate Lovers Have Lower Risk of Stroke
By Sora Song Tuesday, October 11, 2011
1 The news keeps getting sweeter: eating chocolate has been linked to lower blood pressure, a reduced risk of heart disease and
now, in a new study, a lower risk of stroke in women.
2 Even better, the more chocolate women indulged in, the lower their stroke risk, Swedish researchers found. For every 50-gram
(1.8-oz.) increase in chocolate consumption per week, participants' overall stroke risk dropped 14%. The protective effect appeared
to kick in at 45 g (1.6 oz.) of chocolate a week, with women in the highest consumption group — who ate a median of 66.5 g (2.4 oz.),
or between one and two chocolate bars a week — enjoying a 20% lower risk of stroke than those who ate the least.
3 When broken down by type of stroke — ischemic, which occurs when a blood vessel supplying blood to the brain is blocked by a
clot, versus hemorrhagic, which occurs when a blood vessel in the brain weakens and bursts — the protective benefits of chocolate
varied. Each 50-gram-per-week increase in chocolate consumption was associated with a 27% drop in hemorrhagic stroke risk,
compared with a 12% lower risk of strokes caused by clots. Why the effect was greater with one type of stroke wasn't clear, the
4 Led by Susanna Larsson, an associate professor in the division of nutritional epidemiology at the Karolinska Institute in
Stockholm, researchers tracked 33,372 women aged 49 to 83 for about 10 years, until 2008. At the start of the study, the women
filled out lengthy questionnaires about their diet and lifestyle, including how often they had eaten chocolate and about 95 other
foods in the previous year.
5 Over the next decade, researchers recorded 1,549 strokes, including 1,200 ischemic strokes, 224 hemorrhagic strokes and 125
that weren't specified. The protective effect of chocolate consumption on women's stroke risk persisted, even after researchers
adjusted for other major stroke risk factors. The findings fall in line with past research on the topic.
6 The potential health benefits of chocolate, especially dark chocolate, have been widely attributed to its flavonoids, antioxidant
compounds in cocoa that may boost the cardiovascular system. In other studies, researchers have shown that flavonoids can
enhance blood flow by relaxing blood vessels and lowering blood pressure. They may also inhibit clumping of platelets and reduce
inflammation, both of which contribute to cardiovascular health.
7 The question is, Do women start gorging on chocolate to protect themselves from stroke? Not exactly. For one thing, chocolate
is decadent and is meant to be eaten in moderation. "Consuming too much chocolate is probably not good, as chocolate is rich in
sugar, fat and calories, and may lead to weight gain, which increases the risk of chronic diseases," says Larsson.
8 Chocolate isn't the only food that contains antioxidants, of course. "It's important to keep findings like these in context. These
findings don't mean that people need to exchange chocolate for broccoli in their diet," Dr. Nieca Goldberg, a cardiologist at NYU
Langone Medical Center in New York City, told HealthDay. "Chocolate does have antioxidants, and antioxidants are beneficial for
your health. ... But, what if they had tried this study with apple skins or grapes?"
9 While the study adds to the evidence that chocolate may be good for cardiovascular health, its observational nature can't prove
a direct effect. Its reliance on women's self-reports of diet and lifestyle further limits its findings.
10 The authors also note that 90% of the chocolate consumed in Sweden at the time of the questionnaire was Swedish milk
chocolate, which contains about 30% cocoa solids — a much higher concentration than what Americans are used to eating. So if
you're going to pick up a chocolate bar, the author suggests choosing dark chocolate, at least 70% cocoa, which has more
antioxidants and less sugar than milk chocolate. "Chocolate consumption in moderation, and preferably dark chocolate, along with
high consumption of other antioxidant-rich foods such as fruits and vegetables may help reduce the risk of stroke," Larsson says.
--The study was published as a research letter in the Journal of the American College of Cardiology.
Find this article at: