Field of Science
-
-
Don't tell me they found Tyrannosaurus rex meat again!1 week ago in Genomics, Medicine, and Pseudoscience
-
-
-
Course Corrections4 months ago in Angry by Choice
-
-
The Site is Dead, Long Live the Site2 years ago in Catalogue of Organisms
-
The Site is Dead, Long Live the Site2 years ago in Variety of Life
-
Does mathematics carry human biases?4 years ago in PLEKTIX
-
-
-
-
A New Placodont from the Late Triassic of China5 years ago in Chinleana
-
Posted: July 22, 2018 at 03:03PM6 years ago in Field Notes
-
Bryophyte Herbarium Survey6 years ago in Moss Plants and More
-
Harnessing innate immunity to cure HIV8 years ago in Rule of 6ix
-
WE MOVED!8 years ago in Games with Words
-
Do social crises lead to religious revivals? Nah!8 years ago in Epiphenom
-
-
-
-
post doc job opportunity on ribosome biochemistry!9 years ago in Protein Evolution and Other Musings
-
Growing the kidney: re-blogged from Science Bitez9 years ago in The View from a Microbiologist
-
-
Blogging Microbes- Communicating Microbiology to Netizens10 years ago in Memoirs of a Defective Brain
-
-
-
The Lure of the Obscure? Guest Post by Frank Stahl12 years ago in Sex, Genes & Evolution
-
-
Lab Rat Moving House13 years ago in Life of a Lab Rat
-
Goodbye FoS, thanks for all the laughs13 years ago in Disease Prone
-
-
Slideshow of NASA's Stardust-NExT Mission Comet Tempel 1 Flyby13 years ago in The Large Picture Blog
-
in The Biology Files
Happy Hiatus!
Hi, friends. I'm going to be away from this space for about a week and a half, so I'm leaving some nibbles to tide you over till I get back.
I thought about listing some "most popular" posts, but, well, you may have read those already. So instead, here are some less-loved posts from the past. Why less loved? Funny story; at the beginning of the summer, there were one-fifth as many of you reading as there are now.* While I'm away, why not share your favorite Inkfish story with a friend, or follow me here or on Twitter?
- - - - - - - - -
Science's Biggest Cancer Questions: What does the National Cancer Institute think are the most pressing unanswered questions about cancer? Featuring obesity, Alzheimer's disease, and sea turtles.
The One Funny Thing: You'd be surprised what shows up in the Methods section.
Little People, Big World: Tricky sensory illusions make you feel like a giant--or a Barbie.
Depression and the Loss of Old Friends (and Worms): An intriguing hypothesis links mental illness to a lack of dirt in our lives.
Woo Hoo, Witchy Woman: I love the New York Times, but sometimes they publish stupid stories about ovulating women in ponytails.
- - - - - - - - -
*Thank you.
Photo: Wikipedia/albert kok
Even Monkeys Can Tell Red from Grue
Unsure of the difference between ochre and ecru? Mauve and maize? Don't feel bad, because there's at least one color distinction you can handle: warm versus cool colors. You may have thought it was made up by your art teacher to torment you, but the concept is biologically based and universal to cultures around the world. Even a monkey knows the difference.
Researchers led by Youping Xiao at Mount Sinai School of Medicine based their study, in part, on data from the World Color Survey. That project, begun in the late 1970s, asked people from 110 cultures with unwritten languages to look at hundreds of color chips and name them.
Previous analysis of this survey found that no matter how many or how few color terms a language has--even if, say, speakers group together several English-language colors under one term--there's a certain "fault line" that these groupings almost never cross. That's the boundary between warm colors (such as orange or red) and cool ones (such as green or blue--or "grue," as some cultures combine them).
Xiao and colleagues analyzed the wavelengths of light reflected by each color chip used in the World Color Survey and calculated which cone cells in the human eye those colors correspond to. We see color using three types of cone cells: S, M, and L, for short, medium, and long wavelengths of light. The researchers found that the warm-cool boundary corresponded to how much a color stimulated L versus M cone cells. So the distinction isn't one that only artsy people can see--there's a real difference in how warm and cool colors hit our eyes.
The researchers also studied brain images from macaque monkeys as they viewed various colors. Scans of the visual cortex in each macaque showed the same kind of polarity: two different clusters of neurons lit up, depending on whether a color was warm or cool.
Earlier studies had shown that colors are spatially arrayed in a "hue map" in the brain, with different patches of neurons corresponding to separate colors. No matter how many colors our brains or our languages have taught us to distinguish between, though, the anatomy of our eyes seems to make certain distinctions universal. This gives at least a partial answer to the question of whether everyone else is seeing the same colors you are. Whether you carefully sort your sweaters into chartreuse and lime or just call the whole thing "grue," you would never confuse a warm hue with a cool one.
If you want to test exactly how accurate your color perception is, try arranging the color chips on this hue test. It might make your brain feel muddled, like your hue map is missing its compass. But keep in mind that the macaques were only looking at nine different colors.
Image: from xrite.com online color challenge
Xiao, Y., Kavanau, C., Bertin, L., & Kaplan, E. (2011). The Biological Basis of a Universal Constraint on Color Naming: Cone Contrasts and the Two-Way Categorization of Colors PLoS ONE, 6 (9) DOI: 10.1371/journal.pone.0024994
Eat Your Grains (They're Controlling Your Genes)
Scientists made the startling assertion this week that RNA from our food can survive digestion, sneak into our cells, and control our genes. Tiny molecular messengers made inside other species--even other kingdoms of species--work just fine in our bodies, latching onto our genetic material and causing system-wide change. Our understanding of diet and nutrition may be in for a shake-up.
A group of researchers in China has been studying microRNAs (abbreviated miRNAs). These stunted nucleotide chains, instead of carrying genetic material themselves, regulate how genes are expressed. MicroRNAs bind to our genes and affect their activity, usually dialing it down. Unlike non-micro DNA or RNA, miRNAs can leave the cell and circulate through the body with the blood.
Looking at a group of healthy Chinese men and women, the researchers identified about 30 miRNAs circulating in their blood that weren't human, or even mammalian, in origin: They came from plants. A structural difference confirmed that the tiny molecules were made in plants, and weren't just animal mimics. Checking the blood of other mammals including mice, horses, and cows, the researchers found more plant miRNAs.
Since a couple of the plant miRNAs that cropped up most often in subjects' bodies are also present in rice, the researchers guessed they were coming from the rice the subjects ate. They used mice to test their theory. The rice miRNA was already present in laboratory mouse chow--explaining its presence in mouse circulatory systems--but there's much more of it in rice. So the scientists fed fresh rice to mice and measured the miRNAs in their systems several hours afterward. Sure enough, higher levels of rice miRNAs appeared throughout the digestive systems of the mice.
The miRNAs seemed to be able to survive both cooking and digestion. Researchers tried leaving them in acid for six hours, simulating the effect of stomach acid, but the miRNAs still didn't break down.
Once intact plant miRNAs left the digestive system and passed into body tissues, were they doing anything? Looking at human cells, the researchers found that the miRNAs from rice latched onto a certain gene that's active in the liver. The gene is responsible for removing LDL ("bad cholesterol") from the circulatory system. When the miRNAs from rice bound to the gene, it was less active. In mice that had eaten rice, the same liver gene was less active--and a few days later, the mice had higher levels of LDL cholesterol in their systems. The liver genes had been dialed down by plant miRNAs, and circulating cholesterol had risen as a result.
The authors think cells in the small intestine take up miRNAs from our digestive tracts, package them into little bubbles called microvesicles, and send them into our circulatory systems. From there, miRNAs find the tissues and cell types they fit best with.
What's incredible is that RNA molecules from an entirely different kingdom of life can affect our genes. The last time we shared a common ancestor with a rice plant, it was single-celled. Almost nothing about us is the same. But their keys still fit in our locks. Plant miRNAs may do a different job in our bodies than in the plants they come from, but we've been evolving with these visitors all along. Our bodies must expect them, and even need them, to enter with our food.
If miRNAs from plants can function in our body, then any and every other food source could be passing us miRNAs that tweak the activity of our genes. "Food-derived miRNAs may serve as a novel essential nutrient," the authors say, as important to our diet as vitamins and minerals. MicroRNAs could be added to foods as fortification. Illnesses could be tied to miRNA deficiencies in our diets. We could take Flintstones chewable miRNAs to stay healthy.
MicroRNAs seem nearly indestructible--they apparently handle being cooked and digested with no problem. But it's possible that our treatment of certain foods destroys their miRNAs. When we eat highly processed foods, are we depriving our bodies of nutrients we never knew existed? And as genetically modified crops become more ubiquitous, we'll want to consider whether we're modifying those crops' miRNAs as well, and how those changes might help or harm us. Staple crops such as rice and corn aren't just foods on our plates; they're also old acquaintances that share responsibility for regulating our genes. Whatever we do to our food, it'll be best if we can still recognize each other.
Zhang, L., Hou, D., Chen, X., Li, D., Zhu, L., Zhang, Y., Li, J., Bian, Z., Liang, X., Cai, X., Yin, Y., Wang, C., Zhang, T., Zhu, D., Zhang, D., Xu, J., Chen, Q., Ba, Y., Liu, J., Wang, Q., Chen, J., Wang, J., Wang, M., Zhang, Q., Zhang, J., Zen, K., & Zhang, C. (2011). Exogenous plant MIR168a specifically targets mammalian LDLRAP1: evidence of cross-kingdom regulation by microRNA Cell Research DOI: 10.1038/cr.2011.158
A group of researchers in China has been studying microRNAs (abbreviated miRNAs). These stunted nucleotide chains, instead of carrying genetic material themselves, regulate how genes are expressed. MicroRNAs bind to our genes and affect their activity, usually dialing it down. Unlike non-micro DNA or RNA, miRNAs can leave the cell and circulate through the body with the blood.
Looking at a group of healthy Chinese men and women, the researchers identified about 30 miRNAs circulating in their blood that weren't human, or even mammalian, in origin: They came from plants. A structural difference confirmed that the tiny molecules were made in plants, and weren't just animal mimics. Checking the blood of other mammals including mice, horses, and cows, the researchers found more plant miRNAs.
Since a couple of the plant miRNAs that cropped up most often in subjects' bodies are also present in rice, the researchers guessed they were coming from the rice the subjects ate. They used mice to test their theory. The rice miRNA was already present in laboratory mouse chow--explaining its presence in mouse circulatory systems--but there's much more of it in rice. So the scientists fed fresh rice to mice and measured the miRNAs in their systems several hours afterward. Sure enough, higher levels of rice miRNAs appeared throughout the digestive systems of the mice.
The miRNAs seemed to be able to survive both cooking and digestion. Researchers tried leaving them in acid for six hours, simulating the effect of stomach acid, but the miRNAs still didn't break down.
Once intact plant miRNAs left the digestive system and passed into body tissues, were they doing anything? Looking at human cells, the researchers found that the miRNAs from rice latched onto a certain gene that's active in the liver. The gene is responsible for removing LDL ("bad cholesterol") from the circulatory system. When the miRNAs from rice bound to the gene, it was less active. In mice that had eaten rice, the same liver gene was less active--and a few days later, the mice had higher levels of LDL cholesterol in their systems. The liver genes had been dialed down by plant miRNAs, and circulating cholesterol had risen as a result.
The authors think cells in the small intestine take up miRNAs from our digestive tracts, package them into little bubbles called microvesicles, and send them into our circulatory systems. From there, miRNAs find the tissues and cell types they fit best with.
What's incredible is that RNA molecules from an entirely different kingdom of life can affect our genes. The last time we shared a common ancestor with a rice plant, it was single-celled. Almost nothing about us is the same. But their keys still fit in our locks. Plant miRNAs may do a different job in our bodies than in the plants they come from, but we've been evolving with these visitors all along. Our bodies must expect them, and even need them, to enter with our food.
If miRNAs from plants can function in our body, then any and every other food source could be passing us miRNAs that tweak the activity of our genes. "Food-derived miRNAs may serve as a novel essential nutrient," the authors say, as important to our diet as vitamins and minerals. MicroRNAs could be added to foods as fortification. Illnesses could be tied to miRNA deficiencies in our diets. We could take Flintstones chewable miRNAs to stay healthy.
MicroRNAs seem nearly indestructible--they apparently handle being cooked and digested with no problem. But it's possible that our treatment of certain foods destroys their miRNAs. When we eat highly processed foods, are we depriving our bodies of nutrients we never knew existed? And as genetically modified crops become more ubiquitous, we'll want to consider whether we're modifying those crops' miRNAs as well, and how those changes might help or harm us. Staple crops such as rice and corn aren't just foods on our plates; they're also old acquaintances that share responsibility for regulating our genes. Whatever we do to our food, it'll be best if we can still recognize each other.
Zhang, L., Hou, D., Chen, X., Li, D., Zhu, L., Zhang, Y., Li, J., Bian, Z., Liang, X., Cai, X., Yin, Y., Wang, C., Zhang, T., Zhu, D., Zhang, D., Xu, J., Chen, Q., Ba, Y., Liu, J., Wang, Q., Chen, J., Wang, J., Wang, M., Zhang, Q., Zhang, J., Zen, K., & Zhang, C. (2011). Exogenous plant MIR168a specifically targets mammalian LDLRAP1: evidence of cross-kingdom regulation by microRNA Cell Research DOI: 10.1038/cr.2011.158
Are You Yawning Because Your Brain's Hot?
Everyone knows yawning is the pinkeye of social cues: powerfully contagious and not that attractive. Yet scientists aren't sure what the point of it is. Is yawning a form of communication that evolved to send some message to our companions? Or is the basis of yawning physiological, and its social contagiousness unrelated? A new paper suggests that yawning--even when triggered by seeing another person yawn--is meant to cool down overheated brains.
We're not the only species that feels compelled to yawn when we see others doing it. Other primates, and possibly dogs, have been observed catching a case of the yawns. But Princeton researcher Andrew Gallup thinks the root cause of yawning is in the body, not the mind. After all, we yawn when we're alone, not just when we're with other people.
Previously, Gallup worked on a study that involved sticking tiny thermometers into the brains of rats and waiting for them to yawn. The researchers observed that yawning and stretching came after a rapid temperature rise in the frontal cortex. After the yawn and the stretch, rats' brain temperatures dropped back to normal. The authors speculated that yawning cools the blood off (by taking in a large amount of air from outside the body) and increases blood flow, thereby bringing cooler blood to the brain.
If yawning's function is to cool the brain, Gallup reasoned, then people should yawn less often when they're in a hot environment. If the air outside you is the same temperature as your body, it won't make you less hot.
To test that theory, researchers went out into the field--namely, the sidewalks of Tuscon, Arizona--in both the winter and the summer. They recruited subjects walking down the street (80 people in each season) and asked them to look at pictures of people yawning. Then the subjects answered questions about whether they yawned while looking at the pictures, how much sleep they'd gotten the night before, and how long they'd been outside.
The researchers found that the main variable affecting whether people yawned was the season. It's worth noting that "winter" in Tuscon was a balmy 22 degrees Celsius (71 degrees Fahrenheit), while summer was right around body temperature. In the summer, 24% of subjects reported yawning while they looked at the pictures. In the winter, that number went up to 45%.
Additionally, the longer people had been outside in the summer heat, the less likely they were to yawn. But in the winter, the opposite was true: People were more likely to yawn after spending more time outside. Gallup speculates that because the testing took place in direct sunlight, subjects' bodies were heating up, even though the air around them remained cooler. So a yawn became more refreshing to the brain the longer subjects stood outside in the winter, but only got less refreshing as they sweltered in the summer.
The study used contagious yawning rather than spontaneous yawning, presumably because it's easier to hand subjects pictures of yawning people than to aggressively bore them. Gallup notes that contagious and spontaneous yawning are physically identical ("a stretching of the jaw and a deep inhalation of air," if you were wondering), so one can stand in for the other. Still, it would be informative to study people in a more controlled setting--in a lab rather than on the street, and preferably not aware that they're part of a yawning study.
A lab experiment would also allow researchers to directly observe whether their subjects yawned, rather than just asking them. In the field, researchers walked away while subjects were looking at the pictures, since people who know they're being watched are less likely to yawn. But self-reported results might not be accurate. The paper points out that "four participants in the winter condition did not report yawning during the experiment but yawned while handing in the survey to the experimenter."
Still, it seems there's real connection between brain temperature and yawning. It will take more research (and more helplessly yawning subjects) to elucidate exactly what the connection is. Even if brain temperatures always rise right before a yawn and fall afterward, cooling the brain might not be the point of the yawn--another factor could be causing the impulse to yawn, and the temperature changes could be a side effect. Studying subjects in a truly cold environment, and showing that they are once again less likely to yawn (because outside air would cool their brains too much), would provide another piece of evidence that temperature triggers the yawn in the first place.
None of this tells us why yawning is so catching, though. Personally, I think I yawned at least a thousand times while reading and writing about this paper. Maybe I should have taken some advice from an older study by Andrew Gallup, which found that you can inhibit yawning by breathing through your nose or putting something chilly on your forehead.
Exercise and Your Immune System Revisited
It's not every day I get an email from someone in Taiwan about exercise, white blood cells, and menstruation. But in response to my post How Much Exercise Harms Your Immune System?, Guan-Da Syu from National Cheng Kung University Medical College dropped me a friendly note (if you can call an email with its own bibliography a "note") a few days ago. Syu is the lead author of the paper I'd discussed in that post, and he wanted to respond to some questions I raised.
The paper reported that after out-of-shape individuals engaged in sudden and intense exercise, their white blood cells died at an accelerated rate. An increase in reactive, oxygen-containing molecules seemed to be the culprit. But when those same people got consistent and moderate exercise--five days a week for 30 minutes--their white blood cells lived for longer. Furthermore, consistent exercise buffered the harmful effects of more strenuous exercise sessions on white blood cells.
I had asked whether we could be sure that shortening or increasing the life span of white blood cells (specifically, neutrophils) had a net negative or positive effect on individuals' immune systems. Might the body compensate somehow? Syu says that it's hard to quantify a person's immunity, but his findings fit with other research that linked extreme exercise with higher infection risk. Additionally, he says that after severe exercise, it takes about a half a day for the proportion of healthy white blood cells in the body to return to normal.
The consistent exercisers, Syu says, adapt to the oxidizing molecules, and begin to produce neutrophils that live for longer. I had asked whether prolonging the lifespan of short-lived cells might be a burden on the body somehow, but Syu points out that the number of neutrophils living in the body at one time remains the same throughout subjects' exercising or sedentary weeks. We don't have more white blood cells when we're in shape; we are able to produce fewer because they live for longer.
Finally, I'd pointed out that since the study only used male subjects, it's hard to generalize the results for woman, whose bodies don't necessarily react to exercise in the same way. Syu acknowledges that it's unknown how exercise affects women's white blood cells. But it's possible that the effect might depend on the time of the month. In a previous study, the same research group looked at women's platelets (the blood cells responsible for clotting). In the first half of the menstrual cycle, extreme exercise had a notable effect on platelet function. But in the second half of the cycle (from ovulation to menstruation), women's platelets didn't respond to severe exercise in any way.
So the effect of exercise on your immune system might depend on many factors: how hard you work out; how consistently you work out; whether you're a woman and what time of the month it is. It seems that consistent exercise protects your immune system, but going from zero to 60 when you start a workout routine is harmful. And women, for now, will remain a mystery.
Syu, G., Chen, H., & Jen, C. (2011). Severe Exercise and Exercise Training Exert Opposite Effects on Human Neutrophil Apoptosis via Altering the Redox Status PLoS ONE, 6 (9) DOI: 10.1371/journal.pone.0024385
The paper reported that after out-of-shape individuals engaged in sudden and intense exercise, their white blood cells died at an accelerated rate. An increase in reactive, oxygen-containing molecules seemed to be the culprit. But when those same people got consistent and moderate exercise--five days a week for 30 minutes--their white blood cells lived for longer. Furthermore, consistent exercise buffered the harmful effects of more strenuous exercise sessions on white blood cells.
I had asked whether we could be sure that shortening or increasing the life span of white blood cells (specifically, neutrophils) had a net negative or positive effect on individuals' immune systems. Might the body compensate somehow? Syu says that it's hard to quantify a person's immunity, but his findings fit with other research that linked extreme exercise with higher infection risk. Additionally, he says that after severe exercise, it takes about a half a day for the proportion of healthy white blood cells in the body to return to normal.
The consistent exercisers, Syu says, adapt to the oxidizing molecules, and begin to produce neutrophils that live for longer. I had asked whether prolonging the lifespan of short-lived cells might be a burden on the body somehow, but Syu points out that the number of neutrophils living in the body at one time remains the same throughout subjects' exercising or sedentary weeks. We don't have more white blood cells when we're in shape; we are able to produce fewer because they live for longer.
Finally, I'd pointed out that since the study only used male subjects, it's hard to generalize the results for woman, whose bodies don't necessarily react to exercise in the same way. Syu acknowledges that it's unknown how exercise affects women's white blood cells. But it's possible that the effect might depend on the time of the month. In a previous study, the same research group looked at women's platelets (the blood cells responsible for clotting). In the first half of the menstrual cycle, extreme exercise had a notable effect on platelet function. But in the second half of the cycle (from ovulation to menstruation), women's platelets didn't respond to severe exercise in any way.
So the effect of exercise on your immune system might depend on many factors: how hard you work out; how consistently you work out; whether you're a woman and what time of the month it is. It seems that consistent exercise protects your immune system, but going from zero to 60 when you start a workout routine is harmful. And women, for now, will remain a mystery.
Syu, G., Chen, H., & Jen, C. (2011). Severe Exercise and Exercise Training Exert Opposite Effects on Human Neutrophil Apoptosis via Altering the Redox Status PLoS ONE, 6 (9) DOI: 10.1371/journal.pone.0024385
Evolved for Arrogance
Why does nature allow us to lie to ourselves? Humans are consistently and bafflingly overconfident. We consider ourselves more skilled, more in control, and less vulnerable to danger than we really are. You might expect evolution to have weeded out the brawl-starters and the X-Gamers from the gene pool and left our species with a firmer grasp of our own abilities. Yet our arrogance persists.
In a new paper published in Nature, two political scientists say they've figured out the reason. There's no mystery, they say; it's simple math.
The researchers created an evolutionary model in which individuals compete for resources. Every individual has an inherent capability, or strength, that simply represents how likely he or she is to win in a conflict. If an individual seizes a resource, the individual gains fitness. If two individuals try to claim the same resource, they will both pay a cost for fighting, but the stronger individual will win and get the resource.
Of course, if everyone knew exactly how likely they were to win in a fight, there would be no point in fighting. The weaker individual would always hand over the lunch money or drop out of the race, and everyone would go peacefully on their way. But in the model, as in life, there is uncertainty. Individuals decide whether a resource is worth fighting for based on their perception of their opponents' strength, as well as their perception of their own strength. Both are subject to error. Some individuals in the model are consistently overconfident, overestimating their capability, while others are underconfident, and a few are actually correct.
Using their model, the researchers ran many thousands of computer simulations that showed populations evolving over time. They found that their numerically motivated populations, beginning with individuals of various confidence levels, eventually reached a balance. What that balance was, though, depended on their circumstances.
When the ratio of benefits to costs was high--that is, when resources were very valuable and conflict was not too costly--the entire population became overconfident. As long as there was any degree of uncertainty in how individuals perceived each other's strength, it was beneficial for everyone to overvalue themselves.
At medium cost/benefit ratios, where either costs or benefits somewhat outweighed the other, the computerized populations reached a stable mix of overconfident and underconfident individuals. Neither strategy won out; both types of people persisted. In general, the more uncertainty was built into the model, the more extreme individuals' overconfidence or underconfidence became.
When the cost of conflict was high compared with the benefit of gaining a resource, the entire population became underconfident. Without having much to gain from conflict, individuals opted to avoid it.
The authors speculate that humans' tendency toward overconfidence may have evolved because of a high benefit-to-cost ratio in our past. If the resources available to us were valuable enough, and the price of conflict was low enough, our ancestors would have been predicted to evolve a bias toward overconfidence.
Additionally, our level of confidence doesn't need to wait for evolution to change; we can learn from each other and spread attitudes rapidly through a region or culture. The researchers call out bankers and traders, sports teams, armies, and entire nations for learned overconfidence.
Though our species' arrogance may have been evolutionarily helpful, the authors say, the stakes are higher today. We're not throwing stones and spears at each other; we have large-scale conflicts and large-scale weapons. In areas where we feel especially uncertain, we may be even more prone to grandiosity, like the overconfident individuals in the model who gained more confidence when they had less information to go on. When it comes to negotiating with foreign leaders, anticipating natural disasters, or taking a stand on climate change, brains that have evolved for self-confidence could get us in over our heads.
Johnson, D., & Fowler, J. (2011). The evolution of overconfidence Nature, 477 (7364), 317-320 DOI: 10.1038/nature10384
In a new paper published in Nature, two political scientists say they've figured out the reason. There's no mystery, they say; it's simple math.
The researchers created an evolutionary model in which individuals compete for resources. Every individual has an inherent capability, or strength, that simply represents how likely he or she is to win in a conflict. If an individual seizes a resource, the individual gains fitness. If two individuals try to claim the same resource, they will both pay a cost for fighting, but the stronger individual will win and get the resource.
Of course, if everyone knew exactly how likely they were to win in a fight, there would be no point in fighting. The weaker individual would always hand over the lunch money or drop out of the race, and everyone would go peacefully on their way. But in the model, as in life, there is uncertainty. Individuals decide whether a resource is worth fighting for based on their perception of their opponents' strength, as well as their perception of their own strength. Both are subject to error. Some individuals in the model are consistently overconfident, overestimating their capability, while others are underconfident, and a few are actually correct.
Using their model, the researchers ran many thousands of computer simulations that showed populations evolving over time. They found that their numerically motivated populations, beginning with individuals of various confidence levels, eventually reached a balance. What that balance was, though, depended on their circumstances.
When the ratio of benefits to costs was high--that is, when resources were very valuable and conflict was not too costly--the entire population became overconfident. As long as there was any degree of uncertainty in how individuals perceived each other's strength, it was beneficial for everyone to overvalue themselves.
At medium cost/benefit ratios, where either costs or benefits somewhat outweighed the other, the computerized populations reached a stable mix of overconfident and underconfident individuals. Neither strategy won out; both types of people persisted. In general, the more uncertainty was built into the model, the more extreme individuals' overconfidence or underconfidence became.
When the cost of conflict was high compared with the benefit of gaining a resource, the entire population became underconfident. Without having much to gain from conflict, individuals opted to avoid it.
The authors speculate that humans' tendency toward overconfidence may have evolved because of a high benefit-to-cost ratio in our past. If the resources available to us were valuable enough, and the price of conflict was low enough, our ancestors would have been predicted to evolve a bias toward overconfidence.
Additionally, our level of confidence doesn't need to wait for evolution to change; we can learn from each other and spread attitudes rapidly through a region or culture. The researchers call out bankers and traders, sports teams, armies, and entire nations for learned overconfidence.
Though our species' arrogance may have been evolutionarily helpful, the authors say, the stakes are higher today. We're not throwing stones and spears at each other; we have large-scale conflicts and large-scale weapons. In areas where we feel especially uncertain, we may be even more prone to grandiosity, like the overconfident individuals in the model who gained more confidence when they had less information to go on. When it comes to negotiating with foreign leaders, anticipating natural disasters, or taking a stand on climate change, brains that have evolved for self-confidence could get us in over our heads.
Johnson, D., & Fowler, J. (2011). The evolution of overconfidence Nature, 477 (7364), 317-320 DOI: 10.1038/nature10384
How Much Exercise Harms Your Immune System?
I'm looking at you, marathoners and triathletes. While you're out there building superhuman endurance and making the rest of us feel bad, are you also beefing up your immune systems? Or does becoming an Ironwoman actually weaken your body's defenses?
It may depend on how you're exercising. Researchers in Taiwan compared two types of exercise, the names of which might reveal the researchers' own feelings toward hitting the gym: "Acute Severe Exercise" (ASE) and "Chronic Moderate Exercise" (CME). In medicine, "acute" is something that comes on quickly and is over soon, as opposed to a chronic illness. The flu, say, as opposed to mono.
The subjects were 13 males between the ages of 20 and 24. Though young and otherwise healthy, they weren't in shape; the subjects had been getting less than one hour a week of exercise for at least the past six months. At the beginning of the study, all 13 subjects underwent "acute" exercise, cycling at increasing levels of difficulty until they reached exhaustion.
Afterward, five subjects became controls. They were told to continue not exercising for the next four months. Twice during that period, they showed up for another bout of ASE, so researchers could make sure that their bodies and their exercise abilities were staying the same. Meanwhile, the other eight subjects began two months of "chronic" exercise. They worked out five days a week for 30 minutes. The moderate intensity of their workout was defined as a percentage of the work they'd been able to do during ASE. After two months, the exercisers were also instructed to stop exercising. They spent two more months getting no exercise at all. In each month of the study, they also did an ASE test so researchers could see how their bodies' response to severe exercise was changing.
Outwardly, the effect of consistent (excuse me, chronic) exercise on the bodies of formerly sedentary people was unsurprising. After two months of training, the CME subjects had lost weight, lowered their resting heart rates, and increased their endurance. Then they stopped exercising. After the two-month "detraining" period, subjects' weights and heart rates had returned to their original levels, though the work they could do in the ASE task was still elevated, showing a lasting effect on their fitness. The control subjects did their job well, staying the same during the four months.
But what the researchers were interested in was the inner changes in their subjects; namely, changes to white blood cells called neutrophils. These are key players in the immune system, responding to the site of infection in the body and attacking any invaders they find. Neutrophils are short-lived cells, committing cell suicide (called apoptosis) after only a few days in the bloodstream. If these white blood cells are too enthusiastic about offing themselves, it can weaken the immune system.
Neutrophil death may be linked to the abundance of oxygen-containing molecules that react with everything around them, harming structures inside the cell. Since extreme exercise can increase the amount of these harmful "reactive oxygen species" in the body's tissues, the researchers wanted to know how exercise affected neutrophils. They drew blood from their subjects periodically, both at rest and after their ASE trials, and removed the neutrophils for analysis.
They found that "acute severe exercise" did, in fact, accelerate neutrophil suicide. It also increased the amount of reactive, oxygen-containing molecules in the cells.
"Chronic moderate exercise," on the other hand, appeared to slow down the death of neutrophils. After two months of regular exercise, subjects' white blood cells were showing less oxidative stress and slower apoptosis. Even after subjects spent the following two months not exercising, the effect lingered.
In a final twist, the positive effects of consistent exercise seemed to counteract the harmful effects of extreme exercise. After the acute exercise task, subjects who'd been exercising regularly did not show the same damage to their neutrophils that they had at first. But after two sedentary months, the protective effect had begun to fade.
What does all this mean for the marathoner or the Ironwoman? Unfortunately, since the subjects were all men, the study says very little about women of any kind. But for the young, previously sedentary males involved, the study suggests that sudden, exhausting exercise accelerates the death of certain immune cells. Consistent and moderate exercise, on the other hand, prolongs these cells' lives. It also buffers the damaging effect of occasional extreme exercise. And when you stop exercising, the positive effects of your old routine linger, at least for a little while.
The researchers point to other studies that have shown a connection between sudden, extreme exercise and upper respiratory tract infections. In this study, we can't see the effect that various rates of neutrophil death had on subjects' immune systems as a whole. When neutrophil death was accelerated after acute exercise, were subjects truly more vulnerable to infection, or did the immune system compensate somehow for neutrophil loss? In subjects who got regular exercise and prolonged the lives of their neutrophils, was the immune system strengthened? Does keeping these short-lived cells alive for longer necessarily help prevent infection, or could it create a burden for the body?
Overall, the authors think the evidence is in favor of consistent and moderate exercise. For patients whose immune systems are impaired by HIV or chemotherapy, regular exercise might provide a boost. This study suggested that consistent exercise counteracts the negative effects of extreme exercise--at least some of the effects. But to stay on the safe side, the authors recommend that you avoid "acute severe exercise" like, well, the plague.
Syu, G., Chen, H., & Jen, C. (2011). Severe Exercise and Exercise Training Exert Opposite Effects on Human Neutrophil Apoptosis via Altering the Redox Status PLoS ONE, 6 (9) DOI: 10.1371/journal.pone.0024385
It may depend on how you're exercising. Researchers in Taiwan compared two types of exercise, the names of which might reveal the researchers' own feelings toward hitting the gym: "Acute Severe Exercise" (ASE) and "Chronic Moderate Exercise" (CME). In medicine, "acute" is something that comes on quickly and is over soon, as opposed to a chronic illness. The flu, say, as opposed to mono.
The subjects were 13 males between the ages of 20 and 24. Though young and otherwise healthy, they weren't in shape; the subjects had been getting less than one hour a week of exercise for at least the past six months. At the beginning of the study, all 13 subjects underwent "acute" exercise, cycling at increasing levels of difficulty until they reached exhaustion.
Afterward, five subjects became controls. They were told to continue not exercising for the next four months. Twice during that period, they showed up for another bout of ASE, so researchers could make sure that their bodies and their exercise abilities were staying the same. Meanwhile, the other eight subjects began two months of "chronic" exercise. They worked out five days a week for 30 minutes. The moderate intensity of their workout was defined as a percentage of the work they'd been able to do during ASE. After two months, the exercisers were also instructed to stop exercising. They spent two more months getting no exercise at all. In each month of the study, they also did an ASE test so researchers could see how their bodies' response to severe exercise was changing.
Outwardly, the effect of consistent (excuse me, chronic) exercise on the bodies of formerly sedentary people was unsurprising. After two months of training, the CME subjects had lost weight, lowered their resting heart rates, and increased their endurance. Then they stopped exercising. After the two-month "detraining" period, subjects' weights and heart rates had returned to their original levels, though the work they could do in the ASE task was still elevated, showing a lasting effect on their fitness. The control subjects did their job well, staying the same during the four months.
But what the researchers were interested in was the inner changes in their subjects; namely, changes to white blood cells called neutrophils. These are key players in the immune system, responding to the site of infection in the body and attacking any invaders they find. Neutrophils are short-lived cells, committing cell suicide (called apoptosis) after only a few days in the bloodstream. If these white blood cells are too enthusiastic about offing themselves, it can weaken the immune system.
Neutrophil death may be linked to the abundance of oxygen-containing molecules that react with everything around them, harming structures inside the cell. Since extreme exercise can increase the amount of these harmful "reactive oxygen species" in the body's tissues, the researchers wanted to know how exercise affected neutrophils. They drew blood from their subjects periodically, both at rest and after their ASE trials, and removed the neutrophils for analysis.
They found that "acute severe exercise" did, in fact, accelerate neutrophil suicide. It also increased the amount of reactive, oxygen-containing molecules in the cells.
"Chronic moderate exercise," on the other hand, appeared to slow down the death of neutrophils. After two months of regular exercise, subjects' white blood cells were showing less oxidative stress and slower apoptosis. Even after subjects spent the following two months not exercising, the effect lingered.
In a final twist, the positive effects of consistent exercise seemed to counteract the harmful effects of extreme exercise. After the acute exercise task, subjects who'd been exercising regularly did not show the same damage to their neutrophils that they had at first. But after two sedentary months, the protective effect had begun to fade.
What does all this mean for the marathoner or the Ironwoman? Unfortunately, since the subjects were all men, the study says very little about women of any kind. But for the young, previously sedentary males involved, the study suggests that sudden, exhausting exercise accelerates the death of certain immune cells. Consistent and moderate exercise, on the other hand, prolongs these cells' lives. It also buffers the damaging effect of occasional extreme exercise. And when you stop exercising, the positive effects of your old routine linger, at least for a little while.
The researchers point to other studies that have shown a connection between sudden, extreme exercise and upper respiratory tract infections. In this study, we can't see the effect that various rates of neutrophil death had on subjects' immune systems as a whole. When neutrophil death was accelerated after acute exercise, were subjects truly more vulnerable to infection, or did the immune system compensate somehow for neutrophil loss? In subjects who got regular exercise and prolonged the lives of their neutrophils, was the immune system strengthened? Does keeping these short-lived cells alive for longer necessarily help prevent infection, or could it create a burden for the body?
Overall, the authors think the evidence is in favor of consistent and moderate exercise. For patients whose immune systems are impaired by HIV or chemotherapy, regular exercise might provide a boost. This study suggested that consistent exercise counteracts the negative effects of extreme exercise--at least some of the effects. But to stay on the safe side, the authors recommend that you avoid "acute severe exercise" like, well, the plague.
Syu, G., Chen, H., & Jen, C. (2011). Severe Exercise and Exercise Training Exert Opposite Effects on Human Neutrophil Apoptosis via Altering the Redox Status PLoS ONE, 6 (9) DOI: 10.1371/journal.pone.0024385
Non-Aging Plant Gets Better Every Century
Clinging to rock piles high in the Pyrenees, the plant Borderea pyrenaica has a modest lifestyle: It grows a new shoot every summer, flowers and fruits, then sheds its aboveground growth to survive the winter as a tuber. What's remarkable is how long this life lasts for. Individual plants have been known to live 300 years or more. Scientists headed up into the mountains to find out whether these plants, in all their years of living, ever actually get old.
"Senescence" is what we usually call aging--getting weaker and closer to death as we get on in years. To us humans, it seems like a fact of life. But some other animals are thought to be "negligibly senescent." Certain fish, turtles, and other sea creatures seem to be perfectly healthy and fertile at 100 or 200 years old; they're no more likely to die at that age than at any other. Some plants, and especially some trees, may have nearly unlimited lifespans.
Scientists--not to mention cosmetics companies--would love to know exactly why humans are stuck with senescence while organisms like the bristlecone pine just get more fabulous with age. Unfortunately, it's difficult for those of us with limited lifespans to study those without. To squeeze some secrets out of Borderea pyrenaica, scientists from Spain and Sweden studied two populations of the plant over the course of five years.
Because Borderea pyrenaica is left with a scar on its tuber when each year's growth dies back, researchers could count the scars to calculate an individual tuber's age. Each year, they counted and measured the leaves on each plant. They also counted the plants' flowers, fruits and seeds. Since the plants come in male and female versions, the researchers would be able to compare aging in both--would the metabolic effort of making fruits and seeds take a toll on female plants' lifespans? At the end of the study, the researchers dug up all the tubers, dried them and weighed them. (Aesop says: Don't be jealous of negligibly senescent organisms. If old age doesn't kill you, science will!)
The researchers were able to calculate the age of almost 750 plants that were up to 260 years old. They found that tubers grew in size each year, reaching their maximum size after 50 or 100 years (depending on the population). As the tubers grew, the shoots that they put out each year got bigger too. After they reached about 60 years old, the plants didn't seem any more likely to die with the passing years. If anything, survivorship seemed to increase in old age. There was no difference between male and female plants.
As they got bigger, both types of plants put out more flowers, giving them greater potential to contribute to the next generation. This meant that the plants' "reproductive value"--an individual's expected fertility from its current age onward--actually increased over their entire lifespan.
It seems unlikely that we'll one day tap into some biological secret that enables us to live forever. But further research into the plants and animals that don't deteriorate with age might help us solve the mysteries of our own mortality. We may not ever become ageless, but we could learn to age with some of the grace of a lobster, or a mountain tuber.
Garcia, M., Dahlgren, J., & Ehrlén, J. (2011). No evidence of senescence in a 300-year-old mountain herb Journal of Ecology DOI: 10.1111/j.1365-2745.2011.01871.x
"Senescence" is what we usually call aging--getting weaker and closer to death as we get on in years. To us humans, it seems like a fact of life. But some other animals are thought to be "negligibly senescent." Certain fish, turtles, and other sea creatures seem to be perfectly healthy and fertile at 100 or 200 years old; they're no more likely to die at that age than at any other. Some plants, and especially some trees, may have nearly unlimited lifespans.
Scientists--not to mention cosmetics companies--would love to know exactly why humans are stuck with senescence while organisms like the bristlecone pine just get more fabulous with age. Unfortunately, it's difficult for those of us with limited lifespans to study those without. To squeeze some secrets out of Borderea pyrenaica, scientists from Spain and Sweden studied two populations of the plant over the course of five years.
Because Borderea pyrenaica is left with a scar on its tuber when each year's growth dies back, researchers could count the scars to calculate an individual tuber's age. Each year, they counted and measured the leaves on each plant. They also counted the plants' flowers, fruits and seeds. Since the plants come in male and female versions, the researchers would be able to compare aging in both--would the metabolic effort of making fruits and seeds take a toll on female plants' lifespans? At the end of the study, the researchers dug up all the tubers, dried them and weighed them. (Aesop says: Don't be jealous of negligibly senescent organisms. If old age doesn't kill you, science will!)
The researchers were able to calculate the age of almost 750 plants that were up to 260 years old. They found that tubers grew in size each year, reaching their maximum size after 50 or 100 years (depending on the population). As the tubers grew, the shoots that they put out each year got bigger too. After they reached about 60 years old, the plants didn't seem any more likely to die with the passing years. If anything, survivorship seemed to increase in old age. There was no difference between male and female plants.
As they got bigger, both types of plants put out more flowers, giving them greater potential to contribute to the next generation. This meant that the plants' "reproductive value"--an individual's expected fertility from its current age onward--actually increased over their entire lifespan.
It seems unlikely that we'll one day tap into some biological secret that enables us to live forever. But further research into the plants and animals that don't deteriorate with age might help us solve the mysteries of our own mortality. We may not ever become ageless, but we could learn to age with some of the grace of a lobster, or a mountain tuber.
Garcia, M., Dahlgren, J., & Ehrlén, J. (2011). No evidence of senescence in a 300-year-old mountain herb Journal of Ecology DOI: 10.1111/j.1365-2745.2011.01871.x
Meet the Brain's Timekeepers
There are minutes and hours of our lives in which nothing happens, and these don't seem on the surface to be very challenging for our memories. At least, they make for succinct stories: "I waited 20 minutes for the doctor to come in." "I tossed and turned for hours last night." But how do we know it's been hours? How do we represent these chunks of lost time in our memories, accounting for all the empty minutes without actually losing them? Researchers at Boston University think they've found the answer. Buried in the brain's memory center, "time cells" tick away the moments like the second hand on a clock.
Deep inside the brain, the hippocampus helps us to remember sequences of events and form new memories. Howard Eichenbaum and his colleagues implanted electrodes into the hippocampi of four rats. They wanted to observe which neurons were active at different points during a task that involved a delay and challenged the rats' sequential memories.
The rats were trained to complete several steps: First, they entered a corridor and saw (and sniffed) an object, either a green wooden block or half a green rubber ball. Then a door was opened, releasing the rats into the next part of the corridor. When the door shut behind them, the rats were trapped in the blank hallway for 10 seconds. After the delay, another door opened, leading the rats to a flowerpot filled with sand. The rats sniffed at the sand, which had been mixed with either cinnamon or basil. In training, the rats had learned to match each smell with one of the two green objects. If the smell was the correct match for the object they'd seen 10 seconds earlier, the rats could dig in the flowerpot to get a reward (a third of a Froot Loop, in case you wondered). If the smell didn't match, the rats could earn their reward by leaving the flowerpot undisturbed and going around the corner.
In trials, the rats repeated this mini-maze 100 or so times in a row. In order to succeed, they had to keep the order of recent events straight in their memories. (Did I see the green ball before the most recent delay, or before I saw the last flowerpot?) The researchers recorded the activity of a few hundred hippocampal neurons during the whole trial. About half of the neurons they looked at fired during the 10-second delay.
What was interesting about these cells was that they fired one after another throughout the delay. With their successive firings, the neurons covered the whole empty time from start to finish, like a team of runners in a very short relay race.
The researchers dubbed these neurons "time cells" because they seem to keep track of time. Similarly, "place cells" are neurons that are known to fire when a rat is in a specific place. They keep up their activity as a rat moves through an open space, pacing off an otherwise unremarkable landscape just as time cells appear to keep track of empty time.
When the researchers redid the experiment and doubled the delay in one block of trials, they saw some of the time cells adjusting their firing frequency, while other teams of cells kept up their regular tick-tick-tick. They think this means time cells can monitor relative time as well as absolute time.
Could the firing of similar time cells in human brains define how we understand time? Some research has suggested that we experience life in chunks of about three seconds. Each of those chunks makes up a single moment, the theory goes, separating "right now" from everything before and after.
It's an appealing idea: Even when nothing of note is happening, our brains are steadily observing and recording so that we can sort out events in recollection. Maybe other areas of the brain measure time in their own ways. Or maybe the only timekeepers are in the hippocampus--the memory center--meaning that to our brains, time is only important in its passing.
MacDonald, C., Lepage, K., Eden, U., & Eichenbaum, H. (2011). Hippocampal “Time Cells” Bridge the Gap in Memory for Discontiguous Events Neuron, 71 (4), 737-749 DOI: 10.1016/j.neuron.2011.07.012
Deep inside the brain, the hippocampus helps us to remember sequences of events and form new memories. Howard Eichenbaum and his colleagues implanted electrodes into the hippocampi of four rats. They wanted to observe which neurons were active at different points during a task that involved a delay and challenged the rats' sequential memories.
The rats were trained to complete several steps: First, they entered a corridor and saw (and sniffed) an object, either a green wooden block or half a green rubber ball. Then a door was opened, releasing the rats into the next part of the corridor. When the door shut behind them, the rats were trapped in the blank hallway for 10 seconds. After the delay, another door opened, leading the rats to a flowerpot filled with sand. The rats sniffed at the sand, which had been mixed with either cinnamon or basil. In training, the rats had learned to match each smell with one of the two green objects. If the smell was the correct match for the object they'd seen 10 seconds earlier, the rats could dig in the flowerpot to get a reward (a third of a Froot Loop, in case you wondered). If the smell didn't match, the rats could earn their reward by leaving the flowerpot undisturbed and going around the corner.
In trials, the rats repeated this mini-maze 100 or so times in a row. In order to succeed, they had to keep the order of recent events straight in their memories. (Did I see the green ball before the most recent delay, or before I saw the last flowerpot?) The researchers recorded the activity of a few hundred hippocampal neurons during the whole trial. About half of the neurons they looked at fired during the 10-second delay.
What was interesting about these cells was that they fired one after another throughout the delay. With their successive firings, the neurons covered the whole empty time from start to finish, like a team of runners in a very short relay race.
The researchers dubbed these neurons "time cells" because they seem to keep track of time. Similarly, "place cells" are neurons that are known to fire when a rat is in a specific place. They keep up their activity as a rat moves through an open space, pacing off an otherwise unremarkable landscape just as time cells appear to keep track of empty time.
When the researchers redid the experiment and doubled the delay in one block of trials, they saw some of the time cells adjusting their firing frequency, while other teams of cells kept up their regular tick-tick-tick. They think this means time cells can monitor relative time as well as absolute time.
Could the firing of similar time cells in human brains define how we understand time? Some research has suggested that we experience life in chunks of about three seconds. Each of those chunks makes up a single moment, the theory goes, separating "right now" from everything before and after.
It's an appealing idea: Even when nothing of note is happening, our brains are steadily observing and recording so that we can sort out events in recollection. Maybe other areas of the brain measure time in their own ways. Or maybe the only timekeepers are in the hippocampus--the memory center--meaning that to our brains, time is only important in its passing.
MacDonald, C., Lepage, K., Eden, U., & Eichenbaum, H. (2011). Hippocampal “Time Cells” Bridge the Gap in Memory for Discontiguous Events Neuron, 71 (4), 737-749 DOI: 10.1016/j.neuron.2011.07.012
Are Your Gut Bacteria Vegetarian?
This spring, scientists announced that each person seems to have a signature set of gut bacteria, like a blood type for the microbiome. Their human subjects fell into three separate "enterotypes," each one representing a distinct microbial ecosystem. The enterotypes didn't correlate with subjects' age, gender, or nationality. But a new study has found something that does predict what enterotype you'll host: eating a plant- or animal-based diet.
Researchers surveyed 98 subjects about their diets, both recently and in the long term. They also took stool samples from their subjects. The scientists gave this portion of their study the very belabored acronym COMBO, for "Cross-sectional study Of diet and stool MicroBiOme composition."(It's not clear whether COMBO is meant to reference the combination of survey data and fecal data, or the resemblance a cheese-filled pretzel tube has to a cross-section of the colon.)
The researchers used these diet surveys to create profiles of subjects' nutrient intake. They also sequenced the DNA in subjects' stool samples to create profiles of their bacterial communities.
What they found was that subjects could be clustered into two bacterial ecosystems. One group's guts were dominated by bacteria in the genus Bacteroides; the others were dominated by Prevotella bacteria. (These match up with two of the three enterotypes described in the earlier study. In the current study, there was some evidence for the third enterotype the other authors had described--characterized by Ruminococcus bacteria--but in most statistical analyses this enterotype was merged with Bacteroides.)
Comparing enterotype data to dietary data, they saw a strong correlation between a person's bacterial type and his or her preferred nutrient sources. The Bacteroides enterotype showed up in people who consumed more animal proteins and fats. The Prevotella type, on the other hand, went along with higher consumption of carbohydrates and fiber--in other words, plants.
The second portion of the study was called CAFE, short for Controlled Feeding Experiment. You might notice that there's no A in the experiment's actual title--but that's fitting, given that the experiment was basically the opposite of a café. A patron in a café usually expects to choose food from a menu, whereas the 10 subjects in the CAFE experiment had to stay in the research center for 10 days and eat exactly what was put in front of them, no more or less. (This sounds more like a CAFO, or Concentrated Animal Feeding Operation, also called a factory farm.) Half the subjects were put on a diet that was high-fat/low-fiber, and the other half ate a low-fat/high-fiber diet. All subjects ate identical meals; the only difference between the two diets was in the proportions of foods on the plate.
Also unlike in a café, subjects had to provide daily stool samples to the researchers. All 10 subjects started out with the Bacteroides (high protein and fat) enterotype. Over the course of their 10-day diet, none of the high-fiber subjects switched over to the opposite enterotype. Even though their bacterial types seemed to be based on their long-term diets, a 10-day change in diet wasn't enough to seriously affect subjects' bacterial ecosystems.
The different diet did affect the members within each ecosystem, though. Within 24 hours of starting their new diet, subjects' bacterial populations shifted noticeably. So while a temporary change in diet may not permanently alter your enterotype, it could significantly change the balance of different microbes within that enterotype.
In their dietary surveys, the researchers found a few other factors that were tied to the balance of microbes in subjects' guts, but not to a particular enterotype. BMI was one unsurprising example. Others included red wine and aspartame consumption--suggesting either that aspartame kills off gut bacteria, or that something has actually evolved to live on it.
The earlier study, which used a smaller sample size, found no link between enterotype and nationality. But a larger sample size might be able to capture connections between gut bacteria and nationality that reflect different countries' dietary traditions. The current paper references a study that compared European children to children in the African nation of Burkina Faso. While European children tended to have the Bacteroides (animal nutrients) enterotype, the African children were more likely to have the plant-related Prevotella enterotype, reflecting their high-carbohydrate diet.
Increasingly, connections are cropping up between human health--both mental and physical--and the microbiome. We know very little about our relationship with our gut bacteria, but untangling more connections like this one will help us understand how to stay on good terms with them. In the future, keeping ourselves healthy may mean keeping our internal ecosystems well-fed.
Wu, G., Chen, J., Hoffmann, C., Bittinger, K., Chen, Y., Keilbaugh, S., Bewtra, M., Knights, D., Walters, W., Knight, R., Sinha, R., Gilroy, E., Gupta, K., Baldassano, R., Nessel, L., Li, H., Bushman, F., & Lewis, J. (2011). Linking Long-Term Dietary Patterns with Gut Microbial Enterotypes Science DOI: 10.1126/science.1208344
Researchers surveyed 98 subjects about their diets, both recently and in the long term. They also took stool samples from their subjects. The scientists gave this portion of their study the very belabored acronym COMBO, for "Cross-sectional study Of diet and stool MicroBiOme composition."(It's not clear whether COMBO is meant to reference the combination of survey data and fecal data, or the resemblance a cheese-filled pretzel tube has to a cross-section of the colon.)
The researchers used these diet surveys to create profiles of subjects' nutrient intake. They also sequenced the DNA in subjects' stool samples to create profiles of their bacterial communities.
What they found was that subjects could be clustered into two bacterial ecosystems. One group's guts were dominated by bacteria in the genus Bacteroides; the others were dominated by Prevotella bacteria. (These match up with two of the three enterotypes described in the earlier study. In the current study, there was some evidence for the third enterotype the other authors had described--characterized by Ruminococcus bacteria--but in most statistical analyses this enterotype was merged with Bacteroides.)
Comparing enterotype data to dietary data, they saw a strong correlation between a person's bacterial type and his or her preferred nutrient sources. The Bacteroides enterotype showed up in people who consumed more animal proteins and fats. The Prevotella type, on the other hand, went along with higher consumption of carbohydrates and fiber--in other words, plants.
The second portion of the study was called CAFE, short for Controlled Feeding Experiment. You might notice that there's no A in the experiment's actual title--but that's fitting, given that the experiment was basically the opposite of a café. A patron in a café usually expects to choose food from a menu, whereas the 10 subjects in the CAFE experiment had to stay in the research center for 10 days and eat exactly what was put in front of them, no more or less. (This sounds more like a CAFO, or Concentrated Animal Feeding Operation, also called a factory farm.) Half the subjects were put on a diet that was high-fat/low-fiber, and the other half ate a low-fat/high-fiber diet. All subjects ate identical meals; the only difference between the two diets was in the proportions of foods on the plate.
Also unlike in a café, subjects had to provide daily stool samples to the researchers. All 10 subjects started out with the Bacteroides (high protein and fat) enterotype. Over the course of their 10-day diet, none of the high-fiber subjects switched over to the opposite enterotype. Even though their bacterial types seemed to be based on their long-term diets, a 10-day change in diet wasn't enough to seriously affect subjects' bacterial ecosystems.
The different diet did affect the members within each ecosystem, though. Within 24 hours of starting their new diet, subjects' bacterial populations shifted noticeably. So while a temporary change in diet may not permanently alter your enterotype, it could significantly change the balance of different microbes within that enterotype.
In their dietary surveys, the researchers found a few other factors that were tied to the balance of microbes in subjects' guts, but not to a particular enterotype. BMI was one unsurprising example. Others included red wine and aspartame consumption--suggesting either that aspartame kills off gut bacteria, or that something has actually evolved to live on it.
The earlier study, which used a smaller sample size, found no link between enterotype and nationality. But a larger sample size might be able to capture connections between gut bacteria and nationality that reflect different countries' dietary traditions. The current paper references a study that compared European children to children in the African nation of Burkina Faso. While European children tended to have the Bacteroides (animal nutrients) enterotype, the African children were more likely to have the plant-related Prevotella enterotype, reflecting their high-carbohydrate diet.
Increasingly, connections are cropping up between human health--both mental and physical--and the microbiome. We know very little about our relationship with our gut bacteria, but untangling more connections like this one will help us understand how to stay on good terms with them. In the future, keeping ourselves healthy may mean keeping our internal ecosystems well-fed.
Subscribe to:
Posts (Atom)