Field of Science

Pages

Math Shows Today's Writers Are Less Influenced by the Past


When Charles Dickens wrote It was the of, it was the of, the immortal first words in A Tale of Two Cities, he can't have imagined that 21st-century computer scientists would parse his prepositions and pronouns as part of vast literary data sets. But today's researchers are studying the unimportant words in books to find important literary trends. With the meaty words taken out, language becomes a numbers game.

To see how literary styles evolve over time--a science dubbed "stylometry"--researchers led by James Hughes at Dartmouth College turned to Project Gutenberg. The site contains the full text of more than 38,000 out-of-copyright books. Researchers began their mining expedition by digging out every author who wrote after 1550, had a known date of birth and (when relevant) death, and had at least 5 English-language books digitized.

These criteria gave the researchers a set of 537 authors with 7,733 published works. But they weren't interested in every word of those books. Nouns and adjectives were out: No Kareninas or Lolitas, nothing nice or bad or beautiful, no roads or homes or people. Most verbs were out, except for forms of the utilitarian to be. No one could speak or walk or Fly, good Fleance!


It may seem that the researchers were stripping all the information-containing words out of the sentences, and in fact that was their goal: "Content-free" words were all they wanted. The 307-word vocabulary that remained from the books was mostly prepositions, conjunctions, and articles.

This linguistic filler, the little stitches that hold together the good stuff, is known to contain a kind of authorial fingerprint. We may not think much about these words when we're writing or speaking, but scientists can use them to define our style.

Hughes and his team used computer analysis to score each author's similarity to every other author. They found that before the late 18th century, authors's stylistic similarity didn't depend on how close to each other they lived. (Each author was represented by a single year, the midpoint between his or her birth and death.) During this time period, authors who lived in the same generation didn't influence each other's styles much more than authors who lived hundreds of years away.

But from the late 18th century to today, it was a different story. Stylistically, authors were more similar to their contemporaries than to other writers. By the late 19th century, writers closely matched the style of other writers who lived at the same time (at least according to the computers tallying up their non-content words). This influence dropped off outside of 30 years. In other words, authors who lived more than three decades away each other may as well have lived centuries away, for all the similarity between their writing.

Looking at more recent books, that window of influence seems to become even tighter. Among authors from the first half of the 20th century, the similarity of style drops off beyond just 23 years.

Over time, authors have become more and more influenced by the other authors writing at the same time. The researchers say this may simply be due to the number of books published. In the early part of their dataset, there were few enough books around that a studious person could read, well, most of them. But as more and more books were published, contemporary books made up a larger share of what was available to read. Authors have filled more and more shelves in their libraries with books by their peers--and this has made them more likely to echo each other's styles.

Because Project Gutenberg relies on public-domain material, there weren't very many authors after the mid-20th century included in this study. Looking forward, "You would expect a continued diminishing of influence," says Daniel Rockmore, the paper's senior author. Contemporary books take up an ever greater portion of what's available to read. In addition to the huge number of books published each year (more than 288,000 in the United States in 2009), there are now e-books and e-readers and Japanese Twitter novels.

A century from now, we may be able to look back and see that today's authors had an ever-condensing frame of influence. Of course, by then literary styles might only last a week. Most books will be forgotten, but every author will be a revolutionary.

James M. Hughes, Nicholas J. Foti, David C. Krakauer, & Daniel N. Rockmore (2012). Quantitative patterns of stylistic influence in the evolution of literature PNAS : 10.1073/pnas.1115407109


Image: Library of Congress from ep_jhu/Flickr

Baby Corn Plants Recruit Helpful Bacteria Posse


When you're a newly sprouted corn seedling, all alone in the dirt, you need any advantage you can get. After all, you can't pick up your roots and travel to find resources or avoid pests. That's why corn plants emit toxic chemicals that keep away hungry insects aboveground and harmful microbes below. But to at least one kind of bacteria, this poison is more of a beacon. They follow the toxic trail back to the corn plant, set up camp in its roots, and help the vulnerable seedling grow.

A plant's roots are the center of a miniature ecosystem called the rhizosphere. Local bacteria feed on sugars and proteins that trickle out of the roots, like antelopes at a watering hole. Symbiotic fungi enmesh themselves in the plant's roots. Helpless as it may appear, the plant can even release chemicals that encourage certain microbes to live there and discourage others, or prevent competing plant species from growing nearby.

Researchers in the United Kingdom studied one of the toxic chemicals released by the roots of corn plants. The compound is a benzoxazinoid, mercifully abbreviated as BX. Seedlings of corn and other grasses secrete BX molecules to protect themselves from pests and harmful microbes.

But the team, led by Andrew Neal at Rothamsted Research, suspected that certain bacteria weren't bothered by the toxin at all. Neal says this was a bit of a "leap of faith." Many bacteria that are used to clean pollutants from soil are closely related to bacteria that colonize plant roots. And some of the toxins that plant roots produce are similar to these pollutants. So the team asked whether Pseudomonas putida--"one of the best root colonizers we know of," Neal says--might be resistant to plant toxins.

The researchers first took both the plants and bacteria out of the soil to see what was going on. They found that corn seedlings produce the most poison at one week old, protecting themselves at their most vulnerable stage of growth. Over the next couple of weeks, production drops off.

Testing P. putida bacteria, they saw that the concentration of BX molecules around a seedling's roots didn't hurt the bacteria at all. But another common soil bacterium had serious trouble growing, even at a much lower concentration of the toxin. The chemical also broke down more quickly in the presence of P. putida, suggesting that the bacteria might not only tolerate the poison, but eat it.

Next Neal and his coauthors turned to the genes of P. putida to see which ones are most active when the toxic chemical is around. A few dozen genes popped up. Some of these had to do with "chemotaxis," a trick in which bacteria use their wiggly arms to travel toward a high concentration of a chemical they like. Were P. putida bacteria actively seeking out the toxin and the corn roots that released it?

Further experiments showed that the bacteria do, in fact, travel toward areas with more BX molecules. And in the soil, corn seedlings making the toxin attract more P. putida to their roots. (Genetic mutants that can't make BX molecules attract fewer bacteria.) The effect fades by the time the plant is three weeks old.

This is the first time scientists have seen an otherwise toxic root chemical attracting helpful bacteria. A corn plant that has successfully recruited P. putida has a leg up--or a root up--in its development. These bacteria and other friendly microbes keep harmful bacteria away by crowding them out and producing antibiotics against them. They also help the plant reach nutrients such as iron and phosphorous in the soil. The bacteria, too, have an advantage over other microorganisms in the area because they can tolerate the plant's toxin and may even eat it.

Neal says that through breeding, some crops have lost their ability to generate this chemical. "Modern varieties of cereals such as corn, wheat, barley, etc., now produce widely varying amounts of the benzoxazinones we studied," he writes. "Some produce quite a lot, others produce none." Neal hopes this research has shown why BX production is a helpful trait for plants to have.

Today's breeders, better informed about what goes on beneath the soil than their predecessors, may want to create new crop varieties that can once again make their own toxins. Plants that generate BX molecules can inhibit pests and diseases--and call friendly bacteria to their aid. We might be able to use fewer pesticides and fertilizers if we let our crops' bacterial helpers help us, too.

Neal, A., Ahmad, S., Gordon-Weeks, R., & Ton, J. (2012). Benzoxazinoids in Root Exudates of Maize Attract Pseudomonas putida to the Rhizosphere PLoS ONE, 7 (4) DOI: 10.1371/journal.pone.0035498


Image: Noël Zia Lee/Flickr

Life Advice: Think More about Death


The other kids on the school bus used to shriek when we stopped at my house. Or hold their breath. I lived directly across the street from a cemetery, and until I started riding the bus I had no idea this was supposed to be scary.

My parents claim that the realtor who sold them the house, perhaps out of desperation, told them people were "dying to move in!" I was aware of our deceased neighbors but unbothered by them. My dad explained how visitors to the Jewish graveyard put stones on top of the grave markers to show respect, so I pocketed small rocks on our walks there and left them on lonely-looking headstones. Friends came over for the best sledding in the neighborhood. (My baby sister, though, didn't immediately grasp the concept. One day while a funeral went on outside, my parents asked her what she was watching out the window. She answered, "All those people are lined up waiting for their turn to die!")

Psychologists have put a great deal of study into how reminders of mortality affect people. They call it "terror management," assuming that most people view death like those kids on the bus viewed my street: It's gross and we want to be far away from it. So we build up psychological protections for ourselves, which of course are not any more useful than holding our breath but make us feel better.

A new paper published in Personality and Social Psychology Review looks over the accumulated evidence and concludes that thinking about death can make your life better. Previous terror management research has focused on the dark side of our psychological protections: Psychologists say that reminders of death can make us more hostile toward people we see as outside our own group. But researchers led by Kenneth E. Vail III at the University of Missouri, Columbia, say the perks of morbid thinking are too great to ignore.

Conscious reminders of death can encourage people to stay healthy and pursue their goals. In various studies, subjects smoked less, planned to exercise more, and were more conscientious about sunscreen after being made to think about death.

When people were asked to list their goals immediately after answering questions about death, they placed more importance on what psychologists call "intrinsic" goals--those related to relationships and personal growth, for example, rather than wealth or attractiveness. But after a delay, they went back to those "extrinsic" (shallower) goals. People who were asked to do daily contemplations of mortality for a week also put greater importance on intrinsic goals.

Other studies have looked at what happens when people are primed subtly to think about death. In an experiment with an elaborate setup designed to seem accidental, subjects walked through a cemetery (or not) and overheard a person talking on a cell phone about "the value of helping." A little while later, subjects saw a second person drop a notebook. Those people walking through the cemetery seemed to be more receptive to the helpfulness hint, and were much more likely to stop and help the struggling passerby. Similar experiments got subjects to feed the homeless, donate to sick children, or help disabled people by reminding them of death.

Thoughts about dying may strengthen our bonds to others, too. Studies have found that after reminders of mortality, people feel more committed to their romantic relationships and strive more for intimacy. They're also more inclined to have children.

Tapping into the benefits of our fear of death, the authors say, could make people "more inclusive, cooperative, and peaceful." The downside of our psychological response to death is hostility toward outsiders. But as long as people view themselves as part of a larger community, thinking about our mortality can encourage us to clean up our acts. We may be more helpful to others, more committed to our relationships, more focused on healthy habits, and more thoughtful about our long-term goals.

The healthy side effects of dwelling on death inspired game designer Jane McGonigal to create a game called Tombstone Hold 'Em.* She's organized large-scale events in cemeteries around the world. During a game, competitors pair off and race around a graveyard to find the best "hand" made out of two tombstones. The game is built to tap into the psychological plus-sides of working with a group, running around outside, and--yes--thinking about dead people.

Not everyone believes turning a graveyard into a giant poker game is a good idea. Tombstone Hold 'Em stirs some controversy among people who think frolicking among one's deceased neighbors is disrespectful. But McGonigal thinks respect for the dead can come from positive experiences we have while we're around them.

Consciously or not, we took the same tactic in my neighborhood, where families used to gather on top of the cemetery hill to watch Fourth-of-July fireworks. Parents stood between the tombstones while kids sat on top of them for a better view. The youngest ones got hoisted up to stand on tall stones, their parents' hands beneath their armpits for balance, so they could see the distant bursts. No one had to hold their breath.

Vail, K., Juhl, J., Arndt, J., Vess, M., Routledge, C., & Rutjens, B. (2012). When Death is Good for Life: Considering the Positive Trajectories of Terror Management Personality and Social Psychology Review DOI: 10.1177/1088868312440046 


*I read about Tombstone Hold 'Em in Jane McGonigal's very interesting book about gaming, Reality Is Broken.


Image: Necropolis in Glasgow, Scotland, by me.

Dueling Birds Evolve New Egg Colors in Decades


"Arms race" might seem like too dire a phrase for what's essentially an egg-dying contest. But for the two bird species hurrying to outwit each other, it really is a matter of survival. The stakes in this colorful competition are so high, in fact, that they drive evolution at a pace that's rarely seen.

The two birds in question reside in Zambia. One is the tawny-flanked prinia, a petite warbler. The other is the cuckoo finch, a species that's not a cuckoo at all but shares a habit with that family of birds. Some cuckoos are known for sneaking their own eggs into other birds' nests for their unwitting adoption, a tactic called brood parasitism. The cuckoo finch is a brood parasite too. Its target: the prinia.

How difficult it is for an interloping bird to drop off its egg for permanent babysitting depends on its host. Some brood parasites leave giant, obviously mismatched eggs in other birds' nests. Their only precaution is to toss one of the host bird's eggs away in exchange. The hosts, perhaps not bright enough to notice the difference, tend the egg and eventually feed the oversized baby bird alongside (or instead of) their own.


The cuckoo finch faces a bit more of a challenge, though. Prinias are adept at spotting mismatched eggs and booting them out of the nest. Their species has evolved to lay eggs with a wide range of colors and markings, but each individual female lays just one kind of egg. This makes it hard for a brood parasite to match her distinctive eggshell style, and easy for her to notice any mismatched eggs in the nest. Raising a cuckoo finch would be a costly mistake, taking resources away from her own young. (And the adopted bird might evict its siblings from the nest altogether.)

Cuckoo finches, in turn, lay eggs in a variety of colors and patterns similar to the prinia's. They have to spread these eggs around and rely on the odds that at least some of them will match their borrowed nests. This is the only way for their offspring to survive.

Claire N. Spottiswoode and Martin Stevens at the University of Cambridge asked whether competition between the tawny-flanked prinia and its parasite has created an evolutionary arms race. In each generation, prinias with unusual-looking eggs should be better able to avoid brood parasitism. And only cuckoo finches that can make new eggs to match will reproduce.

The researchers gathered eggs from both species in the wild. They also studied preserved eggs from a museum, which had been collected 20 to 30 years earlier. Using spectrophotometry and computer modeling, they analyzed how egg colors and patterns had changed over the decades--not just to human eyes, but as a bird would see them.

Even in the span of those few decades, both birds' eggs had changed color. For example, Spottiswoode said, most of the cuckoo finch eggs from 30 years ago are reddish (at least to our eyes), but are bluish today. The prinia today more often lays olive-green eggs, apparently staying one step ahead of its parasite. (At the top of this page, you can see the range of prinia egg colors in the outside circle and the range of cuckoo finch eggs inside.)

Cuckoo finch eggs from the past are a better color match to prinia eggs from the past, and cuckoo finch eggs from the present day match modern prinia eggs. This tells us the color shift over the past few decades was almost certainly not random. Instead, the two species are evolving in response to each other at an amazingly brisk pace.

The color shift hasn't just been in one direction, though. Both host and parasite eggs have become significantly more varied over the past 30 or so years, as if each species expanded the Crayola box it was working with.

But there's no way the birds could have kept up this expansion over thousands or millions of years of evolutionary history. They would quickly run out of colors. The authors speculate that instead, the two species are locked in an unending cycle.

First the color palette expands, as is happening today. Host birds with bigger Crayola boxes have an advantage at escaping parasites, and parasites develop more egg colors to keep up. But once colors get sufficiently wild, host birds that lay average-colored eggs have more of an advantage. When all the parasitic birds are laying eggs at the edges of the color spectrum, they can't match host eggs that are in the center of that spectrum. As the cuckoo finch catches up and starts laying boring-colored eggs of its own, extreme colors once again become beneficial for the prinia, and the cycle starts over.

In another few decades, maybe the prinia and cuckoo finch will be narrowing their palettes again. Forget fossil records: A cold war between host and parasite can drive evolution that's fast enough for us to watch in real time.

Spottiswoode, C., & Stevens, M. (2012). Host-Parasite Arms Races and Rapid Changes in Bird Egg Appearance The American Naturalist, 179 (5), 633-648 DOI: 10.1086/665031


Images: Egg rainbow, Spottiswoode and Stevens. Brood parasitism by a cuckoo, Per Harald Olsen/Wikimedia Commons.

Human Dung Wins Interspecies Taste Test


A panel of experts in Nebraska has declared human dung more appealing than that of several other species. These experts didn't so much announce their decision as fall headfirst into baited poop traps while looking for a meal. Still, you won't find a more discerning group of judges than nine thousand dung beetles.

The various species of dung beetle that live together in the Great Plains region have evolved to consume, and share peacefully, its turd piles.* Some species are specialists, preferring one animal's feces to any others. Others will eat anything that falls their way. Two entomologists--Sean Whipple at the University of Nebraska, Lincoln, and W. Wyatt Hoback at the University of Nebraska, Kearney--set out to mess with the balance between these dung beetle species.

The researchers set traps all over a large organic cattle ranch. Each trap consisted of a large bucket sunken into the ground with a pile of dung at the bottom. The bait came from 11 different species that included carnivores, herbivores, and omnivores. Some of these dung flavors were ones the beetles might encounter regularly: bison, moose, cougar. Others were "exotic," from animals that don't normally leave their excrement around Nebraska: zebra, lion, human.

(The animal feces came fresh from a local zoo. As for the human specimens? "I will say this much," Sean Whipple said when I asked. "It is difficult to find volunteers for a study such as this.")

Human and chimpanzee feces were the clear winners of the popularity contest. Almost 9,100 dung beetles stumbled into the authors' traps, belonging to 15 different species. The beetles in the human and chimp dung buckets far outnumbered any of the rest.

Different beetle species preferred different types of dung, which explains how they can all share resources normally. But overall, omnivores were a favorite. Pig dung, while not as wildly popular as human or chimpanzee, still attracted a lot of beetles.

Whipple says the scent of omnivore dung is especially alluring. "Previous research has shown that the dung of omnivores is generally more attractive than that of herbivores, likely as a result of odor," he wrote in an email.

Incidentally, if the dung beetles had gotten a chance to eat all that sweet-smelling human dung, it would have been a good choice nutritionally. Chemical analysis showed that the human dung had the highest nitrogen content, a measure of "dung quality," Whipple says.

Herbivore dung wasn't only less popular than omnivore dung. It was also beaten out by carnivore dung. And the most widely ignored droppings came from bison--a species that local dung beetles evolved alongside, and that would have provided much of their diet just a century and a half ago.

Like five-year-olds who are bored with their green beans and would like some dessert already, please, most dung beetle species in this study were eager to switch from their usual plant-based poops to something new and exciting. They're not likely to start encountering a lot of human or chimpanzee feces on their Nebraska ranch. But a non-native animal that's introduced to an area where dung beetles live (and they live all over the world) could upset the balance between its native inhabitants. Beetles might start competing for the exotic food source, for example, or ignoring piles of poop they would ordinarily clean up.

If you're wondering what makes our own species' dung so appealing, the authors say diet doesn't seem to be a factor. Among the zoo animals whose dung they used, all the carnivores were fed the same diet, and so were the herbivores. But the dung beetles preferred some carnivore or herbivore dung to others, suggesting there's more to poop flavor than the food it started out as.

Though the human subjects may have eaten different diets, their specimens were "thoroughly mixed to ensure homogeneity," Whipple says. Now that's appetizing.

Whipple, S., & Hoback, W. (2012). A Comparison of Dung Beetle (Coleoptera: Scarabaeidae) Attraction to Native and Exotic Mammal Dung Environmental Entomology, 41 (2), 238-244 DOI: 10.1603/EN11285


Image: icadrews/Flickr


*Yes, I realize that three out of the last four posts here have involved poop (hyena, penguin, and that eaten by dung beetles). Apparently I'm in a dung rut. At least it's not a dung bucket.

Space Census Finds Extra Penguins, Poop


Playing what might have been the world's most tedious game of Where's Waldo?, scientists used photos taken from space to count all the emperor penguins in Antarctica. They found more than a hundred thousand birds that hadn't been spotted before. The news may affect the penguins' fate in a warming world. Besides, what's a better surprise than extra penguins?

Researchers from several institutions, including the British Antarctic Survey in Cambridge, undertook the emperor penguin space census. They thought previous penguin counts might not be accurate. For one thing, the last estimate of the Antarctic penguin population is almost 20 years old. For another, humans can't easily travel very far from their Antarctic research bases to seek out half-frozen bird huddles. So penguin colonies that are farther out in no man's land might have never been spotted by people.

Thanks to emperor penguins' habit of clumping together in giant colonies during breeding season--and their convenient lack of camouflage against the snow--the researchers knew high-resolution satellite photos should reveal the penguins. They used images from all around Antarctica's coastline, where penguin colonies camp out. Forty-six colonies appeared, including several that hadn't been counted before.

A penguin colony on the Antarctic coastline, spotted from above.

After zooming in on each colony and sharpening the images, the researchers used computers to count the penguins one by one. The challenge was for the computer to decide which dark pixels represent penguins, rather than shadows on the snow--or penguin poop. Author Peter Fretwell explains that in this method, "you 'train' the computer to recognize the pixels that are penguin, guano, snow or shadow by giving it sample pixels. The computer then goes away and splits the whole image into each pixel type."

Zooming in on a penguin colony and sharpening the image. I think I found the guano.

As long as the images have a high enough quality, Fretwell says, this technique is "usually quite accurate." Where the satellite pictures were more shadowy, penguin counts would be a little less certain. For some of the colonies, though, researchers were able to check their numbers against estimates others had made from the ground or from aerial photography.

And then there were the missing penguins. All the satellite images were taken during the breeding season, when emperor penguins congregate to create adorable new baby penguins. The new parents take turns babysitting: While one penguin takes care of the chick, the other goes out to sea and swallows lots of fish to regurgitate later. While the chicks are small, they spend most of their time balancing on top of their parents' feet to keep warm. Once the chicks are old enough to walk around on their own, both parents may leave to forage.

So for every individual counted in a satellite photo, the authors assumed there was a hidden chick and a second adult at sea hunting for food. (They were only interested in counting breeding adults, not the chicks, most of which will die.) Later in the season, some of the penguin pixels may have been kids instead of adults. But since a young penguin standing on the ice probably has two parents away foraging, the researchers figured that pixel still stood for two adult penguins.

The final count was about 595,000 adult emperor penguins in all of Antarctica. That's roughly the (human) population of Milwaukee. It's also substantially higher than the last estimate, which put the population between 270,000 and 350,000 adult birds.

The census could easily have overestimated or underestimated the true number of penguins. But, Peter Fretwell says, "The main thing is that this gives us an initial benchmark from which we can monitor emperor penguin numbers in the long term."

As climate change tightens its grip on every part of the globe--all the way to the poles--penguins will certainly see some changes around them. The sea ice along the coastlines they inhabit will disappear; shifting food webs may make their prey scarcer; and severe storms might become more frequent. Knowing how many emperor penguins are there now, and where to find their colonies, will help scientists monitor how the species is coping with the changes. We might even be able to keep them from becoming harder to find than Waldo.

Fretwell, P., LaRue, M., Morin, P., Kooyman, G., Wienecke, B., Ratcliffe, N., Fox, A., Fleming, A., Porter, C., & Trathan, P. (2012). An Emperor Penguin Population Estimate: The First Global, Synoptic Survey of a Species from Space PLoS ONE, 7 (4) DOI: 10.1371/journal.pone.0033751 


Image: Close-up penguins from Hannes Grobe/AWI/Wikimedia Commons; satellite images from Fretwell et al.


Note for British readers: You may know Waldo as Wally.

Google Searches Give Away a Country's GDP


Anytime we travel through the Internet we leave piles of data behind us, like Pigpen shedding his cloud of filth. It's too bad if you're concerned about privacy. But if you're a mathematician, that heap of dirt is more like a goldmine, and digging into it can turn up unexpected nuggets. A study of worldwide Google searches, for one thing, reveals that people in wealthier nations think less about the past.

Google collects data on what search terms people around the world are using. Researchers who want to use this data to compare search terms across different countries are usually restricted to places that share a language. But the authors of a new paper in Scientific Reports got around that problem by looking only at numerical search terms.

"We realized...that years represented in Arabic numerals are an almost universal written representation," author Helen Susannah Moat wrote in an email. By looking only at search terms such as 2011 or 2010, she and her coauthors could compare search data from nearly the whole globe.

"It seemed a logical first step to consider to what extent Internet users were searching for dates in the future compared to dates in the past," Moat says. For example, looking at data from 2010, the researchers compared searches including 2011 to those including 2009. The ratio of forward-looking to backward-looking searches in each country became its "future orientation" score.

The authors culled data from 45 countries with substantial Internet-using populations. Then they sorted those 45 countries by GDP ("also the most obvious variable," Moat says). A clear pattern popped out of the numbers: Countries with lower GDPs had lower future orientation scores, and vice versa. People in poorer countries did more searches concerning the previous year; those in wealthier nations searched more for the next year. The trend was strong, and it held up in data from 2009 and 2008 as well.

Countries with the lowest future orientation scores included Pakistan and Vietnam, where previous-year searches outnumbered next-year searches by a factor of three or four to one. In the United States and Canada, countries toward the higher end in future orientation, searches for the last year and the next year were roughly equal. Switzerland, Australia, and the United Kingdom were among the most forward-looking countries of all.

"One of the possible interpretations of our results," Moat writes, "is that a focus on the future supports economic success." In other words, populations that are more forward-thinking become wealthier. This up-by-the-bootstraps explanation doesn't seem like the simplest one, though.

Another possibility is that populations with more money and leisure time can afford to spend it thinking about the future. A person in a wealthier nation might search online for next year's concert tickets, dates of work holidays, or when the new iPad is coming out. Someone without disposable income, though, might not have many such events to look forward to.

Here's some good news for people in all nations: Google Trends is available online for aspiring data analysts to play with. Panning for gold in its graphs won't cost anything except your free time.


Preis, T., Moat, H.S., Stanley, H.E., & Bishop, S.R. (2012). Quantifying the Advantage of Looking Forward. Scientific Reports, 2, 350. DOI: 10.1038/srep00350

Hyenas Fast During Lent Too



Carnivores that shape their lives around humans may find themselves following human calendars. And that includes our religious observances. In Ethiopia, spotted hyenas eat meat scraps left by humans for most of the year. But when those humans go vegan for Lent, the hyenas become hunters.

You might not expect dietary discretion from spotted hyenas, as they're possibly the world's least picky eaters. They're just as happy to scavenge on found carcasses as to kill their own meat. They've been observed devouring all kinds of mammals, birds, fish, and reptiles--not to mention garbage, dung, and carcasses infected with anthrax spores. The hyena's digestive system even handles bones without a problem. Hair and hooves are the only remains left after a thorough hyena meal.

But in northern Ethiopia, where populations of their natural prey are severely depleted, spotted hyenas rely on humans for food. Not that they eat humans, that is. Hyenas scavenge animal remains that Ethiopians dump outside their compounds, and since they stay away from the people supplying those carcasses, human and hyena tolerate each other's company. (At a veterinary school, humans even count on hyenas to keep the campus clean.)

There's just one hitch for the hyenas. The population in this part of the country is primarily Orthodox Christian. The Ethiopian Orthodox Tewahedo Church dictates fast periods throughout the year, the most prolonged of which is the eight-week Lent. During this period, Christians give up all meat, dairy and eggs to follow a vegan diet.

Researchers led by Gidey Yirga at Ethiopia's Mekelle University set out to see how the Lenten fast affected local hyenas. At sites around Mekelle, they collected hyena scat on the first day of Lent (representing what hyenas ate beforehand); the last day of Lent (for what they ate during the fast); and 55 days later (after a return to their normal diet). By digging animal hairs out of the hyenas' feces, researchers could identify the species that had made up their meals.

A wide variety of animals were represented in the hyenas' dung, including livestock such as sheep, goats, donkeys, cattle, and horses. (In the study areas, livestock outnumber humans.) And the hyenas' diet had significantly changed during Lent. With their usual supply of leftovers lacking, hyenas' scat showed that they had feasted on donkey meat.

The authors explain that live donkeys are an easy target for hungry hyenas because, unlike other livestock, their owners leave them outside the compound at night. Additionally, "weak donkeys are abandoned altogether, which makes them a relatively easy food source."

With plenty of hapless donkeys around, the hyenas weren't exactly facing hardship during Lent. But they adjusted their behavior with impressive ease. During Lent, the hyenas became active hunters, taking down live animals to feed on. After Lent, they returned to scavenging trash.

The flexibility of predators such as hyenas means that removing a food source might be a good way for humans to change animals' behaviors. The hyenas in this study don't harass people too much. But when the predators hanging around your city (or your livestock) are lions, it can help to know how to get them on a different diet.

Hyenas' adaptability also means humans aren't the only ones that can follow a different diet from one calendar period to the next. But outside of the fasting seasons, hyenas are probably relieved to get back to the comfort of a pre-killed meal. And when Easter arrives, the donkeys might be the happiest of all.

Yirga, G., De Iongh, H., Leirs, H., Gebrihiwot, K., Deckers, J., & Bauer, H. (2012). Adaptability of large carnivores to changing anthropogenic food sources: diet change of spotted hyena (Crocuta crocuta) during Christian fasting period in northern Ethiopia Journal of Animal Ecology DOI: 10.1111/j.1365-2656.2012.01977.x


Image: Rob Willock/Flickr

Dinosaur Age Not Dramatic Enough? Add Fire




As if a world dominated by hungry, house-sized lizards weren't sufficiently exciting, scientists have added another set piece to our image of the Cretaceous: raging wildfires.

The Cretaceous period, which ended about 65 million years ago with the extinction of the dinosaurs, was hot. That's thanks to volcanos that pumped carbon dioxide into the atmosphere and created a greenhouse effect. Researchers from London and Chicago now say it was also a "high-fire" world. Frequent blazes may have kept animals on the run, created some of the fossil beds we study today, and helped determine which plant species survived into the next era.

Led by graduate student Sarah Brown from the Royal Holloway University of London, the researchers tracked the appearance of charcoal in ancient sediments. Like a set of sooty footprints right through the fossil record, the charcoal evidence showed when and where fires had occurred.

The team saw that wildfires had increased during the Cretaceous period. These fires were probably sparked by lightning, and their flames were fanned by the high concentration of oxygen in the ancient atmosphere. Today, oxygen makes up about 21% of our air. But during the Cretaceous, it may have risen as high as 25% or more.

This high oxygen content, the authors say, would have allowed plants to burn without being bone dry. A spark in a green forest, instead of dying out as it would today, might become a full-blown fire.

Brown and her coauthors did not find any evidence that these fires contributed to killing off the dinosaurs. But they note that after a fire burns through a piece land, erosion is likely. There may be rapid flooding or mudslides. In the Cretaceous, these events might have trapped and killed dinosaurs and other animal life--and helped preserve their bones.

The authors point to certain fossil beds that lie in floodplains and contain charcoal, as well as plant and animal remains. These could be sites where wildfires triggered flooding, conveniently sweeping lots of informative fossils into one place for future scientists to find.

Charred plant remains in these fossil beds provide another clue about the effect of fire. As the Cretaceous went on, the types of plants being fossilized gradually changed. Flowering plants, called angiosperms, became more and more common. Gymnosperms--the more ancient, flowerless species such as cone-bearing trees, cycads, and ginkgos--faded into the background.

A charred flower fossil from the Late Cretaceous.

Frequent fires may have given an added edge to the angiosperms. The new types of plumbing these plants had invented let them grow faster and more efficiently. Rather than trees, the flowering plants growing during the Cretaceous seem to have been weedy and shrubby types. After a fire, they could regrow faster than the gymnosperms. And their new growth provided fresh fuel for wildfires, creating a cycle that encouraged the growth of flowering plants and left older models in the dust.

Though fire didn't do in the dinosaurs, then, it may have helped set the stage for the dominant plants of the modern age. (As if we needed any more drama.)

Brown, S., Scott, A., Glasspool, I., & Collinson, M. (2012). Cretaceous wildfires and their impact on the Earth system Cretaceous Research DOI: 10.1016/j.cretres.2012.02.008


Images: Gorgosaurus from Nobu Tamura/Wikimedia Commons; flower fossil from Brown et al.