Field of Science

Pages

The One Funny Thing

Most science is not hilarious. Science is slow, it involves repeating the same action tens or hundreds or thousands of times, and it often investigates deeply unfunny subjects such as cancer or polymer synthesis. That's why the rule of the One Funny Thing is such a relief.

The rule goes like this: If you read a paper all the way through, top to bottom, you have a good chance of finding something funny. It's usually hiding in the Methods section. Now, I can't vouch for the aforementioned cancer and materials science studies, and I admit the humor in most astronomy papers eludes me. But I always feel reassured to find something silly tucked away in an otherwise serious, peer-reviewed paper.

What follows is a collection of funny things. Some of them are from studies I've written about on this blog, in which case you can follow the link to get to the original story. Others are studies I've read for work. Going forward, when I cover a story that follows the rule, I'll label it with the "One Funny Thing" tag. Clicking that tag will take you back to this page, where I'll add the new tidbit at the top of the list.

Good to Know
On recruiting volunteers for a study about diet and gut bacteria:
To be eligible, participants were required to be free from any chronic gastrointestinal disease, cardiac disease, diabetes mellitus or immunodeficiency diseases, to have a normal bowel frequency (minimum once every 2 days, maximum 3 times per day)...

Sniffin' Sticks
To search for a correlation between people's olfactory abilities and the functioning of a certain brain area, researchers in Belgium started by giving their subjects a sniff quiz.
Orthonasal olfactory function was assessed by means of the standardized "Sniffin' Sticks" test. In this evaluation, odors are presented using felt-tip pens containing a tampon filled with four milliliters of liquid odorants. 
I wonder if the liquid odorants were blue, as in every Kotex commercial.

Mouse Refusal
In a study of people's intuitive geometric knowledge, researchers gave a group of five- and six-year-olds a test on a computer. 
Two additional children were excluded from the final sample in the lines intuition task, one because he was attending second grade despite his young age and the other because he refused to use the mouse of the computer.

Un-amenable Ostriches
Scientists studied the locomotion of ostriches by sticking motion-capture balls all over the birds' bodies. They started with five baby ostriches, reared them by hand, and spent eight months training the birds three to four times a week. After all this, only
[T]wo animals were amenable to the procedures required for full three-dimensional gait analysis.
Three out of five birds, that is, were totally untrainable. This might explain why more ostrich research isn't done. I've watched Animal Cops Houston. An ostrich that is not amenable to something will let you know by kicking you in the head.

Adverse Events
This is almost cheating because the whole paper is so funny. The authors asked whether wearing socks over your shoes made it safer to walk down an icy hill (and won and Ig Nobel award for their efforts). First, the researchers recruited passing university students:
In light of the observed behavior of pedestrians (often young men) at these sites on previous mornings, participants were asked to refrain from deliberately skidding or sliding.
As participants made their way down the hill, with or without external socks,
Assessors were also asked to document any falls and to comment on the demeanor of the participants during their descent (for example, "walked confidently," "clung to fences or parked cars," "crawled").
Finally, the authors conclude that the socks did help.
The only adverse events were short periods of embarrassment for the image-conscious.

Expelling the Plug
It's believed that homing pigeons navigate partly by smell. To find out whether the birds depended more on one nostril than the other (they did; it was the right), researchers took on the task of plugging pigeon nostrils. 
The evening before the experimental releases, one nostril of each of the...pigeons was plugged. The plugs were made with a small amount of paste (Xantopren®, Heraues Kulzer, Hanau, Germany), which turns into a solid rubbery plug after insertion into the nostril. The plugs were removed once the pigeons homed.
The pigeons may have objected.
From our preliminary observations, the pigeons are able to expel the plug within a few days.


Cautious Conclusion
A study of high-schoolers found that daylight saving time might negatively affect SAT scores. (Some test dates were immediately after clock changes, and teenagers like to sleep a lot.)
The cautious conclusion is that the daylight-saving time policy should possibly be even more controversial.
Yes, I'll say that's a cautious conclusion.

Rapid Scissoring
From a paper on whether lobsters can recognize each other before deciding whether to fight comes a handy scale of lobster aggression. On one end of the scale is "Retreat." Levels of aggression progress through
Rapid and direct head-first advance towards opponent(s) without hesitation, often with claws outstretched
to
Rapid scissoring motion with both claws at opponent
and finally
Contraction of the abdomen to propel animal backwards in an attempt to rip off opponent's appendage.
The authors reassure us:
The experiments comply with the current laws of Italy, the country in which they were done...We intended to separate the lobsters and consider the observation over if fights appeared to escalate to potentially damaging levels.
After the experiment, they froze all the lobsters to death anyway.

No Objective Way
A story about skunks and other stink-spraying animals also contained a helpful scale, this time measuring the degree to which an animal uses its anal gland secretions for defense, but I'll spare you the details. The authors quantified every possible aspect of each animal's coloration, habitat, and behavior. Some things couldn't be measured, though.
It is worth noting that several boldly colored species are quite pugnacious...however, we were unable to create an objective metric for ferocity. 
Similarly,
Although species that use anal gland defenses also vary in the noxiousness of the secretion, there was no objective way to score this.
I wish they had tried.


Jungle Geometry: Who Needs Euclid?

At some point in your teenage years, you probably kept a compass and straightedge in your backpack, learned the ways to prove two triangles are congruent, and knew what a secant was. It all had a taste of the classical about it: Euclid, Archimedes and Pythagoras had figured everything out and passed it down to us. But geometry may be more democratic than it seems. As a group of native Amazonians showed, you don't need to go to school to explain Euclid.

French researcher Veronique Izard and her colleagues wanted to know if an understanding of Euclidean geometry is intuitive. It makes sense for humans and other animals to have a basic sense of shapes and distances, so we can find reachable fruits and flee approaching predators. But our eyes often deceive us. So do children, or remote tribespeople, instinctively understand that two parallel lines never cross? Or how many points define a line?

The researchers traveled to the Amazon and recruited children (ages 7 to 13) and adults from a group called the Mundurucu. They had no education in geometry, and their language doesn't include any words to describe concepts such as parallel lines or right angles. But the Mundurucu face challenging navigational tasks every day, just moving around their environment. The researchers quizzed them on basic Euclidean tenets.

Instead of points and lines, researchers described villages and straight paths. They asked two sets of questions, one concerning the geometry of a plane (described as a flat world that extends forever) and the other about a sphere (a "very round world"). For a visual aid, they used either a tabletop or a half a calabash.


Participants were also shown two corners of a triangle and asked to demonstrate, with their hands, what the missing corner would look like and where it would be.

The Mundurucu did great on their geometry quiz. The children performed just as well as the adults, and overall the Mundurucu did almost as well as American adults and French children that took the same quiz. All groups did better on questions about a flat plane than questions about the surface of a sphere, maybe because the former is more similar to what we observe in our daily lives.

To find out whether this kind of knowledge is truly innate, or something that develops over time, the researchers repeated the quiz with American kids just 5 and 6 years old. The kids did OK, but not as well as older children or adults. They especially had difficulty completing the triangles.

The results suggest that we're not born with an understanding of geometry. Rather, we learn as we grow how angles and lines work in the world. It would be interesting to see how another untrained group, one with less navigational experience than the Mundurucu, would handle the same questions. If a person grows up in a static and unchallenging environment, does he or she have a less intuitive grasp of distances and perspectives? Might the laws of the world be a little more mysterious?

Some of the questions the Mundurucu correctly answered had to do with abstract ideas, such as infinitely extending lines. This showed that they weren't just describing basic physical relationships they'd observed, but extending their knowledge of the world to larger mathematical concepts. Euclid may have come up with the terms and the postulates, but the Mundurucu show that anyone at all, using their eyes and their understanding, could have invented geometry.


Izard, V., Pica, P., Spelke, E., & Dehaene, S. (2011). From the Cover: Flexible intuitions of Euclidean geometry in an Amazonian indigene group Proceedings of the National Academy of Sciences, 108 (24), 9782-9787 DOI: 10.1073/pnas.1016686108

This post was chosen as an Editor's Selection for ResearchBlogging.org

The Stink Wars

It's not just skunks. Several other scrappy, medium-sized mammals can spray you with bad-smelling liquids from their anal glands. But they're not keeping it a secret: These animals have evolved certain signals that warn you and other potential predators to stay away (especially from the back end). If you know the signs, you can make sure to keep on the good side of any furry creatures you meet.

Striped skunk.

Providing more fodder for the theory that people are drawn to subjects resembling their own names, Theodore Stankowich of the University of Massachusetts, Amherst, led a study on skunks and similar creatures. He wanted to know whether a skunk's bold black-and-white pattern, which certainly doesn't provide camouflage, is actually a warning signal to bigger animals that might eat the skunk. 

Poisonous and nasty-tasting insects, frogs, and snakes often use vivid colors to advertise their unpalatability. Don't taste me, the monarch's black-and-orange wings say, and no one gets hurt. Even though we furry mammals have fewer colors to work with, do some of us warn off predators in the same way?

Spotted skunk.

Stankowich and his colleagues gathered data from 188 species of land-dwelling carnivores. They found that boldly patterned or two-toned creatures are, in fact, more likely to use anal gland secretions to defend themselves. (Though they don't all spray, many carnivores use these secretions for communication or for marking their territory. You may have yelled at your cat or dog for this sort of behavior.) Some animals merely dribble out their foul-smelling defense, while others are able to aim and spray it. In addition to boldly patterned or two-toned fur, horizontal stripes are a good indicator that an animal can not only spray you, but has good aim. 

"Although species that use anal gland defenses also vary in the noxiousness of the secretion," the authors note regretfully, "there was no objective way to [measure] this."

Badger.

Other signals turned up that weren't directly about anal spraying. Carnivores with stripes on their faces (though the rest of the body may be plain) tend to live in burrows. A hungry predator that sticks its head into a burrow and sees a boldly striped face staring back would do well to try a different hole. Many of these creatures are also pretty feisty, though the researchers were, again, "unable to create an objective metric for ferocity." A badger is both mean and able to spray you with stink.

Genet.

The researchers speculate that bold stripes might draw a predator's attention to the location of the anal glands, in case it's not getting the hint already. White markings down the back are often found in nocturnal sprayers, which gives predators approaching from above in the dark a chance to rethink their actions.

Grison.

Most of the animals that use this warning coloration are stocky and live in exposed habitats. When threatened, they can't spring into a tree or river for safety. But their coloration is what's known as an "honest signal": a code that accurately represents a trait (in this case, nasty spraying).

Why don't other animals copy this warning coloration? Sometimes they do. For example, the harmless king snake has red, black and yellow stripes that strongly resemble the venomous coral snake's. We humans use a rhyme to tell them apart ("If red touches yellow, you're a dead fellow!") but other animals presumably have to guess.

This is a dangerous game for a mimic, though. Potential predators need to learn by experience how the coloration code works. That means that an animal's bright colors or stripes won't always prevent a predator from taking a bite out of it. Stink-spraying mammals often have tough, loose skin to help them squirm out of a predator's mouth, or are scrappy fighters. If you don't have any defenses to back you up, you'd be better off hiding from predators altogether.

Leave me out of this. I'm a fluke.

The stripes-and-stink pattern isn't just a quirk of skunk relatives. Instead, it seems to have evolved separately several times within the carnivorous mammals. That means it must be a winning strategy in the wars between predator and prey. As for humans, our best strategy is to leave a wide buffer zone.


Images: Striped skunk Arlington Animal Services; spotted skunk NPS; badger USFWS, genet Wikipedia/Xesko; grison Wikipedia/Tony Hisgett; panda Wikipedia/J. Patrick Fischer.

Stankowich, T., Caro, T., & Cox, M. (2011). BOLD COLORATION AND THE EVOLUTION OF APOSEMATISM IN TERRESTRIAL CARNIVORES Evolution DOI: 10.1111/j.1558-5646.2011.01334.x

How Farming Made Us Shorter

We usually think of farmers as sturdy, Midwestern types who raise their ruddy-cheeked children on a balanced diet of eggs, potatoes, and chores. A study from researchers at Emory University, though, suggests that our farming ancestors weren't the picture of health. When humans transitioned from hunting and gathering to farming and living in cities, the authors say, they became malnourished and more prone to disease. Oh, and they were shorter.

Scientists use height as a rough yardstick of a population's health and nutrition. As an individual, your potential height comes from your parents' genes. But whether you reach that potential has to do with how healthy you are as a child--are you getting the right nutrients? Fighting diseases? And the average height of a population tells scientists roughly how healthy that population is. The tallest people on Earth today live in the Netherlands.

To assess the health of various prehistoric populations, the Emory researchers pooled data from several previous studies of ancient bones, then examined how people's heights changed as their populations transitioned to agriculture. Farming first appeared around 10,000 years ago in the Middle East, then spread around the globe, sometimes cropping up (ahem) independently. The populations included in this study ranged from 9,000-year-old Chinese to North Americans from within the past thousand years.

In general, the authors say, populations tended to get shorter as they transitioned from hunting and gathering to agriculture. Some bones provided evidence of malnutrition, anemia, and poor dental health. Why would farming make people sick? For one thing, relying on a smaller variety of food sources could lead to malnutrition, if crucial nutrients were missing from a farmed diet. Food supply depended on the seasons, and groups had to store enough food to last through the winter. A drought or infestation meant that the whole community went hungry. And since people were living in bigger, denser communities, infectious diseases could spread more easily.

The researchers acknowledge that several studies within the larger group they looked at did not find a short-farmer effect. Those studies found that height stayed the same, or even increased, when populations made the move to agriculture. The effect may have depended on the resources available in an area; maybe populations that could grow a greater variety of foods avoided a health decline. In some areas, height initially decreased but then increased over subsequent generations.

If farming were really a worse survival strategy than hunting and gathering, it couldn't have persisted. Rogue groups of humans who lived outside of the community and gathered their own food would have outcompeted their city-dwelling, farmer neighbors. Instead, farming became the norm. So this lifestyle--organizing ourselves into communities, sharing resources, dividing labor, domesticating crops and animals--must have provided a net gain in our well-being. Even if it initially made us a little more sickly, it allowed our populations to grow and spread.

You might interpret these findings as evidence that you should take up a "caveman diet." If so, there are plenty of books and websites out there to help you; they generally recommend starving yourself and eating a lot of nuts and meat. I'd recommend looking into local hunting laws before you start shooting your own squirrels and pigeons. (Of course, if you're on a true caveman diet, shooting is cheating.) This book even comes with a measurement conversion table, in case you're not sure how many ounces are in a skull cup.

Farming may be a relatively new development in human history, but that doesn't mean we're not built for it. For example, if your ancestors came from a dairying culture such as in northern Europe or eastern Africa, you probably drink milk and eat ice cream with no problem. This isn't the caveman way. For our ancient ancestors--as it is for humans in most parts of the world today--the enzyme that breaks down the sugar in milk (lactose is the sugar, lactase is the enzyme) faded away as humans grew out of early childhood. But the tendency to hang on to lactase has evolved at least twice since we started keeping dairy animals. Drinking a domestic animal's milk must have given these populations a serious evolutionary advantage in order for the lactase-keeping trait to spread so well. So is it "unnatural" to drink milk as an adult? My genes say no, though yours might say something different. It's not the caveman way. But we're not cavemen anymore; we're farmers.


Mummert, A., Esche, E., Robinson, J., & Armelagos, G. (2011). Stature and robusticity during the agricultural transition: Evidence from the bioarchaeological record Economics & Human Biology, 9 (3), 284-301 DOI: 10.1016/j.ehb.2011.03.004

This post was chosen as an Editor's Selection for ResearchBlogging.org

Dogs Defeat DNA


Planning on committing a crime anytime soon? You'd better be careful not to leave your DNA behind. If crime scene investigators can collect any hair, skin cells, blood, or other bits of you from the crime scene, they'll have a pretty convincing case against you once you're in custody. Unless, of course, you have an identical twin. If that's the case, commit all the crimes you want, because there is absolutely no way for scientists to tell the difference between your DNA and your twin's.

According to a new study from the Czech Republic, though, a dog can do one better than a DNA technician. A group of trained German shepherds were able to reliably tell apart the scents of identical twins.

The study used four sets of twins, two identical and two non-identical. The identical twins were 5 and 7 years old, and the others were 8 and 13. The researchers used children because they wanted to give the dogs the greatest challenge possible: telling apart two people with identical genes who live in the same environment and eat the same food. As twins get older and develop different eating habits, move into different homes, and perhaps face different health problems, their personal scents will presumably grow apart. But identical twin children are just about as similar as two humans can get.

The German shepherds were police dogs trained in scent matching. This is a forensic technique used in a few European countries such as Russia, Poland, and Denmark, but not used in the United States. A dog is given a scent sample from the crime scene to sniff. Then the dog is led to a scent line-up: a row of seven identical glass jars, each holding a piece of cotton. The dog sniffs every jar, and if it finds one that matches the crime scene scent, it alerts its handler by lying down next to that jar.

Ten dogs participated in the study. Each trial was like the normal scent line-ups the dogs were used to. A dog sniffed a piece of cotton, then checked for a match among seven possibilities. Every dog consistently matched the humans to their scents, regardless of whether the human was an identical twin. If a dog sniffed Twin A and Twin A was in the lineup, the dog chose that jar. But if a dog sniffed Twin A and only Twin B was in the lineup, it walked right past. They never made a mistake.

(In case you're wondering how scientists bottle a person's smell, the answer is: belly skin. The kids held scent-absorbing cotton pads against their bellies for 20 minutes to create a kind of smell swatch.)

In earlier studies, researchers had found that dogs have trouble smelling the difference between identical twins. The dogs in this study succeeded because they've been highly trained by the police. It's not an easy task--and that means our genes must have a lot of responsibility for our personal scents. The differences in the children's scents must have come from very small variations in their lifestyles. Maybe one twin has a greater preference for peanut butter, or one twin likes to exercise more and has a different metabolism.

Not much is known about why people smell the way they do. It would be interesting to see a follow-up study using a larger group of twins at different ages. Do three-year-old twins smell different? Do some ten-year-old twins smell exactly the same? Is there any way for newborn twins to not smell identical? If we weak-nosed humans had a better understanding of the scents that come from our bodies, it might give us new tools for medical diagnoses. (Here, for example, is a story about a woman who detected her husband's hidden disease based on a change in how his breath smelled.)

In this country, perhaps we should consider giving dogs a larger role in forensic investigations. As long as identical-twin criminals don't strike, DNA is still a reliable and (usually) convincing form of evidence. But it's easy for us to forget that other animals have access to a whole layer of information we can't begin to decode.


LudvĂ­k Pinc (2011). Dogs Discriminate Identical Twins PLoS ONE : 10.1371/journal.pone.0020704

How a Bubble Form Is Like a Fingerprint


Ah, the bubble form. It mostly brings back memories of elementary school: flustered teachers passing out sharp number-two pencils while I sadly bubbled "ELIZABET" into the eight spaces allotted for my first name. Though bubble forms become less frequent as we age, their stakes get higher. We bubble in SAT answers and vote in presidential elections. It's assumed that bubbles are anonymous; once separated from our (truncated) names, they're like ones and zeros that say nothing about the person who filled them in. But researchers at Princeton say that unfortunately--or fortunately--that's not true.

In a paper that will be presented at a security research conference this summer, the computer scientists describe how they extracted identifying information from bubble forms. They created a system that could identify who had filled in a bubble, and detect when a person had fraudulently filled in another person's form.

To pull all the information they could out of each bubble, the researchers scanned a group of questionnaires by high schoolers at high resolution. A computer examined each pencil blob and calculated its center, average radius, how much its radius varied (because most of us don't create perfect circles), and the blob's "center of mass." Then each filled-in circle was divided into 24 pie slices, and the shapes of those slices were evaluated in the same way. Finally, the coloring of each pie slice--how darkly or lightly it had been penciled in--was analyzed.

The researchers were now armed with hundreds of pieces of information about each bubble, and the identities of the 92 students who had made them. They trained their computer with 12 bubbles from each student, then showed it groups of 8 bubbles it hadn't seen before. For each new group of bubbles, the computer guessed which of the 92 students had filled them in. It picked the right student just over 50% of the time. And 75% of the time, the correct student was in the computer's top three guesses.

So a bubble isn't as good as a fingerprint. This method couldn't match bubbles one-to-one; it requires a bunch of bubbles to compare. The researchers did try testing their system on one bubble at a time (after training it on the large group), and it didn't totally fail: it picked the right student 5% of the time, which is better than the 1% it would get from guessing randomly. But to guess the identity of a bubble maker with any confidence, the computer would need a larger group of bubbles.

Still, most of the scenarios in which we fill out bubble forms involve just that: a large group of bubbles. This means that someone with the right programming (and determination) might be able to identify who filled out a standardized test form. The authors point out that a computer program could automatically scan standardized test forms--SATs, MCATs, LSATs--and compare the bubbles to a known set of bubbles for each test-taker. By flagging the tests that don't seem to match, such a program could identify cheaters who have filled out a test under someone else's name.

The findings have negative implications, though, for voting. Here in Chicago, we vote by connecting two sides of an arrow. It's bizarre and frankly a little confusing, but probably doesn't leave a lot of clues for identifying a voter. But in some other parts of the country, voters fill in bubble forms. The authors note that counties such as Humboldt County, California, release scanned images of their ballots after an election. A devious potential employer, or someone who was coercing you to vote a certain way, could potentially find out how you voted, given that this person had the right programming (and, ideally, a copy of your SAT). To make ballots more secure, the authors suggest printing forms on a gray background or having voters fill in their bubbles with ink stampers.

Since this study only used one set of bubble-form surveys from high schoolers at one point in time, it's possible that we aren't that consistent, after all. Our bubbles may all look alike on one day, while we're using one pencil and sitting at one desk, but they might vary from test to test. Or our bubbling-in style might gradually change as we age. If anyone wants to do this study, I'll be the first to volunteer my elementary-school standardized tests. It would be nice to think they served some purpose.

Image: J. Calandrino, W. Clarkson and E. Felten, Princeton University.

Little People, Big World

Does the size of your body fundamentally impact how you see the world around you? Are your perceptions inextricably tied to your dimensions? Researchers in Sweden say the answer is yes. Reaching this conclusion was as simple as swapping people into different-sized bodies.

Before you get too concerned, let me clarify that the bottommost figure in this photo is not a headless child.
The researchers used a "body-swap illusion" that requires a somewhat tricky setup, as you can see above. The subject (the one with a head, on the left) lies down and puts on a headset. Instead of seeing her own body, when she looks down she sees the images coming from the cameras to her right. The cameras are looking down at a mannequin's body, which can be the same size as the subject's body or not. So far, no illusion: the subject knows this isn't her body. But the experimenter then starts to touch the subject's and mannequin's legs simultaneously and in the same places. After a few minutes of seeing touches on the mannequin's legs and feet that correspond to the touches she feels on her own body, the subject experiences the strange sensation that the mannequin's body is her own.

The touches have to happen in the same place and at the same time for the illusion to work. If the experimenter's pokes at the mannequin's and subject's legs are out of sync, the illusion fails. This was the control condition for all the experiments.

One of the first versions of this study, back in the late 1990s, involved a rubber hand. Subjects rested one of their hands on a table, but their view of that hand was blocked by a small barrier, while a rubber hand was in full view. The experimenter stroked the backs of both hands with a paintbrush for several minutes, and the subjects eventually came to feel that the rubber hand was their own. Some subjects even reported seeing the hand start to look like their own. 

As an aside, I have a special affection for this area of research because I actually used the rubber-hand experiment for my eighth-grade science fair project. I had read about the study in Discover magazine and I wrote to the author, Matthew Botvinick, to ask for more information so I could replicate what he'd done. Not having access to whatever supply stores psychology researchers use, I resorted to using a rubber hand from a Halloween shop. It had a gory, sawn-off wrist that my mom covered with an old shirt sleeve of my dad's. We named the hand Larry, and my ten subjects (it's hard to recruit subjects when you're paying in Reese's peanut butter cups) reported satisfyingly eerie results. One friend got freaked out by the illusion and threw poor Larry across the room in the middle of the experiment. I was a big hit at the county science fair, which made up for some of the successes I was obviously not having in other areas, like sports or being cool. Afterward, Larry was relegated to living under my bed, where he occasionally gave me heart attacks when I went searching for things.

Since the Swedish researchers wanted to know how the size of your body affects your perceptions, they decided to "swap" their subjects into outlandishly sized bodies. In addition to a normal adult-sized mannequin and a toddler-sized mannequin, they created a giant set of legs that would have fit on a 13-foot-tall person. They also did the experiment with a Barbie doll, which they expected to be an extra challenge because not only is Barbie just under a foot tall, but she's recognizably a toy.
Equipped with their cast of Gulliver's Travels characters, the researchers performed a series of experiments. First, they showed that they could make the illusion work even with a giant body or a Barbie body. They conclude that the illusion is "potentially unlimited," which makes me eager for their follow-up study.

This all seems creepy enough, but it was worse for subjects in the stabbed-in-the-abdomen portion of the study. To prove that subjects were really feeling ownership of the fake body, and not just experiencing a visual illusion, researchers measured subjects' skin conductance (an indicator of high emotions) while sticking a knife into the mannequins' stomachs. Unsurprisingly, subjects were scared by this.

Here's what you might have seen if you were a subject in the Barbie-body illusion. Subjects reported feeling like they were in a "giant world":
But the researchers wanted to test what being in a giant world--or a tiny world, in the case of subjects who had been swapped into giant bodies--really meant. They hung cubes of different sizes in front of subjects who had been body-swapped, and asked them to estimate the objects' sizes. Subjects who were inhabiting a tiny body thought the cubes were huge, just like the giant hand that had been poking them in the legs. Subjects who felt like they were inhabiting a huge body, conversely, thought the cubes were tiny, just like the puny researchers tickling their feet. Control subjects, who were looking at the same set of legs but not experiencing the illusion, didn't misjudge the cubes in the same way. This showed that the subjects weren't just comparing the cubes to the size of their legs--they really felt that everything in their world had grown or shrunk.

Finally, the researchers put objects at various distances from the subjects, then asked the subjects to stand up and walk (with their eyes closed) to where they thought the object was. The giant-body subjects thought things were closer to them, while the tiny-body subjects thought everything was farther away. Even when standing and walking--physically inhabiting their real bodies--they were still mentally inhabiting their fake bodies.

Aside from creating crazy props to fill some lab's store closet, the study suggests that our sense of our body size is tied to how we perceive the world. (Other studies have shown that your height affects how you experience time.) When we look around us and judge how far away other people are, how tall a building is, or whether we have enough time to dash acros a busy street, our mental calculations rely heavily on the body that our brain is inside. This makes me wonder whether people who have just lost half their body weight due to gastric bypass surgery, or lost both their legs in combat, feel as though they're inhabiting an entirely new world.

The authors suggest a few fascinating applications for their research in the field of virtual-reality-meets-robotics. Perhaps surgeons who need to perform a microscopic procedure could be made to feel as though they're inhabiting a very tiny surgical robot. Or an engineer could inhabit a gigantic oil-drilling robot at the bottom of the sea. They'd have to watch out for enormous squid, though, because those things are not an illusion.

Images: PLoS ONE (doi: 10.1371/journal.pone.0020195.g001)

Will Powdered Rhino Horn Cure My E. Coli? (a quiz)

It's Inkfish's 100th post! I considered celebrating by listing my top 100 stories, but instead I'm bringing you this review of recent science news.

By the way, if you've eaten any European produce lately and are feeling unwell, please stay off of airplanes. The TSA is after you, anyway.


1. At an undisclosed location in the northeast, the U.S. Department of Homeland Security is currently testing a system that will:
a. employ explosives-sniffing ferrets
b. identify people who are thinking about committing a terrorist act
c. scan travelers in 3D, so screeners (wearing 3D glasses, naturally) can get an even more accurate look at your body
d. scan your shoes without requiring you to take them off


2. As of Thursday night, the E. coli outbreak in Europe had killed 30 people and sickened more than 2,800. The culprit has now been identified as:
a. sprouts
b. cucumbers
c. broccoli
d. lettuce


3. In the Czech Republic, the world's eighth-to-last northern white rhinoceros has died of old age (she was 39). All species of rhino have been unfortunately favored by humans for their horns. Which of these is NOT a traditional use of rhinoceros horn?
a. dagger handle
b. aphrodisiac
c. treatment for fever
d. treatment for gout


4. After a delicate joint effort, scientists in China and Scotland are anxiously waiting to find out whether they've successfully mated a pair of:
a. white rhinos
b. pandas
c. cloned cats
d. giant corpse flowers


5. Seven scientists in Italy will be tried on manslaughter charges because they:
a. accidentally released a dangerous virus
b. approved defective pacemakers for implantation in humans
c. erroneously predicted a flood, causing a riot
d. failed to predict an earthquake


Bonus: Do cell phones cause brain cancer?
a. obviously
b. obviously not
c. I don't know, but I'd sure like to read about it!

Answers are in the comments.

Hello, 911? My Phone Is Killing Me!

With the announcement last week from a UN working group that cell phones might cause cancer, there has been a lot of buzz. Headlines in response to the news have included "Top 10 Low-Radiation Cell Phones," "Cell Phones Deemed Dangerous, Radiation Leads to Cancer" and the seemingly helpful "How Far Should You Sleep from Your Cell Phone?" (The question is, disappointingly, not answered in the article.)

I've written before about cell phones and the persistent fear that they damage your brain. The subject isn't a new one. But it's come up again because there was a meeting last month of a World Health Organization working group called the International Agency for Research on Cancer (IARC). After reviewing the current research, including the results of a study called Interphone, the group announced that they were classifying cell phones as "possibly carcinogenic to humans."

To be clear, the conclusion of that Interphone study was the following: "Overall, no increase in risk of [brain cancer] was observed with use of mobile phones."

But the authors do report "suggestions" that one type of brain tumor was more common among the group of people who reported spending the most hours on their cell phones. They point out, though, that there are problems with this finding.

The study relied on subjects' own recollections of how much time they spent on the phone--over a ten-year period. That would be enough of a challenge for a healthy person to recall, but brain tumor patients can have added problems with cognition and memory. And the greatest impediment to getting accurate results in a study like this is the question itself: Excuse me, brain tumor patient, but might you have used your cell phone a lot before you got that brain tumor? Surely every cancer patient is eager to find an explanation for his or her disease. The authors acknowledge that, in a previous study, brain cancer patients were more prone to overestimate their time on the phone.

So why is the IARC taking this result seriously?

To answer that, let's look at how serious this new classification actually is. The IARC classifies chemicals and other agents under a 5-category system: Carcinogenic to humans, probably carcinogenic, possibly carcinogenic, not classifiable, and "probably not" carcinogenic. Of the 941 agents they've assessed, 266 are in the same "possibly carcinogenic" category as cell phones will be. It means that research has been done on the subject, and the possibility of cancer has not been definitively ruled out. But while it's pretty easy to show that, say, tobacco causes lung cancer or solar radiation causes skin cancer, it's very, very hard to show definitively that something does not cause cancer. In fact, out of those 941 agents, only 1 is listed as "probably not carcinogenic." It's a chemical called caprolactam. Go ahead, let your kids roll around in it. It's safe. Probably.

Other agents in the "possibly carcinogenic" category, by the way, include pickled vegetables, magnetic fields, and being a carpenter.

So it's technically possible that your cell phone, or almost any other object or substance in your environment, is giving you cancer. But in the case of cell phones, there's no known way that they could cause cancer. All radiation is not the same. High-energy UV rays, for example, knock electrons off of atoms and damage your DNA, which can cause problems when your cells multiply, which can lead to cancerous growths. But cell phones emit low-energy, "non-ionizing" radio waves. Science knows of no way that this type of radiation could cause cancer.

If you still want to go back to your landline phone, go ahead. Just try not to worry too much. That could kill you.

Why You Shouldn't Sing in a Pakistani Library

How uptight is your home country? In a new study in Science magazine, researchers posed this question to almost 7,000 people from 33 nations. They found that the answer was tied to factors ranging from population density and the availability of clean water to church attendance and the death penalty.

The researchers, led by Michele Gelfand at the University of Maryland, define a spectrum called "tightness-looseness." A nation that's "tight" is restrictive, with strong social norms and "a low tolerance of deviant behavior." But in a "loose" country, anything goes. To figure out where various countries sit on this spectrum, the authors used a survey. Subjects responded to statements such as, "In this country, if someone acts in an inappropriate way, others will strongly disapprove" and "In this country, there are very clear expectations for how people should act in most situations."

The subjects' answers were compiled to create tightness scores for their 33 countries. Especially loose countries included Brazil, Ukraine, and the Netherlands. Some of the tightest countries were Pakistan, Malaysia, and India. The United States scored medium-low: less uptight than the UK, but not quite as relaxed as Australia.

But the authors were after more than a simple international report card of prudery. They wanted to know whether a country's tightness could be tied to historical and ecological factors. Do countries become restrictive after repeated struggles with disease, hunger, warfare, or natural disasters? When populations must band together to survive, do countries have a greater need for rules and structure?

The answer appears to be yes. The authors looked at many environmental and historical factors (and adjusted for countries' per-capita gross national product, since many of these factors are also tied to the wealth of a country). They found strong correlations between a country's restrictiveness and its food deprivation, lack of safe water, vulnerability to natural disasters, tuberculosis infection rate, and childhood mortality.

A country's tightness or looseness manifests in the actions of its government, as well as its populace. The authors found that tight countries' governments tend to be autocratic, with restrictions on the media and civil liberties. Tight countries are much more likely to still use the death penalty. People in tight countries tend not to attend public demonstrations or sign petitions, but they do attend church regularly.

The study's subjects were also asked about day-to-day situations. They rated the restrictiveness of different settings: Can you act however you like while walking down the sidewalk? What about at work or at the doctor's office? They also rated the appropriateness of various behaviors in these settings: Is it OK to eat in an elevator? What about on a bus? Should you flirt at a funeral? (Is this OK anywhere?) You can see more settings and situations, plus the historical and environmental factors studied, here.

A person's culture will shape his or her psychology, for obvious reasons. Someone who lives in Jerusalem and sees buses blowing up on a regular basis must have a pretty different worldview from a sheep farmer in New Zealand. To study the mindsets of people in tight and loose nations, the authors also gave their subjects psychological surveys. They found that people living in restrictive countries restrict themselves, too: They are more cautious and more dutiful, they prefer structure in their lives, and they have better impulse control.

The study does not give a comprehensive picture of all 33 nations, since samples usually came from one city or region in each country. But it offers a snapshot of lives and attitudes across the world. The authors hope that their work can help promote understanding and cooperation between cultures.

Aside from being fodder for a decently entertaining guidebook ("Chapter 9: Funerals Not to Flirt At"), this sort of information could help governments to handle sensitive negotiations. Or it might help aid workers to put locals at ease. The study is a reminder that people in different countries don't just have different views out their windows; they have deeply different perspectives, too.