Not So Happily Ever After
Male and female animals have some inevitable conflicts of interest. A male can theoretically have thousands of offspring in his lifetime, while a female has far fewer opportunities and has to use more energy for each one. Our Pleistocene grandmothers and grandfathers would have experienced the same evolutionary conflict. While the men didn’t necessarily have to pay a heavy cost for a child, the women faced some inescapable burdens. They had to be pregnant for nine months, risked death from various complications along the way, and burned tens of thousands of extra calories as they nursed their child.
As a result of that conflict, men and women evolved an attraction to different kinds of qualities in their mates–qualities we still look for today. David Buss, a psychologist at the University of Texas, has conducted a long‑term survey of thousands of men and women from 37 different cultures, from Hawaii to Nigeria, asking them to rank the qualities that are most important in choosing someone to date or marry. Buss finds that in general, women tend to prefer older men and men tend to prefer younger women. Men ranked physical attractiveness higher than women did, while women valued good earning potential in potential husbands. Buss argues that these universal patterns expose adaptations that evolved in our Pleistocene ancestors: women were attracted to men who would be able to give support for their children, while men were more interested in finding a fertile, healthy mate.
Evolutionary psychologists argue that these conflicts of interest make men and women behave differently. Surveys confirm what many people already suspect, that men are far more willing to have sex than women. Men express a desire for four times as many sex partners in their lifetime; they have more than twice as many sexual fantasies. Men let less time elapse before seeking sex with a new partner, and they’re more willing to consent to sex with a total stranger.
But just because Pleistocene women evolved to be choosy about their mates doesn’t mean that they were perfectly faithful. As we saw in the last chapter, the animal world is rife with females who cheat on their partners. Among monogamous birds, for example, females will sometimes mate with a visiting male, and her cuckolded partner has to raise the chicks. A female bird puts herself at risk when she cheats on her partner, because he may abandon the nest. But the rewards may justify the danger, since she may be able to find a male with better genes than her partner to father her chicks. Pleistocene women might have evolved some similar decision rules about infidelity, which women today have inherited.
Ian Penton‑Voak of St. Andrews University has surveyed women about what kind of face they find attractive on a man. He used a computer to generate “feminized” and “masculinized” male faces, and found that when women are ovulating they prefer masculine faces. The features that make a face masculine–a brow ridge, a jutting jaw, strong cheekbones, for example–may act like a peacock’s tail, advertising a man’s good genes. They are produced by testosterone, and testosterone depresses a man’s immune system. That trade‑off may make a masculine face a costly display–one that can be used only by a man with a strong immune system.
Penton‑Voak suggests that a woman’s keener attraction to a masculine man when she’s most likely to conceive may be an adaptation for snagging good genes for her children. When she is ovulating, a woman may be more likely to have an affair with such a man, but during the rest of her menstrual cycle she may become more interested in the man who is helping to raise her children.
With these sorts of conflicts of interests driving the evolution of our ancestors, some of our uglier emotions may actually have originated as useful adaptations. David Buss has proposed that jealousy, far from being a pathology, was one such mechanism. There’s no obvious signal that lets people know that their mate is cheating on them. Men cannot even tell when women are ovulating; unlike other primates, women do not get genital swellings. Under these uncertain conditions, jealousy makes ample evolutionary sense according to Buss. A “jealousy module” in the brain could let a person stay alert to the subtlest clues of betrayal that a purely rational mind might dismiss. (“Is that a new cologne I smell?”) If those signs build up to a certain threshold, the jealousy module would trigger a reaction that could avert the threat–or cut the person’s losses.
Buss marshals a number of experiments and surveys as evidence for the adaptiveness of jealousy. If you hook electrodes to a man’s forehead, you can measure the stress he experiences when you ask him to think of his romantic partner cheating on him. Men experience more stress thinking about their partners having sex with another man than thinking about them becoming emotionally attached to someone else. (The thought of sexual betrayal makes their heart beat five extra beats a minute, on a par with the effects of drinking three cups of coffee.)
Women, on the other hand, tend to show the opposite reaction, experiencing more stress at the thought of emotional abandonment. Surveys have shown a similar pattern not only in the United States but in Europe, Korea, and Japan. For men, Buss concludes, sexual betrayal is a bigger threat to their reproductive success; for women, emotional betrayal may signal that a man will abandon her and no longer provide for her children.
If Buss is right, evolutionary psychology offers a better way to cope with jealousy than conventional psychology. Therapists typically treat jealousy as something unnatural, that can be eliminated by boosting self‑esteem or desensitizing patients to the thought of their spouse cheating on them. Buss does not condone the ugly side of jealousy–the violence of wife beaters and stalkers–but he argues that pretending jealousy can simply go away is pointless. Instead, he suggests, people should use jealousy to strengthen relationships rather than to destroy them. A flash of jealousy can prevent us from taking a relationship for granted.
The psychological adaptations of our ancestors do not doom us to unhappiness, Buss and other evolutionary psychologists argue; we simply have to acknowledge their reality and work around them. Today, for example, step‑parents are expected to treat their stepchildren no differently from biological children. That’s an unrealistic expectation according to evolutionary psychologists. They argue that parental love, which encourages us to make enormous sacrifices for our children, is yet another adaptation designed for ensuring the survival of our genes. If that’s true, then step‑parents should have a much harder time feeling the full depth of parental love for children who are not their own.
There are some pretty chilling statistics to back up this hypothesis. In a conflict between step‑parents and stepchildren, there isn’t a biological bond to reduce the tension and conflicts are thus more likely to spiral out of control. It turns out that being a stepchild is the strongest risk factor for child abuse yet found. And a child is 40 to 100 times more likely to be killed by a step‑parent than by a biological parent. Step‑parents are not inherently evil; they simply haven’t developed the same depths of patience and tolerance that parents have. And this, evolutionary psychologists argue, points a way to reduce the risks of conflict: step‑parents need to be aware that they have to overcome some obstacles to having a happy family that a biological parent doesn’t encounter.
Module or Mirage?
The new generation of sociobiologists is attracting critics of its own–including a number of evolutionary biologists. They argue that the sociobiologists are too eager to draw certain conclusions from their data, and that in some cases they are not understanding how evolution actually works.
Take, for example, a troublesome book published in 2000 called A Natural History of Rape. Two biologists, Randy Thornhill and Craig Palmer, proposed that rape is an adaptation–a way for men who would otherwise have little access to women to increase their reproductive success. Forced sex is not unique to humans; it has been documented among certain species of mammals, birds, insects, and other animals. Thornhill himself has shown that it is a regular part of the scorpion fly’s mating strategies. Some scorpion fly males woo females by hoarding dead insects that they like to eat, driving away other scorpion fly males who try to steal the carcasses. Other males secrete saliva onto leaves and wait for females to come by and eat it. And others simply grab females and force them to copulate.
Thornhill found that the biggest scorpion flies were those that hoarded dead insects and attracted the most females. Medium‑sized scorpion flies made do with their salivary gifts, which won them fewer mates, and the smallest males attacked. But each scorpion fly can use any of these strategies if the conditions are right. If the biggest males disappear, the medium males can hoard insects and the small ones start drooling.
Thornhill and Palmer argue that our ancestors might have incorporated rape into their sexual strategies as well, as a tactic to use when other means fail. They point to evidence that rape victims tend to be in prime reproductive years–suggesting that reproduction is at the top of the rapist’s unconscious agenda. Female rape victims of reproductive age fight back more against their attackers than women of other ages because, Thornhill and Palmer claimed, they have more to lose in terms of reproduction. Thornhill and Palmer also claim that surveys show reproductive‑aged women are more traumatized by the experience than other women. They are “mourning” the loss of their ability to choose their mate through normal courtship.
A Natural History of Rape was the subject of a scathing review in the journal Nature. Two evolutionary biologists–Jerry Coyne of the University of Chicago and Andrew Berry of Harvard–picked apart the evidence in the book. Girls under 11–too young to reproduce–made up only 15 percent of the population but 29 percent of the victims in a 1992 survey, a percentage far higher than you’d expect according to the book’s hypothesis. The authors claimed that this number was so high because American girls are experiencing their first menstrual period at earlier ages than in previous generations, which “contributes to the enhanced sexual attractiveness of some females under 12.” Coyne and Berry weren’t impressed. “In the end,” they charged, “the hopelessness of this special pleading merely draws attention to the failure of the data to support the authors’ hypothesis.”
And the fact that reproductive‑age women fight back says nothing about evolution: they are also much stronger than little girls and old women. “In exclusively championing their preferred explanation of a phenomenon, even when it is less plausible than alternatives, the authors reveal their true colours. A Natural History of Rape is advocacy, not science,” Coyne and Berry wrote. “In keeping with the traditions established early in the evolution of sociobiology, Thornhill and Palmer’s evidence comes down to a series of untestable ‘just‑so stories.’ ”
Coyne and Berry were referring to the title of Rudyard Kipling’s 1902 book of children’s tales, which tell how the leopard got its spots, the camel its hump, and the rhinoceros its skin. The irritation that Coyne and Berry express about evolutionary psychology is common among biologists. They know just how easy it is to make up a story about the evolution of adaptations, and how hard it is to figure out what anything in nature is really for.
To document real adaptations, biologists use every tool they can possibly find, testing for every possible alternative explanation they can think of. If they can run experiments, they will. When an adaptation–say, deep tubes for holding nectar in flowers–is found on many different species, scientists construct their evolutionary tree and trace the rise of the adaptation from species to species.
The human brain is far more complex than a flower, and researchers have fewer ways to study its evolution. Chimps and other apes can offer a glimpse as to what our ancestors may have been like 5 million years ago, but after that, we evolved in a unique direction. We cannot put 100 Homo erectus in some fenced‑off compound and run experiments to see who will be attracted to whom.
Instead, evolutionary psychologists often rely on surveys. But their samples, usually a few dozen American undergraduates–mostly white, mostly affluent–can hardly be expected to represent the universal human condition. Some evolutionary psychologists appreciate this problem and try to replicate their results in other countries. But even then they may be too eager to jump to cosmic conclusions. In his book The Dangerous Passion, David Buss writes: “People from the United States and Germany give roughly equivalent responses, revealing a large sex difference in the desire for love to accompany sex–a desire that transcends cultures.” Compared to the Binumarien of New Guinea or African pygmies, the differences between Americans and Germans are hardly transcendent.
Any particular human behavior may be created or shaped by culture, and even if it has some genetic basis, it may not actually be an adaptation at all. This has been the chief complaint of Stephen Jay Gould, who has been a critic of sociobiology ever since the publication of Wilson’s book. Like Coyne and Berry, he sees evolutionary psychologists falling victim to a trap that all biologists have to take care to avoid. An eager search for an adaptational explanation, Gould argues, may blind biologists to the fact that they are dealing with an exaptation–something that has been borrowed from its original function for a new one. Birds now use feathers for flight, but feathers first appeared on dinosaurs that couldn’t fly. They probably used them for insulation or as sexual displays to other dinosaurs.
Gould even claims that some things that seem like adaptations may have come into existence for no particular function at all. In a classic 1979 paper, Gould and a fellow Harvard biologist, Richard Lewontin, explained how this could happen with an analogy: the domed roof of the basilica of Saint Mark’s in Venice. The dome sits on four arches that are joined at right angles. Because the tops of the arches are rounded, there is a triangular space at each corner. Three centuries after the dome was built, these spaces–known as spandrels or pendentives–were covered with mosaics.
It would be absurd to say that the architects designed the spandrels so that they could contain triangular mosaics. It would be absurd to say that the spaces were designed for anything at all. If you want to put a dome on four arches, spandrels are automatically part of the deal. You may later put the spandrels to some use, but that use has nothing to do with its basic design.
Evolution deals in spandrels as well, Gould and Lewontin argued. For a simple example, consider snail shells. All snails grow their shells around an axis, with the result that an empty column forms in the middle. In some snail species, the column fills with minerals, but in many the column remains empty. A few species use their open column as a chamber for brooding their eggs. Now, if a biologist was in high storytelling spirits, he or she might say that this chamber is an adaptation that evolved for brooding eggs, and might praise the cleverness of its design by pointing to how it is lodged at the center of the shell. But the fact is that the column has no adaptive function at all. It’s just a matter of geometry.
Gould accuses evolutionary psychologists of mistaking spandrels in the human brain for precise adaptations. He is perfectly happy to grant that the human brain got larger as it adapted to life on the African savannas. But that increased size and complexity gave our ancestors a flexibility that allowed them to figure out how to kill a Cape buffalo or determine when a tuber was in season. It could be reconfigured to read, write, or fly planes. But we do not carry any of these particular abilities in some hardwired form in our brains. “The human brain must be bursting with spandrels that are essential to human nature and vital to our self‑understanding but that arose as nonadaptations, and are therefore outside the compass of evolutionary psychology,” Gould declares.
The debate over evolutionary psychology won’t be resolved any time soon. It is a vital matter that gets at the very heart of human nature, and just how powerful an effect natural selection can have on it. But it can be rancorous and sometimes get mean‑spirited. Evolutionary psychologists sometimes insinuate that their critics are dewy‑eyed Utopians, and their critics attack evolutionary psychologists as rabid conservatives who want to pretend that capitalism and sexism are hardwired into our brains. These sorts of insults are not just beside the point; very often they’re flat‑out wrong. Robert Trivers, who first came up with the idea of reciprocal altruism, is not a conservative; he has described himself as a liberal who was happy to find that his research suggested a biological basis for fairness and justice. And the anthropologist Sarah Blaffer Hrdy, who first showed how significant infanticide can be in animal societies, uses her work in sociobiology to offer a feminist perspective on evolution: that females are not coy, passive creatures as once thought, but active contestants in the evolutionary arena.
As hard as it may sometimes get, it’s important to stay focused on the science, or the lack thereof, in evolutionary psychology. The weight of the scientific evidence will ultimately determine whether it stands or falls.
Toward Language
The glacial monotony of hominid life began to break up about half a million years ago. The tools that humans left behind started to show signs of change. Instead of hacking a stone into a single axe, humans learned how to make a number of blades from a single rock. A hand axe made by a Kenyan 700,000 years ago didn’t look much different than one made in China or Europe. But starting 500,000 years ago, regional styles emerged. New sorts of technology became more common. Humans learned how to make javelin‑like spears, and they learned how to make reliable fires. And as in the past, the rise of new tools was reflected in the expansion of human brains. For about the next 400,000 years, human brains would grow at an extraordinary rate, until 100,000 years ago, when they reached their present size.
According to Robin Dunbar’s work on primate brains, this expansion must have occurred as humans lived in bigger and bigger social groups. Judging from the size of fossil skulls, Dunbar estimates that the earliest hominids, such as Australopithecus afarensis 3 million years ago, formed groups of around 55. Early species of Homo living 2 million years ago would have hung together in bands of 80 individuals. By a million years ago, Homo erectus groups had cracked 100, and by 100,000 years ago, when human brains had reached our own neocortex size, they were congregating in gaggles of 150.
The average size of the human neocortex hasn’t changed since then, and Dunbar sees a lot of evidence that our biggest significant social groups have remained at 150 people. Clans in hunter‑gatherer tribes in New Guinea average 150 people. The Hutterites, a group of fundamentalist Christians who live communally on farms, limit the size of their farming communities to 150, forming new ones if the group gets too big. Around the world, the average size of an army company is 150. “I think on average there are 150 people that each of us knows well and knows warmly,” Dunbar claims. “We understand how they tick. We know about their history and how they relate to us.”
As hominid bands expanded, their complexity grew as well. And once they crossed a certain threshold, Dunbar argues, the old ways in which primates interacted no longer worked. One of the most important ways that primate allies show their affection to each other is by grooming. Grooming not only gets rid of lice and other skin parasites, but it also is soothing. Primates turn grooming into a social currency that they can use to buy the favor of other primates. But grooming takes a lot of time, and the larger the group size, the more time primates spend grooming one another. Gelada baboons, for example, live on the savannas of Ethiopia in groups that average 110, and they have to spend 20 percent of their day grooming one another.
The size of hominid brains suggests that their group size reached 150 by 100,000 years ago, and at that point grooming became an impractical tool. “You simply cannot fit enough grooming into the working day,” says Dunbar. “If we had to bond our groups of 150 the way primates do, by grooming alone, we would have to spend about 40 or 45 percent of our total daytime in grooming. It would be just wonderful, because grooming makes you feel very warm and friendly toward the world, but it’s impractical. If you have to get out there and find food on the savannas, you don’t have that amount of time available.”
Hominids needed a better way to bond. Dunbar thinks that better way was language.
Working out the origin of language remains one of the biggest challenges in evolutionary biology. Speech cannot turn to stone, so it leaves no direct record of its existence. Before the 1960s, most linguists didn’t even think that language was, strictly speaking, a product of evolution. They thought that it was just a cultural artifact that humans invented at some point in their history, just as they invented canoes or square dances.
One of the reasons for this notion was the way linguists thought the brain produced language. Assuming that the brain was a generalpurpose information processor, they concluded that babies learned to speak simply by using their brains to find the meaning of the words they heard. But Noam Chomsky, a linguist at the Massachusetts Institute of Technology, took the opposite stance: babies were born with the underlying rules of grammar already hard‑wired into their brains. How else, Chomsky asked, could one explain the fact that all languages on Earth share certain grammatical patterns, such as subjects and verbs? How else could a baby master the complexities of language in only three years? Words are as arbitrary as dates in history, and yet no one would expect a 3‑year‑old to memorize a time line of the battles of the Peloponnesian War. Not only do children learn individual words, but they quickly use the words they hear to discover grammatical rules. Their brains, Chomsky argued, must already be primed for learning language.
Research since the 1960s suggests that the human brain has special language modules, just as it has modules for seeing edges or for social intelligence. The human brain uses them for storing rules of grammar, syntax, and semantics–all the ingredients that give language its sense and complexity. Linguists can see the language organ at work in the mistakes that children make as they learn how to speak. They may use the standard rules for making plurals or past‑tense verbs to create words that don’t exist, such as tooths, mouses, and holded. Young children have the capacity to set up rules of grammar in their brains, but they are still having trouble overriding them with rote memorization of irregular words.
More evidence comes from certain types of brain damage that rob people of the ability to use language, or even just components of it. Some people have trouble only with proper names or words for animals. A team of British scientists studied a man with a rich vocabulary of nouns, including words like sextant, centaur, and King Canute, but who could use only three verbs: have, make, and be. In each case, a particular language module has been damaged, leaving the rest of the brain intact.
Obviously, 3‑year‑old children do not automatically burst into Shakespearean verse. They need to be immersed in a sea of words as their brains develop so that their inner rules of grammar can wrap themselves around a particular language. But the “language instinct” (a phrase coined by linguist Steven Pinker of MIT) is so strong that children can create new languages on their own. In 1986 Judy Kegl, a linguist at the University of Southern Maine, had the chance to watch one come into existence.
That year Kegl had gone to Nicaragua to visit schools for deaf children. The Nicaraguan government had organized several schools in the early 1980s, but the students were struggling. The children arrived knowing only a few crude gestures that they had developed on their own with their parents. Their teachers didn’t teach them fullblown sign language, but only tried to use “finger spelling,” in which different shapes represent each letter in a word. The finger spelling was supposed to help the students make the transition to speaking the words, but since they had no idea what the teachers were trying to teach them, the entire project failed miserably.
The teachers noticed that even as the children struggled to communicate with them, they had no trouble communicating with one another. They were no longer using the paltry collection of gestures that they had brought from their homes, but a rich, new system that the teachers could not understand. Kegl was invited to come to the schools and help the teachers understand what was happening.
The teenagers at the secondary school, she discovered, used a pidgin cobbled together out of makeshift gestures that they all shared. But the younger children at the primary school were doing something far more sophisticated. Kegl was shocked to see them signing to one another rapid‑fire, with the sort of rhythm and consistency that reflected a real sign language, complete with rules of grammar. The younger the children, the more fluent they were. “You could see just by the way the signs were orchestrated and structured that something more was going on there,” says Kegl. “It became clear that I was there in the early stages of the emergence of a language.”
For the first few years, Kegl struggled to decode the language, sometimes eliciting signs or sentences from the children, and sometimes simply watching them tell long narratives. In 1990 she and the children began to watch cartoons, and she would have them tell her what was happening. The cartoons became her Rosetta stone.
Kegl discovered that the children’s signs were elegant, clever, and evocative. In the teenage pidgin, the word for speak was opening and closing four fingers and a thumb in front of the mouth. The children had taken this mimicry and enhanced it: they opened their fingers at the position of the speaker and closed them at the position of the person being spoken to. They had also invented a way to use prepositions like verbs. Where an English speaker would say, “The cup is on the table,” a Nicaraguan signer would say something like, “Table cup ons.” While this may seem weird to an English speaker, other languages such as Navajo make regular use of it.
In the years since she first came to Nicaragua, Kegl has worked with the deaf community to put together a dictionary of their language, which now contains more than 1,600 words. In the meantime she has also put together a theory for its origin. Children came to the schools with nothing more than their simple gestures. They pooled them together into a common set and then crafted that into the pidgin that the teenagers used. Younger children then came to the school with brains primed for learning language, seized on the gestures of the older children, and endowed them with grammar. These young children abruptly produced a language that from the beginning was as complex and complete as any spoken language. And once this true language had emerged, new experiences led to the creation of new words.
“What happens,” Kegl says, “is that these gestures become gradually richer and richer and more varied. But we can’t see the leap between them and the first signs of language because the grammar is inside the child.”
If the grammar is indeed inside the child–if, in other words, its rules are hardwired into our brains–then evolution must have had a hand in its wiring. But that raises a difficult question: How could natural selection shape language in all its complexity? Scientists can’t go back in time to watch language come into being, but recently they’ve discovered some tantalizing clues to its evolution by modeling it on a computer. They’ve found that just as legs or eyes could have evolved incrementally, language may have built up its complexity step by step.
Martin Nowak and his colleagues at the Institute for Advanced Study in Princeton designed a mathematical model of language evolution based on a few reasonable assumptions. One is that mutations that let an animal communicate more clearly may raise its reproductive fitness. Vervet monkeys, for example, have a set of distinct sounds that they use to alarm their fellow vervets about birds and snakes and other threats. Being able to tell the difference between those calls can make the difference between life and death. If a vervet mistakes a snake call for a bird call, it may rush to the ground, only to be devoured by a waiting python. Another assumption Nowak makes is that a bigger vocabulary–if it can be properly communicated–also brings an evolutionary advantage. A vervet that can understand both a bird alarm and a snake alarm will have better odds of survival than a vervet that has room in its brain for only one.
In Nowak’s model, individuals were endowed with a simple, vervet‑like communication system. Their vocabulary consisted of a collection of sounds, each of which corresponded to some particular thing out in the real world. As the individuals reproduced, mutations cropped up in their offspring that changed the way they spoke. Some of these mutations let the individuals handle a bigger vocabulary than their ancestors; in Nowak’s model, these individuals were awarded more reproductive success.
Nowak found that his model consistently converged on the same results. Initially, the individuals communicated with one another with a few distinct calls. Their language gradually became more complicated as new calls were added. But as their vocabulary grew, it became harder to distinguish new calls from the old ones. The closer the sounds became, the easier it became to confuse them. (Think of the long a in bake and the short a in back, for example.)
While a bigger collection of calls may bring an evolutionary edge, the confusion that comes with it can cancel out its benefits. In trial after trial, Nowak found that his simulated vocabulary expanded to a certain size and then stopped growing. His results may explain why most animals aside from humans can communicate only with a small number of signals: they have no way of overcoming the inevitable confusion that comes with a big repertoire of sounds.
But what if our ancestors evolved a way out of this trap? To explore this possibility, Nowak changed his model. He allowed some of the individuals to start stringing together their simple sounds into sequences–to combine sounds into words. Now Nowak pitted a few of these word‑speaking individuals against the original soundspeakers. He found that if the individuals had only a few messages that they needed to convey to one another, they could get by with a system of sounds. But if their environment was more complex and they needed to use more messages, word‑speaking eventually won out. By combining a small number of sounds in a vast number of unique words, Nowak’s individuals could avoid the confusion that similar sounds creates.
But Nowak discovered that word‑speaking has its limits, too. In order for a word to survive in a language, people have to use it. If they forget it, the word sinks into oblivion. These days books and videotapes can help keep old words in circulation, but among our hominid ancestors, there were only spoken words, which had to be stored in hominid brains. Since brains don’t have an infinite memory capacity, they limited the size of the vocabulary hominids could use. Hominids might invent new words, but only if old words were forgotten.
Nowak created a second twist on his language model to study this limit. Instead of simply using a single word for any given concept, some individuals could now combine words together to describe events. Some words could represent actions, others the people or things involved in that action, and others still would represent their relationship. In other words, Nowak gave these individuals syntax. Syntax allows a person to give a few hundred words millions of different meanings, depending on how the words are arranged. But syntax can create a confusion of its own if speakers aren’t careful. Even though the same words are required for the headlines “Dewey defeats Truman” and “Truman defeats Dewey,” their meanings are quite different.
When Nowak and his colleagues pitted syntax against simple word‑concept communication, they discovered that syntax is not always best. A syntax‑free language beats out syntax when there are only a few events that have to be described. But above a certain threshold of complexity, syntax became more successful. When a lot of things are happening, and a lot of people or animals are involved, speaking in sentences wins.
While Nowak’s models are simple, they capture some of the crucial aspects of how language could have gradually evolved from a simple set of signals. The children who invented Nicaraguan Sign Language may have recapitulated its evolution from signs to words to syntax. Nowak’s results also suggest how our ancestors got out of the communication trap that most other animals are stuck in. Something about the life of our ancestors became complex and created a demand for a complex way in which they could express themselves.
A strong candidate for that complexity, as Dunbar and others have shown, was the evolving social life of hominids. But even if hominids a million years ago had something to say, they might not have had the anatomy for saying it. We modern humans use a very peculiar sort of anatomy in order to speak, an anatomy unlike any other living mammal. Other mammals–including chimpanzees–have a voice box that rides high in their throats. This arrangement lets them breathe while they drink or eat, because the air passageway and the esophagus are divided. But it also creates a very small vocal tract between the voice box and the mouth. With so little room, the tongue cannot move around enough to make complex noises.
At some point in hominid evolution, the larynx must have dropped down to the low position that it takes in the human throat. This sort of anatomy comes with risks, because food or water can slip into our windpipes more easily than in other mammals and make us choke. But it also created enough room for our tongues to flick around and create the repertoire of sounds that a spoken language demands.
That’s not to say that language couldn’t have gotten its start before the voice box was in place. Hominids might have made signs with their hands–which were already capable of fine movements, judging from the tools they were making 2.5 million years ago. They might have combined these signs with simple sounds and movements to create a protolanguage. With such a system in place, evolution might have favored a bigger brain to handle more complex symbol processing and a more human‑like throat to make more sophisticated speech possible.
No one knows the exact chronology of this evolution, because language leaves precious few traces on the human skeleton. The voice box is a flimsy piece of cartilage that rots away. It is suspended from a slender C‑shaped bone called a hyoid, but the ravages of time usually destroy the hyoid too. Many researchers have turned instead to less direct clues that are left on hominid skeletons. They have looked at the angle at the base of the skull, in the hopes of calculating the length of the vocal tract. They have measured the width of the hole where a tongue‑controlling nerve enters the skull. They have looked at the impression of the brain on its case, searching for language‑related regions. In each case researchers have claimed to have found a clue to the first signs of speech. But skeptics have shown that none of these clues is actually a reliable guide to the presence of speech.
With all of this debate over a few shreds of hard evidence, it’s not surprising that experts are divided over when language reached its modern form. Leslie Aiello of University College London, for example, maintains that the acceleration of brain size that started 500,000 years ago must have brought speech with it. Robin Dunbar, on the other hand, has proposed that language started only 150,000 years ago. He argues that only then were our ancestors living in groups that were too large for grooming to work as a social tool. People would have had to have substituted language for grooming and other primitive ways of interacting in order for hominid society to hold together.
With language, for example, you can keep tabs on what other people are doing and on what they’re saying about you. You can manipulate other people with words as well and hold on to your place in a large society. Even today language still functions mainly as gossip. Dunbar has eavesdropped on people on trains and in cafeterias, and he consistently finds that two‑thirds of their conversations are about other people. Language is, Dunbar argues, grooming by other means.
Yet other researchers think that even Dunbar’s figure of 150,000 years is too old for the origin of language. They are convinced that full‑blown language may have appeared only as recently as 50,000 years ago. It is only then that the human fossil record documents a spectacular mental explosion, in which people understood themselves and the world around them in ways that their ancestors never could have imagined. It was then that the modern mind was born, and language could well have been a crucial ingredient in its birth.
Twelve
Äàòà äîáàâëåíèÿ: 2016-02-02; ïðîñìîòðîâ: 799;