John Benjamins Publishing Company

0 downloads 0 Views 141KB Size Report
mon, I later asked him if he was addicted to cryptic crossword puzzles, as I am. It ... (2011), in an extended review of a book on vocalization in primates (Abry et al. ..... Facial gestures play an important role in sign languages (Emmorey 2002), ... recognized as the era in which hominins came to occupy what has been termed.
John Benjamins Publishing Company

This is a contribution from From Gesture in Conversation to Visible Action as Utterance: Essays in honor of Adam Kendon. Edited by Mandana Seyfeddinipur and Marianne Gullberg. © 2014. John Benjamins Publishing Company This electronic file may not be altered in any way. The author(s) of this article is/are permitted to use this PDF file to generate printed copies to be used by way of offprints, for their personal use only. Permission is granted by the publishers to post this file on a closed server which is accessible to members (students and staff) only of the author’s/s’ institute, it is not permitted to post this PDF on the open internet. For any other use of this material prior written permission should be obtained from the publishers or through the Copyright Clearance Center (for USA: www.copyright.com). Please contact [email protected] or consult our website: www.benjamins.com Tables of Contents, abstracts and guidelines are available at www.benjamins.com

The word according to Adam The role of gesture in language evolution Michael C. Corballis Adam Kendon and I disagree on just one aspect of the role of gesture in language evolution. Where I propose that manual gesture preceded speech, he argues that gesture and vocal language evolved as equal partners. Some arguments for the gesture-first theory no longer seem to carry force, but the main support comes from evidence that vocal production in nonhuman primates is largely involuntary and inflexible, whereas manual action is flexible, intentional and learnable. This suggests that language evolved from manual grasping and manipulation in primates to pantomime in our hominin forebears, and was gradually conventionalized toward arbitrary symbols. Speech was one outcome of this process, although gesture is an option, as in signed languages, and also accompanies spoken discourse.

Introduction I met Adam Kendon at the Fifth Conference of the International Society for Ges­ ture Studies (ISGS), held in July 2012 in Lund, Sweden. Although we had never previously met, he somehow picked me out, apparently because I looked like a New Zealander (as indeed I am), and said we needed to talk. I was to give an invited plenary address, and it turned out that he was to introduce me – a task I would not wish on anyone. Within minutes of talking, however, we both stumbled onto Edward Lear’s poem: How pleasant to know Mr Lear! Who’s written such volumes of stuff! Some think him ill-tempered and queer, But a few think him pleasant enough.

(Lear 1894: viii)

This seemed to solve the problem, and he introduced me with a Lear-like poem, and I opened my address with a Leary poem to Adam.

© 2014. John Benjamins Publishing Company All rights reserved

178 Michael C. Corballis

Emboldened by the discovery that we seemed to have something in common, I later asked him if he was addicted to cryptic crossword puzzles, as I am. It turned out that he was not. Crosswords, he said, are for Lewis Carroll people, not for Edward Lear people. That seemed to define a difference between us. His kind of nonsense is Learian, mine Carrollian. I had also discovered years ago that we shared an interest in gesture and its role in language. He was, and is, the guru of gesture, the editor of the journal Gesture, the founder of the ISGS, and well-known for having written volumes of stuff on gesture, on signed languages, and on the way gesture is incorporated into normal discourse. I am really an interloper, a sort of Corballis1 in Wonderland, a dilettante who thinks he knows something about it but probably just makes it all up. It turns out, though, that we are in remarkable agreement, except on one particular issue. We are both agreed that gesture is a critical component of language. Kendon (1980) wrote that “speech and gesture were two aspects of the process of utterance,” and more recently entitled a book Gesture: visible action as utterance (Kendon 2004). These insights have provided one of the platforms for my own thinking. And we agree that signed languages are the equal of speech, at least in linguistic terms. I suspect we would also agree to disagree with authors such as Chomsky (2010), who maintains that language emerged as a sudden saltatory event, perhaps a mutation, in a single individual within the past 100,000 years, and radiated to the human population prior to the dispersal from Africa. That theory seems to owe less to science than to the biblical view that language was a gift from God to that other Adam. Where Kendon and I differ is over the evolutionary timing and manner in which gesture was incorporated into language. I have espoused what Kendon (2011), in an extended review of a book on vocalization in primates (Abry et al. 2009), calls the “gesture-first” theory, which is that language originated in manual gestures, with vocalizations later introduced to the point that speech became the dominant mode, at least among those not confined to signed languages (Corballis 2002, 2009). Many have contrasted this theory with the view that language evolved from vocal calls. In the introduction to his review, Kendon suggests that the evidence “favours neither position” (2011: 1). Rather, he suggests that vocalization and gestures were equal partners in the evolution of language; the concluding section of the review is headed “Speech and gesture together in language origins” (ibid: 366).

1. Actually I don’t pronounce my name to rhyme with Alice, but many others do. Anagram­ matically, though, I am a mere letter away from being a Carroll sib.

© 2014. John Benjamins Publishing Company All rights reserved

Gesture and language evolution 179



Our point of disagreement, then, is a fine one. I suspect that manual gesture and vocalization have long been partners in the evolution of language, but would still argue that the balance shifted, probably gradually, from one to the other. In this chapter I focus primarily on points raised in Kendon’s (2011) article, which is an extended review of a book on vocalization, with several chapters dedicated to the precursors of speech in our primate forebears (Abry et al. 2009). Kendon takes issue with points raised in the book, and the essence of many of his arguments is that one can find precursors of speech in primate vocalizations, suggesting that there is no need to appeal to manual gesture as the precursor to language. In nonhuman primates, as in modern humans, according to Kendon, manual gesture was an accompaniment, not a precursor, to speech. A number of authors, including Chomsky (2010), have argued that language evolved de novo in our own species, which if true would imply that it is futile to seek precursors in primates, or even in our hominin forebears, such as the Neanderthals. This is not my view, nor is it Kendon’s. Language is complex, yet fundamentally biological, and according to Darwinian principles must have evolved incrementally, with precursors going perhaps far back in primate evolution. In a critique of the Chomskyan notion that language evolved de novo in our species, Pinker and Bloom (1990: 708) write as follows: If a current theory of language is truly incompatible with the neo-Darwinian theory of evolution, one could hardly blame someone for concluding that it is not the theory of evolution that must be questioned, but the theory of language.

In what follows, then, I consider evidence from nonhuman primates to be pertinent to our understanding of language evolution, and discuss some of the claims that have led me and others to argue for the gesture-first position. As we shall see, some of these arguments do not clearly distinguish this position from Kendon’s equal partners position. I begin with some arguments that once seemed compelling evidence for the gesture-first argument but now seem at best equivocal.

Equal partners? Laterality of function My own views were prompted initially, not from the study of language itself, but from a Carrollian interest in mirror images and left- and right-handedness, and then from the fact that most of us are right-handed and left-cerebrally dominant for speech (Corballis & Beale 1976). From there came the idea that language itself originated in manual gesture. It turned out that this idea was not especially © 2014. John Benjamins Publishing Company All rights reserved

180 Michael C. Corballis

original. Gordon Hewes, in a landmark article published in 1973, incorporated the coincidence of right-handedness and left-brained dominance for language into a more general argument for the gestural origin of language. But as Kendon (2011) points out, it is equally supportive of the idea that gesture and vocalization evolved together. Evidence of asymmetries in nonhuman primates might even be taken to support a priority for vocalization, since there is evidence for a left-hemispheric bias in vocal control going far back in evolution (Corballis 2003). Even frogs show this bias (Bauer 1993). This of course need not imply a priority for vocalization in the evolution of language itself – frogs don’t speak, neither do apes. Indeed, I argued that the asymmetry entered the language networks when vocalization itself was introduced, and not during an earlier gestural phase. However, this now seems to be contradicted by evidence from primates. Some 65 to 70 percent of chimpanzees are right-handed, both in captivity (Hopkins & Leavens 1998) and in the wild (Biro et al. 2006; Boesch 1991; Lonsdorf & Hopkins 2005). Moreover, in the majority of chimpanzee brains the temporal planum, which is homologous to Wernicke’s area in humans, is larger on the left than on the right (Gannon et al. 1998; Hopkins et al. 1998) – an asymmetry absent in rhesus monkeys and baboons (Wada et al. 1975) but well documented in humans (e.g. Foundas et al. 1996). This leftward asymmetry in the chimpanzee is correlated with a right-handed bias in gestural communication (Hopkins & Nir 2010). Baboons have also been shown to gesture predominantly with the right hand (Meguerditchian & Vauclair 2006), and even rats have been reported to be rightpawed (Güven et al. 2003). Taken overall, then, the evidence for manual and vocal asymmetries in apes implies that both preceded the evolution of language itself, and do not unequivocally favor either the gesture-first or the equal partners arguments.

Comprehension of speech and gesture One argument against the gesture-first scenario is that some nonhuman animals show a remarkable capacity to understand human speech. Savage-Rumbaugh et al. (1998) reported that Kanzi, a bonobo, was able to follow instructions made up of several spoken words at a level comparable to that of two-and-a-half-yearold child. Kanzi is now said to understand some 3,000 spoken words (Raffaele 2006). The gorilla Koko, too, can respond meaningfully to simple spoken requests (Patterson & Gordon 2001). And it’s not only apes. A border collie known as Rico responds accurately to spoken requests to fetch different objects from another room, and then either to place the designated object in a box or to bring it to a © 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 181

particular person (Kaminski et al. 2004). In what is shaping as a linguistic dog fight, another border collie called Chaser is said to know the spoken names of 1022 objects (Pilley & Reid 2011). The ability of apes and dogs to understand spoken words and simple spoken instructions suggests a precursor to language that may go back to a common ancestry with apes, at least; and perhaps even dogs – although dogs may have acquired the facility from humans through selective breeding. However, the ability may have little to do with language evolution per se, but may reflect rather a more general capacity to interpret sounds and to take appropriate action. Most animals live in a noisy environment, where danger or sustenance is signaled in multiple ways – a thunderclap, the fall of a tree, the roar of a lion, the hiss of a snake, or strange noises uttered by humans. Some primates have added their own vocal signals. For instance, Cheney and Seyfarth (1990) famously noted that vervet monkeys produce a variety of different calls to indicate different predators, such as a snake, leopard or eagle, suggesting that the calls have different meanings. A number of other primates, including chimpanzees, produce different calls to express different meanings (see Cheney & Seyfarth 2005, for review). This of course does not preclude the role of manual gesture as well. Hobaiter and Byrne (2011) spent 266 days recording gestures made by chimpanzees in the Budongo National Park in Uganda. Gestures included movements of the body, limbs and head, but excluded facial expressions. They were intentional in that they were directed to another chimp, with the apparent aim of influencing the receiver’s behavior, and they were also described as “mechanically ineffective,” presumably to exclude acts like fighting, eating or manipulation of objects. They recorded a total of 4,397 gestures, made up of at least 66 identifiably different gestures. These included actions like a directed push, a handshake, an embrace, sexual display, and what has been called the “directed scratch,” in which the chimp scratches the part of the body where he or she wants to be groomed by another chimp (Pika & Mitani 2009). It has been suggested that grooming itself is a precursor to language (Dunbar 1998). Again, these various observations suggest that nonhuman animals understand the meanings of both vocal and gestural signals, and the Carrollian gesture-first theory remains on an equal footing with the Learian equal partners. Where gesture begins to take precedence, though, is not in the understanding of signals, but in the actual production of signals that are intentional and subject to learning. Apes and dogs, for all their ability to learn the meanings of spoken words and commands, have no ability to produce anything resembling speech.

© 2014. John Benjamins Publishing Company All rights reserved

182 Michael C. Corballis

Did the Neanderthals speak? A critical question in the evolution of language is whether our closest nonhuman relatives, the now extinct Neanderthals and Denisovans, would have been capable of articulate speech. These large-brained hominins all shared a common ancestor with Homo sapiens dating from some 500,000 years ago, and were separated from around 400,000 years ago, with the lineages leading to Neanderthals and Denisovans in Europe and Russia, and that leading to modern humans confined to East Africa until the dispersal from around 60,000 years ago. The wandering humans eventually made contact with their long-lost cousins in Europe and Russia from around 40,000 years ago. Sequencing of both the Neanderthal (Green et al. 2010) and Denisovan (Meyer 2012) genomes reveals a degree of interbreeding both with each other, and with anatomically modern humans. This suggests in turn that the three species had common cognitive capacities, and indeed raises questions as to whether they were actually different species. At worst, the Neanderthals and Denisovans might have had some linguistic deficiencies relative to Homo sapiens, but it seems unlikely that they would have been denied any language capacity at all. Some have claimed, though, that the Neanderthals, at least, must have been incapable of articulate speech, which might be taken as strong evidence for the gesture-first argument. This would imply in turn that the Neanderthals and Denisovans communicated through gesture rather than speech – an idea contained fictionally in Jean Auel’s novel Clan of the Cave Bear. From a more scientific perspective, though, the claim that the Neanderthals would have been unable to talk articulately rests on two pieces of evidence. First is a long-standing claim that the Neanderthal vocal tract, unlike that of modern humans, was not fully adapted to the production of speech (e.g. P. Lieberman 2007; P. Lieberman et al. 1972; D. E. Lieberman 1998; D. E. Lieberman et al. 2002). This conclusion, though, has been strongly questioned (e.g. Barney et al. 2012; Boë et al. 2007). The second has to do with the FOXP2 gene. A mutation of this gene has resulted in a severe speech impediment in about half the members of an extended English family, known as the KE family (Fisher et al. 1998), leading to suggestions that a (different) mutation of this otherwise highly conserved gene may have occurred in the human lineage to enable speech (Enard et al. 2002). This appears to have been refuted by the discovery that the region of the human version of the gene thought to be critical was present in Neanderthal DNA (Krause et al. 2007) – although the human FOXP2 gene may still differ in other ways (Ptak et al. 2012). Two recent reviews have also strongly questioned the notion that Neanderthals were incapable of speech or language. Barceló-Coblijn (2011: 286) concludes that © 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 183

“Neanderthals were probably able of vocalizing voluntarily, with communicative intentions and in a sophisticated way,” a conclusion endorsed in a comprehensive review of anatomical, genetic and archaeological evidence by Johansson (2013). This conclusion is also in defiance of Chomsky’s (2010) claim that only humans were, and are, capable of language itself. The notion that the Neanderthals were cognitively or linguistically inferior to Homo sapiens may be driven more by the desire to assert human superiority than by the scientific evidence. For present purposes, though, it seems fair to conclude that the Neanderthal evidence no longer strongly supports the “gesture-first” argument.

Evidence for gesture-first Intentionality and learning Although virtually all animals and birds emit vocal signals, these signals are for the most part driven by instinct or emotion, and lack voluntary control. Human language, in contrast, is intentional. For the most part, we can choose what we want to say and when to say it, implying voluntary control over vocalization. Even chimpanzees appear to have little if any intentional control over their vocalizations. Jane Goodall once wrote that “(t)he production of sound in the absence of the appropriate emotional state seems to be an almost impossible task for a chimpanzee” (Goodall 1986: 125). David Premack, another pioneer in the study of chimpanzee behavior, suggests that even chimpanzees, our closest nonhuman relatives, “lack voluntary control of their voice” (Premack 2007: 13,866). He goes on to write that they therefore “could not have speech. But sign language is a possibility, for they do have voluntary control of their hands.” These conclusions do need some qualification. Cheney and Seyfarth (2005) draw attention, as does Kendon, to examples of primates modifying their vocalizations, sometimes even suppressing them, depending on the audience. Vervet monkeys, for example, seldom give alarm calls when they are alone, and are more likely to do so in the presence of kin than of non-kin. Kendon also cites a chapter by Zuberbühler et al. (2011) suggesting that primate vocalization is more complex and variable than previously supposed. Chimpanzees modify their screams when under attack, depending on the severity of the attack and their status relative to that of nearby chimps (Slocombe et al. 2010); when encountering food chimps emit different kinds of grunts depending on the type of food (Slocombe & Zuberbühler 2005). Such examples, though, suggest subtle changes within call types rather than the generation of new call types (Egnor & Hauser 2004). Some modifications © 2014. John Benjamins Publishing Company All rights reserved

184 Michael C. Corballis

involve the face and mouth rather than voicing itself. For instance, chimpanzees can modify vocal sounds to attract attention by vibrating their lips, as in the “raspberry” sound (Hopkins et al. 2007), and this call can be imitated by naïve animals in captivity (Marshall et al. 1999). Reviewing these and other examples, Petkov and Jarvis (2012: 5) write that: … we would interpret the evidence for vocal plasticity and flexibility in some non-human primates as limited-vocal learning, albeit with greater flexibility via non-laryngeal than laryngeal control. But they do not have the considerable levels of laryngeal (mammalian) or syringeal (avian) control as seen in complex vocal learners.

Among complex vocal learners they include some birds, such as parrots, but only humans among the primates. Certainly, the ability of humans to intentionally produce arrays of vocal sounds vastly exceeds that of any other primate. There are said to be well over 1500 different phonemes (basic units of speech) in the languages of the world (Evans 2009). Although any given language makes use of only a small proportion of these, any child has the capacity to learn any of these phonemes. In defence of equal partners, Kendon notes that the vocal apparatus in nonhuman primates is much the same as that which produces speech in humans. The critical difference, though, lies not so much in the vocal apparatus as in the mechanism of control. It is not surprising that speech, when it did emerge in the human repertoire, should incorporate peripheral and subcortical mechanisms already specialized for vocal calls. Both primate vocalizations and human speech depend on a subcortical structure known as the nucleus ambiguus, but the input in nonhuman primates comes primarily from the limbic system. In humans, perhaps uniquely, additional input comes from the motor cortex, providing a degree of intentionality (Jürgens 2002). Of course, human vocalizations are also sometimes under limbic control, as in laughing or crying, and these involuntary emotional sounds can also interfere with speech. But speech itself critically requires cortical input. Although the evidence does favor some limited modifiability and intentionality in vocal production in nonhuman primates, the evidence in these respects clearly favors the gesture-first argument. This is further strengthened by a consideration of manual control, especially in our closest nonhuman relatives.

Manual control in apes Nonhuman primates, then, have at best limited intentional control over vocalization. In marked contrast, their arboreal heritage has produced excellent intentional control over the hands, along with a capacity to learn manual acts. Indeed, early attempts to teach chimpanzees to speak proved fruitless (Hayes 1952; Kellogg & © 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 185

Kellogg 1933; Ladygina-Kohts 2002), but much greater success has been attained with forms of sign language and other manual gestures. The pioneer was the chimpanzee Washoe, who learned a simplified form of American Sign Language, and is said to have mastered several hundred signs (Gardner & Gardner 1969). The bonobo Kanzi communicates by pointing to signs especially designed to be non-pictorial on a keyboard, representing objects and actions in abstract fashion (Savage-Rumbaugh et al. 1998). Kanzi’s keyboard has over 300 signs, and he supplements these by inventing gestures of his own. The gorilla Koko is said to use and understand over 1000 signs (Patterson & Gordon 2001). These examples demonstrate little in the way of grammatical competence, but at least show intentional use of gesture to represent objects and actions, and some limited competence at combining a few gestures to create simple requests. Observations of apes in natural settings also suggest a dominance of bodily over vocal communication, especially where communication is intentional rather than emotional or instinctive. Pollick and de Waal (2007) compared manual gestures directly with orofacial movements and vocalizations in the natural communications of chimpanzees and bonobos, and found manual gestures to be much less tied to context, and more variable between groups, both implying intentionality. The distinction was nevertheless blurred, because the vocalizations were lumped together with orofacial movements, and many such movements in chimpanzees and bonobos, such as lip-smacks, are not vocalized, but may well be under intentional control. This study nevertheless confirms the dominance of manual gesture in the natural communications of our closest nonhuman relatives. There are well-documented examples of chimpanzees making and using tools, and tool-making itself may be a precursor to language (Stout & Chaminade 2011). Chimpanzees fashion sticks for fishing termites out of holes (Bogart & Pruetz 2008) and make spears for jabbing into the hollow trunks of trees to extract bush babies (Pruetz & Bertolani 2007). Chimpanzees in the Laongo National Park in Gabon use tool sets comprizing up to five different stick and bark tools to extract honey from hives (Boesch et al. 2009). Some 25 different chimpanzee tools have been documented. The use and construction of tools lend themselves to pantomime as a means to communicate and teach. It has been suggested, in fact, that the origins of language as intentional communication lie in pantomime (Donald 1991), and sign languages themselves have a strong element of pantomime (Emmorey 2002). Pantomime, though, can be unwieldy and inefficient, and pantomimic gestures are typically simplified into a more symbolic form, a process known as “conventionalization” (Burling 1999). On this view, speech can be considered the end point of a conventionalization process in which pantomimic representations are replaced by arbitrary vocal signals. © 2014. John Benjamins Publishing Company All rights reserved

186 Michael C. Corballis

The mirror system Perhaps the most prominent case for the gesture-first theory has come from the discovery of mirror neurons in the monkey brain. First identified in area F5 in the prefrontal cortex of macaques, mirror neurons respond both when the animal reaches to make a grasping movement with the hand, and when it observes another individual making the same movement. Later research revealed a more general mirror system, encompassing parietal and temporal regions as well as frontal ones (Rizzolatti & Sinigaglia 2010). This system is largely homologous with language areas in humans, including Broca’s and Wernicke’s areas, except that in most people the language circuit is largely restricted to the left cerebral hemisphere. This suggests a scenario in which the mirror system was extended in the course of evolution to incorporate vocal control, and lateralized to the left. A possible role for the mirror system in language evolution has been elaborated by Arbib (2005, 2006), and taken as strong evidence that language evolved from manual actions (see also Corballis 2010). As Kendon (2011) notes, however, the role of the mirror system is often predicated on the understanding that it mediates imitation, seen as a prerequisite to the development of language, whether in ontogeny or phylogeny. But macaque monkeys, in which mirror neurons were first discovered, seem to be incapable of imitation – although Kendon does refer to more recent evidence from the volume by Abry et al. (2009) that infant macaques can imitate facial gestures such as tongue protrusion and lip-smacking. While this may seem to support the vocal origins of language, other evidence shows that mirror neurons in the macaque do not respond to vocal calls, although they do respond to the sounds produced by manual actions, such as tearing paper or cracking nuts (Kohler et al. 2002). The deafness of mirror neurons to vocalization, and their evident specialization for manual action, again suggests a priority for gestural origin, perhaps with manual and facial gestures both preceding vocalization. Nevertheless, Hickok (2009) raises no fewer than eight objections to the idea that mirror neurons mediate action understanding, or that they bear on language evolution. While Rizzolatti and Sinigaglia (2010) mount a spirited defence of the role of mirror neurons, it seems clear that the mirror neuron story is far from established. A more limited role for mirror neurons, however, is suggested by Hickok and Poeppel (2007). They write specifically in relation to the motor theory of speech perception, which holds that speech sounds are perceived in terms of how they are produced, rather than in terms of their acoustic characteristics (e.g. Galantucci et al. 2006) – a theory that effectively anticipated the discovery of mirror neurons, but in the context of vocal utterance rather than manual gesture. As Hickok and Poeppel point out, speech perception can occur in the absence © 2014. John Benjamins Publishing Company All rights reserved

Gesture and language evolution 187



of speech production, as in human infants or people with expressive aphasia. As mentioned earlier, even chimpanzees, bonobos and dogs seem able to understand spoken words. Extending Hickok and Poeppel’s argument, then, the main role of the mirror system may lie not in imitation, but rather in the coordination of production with perception, whether manually or vocally. Mirror neurons are involved in the visuo-manual coordination involved in reaching and grasping, and also, at least in humans, in accurate “reaching” of auditory targets in the production of speech. Their primary role, then, may lie in the mapping of one’s own perceived movements onto production, with the mapping of the perceived movements of others playing a lesser role. In nonhuman primates the role seems to be largely if not exclusively manual, with vocal production and perception relegated to lower-order mechanisms with at best limited intentional control, as discussed earlier. Insofar as language is intentional and learnable, this can be taken as evidence that language itself grew out of a system that evolved initially for manual control.

The argument from incredulity In his 1996 book River out of Eden, Richard Dawkins suggested that opposition to Darwin’s theory of natural selection arose from incredulity. People simply could not believe that humans could have evolved from earlier forms – although one might argue that the belief that humans were created by a deity is equally unbelievable. In discussing the issue of language evolution with Adam Kendon, my sense was that he had difficulty believing that language could ever have taken a different form, at least with respect to the relative contributions of gesture and vocalization. A similar argument is implicit in an objection to gestural theory raised by Burling (2005: 123): … the gestural theory has one nearly fatal flaw. Its sticking point has always been the switch that would have been needed to move from a visual language to an audible one.

Of course, to Burling the issue was more stark, a question of whether language could have shifted from a purely visual form to a purely vocal one. To Kendon, the issue is more subtle, since in his view gesture was always part of the system. Even so, he seemed unwilling to concede that language could ever have lacked the vocal component (even though sign languages are entirely gestural). The argument from incredulity is perhaps also supported by considerations of parsimony: why posit a change in modality if the evidence does not strongly support it?

© 2014. John Benjamins Publishing Company All rights reserved

188 Michael C. Corballis

In my view the argument is alleviated if speech itself is considered a gestural system, rather than an acoustically based one. This idea is also conveyed by the motor theory of speech perception, considered earlier, in which speech is even perceived in terms of how it is produced rather than in terms of its acoustic structure. Although this has been questioned by Hickok and Poeppel (2007), it is still the case that speech comprises gestural movements of the lips, the larynx, the velum and the blade, body and root of the tongue (Studdert-Kennedy 2005). Consequently, the switch can be considered a gradual one, in which the gestures shift from manual to vocal, with both still involved in everyday communication.

Facing facts The notion of a switch from manual to vocal can be further softened by supposing that the face itself played an intermediary role. In primates, hand and mouth are linked through the act of bringing food to the mouth, involving synchronization of mouth movements with movement of the hand. Indeed Kendon cites the chapter by MacNeilage, as well as MacNeilage (2008), that speech evolved from the masticatory movements of the mouth when engaged in chewing, perhaps forming the basis for the sharp acoustic contrasts that make up speech. To MacNeilage, this theory gives priority to speech over gesture. This theory, though, seems to neglect the concomitant use of manual gesture, and indeed hand and mouth movements are closely coordinated in primates, as well as in some other mammals such as squirrels and raccoons, in the use of the forelimbs to bring food to the mouth. The work of Maurizio Gentilucci on humans shows a close correspondence between hand and mouth movements during speech itself (Gentilucci 2003; Gentilucci et al. 2006). If language did indeed derive from the mechanisms of eating, then there is a case for supposing that movements of hand and mouth were equal partners. Movements of the mouth, though, do not involve vocalization, so their possible role suggests a precedence of gestures over vocalizations, albeit facial gestures rather than manual ones. Of course we can communicate quite well, albeit only at close range, by whispering, and it is conceivable that language evolved from movements of the hands and mouth in whispered combination, with voicing gradually introduced. That scenario, though, still suggests a priority for gesture, as perceived visually rather than through hearing. Facial gestures play an important role in sign languages (Emmorey 2002), and even normal speech retains a visible component. This is illustrated by the McGurk effect: A syllable (such as da) is dubbed onto a mouth saying another syllable (such as ba), and people tend to “hear” what they see rather than what was actually voiced (McGurk & MacDonald 1976). Other studies show that the © 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 189

parts of the brain involved in producing speech are activated when people simply watch silent videos of people speaking (Calvert & Campbell 2003; Watkins et al. 2003). Ventriloquists know the power of vision over what they hear when they project their own voices onto the face of a dummy by synchronizing the mouth movements of the dummy with their own tight-lipped utterances. Speech itself can be considered an extension of facial gestures, in which the gestures are partly contained within the mouth, and are for the most part invisible, although deaf people can demonstrate an impressive ability to lip-read. Voicing can then be understood as an added device to make mouth gestures accessible through sound rather than sight. Speech is effectively facial gesture half swallowed, with added sound. To some extent, speech can be heard without voicing itself, not only through whispering but also through a capacity to produce sound without vocalization. MacNeilage (1998) drew attention to the similarity between human speech and the ability of primates to make sound-producing facial gestures such as lip smacks, tongue smacks, and teeth chatters. Even though vocalizations do not seem to activate the mirror system in nonhuman primates, nonvocal facial movements may well do so; Ferrari et al. (2003) recorded discharge both from mirror neurons in monkeys during the lip smack, which is the most common facial gesture in monkeys, and from other mirror neurons in the same area during mouth movements related to eating. It is conceivable that the earliest audible languages were click languages, perhaps representing a transition between unvoiced and voiced features. Aside from a now extinct click language in Australia, click languages are confined to Africa. Two of the many groups that make extensive use of click sounds are the Hadzabe and the San, who are separated geographically by some 2000 kilometers, and genetic evidence suggests that the most recent common ancestor of these groups goes back to the root of present-day mitochondrial DNA lineages, perhaps at least as early as 100,000 years ago (Knight et al. 2003).

An evolutionary scenario Intentional manual action probably goes far back in primate evolution, and is evident in great apes in the use and manufacture of tools, and to some extent in gestural communication. It is likely that the emergence of bipedalism in hominins enhanced the complexity of gestures as well as of the use and manufacture of tools. The critical phase was probably the Pleistocene, dating from around 2.9 million to about 12,000 years ago. The Pleistocene saw the beginning of a tripling of brain size, the emergence of stone tools, and obligate bipedalism replacing the more facultative bipedalism of the earlier hominins. The Pleistocene is also widely © 2014. John Benjamins Publishing Company All rights reserved

190 Michael C. Corballis

recognized as the era in which hominins came to occupy what has been termed the “cognitive niche” (Tooby & de Vore 1987), depending on social bonding and enhanced communication for survival in the more exposed and dangerous environment of the African savanna. It seems highly likely, then, that grammatical language evolved gradually during this era, rather than late and abruptly, as implied by Chomsky (2010) and others. As Donald (1991) argued, language probably emerged from pantomime, using the body to mimic space-time activities and so convey them to a watching audience. Pantomime, though, is inefficient, and over the course of the Pleistocene, the pressure toward a more efficient and compact system may have driven the process of conventionalization. Iconic or pantomimic gestures were replaced by simpler signals whose meanings were acquired through association rather than through pictorial representation. Meaning is then carried through cultural transmission, rather than in the signal itself. Such signals might be described as arbitrary symbols, but their arbitrary nature arose from practicality rather than from some newfound cognitive capacity – recall that chimpanzees and border collies can learn to associate spoken words with objects and actions. Speech itself might be regarded as an end result of progressive conventionalization, and carries advantages over manual gesture. These advantages include the ability to communicate in the dark, the freeing of the hands for simultaneous actions such as carrying objects and demonstrating skills, the greatly reduced energy cost, and enhanced attentionseeking (Corballis 2009). From a strictly linguistic perspective, though, signed languages are as effective as speech, and manual gestures are effective accompaniments to spoken discourse. In these respects, at least, speech and gesture are indeed equal partners.

Conclusions In this chapter, with Carrollian obsession, I have corralled the evidence to favor, albeit fairly marginally, the gesture-first scenario. In the course of doing so, however, I discovered that the issue is not as stark as it once seemed. This is partly because language itself is complex. Insofar as it is a system of communication through vocal sounds, there are of course precedents in animal calls throughout the animal kingdom, including birds and insects. Apes and dogs can even understand human speech, albeit not to the extent that we can share stories or gossip about the neighbors. In these respects, vocal communication was indeed a precursor, even shaping the vocal tract and subcortical components of vocal control well before language was invented. In these respects, too, vocalization was at least an equal partner, if not a precursor also, of bodily gesture. © 2014. John Benjamins Publishing Company All rights reserved

Gesture and language evolution 191



The force of the gesture-first argument, where we understand gesture to mean visible manual gesture, lies in the productive aspect of language. In nonhuman primates, vocalization is very largely involuntary, with at best only limited intentional control. Human speech, in contrast, is a voluntary system, with a grammatical and vocal complexity that seems to take it well beyond the realms of animal calls. It is the limbs, not the voice, that have evolved for intentional action, first enabling movement through the environment, but later, especially through our primate heritage, the hands in particular have also evolved for manipulation. This development probably occurred largely as a consequence of adaptation to life in the trees. The primate hand is adapted to clutching branches, plucking fruit, catching insects, bringing food to the mouth and making interindividual contact through grooming and touching. In the more terrestrial great apes, these adaptations later led to making and using tools, and even to gestural communication. Our manual heritage provided a natural platform for the subsequent emergence of intentional two-way communication. Yet great apes do not possess the capacity for language in the human sense; they do not tell stories, gossip or use language to instil knowledge in others. As noted earlier, Chomsky (2010) has asserted that true language evolved only in Homo sapiens, and only within the past 100,000 years. In this he is supported by a number of archaeologists, such as Tattersall, who recently wrote as follows: Our ancestors made an almost unimaginable transition from a non-symbolic, nonlinguistic way of processing information and communicating information about the world to the symbolic and linguistic condition we enjoy today. It is a qualitative leap in cognitive state unparalleled in history. Indeed, as I’ve said, the only reason we have for believing that such a leap could ever have been made, is that it was made. And it seems to have been made well after the acquisition by our species of its distinctive modern form. (Tattersall 2012: 199)

In this surprisingly prevalent view, any precursors to language were trivial and uninteresting – it was the emergence of symbolic communication and what has been termed universal grammar that created an entirely new form of communication, in which neither gesture nor voicing played a critical role. They merely provided alternative means through which our new-found symbolic prowess could be communicated, and indeed may have been equal partners. I suggested above that the more likely scenario is that language evolved during the Pleistocene, and the balance between manual and vocal expression may well have shifted during that era. But it was most likely also during that era that it gained the complexity that we do not see in great apes. In this view, grammatical language was the preserve of the genus Homo, but with the extinction of our Homo cousins it remains a uniquely human treasure. Among extant apes, only humans © 2014. John Benjamins Publishing Company All rights reserved

192 Michael C. Corballis

have the capacity to tell stories, gossip and use language to instil knowledge. One of the properties of language is what linguists call displacement; language seems exquisitely and perhaps uniquely shaped to enable us to communicate about the nonpresent. Most of our utterances have to do with other times (such as the middle of next week) or other places (such as Lund, Sweden), or ideas that exist in our heads and not in the sentient environment (such as the ideas I am groping for in this chapter). On that note, then, I leave the final word to Lewis Carroll: He thought he saw a Rattlesnake That questioned him in Greek: He looked again, and found it was The Middle of Next Week. “The one thing I regret,” he said, “Is that it cannot speak!”

from The Gardner’s Song (Carroll 1889: 83)

References Abry, Christian, Vilain, Anne, and Schwartz, Jean-Luc (eds). 2009. Vocalize to Localize. Amsterdam: John Benjamins. DOI: 10.1075/bct. 13 Arbib, Michael A. 2005. “From monkey-like action recognition to human language: An evolutionary framework for neurolinguistics.” Behavioral and Brain Sciences 28: 105–168. Arbib, Michael A. 2006. “The mirror system and the linkage of action and language.” In Action to Language via the Mirror Neuron System, Michael A. Arbib (ed.), 3–47. Cambridge: ­Cambridge University Press. DOI: 10.1017/CBO9780511541599.002 Barceló-Coblijn, Lluis. 2011. “A biolinguistic approach to vocalizations of H. neanderthalensis and the genus Homo.” Biolinguistics 5: 286–334. Barney, Anna, Martelli, Sandra, Serrurier, Antoine, and Steele, James. 2012. “Articulatory capacity of Neanderthals, a very recent and human-like fossil hominin.” Philosophical Transactions of the Royal Society B 367: 88–102. DOI: 10.1098/rstb.2011.0259 Bauer, Richard H. 1993. “Lateralization of neural control for vocalization by the frog (Rana pipiens).” Psychobiology 21: 243–248. Biro, Dora, Sousa, Claudia, and Matsuzawa, Tetsura. 2006. “Ontogeny and cultural propagation of tool use by wild chimpanzees at Bossou, Guinea: case studies in nut cracking and leaf folding”. In Cognitive Development in Chimpanzees, Tetsura Matsuzawa, Masaki Tomonaga and Masayuki Tanaka (eds), 476–508. Tokyo: Springer-Verlag. DOI: 10.1007/4-431-30248-4_28 Boë, Louis-Jean, Heim, Jean-Louis, Honda, Kiyoshi, Maeda, Shinji, Badin, Pierre and Abry Christian. 2007. “The vocal tract of newborn humans and Neanderthals: Acoustic capabilities and consequences for the debate on the origin of language. A reply to Lieberman (2007a).” Journal of Phonetics 35: 564–581. DOI: 10.1016/j.wocn.2007.06.006 Boesch, Christophe. 1991. “Handedness in wild chimpanzees.” International Journal of Primatology 12: 541–558. DOI: 10.1007/BF02547669

© 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 193

Boesch, Christophe, Head, Josephine, and Robbins, Martha M. 2009. “Complex tool sets for honey extraction among chimpanzees in Laongo National Park, Gabon.” Journal of Human Evolution 56: 560–569. DOI: 10.1016/j.jhevol. 2009.04.001 Bogart, Stephanie L., and Pruetz, Jill D. 2008. “Ecological context of savanna chimpanzee (Pan troglodytes verus) termite fishing at Fongoli, Senegal.” American Journal of Primatology 70: 605–612. DOI: 10.1002/ajp. 20530 Burling, Robbins. 1999. “Motivation, conventionalization, and arbitrariness in the origin of language.” In The Origins of Language: What Nonhuman Primates Can Tell Us, Barbara J. King (ed.), 307–350. Santa Fe, NM: School of American Research Press. Burling, Robbins. 2005. The Talking Ape. New York: Oxford University Press. Calvert, Gemma A., and Campbell, Ruth. 2003. “Reading speech from still and moving faces: The neural substrates of visible speech.” Journal of Cognitive Neuroscience 15: 57–70. DOI: 10.1162/089892903321107828 Carroll, Lewis. 1889. Sylvie and Bruno. London: Macmillan. Cheney, Dorothy L., and Seyfarth, Robert M. 1990. How Monkeys See the World. Chicago: University of Chicago Press. Cheney, Dorothy L., and Seyfarth, Robert M. 2005. “Constraints and preadaptations in the earliest stages of language evolution.” The Linguistic Review 22: 135–159. DOI: 10.1515/tlir.2005.22.2-4.135 Chomsky, Noam. 2010. “Some simple evo devo theses: How true might they be for language?” In The Evolution of Human Language, Richard K. Larson, Viviane Déprez and Hiroko Yamakido (eds), 45–62. Cambridge: Cambridge University Press. DOI: 10.1017/CBO9780511817755.003 Corballis, Michael C. 2002. From Hand to Mouth: The Origins of Language. Princeton, NJ: Princeton University Press. Corballis, Michael C. 2003. “From mouth to hand: Gesture, speech, and the evolution of righthandedness.” Behavioral and Brain Sciences 26: 198–208. Corballis, Michael C. 2009. “The evolution of language.” Annals of the New York Academy of Sciences 1156: 19–43. DOI: 10.1111/j.1749-6632.2009.04423.x Corballis, Michael C. 2010. “Mirror neurons and the evolution of language.” Brain and Language 112: 25–35. DOI: 10.1016/j.bandl.2009.02.002 Corballis, Michael C., and Beale, Ivan L. 1976. The Psychology of Left and Right. Hillsdale, NJ: Lawrence Erlbaum. Dawkins, Richard. 1996. River Out of Eden: A Darwinian View of Life. New York: Basic Books. Donald, Merlin. 1991. Origins of the Modern Mind. Cambridge, MA: Harvard University Press. Dunbar, Robin I. M. 1998. Grooming, Gossip, and the Evolution of Language. Cambridge, MA: Harvard University Press. Egnor, S. E., and Hauser, Mark D. 2004. “A paradox in the evolution of primate vocal learning.” Trends in Neurosciences 27: 649–654. DOI: 10.1016/j.tins.2004.08.009 Emmorey, Karen. 2002. Language, Cognition, and Brain. Hillsdale, NJ: Lawrence Erlbaum. Enard, Wolfgang, Przeworski, Molly, Fisher, Simon E., Lai, Cecilia S., Wiebe, Victor, Kitano, Takashi et al. 2002. “Molecular evolution of FOXP2, a gene involved in speech and language.” Nature 418: 869–872. DOI: 10.1038/nature01025 Evans, Nicholas. 2009. Dying Words: Endangered Languages and What They Have to Tell Us. Oxford: Wiley-Blackwell.

© 2014. John Benjamins Publishing Company All rights reserved

194 Michael C. Corballis

Ferrari, Pier F., Gallese, Vittorio, Rizzolatti, Giacomo, and Fogassi, Leonardo. 2003. “Mirror neurons responding to the observation of ingestive and communicative mouth actions in the monkey ventral premotor cortex.” European Journal of Neuroscience 17: 1703–1714. DOI: 10.1046/j.1460-9568.2003.02601.x Fisher, Simon E., Vargha-Khadem, Faraneh, Watkins, Katie E., Monaco, Anthony P., and ­Pembrey, Marcus E. 1998. “Localization of a gene implicated in a severe speech and language disorder.” Nature Genetics 18: 168–170. DOI: 10.1038/ng0298-168 Foundas, Anne L., Leonard, Christiana M., Gilmore, Robin L., Fennell, Eileen B., and Heilman, Kenneth M. 1996. “Pars triangularis asymmetry and language dominance.” Proceedings of the National Academy of Sciences, USA 93: 719–722. DOI: 10.1073/pnas.93.2.719 Galantucci, Bruno, Fowler, Carol A., and Turvey, Michael T. 2006. “The motor theory of speech perception reviewed.” Psychonomic Bulletin and Review 13: 361–377. DOI: 10.3758/BF03193857 Gannon, Patrick J., Holloway, Ralph L., Broadfield, Douglas C., and Braun, Allen R. 1998. “Asym­ metry of chimpanzee planum temporale: Human-like brain pattern of Wernicke’s area homolog.” Science 279: 220–221. DOI: 10.1126/science.279.5348.220 Gardner, R. Allen, and Gardner, Beatrice T. 1969. “Teaching sign language to a chimpanzee.” Science 165: 664–672. DOI: 10.1126/science.165.3894.664 Gentilucci, Maurizio. 2003. “Grasp observation influences speech production.” European Journal of Neuroscience 17: 179–184. DOI: 10.1046/j.1460-9568.2003.02438.x Gentilucci, Maurizio, Bernardis, Paolo, Crisi, Girolamo, and Dalla Volta, Ricardo. 2006. “Repetitive transcranial stimulation of Broca’s area affects verbal responses to gesture observation.” Journal of Cognitive Neuroscience 18: 1059–1074. DOI: 10.1162/jocn.2006.18.7.1059 Goodall, Jane. 1986. The Chimpanzees of Gombe: Patterns of Behaviour. Cambridge, MA: ­Harvard University Press. Green, Richard E., Krause, Johannes, Briggs, Adrian W., Maricic, Tomislav, Stenzel, Udo, Kircher, Martin et al. 2010. “A draft sequence of the Neanderthal genome.” Science 328: 710–722. DOI: 10.1126/science.1188021 Güven, Mehmet, Elalmis, Derya D., Binokay, Secil, and Tan, Uner. 2003. “Population right-paw preference in rats assessed by a new computerised food-reaching test.” International Journal of Neuroscience 113: 1691–1705. DOI: 10.1080/00207450390249258 Hayes, Catherine. 1952. The Ape in Our House. London: Gollancz. Hewes, Gordon W. 1973. “Primate communication and the gestural origins of language.” Current Anthropology 14: 5–24. DOI: 10.1086/201401 Hickok, Gregory S. 2009. “Eight problems for the mirror neuron theory of action understanding in monkeys and humans.” Journal of Cognitive Neuroscience 21: 1229–1243. DOI: 10.1162/jocn.2009.21189 Hickok, Gregory S., and Poeppel, David. 2007. “The cortical organization of speech processing.” Nature Reviews Neuroscience 8: 395–402. DOI: 10.1038/nrn2113 Hobaiter, Catherine, and Byrne, Richard W. 2011. “Serial gesturing by wild chimpanzees: Its nature and function for communication.” Animal Cognition 14: 827–838. DOI: 10.1007/s10071-011-0416-3 Hopkins, William D., and Leavens, David A. 1998. “Hand use and gestural communication in chimpanzees (Pan troglodytes).” Journal of Comparative Psychology 112: 95–99. DOI: 10.1037/0735-7036.112.1.95

© 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 195

Hopkins, William D., Marino, Lori, Rilling, James K., and MacGregor, Leslie A. 1998. “Planum temporale asymmetries in great apes as revealed by magnetic resonance imaging (MRI).” NeuroReport 9: 2913–2918. DOI: 10.1097/00001756-199808240-00043 Hopkins, William D., and Nir, Talia M. 2010. “Planum temporale surface area and grey matter asymmetries in chimpanzees (Pan troglodytes): The effect of handedness and comparison with findings in humans.” Behavioural Brain Research 208: 436–443. DOI: 10.1016/j.bbr.2009.12.012 Hopkins, William D., Taglialatela, Jared P., and Leavens, David A. 2007. “Chimpanzees differentially produce novel vocalizations to capture the attention of a human.” Animal Behaviour 73: 281–286. DOI: 10.1016/j.anbehav.2006.08.004 Johansson, Sverker. 2013. “The talking Neanderthals: What do fossils, genetics, and archeology say?” Biolinguistics 7: 35–74. Jürgens, Uwe. 2002. “Neural pathways underlying vocal control.” Neuroscience and Biobehavioral Reviews 26: 235–258. DOI: 10.1016/S0149-7634(01)00068-9 Kaminski, Juliane, Call, Josep, and Fischer, Julia. 2004. “Word learning in a domestic dog: ­Evidence for ‘fast mapping’.” Science 304: 1682–1683. DOI: 10.1126/science.1097859 Kellogg, Winthrop N., and Kellogg, Luella A. 1933. The Ape and the Child: A Study of Early Environmental Influence upon Early Behaviour. New York: McGraw-Hill. Kendon, Adam. 1980. “Gesticulation and speech: Two aspects of the process of utterance.” In The Relationship of Verbal and Nonverbal Communication, Mary Ritchie Key (ed.), 207–228. The Hague: Mouton. Kendon, Adam. 2004. Gesture: Visible Action as Utterance. Cambridge: Cambridge University Press. Kendon, Adam. 2011. “Vocalisation, speech, gesture, and the language origins debate.” Gesture 13: 349–370. DOI: 10.1075/gest. 11.3.05ken Knight, Alec, Underhill, Peter A., Mortensen, Holly M., Zhivotovsky, Lev A., Lin, Alice A., Henn, Brenna M. et al. 2003. “African Y chromosome and mtDNA divergence provides insight into the history of click languages.” Current Biology 13: 464–473. DOI: 10.1016/S0960-9822(03)00130-1 Kohler, Evelyne, Keysers, Christiane, Umiltà, M. Alessandra, Fogassi, Leonardo, Gallese, ­Vittorio, and Rizzolatti, Giacomo. 2002. “Hearing sounds, understanding actions: Action representation in mirror neurons.” Science 297: 846–848. DOI: 10.1126/science.1070311 Krause, Johannes, Lalueza-Fox, Carles, Orlando, Ludovic, Enard, Wolfgang, Green, Richard E., Burbano, Hernàn A. et al. 2007. “The derived FOXP2 variant of modern humans was shared with Neanderthals.” Current Biology 17: 1908–1912. DOI: 10.1016/j.cub.2007.10.008 Ladygina-Kohts, Nadezhda N. 2002. Infant Chimpanzee and Human Child. Oxford: Oxford University Press. (Translated from the 1935 Russian version by Boris Vekker). Lear, Edward. 1894. A Book of Nonsense. Boston: Roberts Brothers. (First published 1842). Lieberman, Philip. 2007. “The evolution of human speech.” Current Anthropology 48: 39–46. DOI: 10.1086/509092 Lieberman, Philip, Crelin, Edmund S., and Klatt, Dennis H. 1972. “Phonetic ability and related anatomy of the new-born, adult human, Neanderthal man, and the chimpanzee.” American Anthropologist 74: 287–307. DOI: 10.1525/aa.1972.74.3.02a00020 Lieberman, Daniel E. 1998. “Sphenoid shortening and the evolution of modern cranial shape.” Nature 393: 158–162. DOI: 10.1038/30227

© 2014. John Benjamins Publishing Company All rights reserved

196 Michael C. Corballis

Lieberman, Daniel E., McBratney, Brandeis M., and Krovitz, Gail. 2002. “The evolution and development of cranial form in Homo sapiens.” Proceedings of the National Academy of Sciences 99: 1134–1139. DOI: 10.1073/pnas.022440799 Lonsdorf, Eric, and Hopkins, William D. 2005. “Wild chimpanzees show population-level handedness for tool use.” Proceedings of the National Academy of Sciences (USA) 102: 12634– 12638. DOI: 10.1073/pnas.0505806102 MacNeilage, Peter F. 1998. “The frame/content theory of evolution of speech.” Behavioral and Brain Sciences 21: 499–546. MacNeilage, Peter F. 2008. The Origin of Speech. Oxford: Oxford University Press. Marshall, Andrew J., Wrangham, Richard W., and Arcadi, Adam C. 1999. “Does learning affect the structure of vocalizations in chimpanzees?” Animal Behaviour 58: 825–830. DOI: 10.1006/anbe.1999.1219 McGurk, Harry, and MacDonald, John. 1976. “Hearing lips and seeing voices.” Nature 264: 746–748. DOI: 10.1038/264746a0 Meguerditchian, Adrien, and Vauclair, Jacques. 2006. “Baboons communicate with their right hand.” Behavioural Brain Research 171: 170–174. DOI: 10.1016/j.bbr.2006.03.018 Meyer, Matthias, Martin Kircher, Gansauge, Marie-Theres, Li, Heng, Racimo, Fernando, Mallick, Swapan et al. 2012. “A high-coverage genome sequence from an archaic Denisovan individual.” Science 338: 222–226. DOI: 10.1126/science.1224344 Patterson, Francine G. P., and Gordon, Wendy. 2001. “Twenty-seven years of project Koko and Michael.” In All Apes Great and Small, Vol. 1: African Apes, Biruté M. F. Galdikas, Nancy E. Briggs, Lori K. Sheeran, Gary L. Shapiro, and Jane Goodall (eds), 165–176. New York: Kluver. Petkov, Christopher I., and Jarvis, Erich D. 2012. “Birds, primates, and spoken language origins: Behavioral phenotypes and neurobiological substrates.” Frontiers in Evolutionary Neuroscience 4: article 12. DOI: 10.3389/fnevo.2012.00012 Pika, Simone, and Mitani, John C. (2009). “The directed scratch: Evidence for a referential gesture in chimpanzees?” In The Prehistory of Language, R. Botha and Chris Knight (eds), 167–177. Oxford: Oxford Scholarship. Pilley, John W., and Reid, Alliston.K. 2011. “Border collie comprehends object names as verbal referents.” Behavioural Processes 86: 184–195. DOI: 10.1016/j.beproc.2010.11.007 Pollick, Amy S., and de Waal, Frans B. M. 2007. “Ape gestures and language evolution.” Proceedings of the National Academy of Sciences 104: 8184–8189. DOI: 10.1073/pnas.0702624104 Pinker, Steven and Bloom, Paul. 1990. “Natural language and natural selection.” Behavioral and Brain Sciences 13: 707–784. Premack, David. 2007. “Human and animal cognition: Continuity and discontinuity.” Proceedings of the National Academy of Sciences (USA) 104: 13861–13867. DOI: 10.1073/pnas.0706147104 Pruetz, Jill D., and Bertolani, Paco. 2007. “Savanna chimpanzees, Pan troglodytes verus, hunt with tools.” Current Biology 17: 412–417. DOI: 10.1016/j.cub.2006.12.042 Ptak, Susan E., Enard, Wolfgang, Wiebe, Victor, Hellmann, Ines, Krause, Johannes, Lachmann, Michael et al. 2012. “Linkage disequilibrium extends across putative selected sites in FOXP2.” Molecular and Biological Evolution 26: 2181–2184. DOI: 10.1093/molbev/msp143 Raffaele, P. 2006. “Speaking bonobo.” Smithsonian Magazine, November 2006. Online at: http:// www.smithsonianmag.com/science-nature/speakingbonobo.html.

© 2014. John Benjamins Publishing Company All rights reserved



Gesture and language evolution 197

Rizzolatti, Giacomo, and Sinigaglia, Corrado. 2010. “The functional role of the parieto-frontal mirror circuit: Interpretations and misinterpretations.” Nature Reviews Neuroscience 11: 264–274. DOI: 10.1038/nrn2805 Savage-Rumbaugh, Sue, Shanker, Stuart G., and Taylor, Talbot J. 1998. Apes, Language, and the Human Mind. New York: Oxford University Press. Slocombe, Katie E., Kaller, Tanya, Call, Josep, and Zuberbühler, Klaus. 2010. “Chimpanzees extract social information from agonistic screams.” PLoS ONE 5: e11473. DOI: 10.1371/journal.pone.0011473 Slocombe, Katie E., and Zuberbühler, Klaus. 2005. “Functionally referential communication in a chimpanzee.” Current Biology 15: 1779–1784. DOI: 10.1016/j.cub.2005.08.068 Stout, Dietrich, and Chaminade, Thierry. 2011. “Stone tools, language and the brain in human evolution.” Philosophical Transactions of the Royal Society B 367: 75–87. DOI: 10.1098/rstb.2011.0099 Studdert-Kennedy, Michael. 2005. “How did language go discrete?” In Language Origins: Perspectives on Evolution, Maggie Tallerman (ed.), 48–67. Oxford: Oxford University Press. Tattersall, Ian. 2012. Masters of the Planet: The Search for Human Origins. New York: Palgrave Macmillan. Tooby, John, and DeVore, Irven. 1987. “The reconstruction of hominid evolution through strategic modeling.” In The Evolution of Human Behavior: Primate Models, Warren G. Kinzey (ed.), 183–238. Albany, NY: SUNY Press. Wada, Juhn A., Clarke, Robert, and Hamm, Anne. 1975. “Cerebral hemispheric asymmetry in humans.” Archives of Neurology 32: 239–246. DOI: 10.1001/archneur.1975.00490460055007 Watkins, Katie E., Strafella, Antonio P., and Paus, Tomas. 2003. “Seeing and hearing speech excites the motor system involved in speech production.” Neuropsychologia 41: 989–994. DOI: 10.1016/S0028-3932(02)00316-0 Zuberbuhler, Christiane, Arnold, Kate, and Slocombe, Katie. 2011. “Living links to human language.” In Primate Communication and Human Language, Vilain, Anne, Schwartz, JeanLuc, Abry, Christian, and Vauclair, Jacques (eds), 13–38. Amsterdam: John Benjamins.

© 2014. John Benjamins Publishing Company All rights reserved