The Critical Period in Language Acquisition: Nature vs. Nurture Redux

Among those who veer towards the nature end of the nature vs. nurture debates, it is axiomatic that the acquisition of language must be achieved during the “critical period” of a child’s development and that a child’s natural innate ability to learn not only his native language, but one or more foreign ones as well (to native fluency), is basically shut down at puberty.  For adults, learning a foreign language is extremely difficult largely because those portions of the brain involved in language become closed or fixed in those patterns programmed before puberty’s onset.  Many authors, including Steven Pinker in his well-known book The Language Instinct (1994), suggest that this early facility to learn language and its subsequent loss can be explained in evolutionary, and particularly in selective, terms.

That these ideas are still widely held is demonstrated by an article by Matt Ridley recently published in the Wall Street Journal (“Where the Wild Things Learn Language,” 24 September 2011).  Ridley points to two famous “wild children”
who, isolated during the critical period, failed to adequately learn language once they were provided a normal social environment, Kaspar Hauser and a girl name Genie in California.  Genie was able to acquire a fairly good vocabulary, but both children were unable to master grammar and syntax, despite extensive efforts on their behalf by well-intentioned caretakers.  Ridley also draws a comparison to young song birds who, if deprived of exposure to their species’ song, fail to sing properly as adults (Pinker also cites this example, as do many others).  Ridley offers some suggestions as to a molecular basis for the critical period, although in language that indicates that these are not yet sufficient explanations.

As mentioned above, the critical period idea is firmly entrenched in conventional wisdom—which is exactly why I wish to question it.  Conventional wisdom is all too often partially or totally wrong.

First, the cases of Kaspar Hauser and Genie leave much to be desired as examples.  Kaspar was discovered, as Ridley notes, in Nuremberg, Germany, in 1828, having been kept isolated in a dark room his entire life (he was 16 when discovered).  However, there was no evidence that corroborated his stories (he elaborated further on them later), no evidence that he could not speak reasonably well, and some evidence that he was a con artist.  Whatever the truth of Hauser’s life, he clearly is not evidence for the critical period for language acquisition.  Genie, the California girl kept locked in her room for her first 13 years by a shockingly abusive father (who also abused his wife and son), was nearly catatonic when she was taken by her mother, who had finally left the house, to social services and in subsequent years was closely observed by a team of linguists, who found that she was able to acquire many words but not grammar.  They also noted that she was backward in many other ways, both cognitively and physically. Genie also is not helpful evidence of the critical period for language acquisition, as it is not known if she was a normal baby at birth and because her deprivations were so complete that her brain could not have developed any normal cognitive functions even if she had begun life as a normal infant.  As evidence of a critical period, both Hauser and Genie are of dubious value.

That leaves only the example of young song birds, but again, not a particularly good example because bird song, while communicative, is not language (birds do not discuss ideas, for example) and are stereotypical rather than generative.  Even birds who are good at imitating others, such as mockingbirds or parrots, do so by rote and repeat what they hear; they do not generate novel sentences or create neologisms to express novel ideas, for example.

In order, therefore, to understand human language acquisition we must turn to an examination of human beings, beginning with the historical record.  Although I do not know of anyone who has written a book or an extended article on the history of human beings’ acquisition of language(s), I have been struck as I read European history by the number of people in that history who spoke multiple languages (as in fact many Europeans continue to do today) and by how adults seemed to have learned new and quite, to them, strange languages so readily.  For example, long before Columbus sailed to the new World, Italian and especially Venetian merchants had travelled far from home into the Near East and in the process became fluent in Arabic, Persian, and other languages of peoples they traded with.  Marco Polo (c. 1254 to 1324) left Venice with his father and uncle when he was 17 (i.e., post-pubescent) and travelled as far as China, where he served in the royal court. In his memoir (dictated to someone else during a year in a Genoan prison) he is said to have mastered four languages during his sojourn (most likely Mongolian, Persian, Arabic, and Turkic). Employees of the East India Company frequently learned the languages of India and other ports of call, as a necessity of doing business, and none of them were children.  So also, early explorers and conquerors of the New World learned the Native languages, and the Natives often learned Spanish, French, or English, depending on which European nation was currently encroached on their territories.  Let us not forget the example of Dona Marina, Cortes’ mistress and interpreter, who knew more than one Native language and Spanish.

How do we account for the apparent ease with which adults of the past learned new languages, languages that were often very different not only in vocabulary but in grammar, from their own?  (How different is Navajo from English, for
example!)  Certainly part of the answer lies in necessity, which provides considerable motivational force.  But I think part of the answer also lies in the way people normally lived in past times. Their social, and therefore their linguistic, experience was almost entirely (and for many, only) oral and face-to-face, much more so than it is for us today.  Even reading, interestingly, was in many cases actually oral; our words “lecture” and “lecturer” have the root meaning of reading out loud.  Because until recent historical times, books were expensive and rare, lecturers read from books to an audience, listeners who might be engaged in other activities, such as dining, or to rooms full of students (thus the lectern, which is designed to hold a book open in front of the lecturer).  Even after the invention of the moveable-type printing press and the availability of cheap mass-produced books (and before the days of radios and televisions), families gather by the hearth, one member reading from a book while others sewed, smoked a pipe, or engaged in other routine tasks.

It should also be noted that in Medieval and Renaissance times, education was largely in languages, with a heavy emphasis on rhetoric, grammar, and speaking.  Imagine an American elementary school student today reciting his Latin homework to his teacher on a daily basis!  But that’s what many school boys once did.  But even for the uneducated, exposure to different languages, and certainly to different dialects of their own language, would have been a routine experience; “Italy” and “France,” etc., were not nation-states with an agreed upon national language until historically recent times but were instead assortments of duchies, kingdoms, city states, and autonomous or conquered regions, each with its own distinct dialect, sometimes so distinct as to be almost mutually incomprehensible.  (Even today, an English man or woman who’s been around England a bit can tell where a fellow citizen grew up by his or her dialect.) Thus, people in the past were aurally attuned to subtle differences in vocal sounds and their ears (and brains) would have been continually primed for additional new sounds—so that what to us may sound like gibberish to them might well have resolved rather quickly into identifiable sounds and words.

It appears that today’s conventional wisdom that the critical period or window of opportunity closes around the time of puberty may be a contemporary anomaly having more to do with Americans’ lack of interest in (and in most instances practical need for) foreign language learning than with biological inevitability—thus being actually a cultural or nurture phenomenon rather than the innate or nature one that conventional wisdom has assumed.  Learning languages may be in the same class as learning how to play the piano or violin, program a computer, or indeed any other skill that is clearly cultural rather than natural (invented and learned rather than selected and innate).  One will search in vain for the genes that program us to invent pianos, violins, computers, keyed locks, air conditioners, and microscopes, and equally in vain for the genes that program the skills to use these devices.  Yet we do all them as easily (or with equal difficulty) as we learn languages.  That language learning is far older than these others does not make it innate.

Furthermore, that many adults find learning any new skill difficult does not mean that they would have learned that skill more easily or to a greater degree if they had started as children.  Yes, we can all cite the child prodigies in music and sports, but if we’re honest we can also cite the many more children who, despite years of lessons and support from parents, are never more than mediocre piano or tennis players. Less often noted (perhaps because they aren’t really quite so unexpected as the critical period idea would suggest) are the teenagers and adults who become adept at a particular skill, especially one that requires higher cognitive abilities for full development. Think Henry James’s late novels.  Think also of the grannies and gramps that get their college degrees at the same time as their grandchildren.

Even if it were true that children learn new skills more easily than adults, that may be because they have no skills to begin with; they must learn something and they must learn it quickly in order to survive and thrive.  Adults already have a repertoire of skills that they have found useful, and they may out of custom or habit be disinclined to acquire new ones, even when it’s obvious that new ones would enhance their lives—especially if they have been convinced by the critical
period advocates that they are no longer able to do so.

Conventional wisdom tends to be wrong or, worse, outdated.  Such is the case for the critical period idea of language acquisition. When in doubt, go to the experts, those whose profession is to research and know, as well as teach, a particular discipline or field of knowledge.  As it happens, there are people whose profession is to research the process of language learning, particularly second language learning (SLL).  According to a website for such professionals, Language Learning Advisor, adults do not have a harder time learning a new language; rather, they learn in a different way than do children.  It’s worth quoting the first paragraph from that website:

What exactly is the relationship between age and language learning? There are numerous myths and misconceptions about the relative abilities or inabilities of language learners of different ages. Do children learn language faster? Is it impossible for adults to achieve fluency? In a word – no. These and other common beliefs are simply not true. Children do not necessarily learn faster than adults and, in fact, adults may learn more efficiently. Furthermore, there is no loss of language ability or language learning ability over time. Age is not a detriment to language learning, and by all accounts, learning a second (or third etc) language actually keeps the older language learner’s mind active. People of all ages can benefit from learning languages.

The website also makes an interesting point about the notion of a critical period:  “The ‘critical period’ hypothesis that was put forth in the 1960’s was based on then-current theories of brain development, and argued that the brain lost ‘cerebral plasticity’ after puberty, making second language acquisition more difficult as an adult than as a child.”  The 1960’s!  That suggests that anyone clinging to the outmoded critical period notion not only is way behind the times but apparently has not been aware of the tidal wave of research into neural plasticity since then, especially the realization that the human mind retains its ability to “reprogram” itself throughout the lifespan (with the exception of organic brain disease, such as Alzheimer’s disease).  Even people who have suffered strokes can often train new areas of their brains to take over lost functions.  Read in light of these discoveries, the last sentence of Ridley’s article is truly astonishing:  “The brain is innately designed to be open to experience, but only during a certain period” [italics added].

Given the dubiousness of the critical period idea, one wonders why it continues to be so widely repeated without question  Perhaps because it serves the purpose of supporting a larger myth that humans are as subject to instinct as other creatures, even in our most vaunted trait, our ability to think.  But at an even deeper level, what is behind this desire to reduce humans to the animal level?  I’m not sure I know, but:  cherchez la ideologie!

Addendum:  Since posting this article, I re-discovered on my shelves a book by Professor John Edwards titled “Multilingualism.”  Although it is not a history of language acquisition per se, it does contain extensive passages on multilingualism in the past, as well as extensive discussion of SLL in both children and adults.  He points out that the notion of a critical period is questionable and implies that it may be related to an Anglophone bias in many writers; interestingly, he also mentions that many contemporary linguists are monolingual–and certainly, in my view, that is a serious lack in anyone who makes any claims about the origin of language and on how language works.  Monolingual English readers can get a sense of the diversity and complexity of languages by reading John McWhorter’s “What Language Is.”

For a New York Times article on polyglots, click here.

For a more recent article on adults and immersion, click here.

Advertisements

Comments

  • T Hill  On November 13, 2011 at 7:45 PM

    Is Henry Kissinger fluent in English? Of course. Does everyone whose native language is English instantly know that Kissinger’s native language is not? Of course.

    I think the ability to learn a language with no accent is what disappears around puberty. To claim the argument is that adults can’t learn a second language is setting up a straw man that is trivial to knock down.

    • William L. Scurrah  On November 13, 2011 at 8:21 PM

      I’m not sure what you’re trying to say. No where in my article do I say that anyone claims that adults cannot learn a second language. What I do argue is that claims that adults cannot learn a second language as easily as children is untrue; the people who hold that view do so because believing in the “critical period” reinforces their view that language is an instinct, a view with which I disagree.

Trackbacks

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: