Colorless Green Ideas Sleep Furiously. Or Maybe Not.

Scientists demonstrate how the brain physically understands language hierarchically.
Image
Two older men talking on a city street with a cyclist in the background.
Media credits
Joel Shurkin, Contributor

(Inside Science) -- A group of neuroscientists using high-tech imaging devices has found physical evidence that the brain arranges words in a hierarchy -- clumped together into phrases and sentences -- so they can be understood even if they make no sense. The work seems to contradict theories that brains just collect one word at a time.

The finding, reported in Nature Neuroscience, plunges the scientists, all of whom are affiliated with New York University, into a long-time roaring battle among linguists, neuroscientists and psychologists about how humans learn language. They believe they have physical evidence for an internal grammar in the brain. Linguists and neuroscientists do not always agree.

The researchers, none of whom are linguists, studied 34 native Mandarin Chinese speakers and 13 Native American speakers who heard sentences in both languages. The subjects were given words, phrases and sentences, spoken flatly, without intonation clues in a rhythm. Their brain activities were monitored by magnetoencephalography, which measures tiny magnetic fields, and electrocorticography devices, which are often used to measure brain activity in patients before brain surgery.

"Neurophysiological brain activity tracks properties of speech very faithfully and precisely ... with brain signals," said David Poeppel, a professor of psychology and neuroscience at NYU, and director of the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany.

The dispute is over whether people pick up meaning from words sequentially (and sometimes guessing at what will come next), or whether they have the inmate ability to organize them in hierarchies as they hear them.

The most widely accepted theory among neuroscientists is that we use a statistical or probabilistic structure to understand sentences, sometimes anticipating a word. For instance, if you hear: "I take my coffee with cream and," you assume the next word is "sugar." In this theory, you hear the separate words: I. Take. My. Coffee. With. Cream. And. 

"There is a probabilistic relationship between words," Poeppel said. Word anticipation happens anyway, he said, but the question is whether you need that information.

If the next word is unanticipated, such as "socks" ("I take my coffee with cream and socks") there is an error message in the brain.

"The brain goes ding-ding-ding," he said, and that can actually be tracked on the detectors.

Others say the language is understood hierarchically, with the brain not only hearing each word but collecting them into phrases and sentences simultaneously.

That sentence would come out "I take...my coffee...with cream and." We can do that because we have an innate grammar, a theory propounded most famously by the MIT linguist and philosopher Noam Chomsky.

To Chomsky, this grammar is a unique part of the human brain and it comes from a structure in the brain that does only syntax, or rules, using this internal grammar.

Grammar to a linguist in this area does not refer to rules such as never splitting an infinitive ("to boldly go"), a construct introduced externally, but rules innate to the brain that structures language. 

Chomsky famously said that confronted with a sentence such as "colorless green ideas sleep furiously" people know immediately it makes no sense but the sentence is grammatically correct in its structure.

"That's a very unpopular view right now," Poeppel said, "and the argument is rather just how much grammar we are born with. Some say very little."

The NYU experiment was to test to see if the hierarchical view was true.

"We basically created artificial sentences where we shuffle things around just to see if I can manipulate it so you are also tracking the phrases and the sentences the same time," said Poeppel. "What we tried to test in this experiment using these fancy brain scanners is can you actually find evidence that you track on line as you listening to speech, different levels of abstraction, as your understanding goes. The answer in our hands is that the data says yes." 

Shlomo Argamon, a professor of computer science at the Illinois Institute of Technology in Chicago, a linguist who is an expert on teaching human language to computers, said "most linguists would agree that there has to be some sort of hierarchical representation, and that language cannot be processed using a purely sequential method. The only question is the nature of this hierarchical representation, and whether it is processed by a special part of the brain dedicated just to language."

The paper does not identify any part of the brain that does just that and only that, a claim of Chomsky.

Their evidence does support a hierarchical structure which argues in favor of there being some sort of abstract categories, Argamon said, "not just structuring single words bit or single sounds in a sequence, but multi-level structures of sounds comprising words, comprising phrases, comprising clauses, etcetera."

The structure is not just about meaning, Argamon said. Think of the hierarchy as scaffolding and the building is the meaning.

"Sequentially if you lay down a bunch of bricks in sequence and you'll get a building. If you look just at the bricks as a sequence, you won't understand the structure of the building. But if you can infer the scaffolding, which they cannot now see, you can understand that the building has multiple floors, which have multiple rooms and hallways – this is the hidden hierarchical structure. And then this structure allows you to infer how the building is used — analogous to the meaning hidden inside language.

"All you will see is the building, not the scaffolding. That gives you the meaning."

Chomsky called the work a "major contribution."

"What the work shows is that perception of language is based on detection of hierarchical structure, in ways that cannot be reduced to acoustic cues or statistical properties," Chomsky wrote in an email to Inside Science

"It therefore supports conclusions about the nature of language that are strongly supported by linguistic considerations, but that have been challenged in recent approaches (in quite unpersuasive ways, I think, but that is a separate matter)," he wrote.

"Poeppel's work provides quite significant independent confirmation for the results of linguistic research that seem to me to be quite well-established on independent grounds."

Author Bio & Story Archive

Joel Shurkin is a freelance writer in Baltimore who has also taught journalism and science writing.