Computer Deciphers the Brain Signals of Imagined Writing

New approach to brain-computer interfaces more than doubles the speed at which messages can be relayed.
Image
A hand writing with a pen
Charles Q. Choi, Contributor

(Inside Science) -- A man paralyzed below the neck can imagine writing by hand and, with the help of artificial intelligence software, use electronics hooked up to his brain to translate his mental handwriting into words at speeds comparable to typing on a smartphone, a new study finds.

By helping convert thoughts into actions, brain-computer interfaces can help people move or speak. Recently, scientists have sought to help people with disabilities communicate by using these mind-machine interfaces to move a cursor on a screen to point and click on letters on a keyboard. The previous speed record for typing with such devices was about 40 characters per minute.

Now, for the first time, scientists have deciphered the brain signals associated with handwriting. They discovered that using "mindwriting," a 65-year-old volunteer equipped with a brain-computer interface could generate up to 90 characters per minute. Such "mindwriting" rivals the speed of 115 characters per minute typically achieved by nondisabled peers the volunteer's age texting on a smartphone.

The volunteer was paralyzed below the neck by a spinal cord injury in 2007. In 2016, doctors implanted two brain-computer interface chips on the surface of his brain's left side, each about 4 millimeters by 4 millimeters large, the size of a baby aspirin. The chips each possessed 100 electrodes that pick up neural signals from the part of his brain that controls the hand and arm, making it possible for him to move a robotic arm, or a cursor on a screen, by attempting to move his own paralyzed arm.

In the new study, researchers analyzed the volunteer's intended hand and finger motions while he attempted to write sentences as if he were not paralyzed, imagining that he was holding a pen on a legal pad. The electronic interpretation of whatever letter he was attempting to write appeared on the computer screen after a roughly half-second delay.

"We had no idea if the brain could preserve its ability for fine dexterous hand motions in someone paralyzed 10 years," said study lead author Francis Willett, a neuroscientist at Stanford University in California. "It was amazing to see this work."

First the volunteer concentrated on writing each letter of the alphabet 10 times so artificial intelligence algorithms could learn to recognize the specific patterns his brain produced with each character. Later he was shown sentences he had to imagine copying by hand, all in lowercase letters, such as "i interrupted, unable to keep silent." 

Over time, the volunteer could freely mindwrite about 15 words per minute, with an error every 11 or 12 attempted letters. When the scientists employed an after-the-fact autocorrect function, similar to ones often used in smartphone keyboards, those error rates shrunk to a little over 2%. 

"This work highlights how tools like predictive language models -- your phone's autocorrect -- can be used to improve brain-computer interfaces," said neural engineer Amy Orsborn at the University of Washington in Seattle, who did not take part in this research. "This is the most significant demonstration to date of leveraging established tools from machine learning to advance brain-computer interfaces."

Previous brain-computer interfaces were likely focused on point-and-click approaches instead of handwriting due to their roots in animal research. "You can't make monkeys handwrite, but you can ask them to move a cursor or make reaching movements," Willett said. "When this research was first translated to people, it hadn't moved on yet to exploring things only people can do, like handwriting."

The reason this new interface works better than previous ones is likely because the brain signals for each letter are very different from each other, as each character requires a pen to travel its own distinct trajectory, Willett said. In contrast, "with a point and click system, each key you are moving a cursor to requires a very similar motion with a slightly different angle from each other, so they have very similar patterns of neural activity that are harder to tell apart," he explained. "This is especially true given how we're only looking at a handful of neurons, not all of which are doing the same thing every time, so it helps if their pattern of activity is very distinctive, to try and distinguish the signals we're looking for from the noise."

In the future, the researchers aim to streamline their system to help it learn a person’s handwriting more quickly, and to make it easier for people to use. They also intend to work with a person who is paralyzed and cannot speak, in the hopes that such research could help millions of similar people worldwide. They detailed their findings in the May 13 issue of the journal Nature
 

Author Bio & Story Archive

Charles Q. Choi is a science reporter who has written for Scientific American, The New York Times, Wired, Science, Nature, and National Geographic News, among others.