A computer connected to a brain device reads thoughts and spells them out letter by letter
One of the great tragedies of anarthria, the loss of speech, is that many people who suffer from it – as a result of, for example, a stroke – can still think clearly. They just can’t express themselves like most of us, with words. Especially if they’re also paralyzed and can’t type their thoughts on a tablet.
For years, scientists have been trying to help anarthric people, especially paralyzed people, talk through technology. The final approach involves implanting devices in or near the brains of anarthric people that can literally read the electrical impulses that make up their thoughts and transmit text to a device that displays or sounds it.
These brain-computer interfaces, or BCIs, are becoming increasingly sophisticated. But they still have a big problem with accuracy. In a major experiment three years ago, a leading BCI prototype mistranslated the thoughts of about a quarter of the test participants.
In other words, one in four sentences users thought of… ended up being wrong on screen. It’s almost as if every fourth line of text you write in an email conversation automatically ends up being rewritten in gibberish.
The same team that oversaw that 2019 trial, the Chang Lab at the University of California, San Francisco, is now trying a different approach. The lab, led by top neuroscientist Edward Chang, has developed a new BCI that translates the individual letters instead of whole words or phrases. Users express their thoughts, one letter at a time.
The first results are encouraging. BCI was able to correctly translate and present approximately 94% of the letters written by the participants. Chang Lab’s new spelling-BCI could help advance brain implant technology, bringing it closer to daily use by large numbers of people. Giving voice to the voiceless.
The Chang Lab made headlines three years ago when it introduced its BrainNet BCI. In the experiment, two volunteers wore electroencephalogram electrodes on their heads, the kind neurologists use to detect epilepsy. Unlike older, more rudimentary BCIs, BrainNet did not require invasive surgery to implant sensors directly into the brain.
The volunteers silently focused on some simple thoughts. The EE headsets detected their brainwaves through their skulls, and an algorithm matched those waves to a “dictionary” of phrases the lab had written by asking volunteers to speak phrases and then recording neurological activity. resultant.
The fact that BrainNet worked was impressive. But its maximum accuracy of 76% left plenty of room for improvement. “A major challenge for these approaches is achieving high accuracy rates in a single trial,” Chang and his team conceded.
Spelling out thoughts one letter at a time would certainly be slower than feeding entire thoughts into a BCI, but could that be any more accurate? To find out, Chang Lab recruited a volunteer who, in 2019, had an electrocorticography array — a postcard-sized patch of 16 electrodes — implanted under his skull. The volunteer suffers from “serious paralysis of the limbs and vocal tracts”, according to the laboratory.
Chang and his teammates, including UCSF neuroscientists Sean Metzger and David Moses, taught the NATO phonetic alphabet. “Alpha” for A. “Bravo” for B. “Charlie” for C. Etc. They asked the volunteer to spell out his thoughts thinking of the NATO code word for each letter.
The BCI reads brain waves. An algorithm did its best to match the waves to a dictionary of 1,152 words. Thoughts – at least, the algorithm’s best translation of a person’s thoughts – scrolled across a computer screen at the rate of 29 letters per minute.
The system was quite accurate. In both cases, when the subject thought “Thank you”, the translated text appeared on the screen as “Thank you”.
But it wasn’t perfect. “Hello” came out as “hello” on the first try and “good for the legs” on the second try. And “you’re not going to believe this” totally confused the BCI and its algorithm, getting a garbled translation like “you plan to kinda go like this” on the first try, and like “ypuaranpdggingloavlinesoeb” on the second try .
Overall, the system demonstrated a “median character error rate” of six percent. Augmenting the data for a hypothetical vocabulary of 9,000 words, Chang’s team concluded that the error rate would be only slightly higher: just around 8%.
“These results illustrate the clinical viability of a silently controlled speech neuroprosthesis for generating sentences from a large vocabulary through a spelling-based approach,” Chang, Metzger, Moses and their co-authors wrote in a peer-reviewed study published in Nature Communication tuesday.
Samuel Andrew Hires, a University of Southern California neurobiologist who was not involved in the study, told The Daily Beast he was impressed. “A typical human is around 30-35 wpm with modern text prediction, maybe faster if you’re a teenager,” he said. “Here, the subjects were only about six times slower, which is quite impressive given that they couldn’t move or talk. I don’t know what my word error rate is on my phone, but it looks like it’s about one in 10 words, on par with brain decoding performance.
But don’t expect the spelling approach to change the world overnight. We are still a long way from a robust, fast, accurate, and affordable version of a thought-to-text system that a wide variety of people with speech disabilities can use in public.
“We think the main thing to confirm is that our BCI can work with a variety of users with a variety of disabilities.”
— David Moses, UCSF
Durability is an issue. Implanting a device under the skull is traumatic and risky. Ideally, a device will work for many years before it needs to be repaired or replaced. To that end, it’s good news that the volunteer’s electrocorticography network is still functioning quite well after 2.5 years, Moses told The Daily Beast.
But much more experimentation is needed to prove that the system is largely effective. “We think the main thing to confirm is that our BCI can work with a variety of users with a variety of disabilities,” Moses said.
Only after extensive additional testing can any lab, Chang’s or another, consider licensing the technology for use by the general public. At this point, the challenge will be to shrink it, strengthen it, and make it portable and affordable. Moses said he envisions “a fully implantable neural interface” that can “communicate wirelessly with a phone, tablet, or laptop to enable portable use.”
As for the price… who knows? “It is too early to accurately estimate the cost of such a system,” Moses said. But even at full cost, a thought-to-text device, even one that speaks just six or seven words per minute, could help many people in settings where total speech impairment is currently a major obstacle.
Desks. Classroom. Even bars and restaurants. “Brain-computer interfaces have the potential to restore communication,” Chang’s team wrote. All those pent up thoughts, swirling around in the brains of anarthric people who can think clearly but say nothing, could come crashing down. One letter at a time.