Imagine being able to communicate just by thinking the words you want to say. This possibility, which seemed like science fiction until recently, is becoming a reality thanks to advances in neural prosthetics developed by scientists at Stanford University.
In the past, people with severe paralysis relied on extremely slow methods to express themselves. A famous example is French journalist Jean-Dominique Bauby, who in 1995 wrote his book The Diving Bell and the Butterfly by blinking a single eyelid to spell out letters, after a stroke left him almost completely paralyzed. Today, modern technologies allow users to select words on a screen using only eye movements or small muscle gestures.
Recently, researchers have gone further, creating brain implants capable of converting neural signals directly into whole words. These devices, known as brain-computer interfaces (BCIs), typically require users to attempt physical speech—a tiring and slow process. But the latest advancement allows communication purely through thought.

The system uses sensors implanted in the motor cortex, the part of the brain responsible for sending commands to the vocal tract. A machine-learning model interprets the neural signals and predicts which words the user intends to express. The major difference is that, in addition to actual movement, the motor cortex is also activated during imagined speech—that is, when the individual thinks the words without speaking them aloud.
The research, published in Cell, involved four participants: three with amyotrophic lateral sclerosis (ALS) and one with a brain stem stroke. All had previously had sensors implanted. With the new system, participants only needed to think of the sentence they wanted to say, and it appeared on a screen in real time. While previous inner-speech decoders were limited to a handful of words, this device supports a vocabulary of up to 125,000 words.
Erin Kunz, a postdoctoral researcher and the study’s lead author, explains that the goal is to create a system that is comfortable and capable of providing natural communication. “Physically attempting to speak is tiring and slow, and in some cases requires multiple breaths,” she says. The new implant allows a comfortable conversational speed of 120 to 150 words per minute, with no extra effort beyond thinking the words.
According to Alexander Huth, a BCI specialist at the University of California, Berkeley, the technology only works when the user can convert an idea into a speech plan, even if they cannot physically execute the movements. In cases of dysarthria, where the connection between plan and movement is impaired, the implant is particularly beneficial.

Participants in the study expressed excitement about the new technology. Some celebrated the ability to communicate quickly, and one highlighted the unprecedented ability to interrupt conversations—something impossible with slower methods. To protect the privacy of thoughts, researchers implemented a command phrase: “chitty chitty bang bang,” which would start or pause the BCI transcription.
Although brain implants raise questions about mental privacy, experts like Huth emphasize the integrity of the research teams. “The focus is always on the patient, developing solutions to improve the lives of people with paralysis,” he says.
For Kunz, the research has personal significance. Her father had ALS and lost his ability to speak, and she acted as his translator during his final years. “I know firsthand the importance of being able to communicate,” she says. She also praises the study’s volunteers, who participated to help others in the future, without necessarily seeking personal benefits.
New Brain Implant Can Read Thoughts with 74% Accuracy

Brain-computer interfaces (BCIs) are not new, but a recent breakthrough from Stanford University could transform how people with severe paralysis communicate. These technologies allow direct interaction between the brain and external devices and have already been used to control prosthetic limbs by decoding brain signals related to movement.
Previous research has shown that BCIs can decode attempted speech in paralyzed individuals, turning brain activity associated with trying to speak into understandable words. While faster than older methods like eye-tracking systems, these interfaces can still be physically demanding and slow for people with limited muscle control.
To overcome this limitation, the Stanford team explored the possibility of decoding inner speech—the silent thoughts we have in our minds. “If you only have to think about words instead of actually trying to speak, it’s potentially easier and faster for people,” said Benyamin Meschede-Krasa, co-first author of the study.
The study involved four participants with severe paralysis due to conditions such as amyotrophic lateral sclerosis (ALS) or brainstem stroke. Microelectrodes were implanted in the motor cortex, the part of the brain responsible for controlling speech, to record neural activity during the experiment.
Participants were asked to either attempt to speak or imagine words. Researchers found that both actions activated similar brain regions and produced comparable neural patterns. However, the brain activity associated with inner speech was noticeably weaker.
Even so, the patterns were distinct enough for artificial intelligence algorithms to learn and interpret imagined words. The inner speech data was used to train AI models capable of decoding these silent thoughts.
In a demonstration, the BCI successfully interpreted imagined sentences from a vocabulary of up to 125,000 words, achieving up to 74% accuracy. The system could also detect unplanned thoughts, such as numbers, when participants counted objects on a screen.
This breakthrough marks a significant step forward in communication for people with paralysis, paving the way for faster, more natural interaction using only the power of thought.