Researchers at the University of Texas claim to have built a "decoder" that can reconstruct what a person is thinking just by monitoring their brain activity.
The research could pave the way for more capable brain- computer interface designed to help those who can't speak or type.
In an experiment, the researchers used magnetic resonance machines to measure the changes in blood flow, not the firing of individual neurons, which is notoriously "noisy" and difficult to decode.
They used the data to train a program that could associate the changes in blood flow with what people were listening to.
Alexander Huth, a University of Texas neuroscientist and co-author of the study, told The Scientist that the results were good.
The system had some problems. The radio and podcasts recordings are often mixed up. Huth said that the algorithm knows what's happening but not who is doing it.
It was not possible for the algorithm to apply what it had learned from one person's brain scans to another person's brain scans.
When participants watched a silent film, the decoder was able to deduce a story because it was not limited to spoken language. It's possible that these findings will help us understand the functions of different parts of the brain and how they overlap in making sense of the world.
The other neurosciences were impressed. "If you have a smart enough modeling framework, you can actually pull out a surprising amount of information from these kinds of recordings," Sam Nastase told The Scientist.
The study sets a solid ground for brain- computer interface applications, according to a computational neuroscientist at Kyoto University.
There is a report on decoding thoughts from fMRI data.
The founder of Neuralink leaves as the company goes into chaos.