AI Brain Activity Decoder Can Translate Thoughts Into Written Words
Think of a story and – at least some of the time – it will appear.
Translating someone’s brain activity into written words may sound like a science fiction dream, but a new artificial intelligence () model developed at the University of Texas at Austin has been able to achieve just that. Using only noninvasive scanning methods, the model can be trained to decode complex language from someone’s thoughts for extended periods of time.
“For a noninvasive method, this is a real leap forward compared to what’s been done before, which is typically single words or short sentences,” said study co-lead Alex Huth, an assistant professor of neuroscience and computer science, in a .
Other similar systems are in development elsewhere, but what sets this one apart is that participants don’t need to undergo surgery to have implants fitted, nor are they restricted to a list of words they can use.
The decoder can’t synthesize the person’s thoughts word for word, but it can often capture the gist of what they’re thinking. After extensive training, it’s able to produce text that is a good, and occasionally exact, representation of the person’s thoughts about half of the time.