Researchers Have Created a Mind-Reading AI System That Can Decode Your Thoughts

Researchers Have Created a Mind-Reading AI System That Can Decode Your Thoughts

A group of researchers from the University of Texas have created a mind-reading AI system that can decode perceived or imagined speech from a person’s brain.

It’s not the first time researchers have attempted to harness the language of the brain and translate it into understandable terms, but it’s notable for being noninvasive without any required surgeries (looking at you, Neuralink). The system uses an MRI to gather brain data from a person and then translates that data into comprehensible information.

During testing, the researchers recorded data from three participants as they listened to 16 hours of stories. As these participants listened to these stories, the University of Texas researchers monitored their brain activity and translated it through a non-invasive language decode – the mind-reading system in question.

The results weren’t perfectly word-for-word, however, the ‘semantic decoder’ AI system that the team created could capture the meaning of stories and recognise exact words and phrases. Additionally, the decoder was found to be able to predict the meaning of a participant’s imagined story.

“For a noninvasive method, this is a real leap forward compared to what’s been done before, which is typically single words or short sentences,” Assistant Professor of Neuroscience and Computer Science at the University of Texas Alex Huth said.

“We’re getting the model to decode continuous language for extended periods of time with complicated ideas.”

An example provided by the University of Texas goes like this: a participant might listen to a speaker say “I don’t have my driver’s licence yet”. Through the MRI data, the language decoder is able to translate the brain’s activity to “She has not even started to learn to drive yet” (grammatically incorrect, but still understandable).

When a person listened to “I didn’t know whether to scream, cry or run away. Instead, I said, ‘Leave me alone!’”, the language decoder read the brain activity and translated it to “Started to scream and cry, and then she just said, ‘I told you to leave me alone’”.

The scanner was also able to translate specific moments from silent movies as translations through the brain activity of test participants.

“We take very seriously the concerns that it could be used for bad purposes and have worked to avoid that,” doctoral student in computer science Jerry Tang added.

“We want to make sure people only use these types of technologies when they want to and that it helps them.”

It’s hoped that this research could translate to the real world through functional near-infrared spectroscopy machines, which monitor brains in a similar way as smaller, less powerful units. The system could one day help people who are not physically able to speak express themselves again, such as people who have suffered strokes.

You can read about the study on the University of Texas website or in Nature Neuroscience.

The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.