Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Mind reading? New AI system can translate a person’s brain activity into a continuous stream of text

by Christina Maher
May 2, 2023
in Artificial Intelligence, Neuroimaging
Share on TwitterShare on Facebook
Follow PsyPost on Google News

The technology to decode our thoughts is drawing ever closer. Neuroscientists at the University of Texas have for the first time decoded data from non-invasive brain scans and used them to reconstruct language and meaning from stories that people hear, see or even imagine.

In a new study published in Nature Neuroscience, Alexander Huth and colleagues successfully recovered the gist of language and sometimes exact phrases from functional magnetic resonance imaging (fMRI) brain recordings of three participants.

Technology that can create language from brain signals could be enormously useful for people who cannot speak due to conditions such as motor neurone disease. At the same time, it raises concerns for the future privacy of our thoughts.

Language decoded

Language decoding models, also called “speech decoders”, aim to use recordings of a person’s brain activity to discover the words they hear, imagine or say.

Until now, speech decoders have only been used with data from devices surgically implanted in the brain, which limits their usefulness. Other decoders which used non-invasive brain activity recordings have been able to decode single words or short phrases, but not continuous language.

The new research used the blood oxygen level dependent signal from fMRI scans, which shows changes in blood flow and oxygenation levels in different parts of the brain. By focusing on patterns of activity in brain regions and networks that process language, the researchers found their decoder could be trained to reconstruct continuous language (including some specific words and the general meaning of sentences).

Specifically, the decoder took the brain responses of three participants as they listened to stories, and generated sequences of words that were likely to have produced those brain responses. These word sequences did well at capturing the general gist of the stories, and in some cases included exact words and phrases.

The researchers also had the participants watch silent movies and imagine stories while being scanned. In both cases, the decoder often managed to predict the gist of the stories.

For example, one user thought “I don’t have my driver’s licence yet”, and the decoder predicted “she has not even started to learn to drive yet”.

Further, when participants actively listened to one story while ignoring another story played simultaneously, the decoder could identify the meaning of the story being actively listened to.

How does it work?

The researchers started out by having each participant lie inside an fMRI scanner and listen to 16 hours of narrated stories while their brain responses were recorded.

These brain responses were then used to train an encoder – a computational model that tries to predict how the brain will respond to words a user hears. After training, the encoder could quite accurately predict how each participant’s brain signals would respond to hearing a given string of words.

However, going in the opposite direction – from recorded brain responses to words – is trickier.

The encoder model is designed to link brain responses with “semantic features” or the broad meanings of words and sentences. To do this, the system uses the original GPT language model, which is the precursor of today’s GPT-4 model. The decoder then generates sequences of words that might have produced the observed brain responses.

A table showing stills from an animated film next to descriptions of the action decoded from fMRI scans.
The decoder could also describe the action when participants watched silent movies. (Tang et al. / Nature Neuroscience)

The accuracy of each “guess” is then checked by using it to predict previously recorded brain activity, with the prediction then compared to the actual recorded activity.

During this resource-intensive process, multiple guesses are generated at a time, and ranked in order of accuracy. Poor guesses are discarded and good ones kept. The process continues by guessing the next word in the sequence, and so on until the most accurate sequence is determined.

Words and meanings

The study found data from multiple, specific brain regions – including the speech network, the parietal-temporal-occipital association region, and prefrontal cortex – were needed for the most accurate predictions.

One key difference between this work and earlier efforts is the data being decoded. Most decoding systems link brain data to motor features or activity recorded from brain regions involved in the last step of speech output, the movement of the mouth and tongue. This decoder works instead at the level of ideas and meanings.

One limitation of using fMRI data is its low “temporal resolution”. The blood oxygen level dependent signal rises and falls over approximately a 10-second period, during which time a person might have heard 20 or more words. As a result, this technique cannot detect individual words, but only the potential meanings of sequences of words.

No need for privacy panic (yet)

The idea of technology that can “read minds” raises concerns over mental privacy. The researchers conducted additional experiments to address some of these concerns.

These experiments showed we don’t need to worry just yet about having our thoughts decoded while we walk down the street, or indeed without our extensive cooperation.

A decoder trained on one person’s thoughts performed poorly when predicting the semantic detail from another participant’s data. What’s more, participants could disrupt the decoding by diverting their attention to a different task such as naming animals or telling a different story.

Movement in the scanner can also disrupt the decoder as fMRI is highly sensitive to motion, so participant cooperation is essential. Considering these requirements, and the need for high-powered computational resources, it is highly unlikely that someone’s thoughts could be decoded against their will at this stage.

Finally, the decoder does not currently work on data other than fMRI, which is an expensive and often impractical procedure. The group plans to test their approach on other non-invasive brain data in the future.The Conversation

 

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

TweetSendScanShareSendPin2ShareShareShareShareShare

RELATED

Researchers identify neural mechanism behind memory prioritization
Memory

Researchers identify neural mechanism behind memory prioritization

June 30, 2025

A new brain imaging study shows that when people try to remember multiple things, their brains give more precise attention to the most important item. The frontal cortex helps allocate memory resources, boosting accuracy for high-priority information.

Read moreDetails
Readers struggle to understand AI’s role in news writing, study suggests
Artificial Intelligence

Readers struggle to understand AI’s role in news writing, study suggests

June 29, 2025

A new study finds that readers often misunderstand AI’s role in news writing, creating their own explanations based on limited information. Without clear byline disclosures, many assume the worst.

Read moreDetails
Muscle contractions release chemical signals that promote brain network development
Memory

Sleep helps stitch memories into cognitive maps, according to new neuroscience breakthrough

June 28, 2025

Scientists have discovered that forming a mental map of a new environment takes more than just recognizing individual places—it also requires sleep. The study highlights how weakly tuned neurons gradually become synchronized to encode space as a connected whole.

Read moreDetails
Regular psychedelic users exhibit different brain responses to self-related thoughts, study finds
Neuroimaging

Regular psychedelic users exhibit different brain responses to self-related thoughts, study finds

June 28, 2025

A new study suggests that regular users of psychedelics may process self-related thoughts differently at both psychological and brain levels, revealing altered patterns of brain activity during self-reflection compared to non-users who intend to try psychedelics.

Read moreDetails
Reduced pineal gland volume observed in patients with obsessive-compulsive disorder
Cognitive Science

Neuroscientists identify key gatekeeper of human consciousness

June 27, 2025

Using rare brain recordings from patients, scientists found that the thalamus helps trigger visual awareness. The study reveals that this deep brain region sends synchronized signals to the cortex, acting as a gateway for conscious perception.

Read moreDetails
Neuroscientists identify a reversible biological mechanism behind drug-induced cognitive deficits
Depression

New study links intermittent fasting to improved mood via brain’s dopamine system

June 27, 2025

A new study suggests that intermittent fasting may reduce symptoms of depression by activating dopamine D1 receptors in the brain’s prefrontal cortex. The findings point to a potential non-drug approach for mood disorders rooted in brain signaling.

Read moreDetails
How people end romantic relationships: New study pinpoints three common break up strategies
Autism

Brain connectivity shift across puberty may explain autism risk in 22q11.2 deletion syndrome

June 26, 2025

Scientists have uncovered how puberty reshapes brain connectivity in 22q11.2 deletion syndrome, a genetic condition linked to autism and schizophrenia. The findings highlight how changes in synapses and brain connections may shape social behavior and mental health outcomes later in life.

Read moreDetails
Chronic stress can alter genetic material in sperm, leading to changes in offspring behavior
Mental Health

A common parasite not only invades the brain — it can also decapitate human sperm

June 22, 2025

A new study finds that a widespread parasite, Toxoplasma gondii, can physically damage human sperm, including decapitating them on contact. The findings raise fresh questions about the parasite’s potential role in the decades-long global decline in male fertility.

Read moreDetails

SUBSCRIBE

Go Ad-Free! Click here to subscribe to PsyPost and support independent science journalism!

STAY CONNECTED

LATEST

Ghosting and ‘breadcrumbing’: the psychological impact of our bad behaviour on dating apps

Older adults who feel criticized by loved ones are more likely to develop depression

New study exposes gap between ADHD drug use and safety research in children

People who are more likely to die seem to care less about the future

Researchers identify neural mechanism behind memory prioritization

Love addiction linked to memory and attention problems

Positive early experiences may buffer suicidal thoughts in those with trauma symptoms, new study finds

Readers struggle to understand AI’s role in news writing, study suggests

         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy