Science Computer turns thoughts into sentences

  • 🏰 The Fediverse is up. If you know, you know.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
BRAIN ACTIVITY

Computer turns thoughts into sentences​

A new decoder from the U.S. is capable of converting thoughts into speech, turning thought content into coherent sentences. People who cannot speak could thus communicate verbally again. However, the researchers also warn of the dangers inherent in devices of this kind.
Online since today, 5 p.m.

For years, scientists have been trying to analyze a person's thoughts with the help of technical devices and to decode the thought content. In the meantime, "mind reading" is quite feasible and a reality, even if not always exactly word for word.

Extensive research work
Previous studies have mostly used invasive implants - neuroprostheses - to measure the brain activity of the person in question in real time, such as in a study from the U.S. published in the journal Nature Communications in November 2022. The program used at that time was trained in almost 50 training sessions using "deep learning" while a paralyzed patient tried to articulate one of 50 given words. In total, a vocabulary of around 1,000 words was thus built up, which correlated with certain thoughts of the patient.

Other approaches do without the implanted neuroprostheses, but the effectiveness of the methods usually suffers from the smaller number of data on brain activity. Many such programs are able to decode a few individual words from a person's thoughts - but they usually cannot manage coherent sentences.

Use in medicine
A research team led by neuroscientist and computer scientist Alexander Huth from the University of Texas (USA) has now also used a non-invasive method to measure the brain activity of three test subjects.

Unlike previous experiments without implants, however, the U.S. researchers actually managed to convert thought content into complete sentences and "continuous speech," Jerry Lang told journalists on Thursday. Lang is part of Huth's research lab at the University of Texas and the lead author of the study currently presented in the journal Nature Neuroscience.

The researchers involved hope to use their method to give a voice back to those people who can no longer speak, for example, because of a stroke or a disease of the nervous system. The fact that no implants are needed makes it much easier to use in practice. Nevertheless, the program does not work right away.

Training indispensable
The three test subjects first had to listen to stories and follow discussions for 16 hours. During this time, their brain activities were scanned using functional magnetic resonance imaging (fMRI). The researchers then created a language model from word sequences to predict how the brain would react to certain words.

The programmed decoder was trained to distinguish which thought content was associated with which words and brain activities. With the help of further models, some of which were also based on artificial intelligence, the research team then trained the decoder to predict words and sentences that should cause the most similar reactions possible in the subjects' brains.

So the program was soon able to convert the brain activity of the particular person it was trained on to words and phrases - and even when the subject was listening to a story that didn't appear in the training. The test subjects did not even have to actively listen to a text; it was enough if they only thought about the content of the stories.

Content mostly recognized in its meaning
In most cases, the decoder did not manage to exactly reproduce the content of the stories heard, thought about or seen in silent films. Nevertheless, it was able to reproduce the approximate sense of the thought processes. For example, when the subjects heard the sentence "I don't have my driver's license yet," the decoder processed this into "She hasn't started learning to drive yet" in one of the trials.

According to Huth, the fact that in most cases only the approximate meaning of the content could be reproduced was primarily due to the research team's special method. "Previous approaches have almost always tried to analyze the signals that occur in the brain when a person tries to actually articulate a word," he explained in a press briefing. The research team's method, however, targets other areas of the brain and actually analyzes the person's thoughts rather than their mental attempt to speak, he said.

On the one hand, this leads to the fact that the contents are often only reproduced in the sense, on the other hand, it also allows the verbalization of contents that are not based on spoken words - for example, in silent films or pictures. The method could thus be used for a variety of problems, especially in medicine.

Mental privacy at risk
However, the researchers are well aware that "mind reading" also harbors dangers - for example, with regard to privacy. However, the team is still giving the all-clear: In order to function, the decoder must first be trained for hours on a specific person - the program is currently unable to interpret the thoughts of a complete stranger. Without the cooperation of the person in question, it is currently not possible to convert their thoughts into words and sentences with the help of technology.

Theoretically, however, we are not too far away from that either. In the coming years, technical progress could lead to thoughts also being analyzed without the preceding training - i.e. possibly even without the consent of the respective person. The team is therefore calling for measures in the form of unambiguous laws to also legally safeguard people's mental privacy in the future and to prevent misuse of thought decoders.

Use (still) impractical
In order to apply the method of the U.S. research team in medicine and actually help people with speech problems, more work is needed. Currently, the decoder still relies on extensive data from magnetic resonance imaging. The method is not yet suitable for everyday use due to the size of the equipment required and the high costs involved.

In a next step, the researchers therefore want to clarify whether the decoding of thoughts might not also work with smaller and less expensive sensors.
Raphael Krapscha, Ö1 Science

Source (Austria)
 
The end's coming soon, I feel.
Telepathy would save us. Imagine everyone knowing what another's motivation is when they present their ideas. Propaganda would become obsolete, at least for a while.

I had a friend with ALS. He died in 2014. He had a head band with which he could move a cursor and type on the internet. We fund raised the 10k to buy it because we knew it would change his bed ridden, paralyzed life and it did. It wasn't his skin. eyebrows or temperature that made the cursor move. It was his pre frontal cortex. We would joke about how we must be close to having one that can actually read your thoughts.
 
needs to be an unchangeable default setting on twitter and every local news site from this day forward.
People already kind of sort of find a flowery way to express some real good vitriol they're actually thinking.
Open the floodgates. Let's see what happens when articles start actually calling shit for what it is.

"Teacher beaten by teens/youths" is the poor man's "Teacher beaten by niggers in yet another chimp out"
 
As dystopian as this sounds, I have a loved one with brain damage that renders him completely nonverbal and his inability to communicate or understand anything has rendered him a confused frustrated violent asshole and makes everybody's life a living hell, so I'm pretty stoked at the prospect of him not being a fucking dickhead if he's still alive when this becomes viable, even if it means Musk beaming Tesla ads into his neuralink.

It's easy to be a hard line libertarian when you don't have cripples and retards of your own.
 
Back
Top Bottom