![]() ![]() The MRI approach is currently slower and less accurate than an experimental communication system being developed for paralyzed people by a team led by Dr. "And yet what we got was this kind of language description of what's going on in the video." A noninvasive window on language "We didn't tell the subjects to try to describe what's happening," Huth says. ![]() ![]() In a third experiment, participants watched videos that told a story without using words. In another experiment, the system was able to paraphrase words a person just imagined saying. In many cases, he says, the decoded version contained errors. So if a participant heard the phrase, "I didn't even have my driver's license yet," the decoded version might be, "she hadn't even learned to drive yet," Huth says. What emerged from the system was a paraphrased version of what a participant heard. The system got a lot of help constructing intelligible sentences from artificial intelligence: an early version of the famous natural language processing program ChatGPT. Those streams of words produced activity all over the brain, not just in areas associated with speech and language. "For the most part, they just lay there and listened to stories from The Moth Radio Hour, Huth says. Participants wore headphones that streamed audio from podcasts. Researchers had three people spend up to 16 hours each in a functional MRI scanner, which detects signs of activity across the brain. The new study came about as part of an effort to understand how the brain processes language. "I think that this general kind of approach is going to solve that puzzle someday." Podcasts in the MRI "One of the biggest scientific medical challenges is understanding mental illness, which is a brain dysfunction ultimately," Just says. That could mean it has applications beyond communication, he says. ![]() Shots - Health News Meet the 'glass-half-full girl' whose brain rewired after losing a hemisphereīut the Texas team's approach is an attempt to "decode more freeform thought," says Marcel Just, a professor of psychology at Carnegie Mellon University who was not involved in the new research. The sensors detect signals in areas involved in articulating words. Previous efforts to decode language have relied on sensors placed directly on the surface of the brain. They also are helping scientists understand how the brain processes words and thoughts. Still, systems that decode language could someday help people who are unable to speak because of a brain injury or disease. It only works when a participant is actively cooperating with scientists. This technology can't read minds, though. "It's getting at the ideas behind the words, the semantics, the meaning," says Alexander Huth, an author of the study and an assistant professor of neuroscience and computer science at The University of Texas at Austin. The system reconstructs the gist of what a person hears or imagines, rather than trying to replicate each word, a team reports in the journal Nature Neuroscience. Scientists have found a way to decode a stream of words in the brain using MRI scans and artificial intelligence. Pink areas have above-average activity blue areas have below-average activity. This video still shows a view of one person's cerebral cortex. ![]()
0 Comments
Leave a Reply. |