devices make headway, researchers report
Posted April 24, 2005
Courtesy Nature Research Journals
and World Science staff
fanfare, mind reading has left the pages of science-fiction fantasy and begun tapping on reality’s door.
In new experiments, researchers say they have built devices that decode, from brain scans, simple aspects of mental states.
The machines tell whether people are visualizing one or another of a set of patterns they have
viewed, the scientists say. In some cases the devices know better what has passed through a person’s mind than he or she
does, according to researchers.
Mind-reading, as futuristic and improbable as it sounds, is not completely new.
Lie detectors, which have existed for decades, arguably are crude mind readers. Some humans are excellent lie detectors, for that matter.
More recently, more sophisticated technologies have added to this mind-reading arsenal. Associating brain activity patterns as they appear on brain scans with specific emotions has become a routine part of brain research, for instance.
Researchers have also taken stabs at making computers that can decipher the contents of the “mind’s eye.”
As we look around, objects and scenes cause patterns of activity in the part of our brain devoted to vision. In 1999, University of California at Berkeley researchers reported having used a computer to roughly reconstruct the scenes a cat was viewing. The computer read signals from wires recording electrical activity in 177 of the animal’s cells, from a brain area that acts as a first processor of visual information from the eye.
The new findings go a step further: researchers now report having decoded mental imagery without the use of such wires. Instead, they employed a brain scanning technology called functional Magnetic Resonance Imaging, which shows how active different brain regions are based on their oxygen usage.
Moreover, the researchers say brain scans can be decoded to find out not just what people were shown, but which characteristics of the image they were concentrating on, and even whether they saw something too briefly to remember it.
The research, from two separate scientific groups, is published in the May issue of the research journal
Yukiyasu Kamitani and Frank Tong found that when people were shown stripes tilted in different directions, there were subtle differences in the pattern of brain activity that showed up on the scans. Kamitani is with the ATR Computational Neuroscience Laboratories, Kyoto, Japan and Frank Tong is at Princeton University, Princeton, New Jersey.
They created a computer program that could learn to recognize the different patterns by analyzing examples of previous scans, and their relationships to different stripe angles associated with them.
The programs learned to discern which stripes had been shown for a given scan with high accuracy, Kamitami and Tong wrote.
Furthermore, when subjects were shown a plaid pattern made up of two different sets of stripes but asked to pay attention to only one set, the program was able to tell which one the subjects were thinking about.
“The mind-reading approach presented here provides a potential framework for extending the study of the neural correlates of subjective experience,” that is, of what happens in our brains as we think, wrote the researchers.
“Our approach may be extended to studying the neural basis of many types of mental content, including a person’s awareness, attentional focus, memory,” and intentions and choices, the authors wrote.
Using a similar form of analysis, John-Dylan Haynes and Geraint Rees of University College London, U.K., wrote that could predict what had been displayed on the computer screen better than the owners of the brains themselves.
When two images were flashed on in quick succession, subjects only saw the second one and were unable to make out the first. The authors’ computer program, however, was still able to distinguish the patterns of brain activity created by the invisible images, even though the people could barely guess at what they had seen.
The findings could also suggest new directions for studies into the still poorly understood difference between conscious and unconscious thought, the researchers wrote. “Whether to be represented in conscious experience information has to cross a threshold level of activity, or perhaps needs to be relayed to another region of the brain, is an intriguing question for further research.”