The research team from the University of Oregon, United States, are said to have developed a system that can read people’s thoughts via brain scans, and rebuild the faces they were visualizing in their heads.
Lead researchers, Brice Kuhl and Hongmi Lee reportedly used artificial intelligence (AI) that analyzed brain activity in an attempt to reconstruct one of a series of faces that participants saw. Tech Worm reports that the result was not exact, but the AI got close.
According to how the machine was able to read the human thought, the researchers created images directly from memories using a Magnetic Resonance Imaging (MRI) machine, machine learning software, and some hapless human test participants.
The researchers then put the test participants in the MRI machine, showing them several hundred faces. The program had access to real-time MRI data from the machine, as well as a set of 300 numbers that described each face. They covered everything from skin tone to eye position.
As the MRI machine detected the movement of blood around the brain, the assumed conclusion was that equals brain activity. The program then analyzed these movements of blood in reaction to the different points, learning how a particular brain reacts to known stimuli.
The AI was then put to the test with a few hundred examples incorporated into its algorithm. The participants were again shown a face, but this time the program did not know anything about the numbers describing it. The only thing it had to go on, was the MRI data that described brain activity as the person saw the face. The AI learned to match blood flow over time, which explained the brain activity with facial features seen by the subjects in the MRI machine, spluttering out rough images of what the subjects were seeing.
The resulting images had a dream-like quality, with blurred edges and shifts in skin tone. The images look like the kind of vague renderings a person might see in their own brain. But the machine has the ability for getting the facial expressions right, and other features, such as an upturned nose or an arched eyebrow seemed to translate faithfully from mind to machine.
To provide some specific numbers data on the program’s success rate, the researchers showed the strange reconstructions to another set of participants and asked them questions relating to skin tone, gender, and emotion. The group responded correctly; that described the original faces at a higher rate than random chance, which proves that the AI interpretations provide relatively accurate data.
“We can take someone’s memory which is typically something internal and private and we can pull it out from their brains. Some people use different definitions of mind reading, but certainly, that’s getting close,” one of the lead researchers, Kuhl told Vox about their invention.
Mr Kuhl further said that their machines cannot read people’s minds without their cooperation. He was quoted as saying: “You need someone to play ball. You can’t extract someone’s memory if they are not remembering it, and people most of the time are in control of their memories.”
The research team wants to take their study further. They are currently working on getting their participants to see a face, hold it in their memory, and then get the AI to reconstruct it based on the person’s memory of what the face looked like.
You want to support Anonymous Independent & Investigative News? Please, follow us on Twitter: Follow @AnonymousNewsHQ
This article (New Study: Researchers Invent Mind-Reading Machine that Reads Thought Process) is a free and open source. You have permission to republish this article under a Creative Commons license with attribution to the author and AnonHQ.com.