University College London (UCL) scientists have made the first steps towards building a mind-reading device. In a study published in the latest issue of Nature Neuroscience, the UCL team discovered that they could use brief recordings of brain activity alone to predict which of two objects volunteers were viewing.
Even when the objects were masked to appear invisible to the volunteer, their brain activity could still be used to predict which of the objects was present, suggesting that unconscious processes in the brain were registering the object. The research team could tell more about what had been shown to the volunteers from their brain activity than the volunteers themselves.
In the experiment, UCL researchers Dr Geraint Rees and Dr John-Dylan Haynes measured activity in the brain's visual cortex while volunteers viewed a grating slanted to the left or the right. A single two-second measurement of brain activity was enough for researchers to predict with 80 per cent accuracy which of the two oriented gratings the volunteer was viewing.
Dr Geraint Rees from UCL's Institute of Cognitive Neuroscience says: “This is the first basic step towards reading somebody's mind. If our approach could be expanded upon, it might be possible to predict what someone was thinking or seeing from their brain activity alone.
“In principle, the technique could be applied to a device such as a lie detector but much more research would be needed. You would need to explore which regions of the brain might predict whether someone was lying. These could be very different to the visual cortex and might not carry strong enough signals. A lie detector would also need to generalize across subjects, whereas we were basing our predictions on the brain activity of each individual.
“Our study also shows that an object can be registered subliminally by the brain even when the individual is not conscious of it. The next stage of our research is to explore whether brain activity can be used to predict how our stream of consciousness changes over time.