New AI application reads eye movements

NewsGuard 100/100 Score

Eye movements read by a new AI application can reveal thoughts, memories, goals — and brain diseases.

A new tool developed at the Kavli Institute for Systems Neuroscience in Norway and described in an article in Nature Neuroscience, predicts gaze direction and eye movement directly from magnetic resonance imaging (MRI) scans. The goal is to make eye tracking diagnostics a standard in brain imaging research and hospital clinics.

Eye movements reveal thoughts, memories, goals — and brain diseases

Whenever you explore an environment or search for something, you scan the scene using continuous rapid eye movements. Your eyes also make short stops to fixate on certain elements of the scene that you want more detailed information about. The way you move your eyes and the selection of details you fixate your gaze upon can be summed up as your viewing behavior.

Scientists can infer a great deal from analyzing eye movements. For instance, when you recall a memory, your eyes move in a pattern similar to how they did when you first experienced the event. Or, when you enter your kitchen hungry, your eyes will be drawn to other items and follow different patterns than when you enter the kitchen to do the dishes.

DeepMReye is an eye tracker that uses the MRI signal from the eyes. Eliminating the use of a camera promises to overcome many obstacles that has prevented eye-tracking from being implemented as a standard in the imaging field. It will also open up completely new research and diagnostic possibilities.

"Our viewing behavior is an expression of our thoughts, memories and goals at any given moment in time," says Matthias Nau. In collaboration with Markus Frey and Christian Doeller, he developed theDeepMReyetool at the Kavli Institute for Systems Neuroscience at the Norwegian University of Science and Technology (NTNU) in Trondheim.

Viewing behavior can even be a diagnostic for brain diseases.

"Brain diseases manifest themselves as characteristic eye movement patterns and disturbances in viewing behavior," Nau said. "Almost every cognitive or neural disorder, such as working memory deficits, amnesia, Parkinson's disease and Alzheimer's disease will affect your viewing behavior."

The artificial intelligence that decodes your gaze pattern

The human eye really can't extract signals from noise in big datasets. Artificial intelligence, on the other hand, is surprisingly efficient and well equipped for discerning relevant patterns in haystacks of data. So the researchers started training a deep learning network to do what they themselves could not: pay attention to the viewing behavior of subjects in the MR brain scanner and recognize patterns that are shared across people.

A long time and many MRI datasets later, the model had figured out the complicated task ofgeneralizingpatterns of gaze across people. In a process involving convolutions that extract features of the data through dimensionality reduction, the artificial intelligence had learned how to use the vast pool of knowledge it had been fed, to extract and interpret the hidden meaning from any person's viewing behavior.

The AI's ability to generalize knowledge to new individuals, combined with the fact that all the data the AI requires for analysis is already found in most MRI maps, brings new life and interest to old brain scanning data."

Researchers can use this model to analyze retrospectively from 20 years of existing fMRI data, from thousands of thousands of subjects, and ask completely new questions," Nau said. For instance, researchers can ask how ongoing behavior really influences so called "resting state networks" - currently one of the juicier topics in the brain imaging world.

DeepMReye can also be used for patient groups and in categories of research that the old camera-based eye tracking did not support. While camera eye tracking could only collect information from subjects when their eyes were open, the MR signal locates eyeball movement patterns even when the eyelid is closed. This makes the DeepMReye model relevant for sleep labs, for instance by studying eye movements to classify sleep stages. Another example is clinical health screenings of congenitally blind people, who up till now have been excluded from eye tracking studies because the camera calibration process requires a seeing and focusing eye. DeepMReye does not discriminate between seeing and blind eyes when it comes to health screening.

Fixing a critical blindness in imaging research and diagnostics

There is a general agreement between scientists and clinicians that eye tracking is an important measure for investigating cognition and diseases. Combining precise measurements of eye movements with neuroimaging maps of whole-brain activity makes for a powerful tool. However, eye tracking has yet to become a standard in imaging research labs and the clinical routine of hospitals.

So why hasn't eye tracking caught on? Nau has a clear explanation: "The existing camera eye-tracking procedure is quite expensive, it is tricky to use, and way too time consuming. We don't have the time, the clinical routine, or the expertise to use these cameras for every single patient or research subject."

This is a suboptimal situation well known amongst professionals. Currently, 90 % of MRI studies in the top research journals published in the last two years did not track eye movement. Of the 10 % that used eye tracking, 5 % reported poor data quality, and the remaining 5 % used the data in a way that didn't really require a camera.

"This is a missed opportunity. But it is also alarming," Nau said. "There are many problems and artefacts associated with fMRI brain activity studies that these researchers and clinicians are making themselves blind to, which could be cleaned up by including eye tracking into the procedure."

Several studies are currently claiming that they have identified memory encoding and all sorts of higher-level cognitive processes, which at the end of the day may just be an eye-movement confound. One candidate for this category is thesubsequent memory effect.

"If I show you two images and you later remember only one of them, I may also find that the activity in your hippocampus, which is your brain's memory encoding hub, was higher for the image that you remembered than for the one that you had forgotten," Nau said.

The theory of a subsequent memory effect suggested that the correlation between the two is directly related to memory encoding, and that the bump in activity is like a signature of memory. Thus, higher activity means better recall of that image later.

However, it turns out that the images you remember are also the ones that you looked at more. There is a direct correlation between eye movement, rise in local brain activity, and successful memory encoding. Your eyes made more fixations at that image. You invested way more energy in getting information from that image. And that is why you remember it later on. More fixations mean stronger brain activity. So, if you take into account the fact that people move their eyes more during encoding, then this sub-sequent memory effect disappears.

"The reason why we encode and remember some images and forget others, may simply be because we find some scenes more interesting, we are drawn to them, and we are willing to spend more time studying them to extract unique features of the scene. What we are beginning to realize, is that we can't fully piece apart these memory effects from our viewing behavior, they are inherently linked," Nau said.

A free, open-source, plug and play tool

Crafting the camera-less eye tracking tool started as a weekend project between the students about three years ago."We wanted to build the package that we ourselves would like to have," Nau said. "A user-friendly eye tracker that uses the MRI signal from the eyes, a model that could improve interpretability and robustness of neuroimaging research, and that would overcome the obstacles that had prevented eye-tracking from being implemented as a standard in the imaging field."

"This project has been different from the average paper," Nau said. "Usually you work on something for a few years, you write it up, you publish it, and then you close the book and move on to a new project. Whereas this paper will take on a whole new life once we publish it. Once the users start using DeepMReye, they will have questions, they will have requests, they will find bugs."

Theresearchers have made the tool open-source and fully available from GitHub:

https://github.com/DeepMReye/DeepMReye

"You don't need any money, any equipment, or any trained personnel to use the model. You don't have to spend costly experimental time or medical consultation time for setting up and calibrating a camera. The model can be used post-hoc, after the patient is well at home," Nau said.

The researchers also worked hard to make the tool as user friendly as possible, and created a website with user recommendations and a FAQ: https://deepmreye.slite.com/p/channel/MUgmvViEbaATSrqt3susLZ

"Hopefully this will be plug-and-play for everybody," Nau said.

Source:
Journal reference:

Frey, M., et al. (2021) Magnetic resonance-based eye tracking using deep neural networks. Nature Neuroscience. doi.org/10.1038/s41593-021-00947-w.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Increased emotional sensitivity linked to previous COVID-19 infection, new research suggests