York University professor Doug Crawford, of the university’s pioneering Centre for Vision Research, released a groundbreaking study that will assist stroke and head injury victims rehabilitate their sight. For the first time, Crawford and his associates used measurements of eye and head movements, computer simulations and brain recordings, to determine how we use vision to guide our movements.
The study appears in its entirety in the December 16 issue of the journal Neuron.
"In this study," says Crawford, "we discovered a function that sets humans and higher primates apart from lower mammals in terms of how vision controls movements." The research was conducted at the York Centre for Vision Research, which Crawford noted was the only lab in the world capable of conducting the experiments. He added that there is a scaled down lab at Sunnybrook Health Sciences Centre that is used to test the vision and movements of stroke patients.
Crawford's team's previous landmark work demonstrated that the parietal cortex (located at the back and top of the brain) maps what the eyes are looking at to tell us where things are in space relative to where we are looking. Crawford's team observed that this spatial map is 'updated' each time the eyes move and that this function is a basic, primitive mechanism probably common to all mammals.
"With higher mammals like humans," says Crawford, "we have now determined that an area in the frontal cortex called the 'Supplementary Eye Fields' (SEF) contains a much more complex map of space for guiding movements of the eyes and head that shift visual gaze." He added that, by correlating activation of specific regions of the SEF with eye and head movements, and comparing them to the predictions of a computer model, the team was able to show that the SEF contains several separate maps of space for coding targets relative to the eyes, head or body.
He noted that, while his team knew that the SEF governed complex signals related to high-level vision, attention and planning, previously they did not know much about how the signals actually worked. "To put it in laypersons' terms," said Crawford, "The parietal cortex relies on one fairly simple spatial 'language' to guide movement, while the SEF is multilingual."
Crawford likened the SEF to a computer that receives all kinds of processed information from the visual system. “It then lets you link it up to different kinds of behaviour – which is the basis of thought,” he said, adding, “To extend the language analogy, it is like a translator – except that this translator also makes decisions and can learn.”
Crawford noted that several labs are working on ways to link up brain areas like the SEF with prosthetic devices to allow stroke and other patients to move. “In order to do this, he said, “we need to know what these brain areas are coding – what signals they normally send to other parts of the brain.”
In addition to team leader, lab director and holder of the CIHR grant that funded this work, Crawford, who is Canada Research Chair in Visumotor Neuroscience and Associate Director of the York Centre for Vision Research, the other authors of the study were: former York post-doctoral student, Dr. Julio Martinez-Trujillo, who was lead author, and is now Canada Research Chair and Assistant Professor of Physiology at McGill; former York post-doctoral student, Dr. Pieter Medendorp, who did the theoretical simulations, and is now a faculty member in the Department of Psychology, U. Nijmegen, Netherlands; and Dr. Hongying Wang, Crawford's research associate, who assisted Dr. Martinez-Trujillo with experiments.