Researchers create artificial tactile perception in monkeys through direct brain stimulation

NewsGuard 100/100 Score

Researchers from Duke University and HSE University have succeeded in creating artificial tactile perception in monkeys through direct brain stimulation. This breakthrough can be used to create upper-limb neuroprostheses, capable of delivering a tactile sensation. The study's results were recently published in the Proceedings of the National Academy of Sciences.

Most of today's prosthetics exchange information with the remaining nerves in an amputated limb, rather than directly with the brain. Neuroprostheses connect to the brain directly and can help restore limb function even if there is a complete failure of the peripheral nervous system is completely damaged, such as from a spinal cord injury or paralysis. In addition, when a prosthesis user gets tactile feedback, they can control its movements not only visually. This will increase the precision of movement and make control more natural and easier for humans, since in everyday life, we don't usually monitor our hand movement visually.

Electric stimulation of sections of the somatosensory cortex can produce percepts, which can mimic somatic sensation in the body parts connected to these parts of the cortex. Meanwhile, tactile perception includes a wide range of various sensations, such as the ability to distinguish a subject's temperature, weight, pressure or texture. To imitate tactile perception completely, each of these sensations must be studied.

Researchers from Duke University and HSE University decided to find out whether it is possible to mimic the sensation of a surface while engaged in active tactile exploration, with the application of somatosensory cortex stimulation.

Two rhesus monkeys were implanted with electrodes in parts of their somatosensory cortex. According to Mikhail Lebedev, Academic Supervisor of the HSE Centre for Bioelectric Interfaces, one of the monkeys had the electrode implanted in order to stimulate the area responsible for tactile perception in its finger; and the other one - in its toe.

The animals were seated before displays and given joysticks, which they used to control a cursor that looked like a realistic upper-limb avatar. The display showed two grey rectangles with a 'tactile' texture - vertical ridges that were invisible but could be 'felt' with the cursor. When the cursor crossed a ridge, the monkey's somatosensory cortex was stimulated with electrodes.

At first, the monkeys used the joystick to move the cursor, and at the next stage, the cursor-joystick connection was disabled, and the trial subjects were connected to the virtual finger via a 'brain-computer-brain' interface: the signals controlling the cursor were transcribed from their brains directly. The monkeys were rewarded each time they chose the most 'rugged' rectangle.

The researchers were particularly interested in whether the monkeys would maintain their ability to compare the textures of surfaces at different speeds of exploration: this would mean that their movement control is 'synchronized' with the feedback received from the cursor.

Both monkeys, even after the first experimental session, performed the task better than simply guessing the correct rectangle. The speed of their exploration (i.e., the speed of their virtual hand movement over quasi-textured objects) did not affect their overall performance. This means that they were really able to feel the texture of different rectangles.

The researchers are currently carrying out their next experiment with the participation of volunteers. As Mr Lebedev explains,

Volunteers are now involved in the same experiment as the monkeys were, but now the electrode is placed on their finger and stimulates the finger directly. People can already tell us what exactly they are feeling."

So, researchers can check how well the brain-computer-brain interface can specify the difference in the rectangles’ texture frequency.’ The new project is carried out by Mikhail Lebedev and Alexey Ossadtchi from HSE University in collaboration with Mikhail Sinkin and Vladimir Krylov of Moscow State University of Medicine and Dentistry.

This relatively simple imitation of the sensation of a tactile texture can allow for the easy integration of this technology in already existing neuroprostheses. In the future, researchers will need to further explore how to code tactile perception from different artificial limb receptors simultaneous, in order to deliver information in regards to more complicated textures.

Биоэлектрический интерфейс
Source:
Journal reference:

O’Doherty, J.E., et al. (2019) Creating a neuroprosthesis for active tactile exploration of textures. Proceedings of the National Academy of Sciences. doi.org/10.1073/pnas.1908008116.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Salk scientists explain how CBN protects the brain against aging and neurodegeneration