In a recent study published in Scientific Reports, researchers investigated the dynamic visual-somatosensory cortical interactions in human infants.
Prior studies have shown that infants are sensitive to spatial and temporal correspondences between tactile and visual stimuli. However, whether infants also respond to these dynamic interactions over time remains unclear.
Notably, in humans who can see, somatosensory interactions provide rich, continuous spatiotemporal information about an object's movements with respect to the body. Moreover, perception of these multisensory interactions is a prerequisite to representing peripersonal space, i.e., the ability to perceive the relation between objects in external space and the body.
About the study
The current study included approaching and receding (two) conditions, with Touch and No-Touch trials, presented in blocks of eight trials each and a 12-second video break after each block.
Parents holding the infants in their lap held their hands still in front of a screen. When the infant's attention was fixed on an animal's face, the researchers triggered the experimental stimuli. The study cohort comprised infants aged four and eight months, of which 20 infants with nine girls constituted the former group, and 20 infants with 10 girls formed the other group.
A red ball (a dynamic visual stimulus) appeared in the lower half of the screen and approached the infant's hands or receded towards the background for 333 milliseconds (ms). After a small interval, they presented infants with a tactile stimulus on both hands at the expected site of contact in the approaching visuals motion trajectory.
Throughout these experiments, the team recorded the infants' brain activity using electroencephalography (EEG) and analyzed their somatosensory evoked potentials (SEPs) measured in response to the vibrotactile stimuli. SEPs are electrical brain responses generated in response to tactile (touch) stimuli.
The researchers deployed two analysis strategies, sample-point simulation and comparison of mean individual amplitude, in two study conditions, and these analyses were done separately for four- and eight-month-old infants.
The simulations compared the SEPs between 100 ms before the tactile stimulus presentation and 900 ms later, accounting for the correlation between two consecutive sample points. They performed one-sample t-tests on the simulated data at each time point, comparing the difference waves of the SEPs between conditions to zero, where a p-value <0.05 was considered statistically significant.
For each of the 1000 random datasets generated in the simulation process, the researchers computed the longest sequence with sequential significant t-test results whose 95th percentile of distribution helped them determine a sequence of statistical reliability.
Next, the researchers examined all prominent features in the brain's electrical activity in response to a tactile stimulus. Comparing amplitudes (signal strength) of these components using the collapsed localizers approach helped them identify the main components within a specific time window. Furthermore, they determined the latency of these components based on the average waveform across conditions.
Among four-month-olds, the researchers identified five main components in the SEPs labeled based on their polarity (positive or negative) and peak latency. These were named P286, N398, P506, N560, and P662, with respective time ranges of occurrence being 202–354 ms, 356–440 ms, 442–548 ms, 550–598 ms, and 600–700 ms.
The average amplitude of all SEPs was substantial when an approaching visual motion (not a receding motion) followed a tactile stimulus, suggesting that the brain's response to the touch is stronger when associated with an approaching motion in four-month-olds.
In eight-month-olds, this pattern gets reversed. Accordingly, they had four components named P240, N362, P470, and N572, with respective time ranges of occurrence being 202-310 ms, 312-418 ms, 420-526 ms, and 528-636 ms between 202 and 700 ms after the tactile stimulus onset. However, t-tests did not show any consistent and significant differences in their brain responses or waves of the SEPs across the two study conditions.
Overall, the study data confirmed that human infants, as young as four months of age, can coordinate multisensory information presented across different locations and at different time points to perceive self in the dynamic relationship with the surrounding world.
However, notably, SEPs of four-month-old infants were enhanced following approaching visual stimuli, while that of older 8-month-olds showed a reverse pattern, with larger SEPs in response to tactile stimuli following receding visual stimuli.
- Orioli, G., Parisi, I., van Velzen, J.L. et al. Visual objects approaching the body modulate subsequent somatosensory processing at 4 months of age. Sci Rep 13, 19300 (2023). doi: https://doi.org/10.1038/s41598-023-45897-4