Autistic children lack essential social and communication skills, and the type and severity of the symptoms experienced are wide-ranging. Because they have such diverse ways of expressing their emotions, it is difficult for therapists to gauge how engaged they are with the therapy.
To track their progress, many therapists film their sessions and spend hours watching the footage and analyzing the child’s reactions to determine how effective a particular approach was. This is laborious and time-consuming work.
In response, the EU-funded ENGAGEME project is developing state-of-the-art software to enable the robots used as assistive tools in autism therapy to ‘perceive’ a child’s emotions and level of engagement. With this information, the robot can respond to the child more naturally and effectively, and also document the therapy progress automatically.
‘The technology being developed can enable personalized interactions with an autistic child, helping the therapist to keep the child’s attention, which improves their ability to learn the content,’ says project researcher Ognjen Rudovic, a postdoctoral Marie Curie Fellow in the Affective Computing group at the Massachusetts Institute of Technology (MIT Media Lab), US. It also allows the therapist to more quickly determine which learning approach works best for each child.
‘We are currently working on a prototype, the goal of which is to enable the use of our personalized machine-learning algorithms within everyday autism therapy,’ says Rudovic. ‘These computing tools can significantly reduce the therapists’ efforts, allowing them to focus on the design of more effective therapy content that will have better learning outcomes for children with autism.’
Personalized deep learning
Capturing an autistic child’s attention, even for a few seconds, is challenging. A non-threatening-looking NAO robot, of the type used in this project, makes doing so easier. ENGAGEME is equipping the robots with a type of software that is based on areas of artificial intelligence (AI) that employs machine learning, computer vision, affective computing and human-robot interactions.
More precisely, the researchers use the type of computer algorithms known as ‘deep learning’ for a more personalized analysis of the child’s behavior using data recorded by cameras and microphones – facial expressions, head pose, body language and voice.
In addition, a wrist monitor monitors heart rate and skin conductivity. The algorithm analyses these data to determine the child’s level of engagement and emotional state.
Rosalind Picard, the director of Affective Computing Group at MIT Media Lab, and Rudovic’s research advisor, says the challenge of creating machine learning and AI that works in autism is particularly vexing.
‘This is because the usual AI methods require a lot of data that are similar for each category that is learned. In autism, where heterogeneity reigns, the normal AI approaches fail,’ says Picard.
ENGAGEME is also unique in that it is the first project to analyse behavioural cues from autistic children with different cultural backgrounds – Japan and Europe.
A group of 35 children from Japan and Serbia, aged 3 to 13, took part in a study as part of the project. The results showed that the robots are more consistent than human therapists at estimating a child’s emotion and engagement states.
The researchers noticed that the children from the two cultural backgrounds communicated different states using similar body movements. The Japanese children communicated low levels of engagement, including disengagement during the therapy, using large body movements. On the other hand, the European children showed more bodily movements when they were highly engaged.
‘This is an important finding for the design of future child-robot interactions as part of the therapy, as it may allow a robot to easily adapt its interpretations of children's behaviors based on their cultural background,’ says Rudovic.
Working with industry
In more recent work, the researchers showed that with their image analysis system they could enable humanoid robots to successfully read the facial expressions of children with autism.
‘This can help the robots to better gauge facial expressions of autistic children and teach them to recognize facial expressions of their typically developing peers, leading to better social communication, which is one of the main challenges for autistic children,’ he says.
Currently, the researchers are trying to deploy their technology in schools with autistic children in the UK, in collaboration with the DE-ENIGMA project, one of the biggest European projects designing assistive tools for autism therapy.
Another important aspect is the project’s close collaboration with businesses that produce robots for use as assistive tools in autism therapy.
‘We have received lots of interest from institutions, and parents of autistic children, from Japan, Europe and USA to use our technology,’ Rudovic says. ‘Our next step is to team up with robotics companies to scale-up our software and implement large clinical trials that would allow us to make a real impact with our technology. It is important for Europe to be one of the frontiers in doing the research on autism and deliver the state-of-the-art therapy for children with autism, to reduce the later cost of this condition, which is in the range of billions of euros.’
ENGAGEME received funding through the EU’s Marie Skłodowska-Curie fellowship program. Rudovic, originally from Serbia and a UK national, is the main recipient of the Marie Curie grant. It has enabled him to position himself as one of the top researchers in the field of affective computing and machine learning.
‘This advanced my career prospects – currently there is a lot of interest from academic and industry sector in my research, and I am very excited about continuing it,’ says Rudovic.