Tele-Operating an Android Robot to Promote the Understanding of Facial Expressions and to Increase Facial Expressivity in Individuals With Autism Spectrum Disorder
To the Editor: Facial expression is crucial for conveying emotions and communicating. Altered facial expressivity could contribute to social isolation and difficulties in emotional regulation (1). Individuals with autism spectrum disorder (ASD) exhibit little facial expressivity (2). However, few studies have investigated interventions for improving facial expressivity in such patients. This letter documents the case of a patient with ASD who received intervention using an android robot. The patient provided informed consent, and the study design was approved by the appropriate ethics review boards.
An 18-year-old man with ASD diagnosed according to DSM-5 criteria had a history of hospitalization because of social impairment, including a suicide attempt. However, he did not meet criteria for any other DSM-5 diagnoses. The Childhood Autism Rating Scale total score (34) indicated mild to moderate ASD. Despite extensive treatment, including behavioral therapy to improve communication skills, his aversion to communicating with others persisted. He did not understand the importance of facial expression and required training to describe his feelings through facial expressions. However, his full-scale IQ was high (141). Considering his interest in advanced technology, we decided to use an android robot for the intervention. We selected an android robot rather than an avatar because we believed a three-dimensional learning environment, wherein the patient interacted with an android robot, may be more powerful than one involving an interaction with an avatar. We used an ACTROID-F robot (Figure 1) (Kokoro Co. Ltd.) that closely resembles a human being in terms of realism and facial expressions (3). During the intervention, an operator entered words into a computer, which were read aloud by ACTROID-F. The operator could also replicate facial expressions, such as a smile, surprise, or sorrow, by using ACTROID-F. The operator could monitor the expressions made by ACTROID-F and the interlocutor via video.
We encouraged the patient to communicate with a teacher by operating ACTROID-F. Each intervention session lasted approximately 30 minutes, and he underwent five sessions. After each intervention, the patient revisited what he had learned through the communication with his teacher. The patient could voluntarily control his facial expressions and watch the interaction in a panoramic manner.
Through this intervention, the patient learned to understand the meaning of facial expressions and realized their importance in communication. After the intervention, he used facial expressions to express his subjective emotions in daily life and became interested in having conversations with others, and his self-confidence increased. Before the intervention, the patient thought it was difficult to pursue his dreams because of his lack of confidence in communicating. However, after the intervention, he was confident he could overcome his communication issues and went on to pursue his dreams by joining his preferred university. Although university admission is unlikely to be a common outcome after five treatment sessions with a robot, robotic intervention serves as a starting point.
The simulation of facial expression of emotions is known to promote the recognition of facial expressions (1). During the intervention, the facial expressions generated using ACTROID-F promoted the patient’s understanding of facial expressions and their importance. This interaction between the patient and the robot, and the benefits thereof, could be attributed to the patient’s interest in advanced technology. These case findings suggest that intervention using a robot might trigger the understanding of facial expressions and increase facial expressivity in individuals with ASD. An important future study would be one with a single-case experimental design with regular information gathered regarding the key outcome variables and other relevant variables.
1 : The Simulation of Smiles (SIMS) model: embodied simulation and the meaning of facial expression. Behav Brain Sci 2010; 33:417–433Crossref, Medline, Google Scholar
2 : Facial expression to emotional stimuli in non-psychotic disorders: a systematic review and meta-analysis. Neurosci Biobehav Rev 2016; 64:252–271Crossref, Medline, Google Scholar
3 Yoshikawa M, Matsumoto Y, Sumitani M, et al: Development of an android robot for psychological support in medical and welfare fields. IEEE International Conference on Robotics and Biomimetics. Karon Beach, Thailand, 2011; 2378–2383Google Scholar