The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
LettersFull Access

Tele-Operating an Android Robot to Promote the Understanding of Facial Expressions and to Increase Facial Expressivity in Individuals With Autism Spectrum Disorder

To the Editor: Facial expression is crucial for conveying emotions and communicating. Altered facial expressivity could contribute to social isolation and difficulties in emotional regulation (1). Individuals with autism spectrum disorder (ASD) exhibit little facial expressivity (2). However, few studies have investigated interventions for improving facial expressivity in such patients. This letter documents the case of a patient with ASD who received intervention using an android robot. The patient provided informed consent, and the study design was approved by the appropriate ethics review boards.

An 18-year-old man with ASD diagnosed according to DSM-5 criteria had a history of hospitalization because of social impairment, including a suicide attempt. However, he did not meet criteria for any other DSM-5 diagnoses. The Childhood Autism Rating Scale total score (34) indicated mild to moderate ASD. Despite extensive treatment, including behavioral therapy to improve communication skills, his aversion to communicating with others persisted. He did not understand the importance of facial expression and required training to describe his feelings through facial expressions. However, his full-scale IQ was high (141). Considering his interest in advanced technology, we decided to use an android robot for the intervention. We selected an android robot rather than an avatar because we believed a three-dimensional learning environment, wherein the patient interacted with an android robot, may be more powerful than one involving an interaction with an avatar. We used an ACTROID-F robot (Figure 1) (Kokoro Co. Ltd.) that closely resembles a human being in terms of realism and facial expressions (3). During the intervention, an operator entered words into a computer, which were read aloud by ACTROID-F. The operator could also replicate facial expressions, such as a smile, surprise, or sorrow, by using ACTROID-F. The operator could monitor the expressions made by ACTROID-F and the interlocutor via video.

We encouraged the patient to communicate with a teacher by operating ACTROID-F. Each intervention session lasted approximately 30 minutes, and he underwent five sessions. After each intervention, the patient revisited what he had learned through the communication with his teacher. The patient could voluntarily control his facial expressions and watch the interaction in a panoramic manner.

FIGURE 1.

FIGURE 1. A Photograph of the Android Robot ACTROID-F Used to Treat Individuals With Autism Spectrum Disorder

Through this intervention, the patient learned to understand the meaning of facial expressions and realized their importance in communication. After the intervention, he used facial expressions to express his subjective emotions in daily life and became interested in having conversations with others, and his self-confidence increased. Before the intervention, the patient thought it was difficult to pursue his dreams because of his lack of confidence in communicating. However, after the intervention, he was confident he could overcome his communication issues and went on to pursue his dreams by joining his preferred university. Although university admission is unlikely to be a common outcome after five treatment sessions with a robot, robotic intervention serves as a starting point.

The simulation of facial expression of emotions is known to promote the recognition of facial expressions (1). During the intervention, the facial expressions generated using ACTROID-F promoted the patient’s understanding of facial expressions and their importance. This interaction between the patient and the robot, and the benefits thereof, could be attributed to the patient’s interest in advanced technology. These case findings suggest that intervention using a robot might trigger the understanding of facial expressions and increase facial expressivity in individuals with ASD. An important future study would be one with a single-case experimental design with regular information gathered regarding the key outcome variables and other relevant variables.

From the Research Center for Child Mental Development, Kanazawa University, Ishikawa, Japan; the Department of Neuropsychiatry, Keio University School of Medicine, Tokyo; the Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, Osaka, Japan; the Service Robotics Research Group, Intelligent Systems Institute, National Institute of Advanced Industrial Science and Technology, Ibaraki, Japan; and the Department of Psychosocial Medicine, National Center for Child Health and Development, Tokyo.
Address correspondence to Dr. Kumazaki ().

Supported by Grants-in-Aid for Scientific Research from the Japan Society for the Promotion of Science (15H01577), by the Erato Ishiguro Symbiotic Human-Robot Interaction Project, and by the Center of Innovation Program from the Japan Science and Technology Agency.

Dr. Yoshikawa and Dr. Ishiguro have advisory roles at Vstone, which manufactures a commercial robot not used in this study. Dr. Ishiguro also has an advisory role at A-Lab, which manufactures a commercial robot not used in this study. Within the past 3 years, Dr. Mimura has received grants and/or speakers honoraria from Asahi Kasei, Astellas, Daiichi Sankyo, Dainippon Sumitomo, Eisai, Eli Lilly, FUJIFILM RI, Janssen, Kracie, Meiji Seika, Mitsubishi Tanabe, Mochida, MSD, Novartis, Ono Yakuhin, Otsuka, Pfizer, Shionogi, Takeda, and Yoshitomi Yakuhin. The other authors report no financial relationships with commercial interests.

References

1 Niedenthal PM, Mermillod M, Maringer M, et al.: The Simulation of Smiles (SIMS) model: embodied simulation and the meaning of facial expression. Behav Brain Sci 2010; 33:417–433Crossref, MedlineGoogle Scholar

2 Davies H, Wolz I, Leppanen J, et al.: Facial expression to emotional stimuli in non-psychotic disorders: a systematic review and meta-analysis. Neurosci Biobehav Rev 2016; 64:252–271Crossref, MedlineGoogle Scholar

3 Yoshikawa M, Matsumoto Y, Sumitani M, et al: Development of an android robot for psychological support in medical and welfare fields. IEEE International Conference on Robotics and Biomimetics. Karon Beach, Thailand, 2011; 2378–2383Google Scholar