Introduction

Autism spectrum disorders (ASD) are marked by core features including impaired social communication, and restrictive and repetitive behaviors and interest (Diagnostic and Statistical Manual of Mental Disorders 5th Edition: DSM 5). Current available treatment options are largely based on addressing comorbid psychiatric, neurological, or medical conditions, as core symptoms of autism are often refractory to current pharmaco-therapeutic options (Ji and Findling 2015). Thus, many attempts to develop novel treatments have been driven towards ameliorating the core features with hopes to possibly change the natural course of the illness (Accordino et al. 2016).

In the past decade, we have seen digital technologies, especially mobile phones and smartphones, transform how people across the world communicate and access information. There are now more phones than there are people in the world. With a continual shortage of child psychiatrists both in the USA and also around the world, the potential to deliver or augment ASD services and technology is tremendous. Like the rest of healthcare, the digital revolution has impacted the ASD community as mobile device-based software and smartphone/tablet apps are constantly being developed and made commercially available for patients and their families (Shic and Goodwin 2015). However, few of the apps for ASD are formally tested or offer literature to help inform about their actual use in patients. The limited papers that do exist offer encouraging evidence about the potential and feasibility of apps in ASD (An et al. 2017; Chen et al. 2017; Fletcher-Watson et al. 2016; Gyori et al. 2015), but overall little is known about the efficacy of these apps in real-world settings.

But despite limited evidence, a quick search of the term “autism” into the Apple or Google apps stores returns a sea of related apps. With so many apps thus directly available for parents and even patients to directly download—yet so little clear evidence—how can clinicians best proceed and navigate this evolving app space? The American Psychiatric Association developed a tool for psychiatrist in 2016 to help guide their patients in mobile technology use (https://www.psychiatry.org/psychiatrists/practice/mental-health-apps). However, this resource is more an evaluation rubric and assists with informed decision-making regarding apps and does not recommend certain ASD apps. Interestingly, the Autism Speaks website does provide a list of ASD apps via its publically accessible website (https://www.autismspeaks.org/autism-apps). While there is not official curation of ASD apps, this Autism Speaks website appears to be on the largest curated collection of ASD apps. Thus, it can serve as a useful proxy for considering what are the available ASD apps and assessing what is the evidence supporting those apps.

Assessing the evidence supporting ASD app is critical, as at least in other areas of psychiatry studies have shown a significant discrepancy between presumed benefit and actual clinical applicability or efficacy of apps (Gajecki et al. 2014; Heffner et al. 2015; Kertz et al. 2017). Despite numerous apps claiming to be evidence-based, this purported evidence often refers only to evidence-based principles such as cognitive behavioral therapy which may or may not work equally well when delivered in a new digital format and app design (Fletcher-Watson et al. 2016). For example, in some individuals with ASD, skills learned through computer-based programs do not always translate into daily life (Fletcher-Watson et al. 2016; Whyte et al. 2015). Thus, understanding the evidence for available apps is an essential process for clinicians and families in making an informed decision regarding the use of a mobile device app in ASD.

In this study, we aim to (1) examine the available mobile device-based applications for ASD, (2) consider currently available evidence, and (3) discuss associated issues, with a main goal to guide the process of app selection by clinicians, patients, and families.

Method

We reviewed evidence for commercially available mobile device apps for ASD. As there is no authoritative source of ASD apps, we picked apps listed on “Autism Apps” section on Autism Speaks official website (https://www.autismspeaks.org/autism-apps). According to Autism Speaks, apps are assigned research ratings as follows:

  • Anecdotal = No specific or related scientific studies for this type of app

  • Research = There are some related scientific studies, but no direct research support for this type of app or technology

  • Evidence = There is solid or specific scientific evidence that this type of app or technology is helpful.

We first reviewed the articles linked to apps rated by Autism Speaks as having either “research” or “evidence.” Next, we searched for evidences for these apps by searching PubMed, PsychInfo, and the app developers’ websites for any related scientific articles or publications about the app. Lastly, we evaluated each app with our own simple criteria to rate the level of evidence for these apps, such as follows:

  • Direct evidence = a study looking specifically at a certain app with regard to targeting symptoms of ASD

  • Indirect evidence = a study providing evidence for a certain method that is utilized in an app with regard to targeting symptoms of ASD

To test reliability, two raters reviewed an initial random subset of 10 apps from the Autism Speaks website. Given the simplicity of our criterion of direct or indirect evidence offered by a study, the two raters had perfect reliability.

Result

Autism Speaks Ratings

Based on a search on of 3/4/17, we found a total of 695 apps on Autism Speaks apps section (https://www.autismspeaks.org/autism-apps). Those 695 apps were rated as anecdotal, research, or evidence according to Autism Speaks’ own research rating criteria. Only 40 apps were rated as “evidence” by Autism Speaks, and they had a total of nine supporting articles cited. Fifty-two apps rated by Autism Speaks as “research” had 12 articles cited, of which five articles were duplicates of those cited in apps rated as “evidence.” Two of the cited studies could not be found.

Our Team’s Ratings

Based on a search, 1 month later on 4/4/17, we found 0 apps that met our definition of possessing direct evidence among those rated by Autism Speaks as having “evidence.” Of note, three apps were no longer available which suggests the Autism Speaks list of app is currently updated on at least a monthly basis. Among those rated by Autism Speaks as having “research,” we found five apps with direct evidence that included a thesis and a book chapter, and nine apps were no longer available.

Updates and Availability

Based on our repeat search on 8/2/17, we found that 12 apps were no longer available, and 10 apps were last updated before 2015, among those rated by Autism Speaks as having “evidence.” Among those rated by Autism Speaks as “research,” 15 apps were no longer available, and 14 were last updated before 2015 (Fig. 1 and Table 1).

Fig. 1
figure 1

Flowchart of review process

Table 1 An overview of evidence for autism apps by number as examined on 8/2/17. The total of 603 for anecdotal evidence is calculated as the total of 695 minus the 92 with evidence or research

Discussion

There are nearly 700 mobile device apps listed on “Autism Apps” section on Autism Speaks website (Autism Apps 2017). However, only a small portion of these apps labeled as having evidence were found to actually have any actual clinical evidence (4.9%) supporting their use or benefit. The vast majority of apps targeting Austism (95.1%) offered no clear indirect or direct evidence.

Few apps offered indirect evidence in the form of links or reference to non-specific clinical research articles on particular principles of Autism care, on which apps were developed and based. For example, Augmentative and Alternative Communication (AAC) has been studied to show benefits in autism populations (Iacono et al. 2016); however, less is known about how such translates into a smartphone app and what is gained or lost when delivered on a phone. Apps based on principles like this are said to have indirect evidence to support them until direct evidence is obtained. But as previously mentioned, studies have shown significant discrepancy between evidence for a treatment modality in regular clinical settings compared to that in mobile device app setting (Gajecki et al. 2014; Heffner et al. 2015; Kertz et al. 2017). Thus, precautions are warranted for indirect evidence as such discrepancy could lead to not only lack of efficacy but also potential harm to users (Gajecki et al. 2014; Heffner et al. 2015; Kertz et al. 2017).

Yet, an even smaller fraction (0.6%) of apps with any evidence has direct evidence. Most cases of such direct evidence were pilot studies and not rigorous clinical trials. However, it is important to note that these apps do have evidence that directly address their utility for mobile device users, and the developers or independent researchers should be encouraged to seek direct evidence for their products. Generating direct evidence can be difficult for app makers, as it involves partners with researchers and undertaking the often costly and time-consuming process of clinical research. New research methods and protocols can greatly reduce the cost and shorten the length of app-based studies, meaning the barriers to direct app research are often less than expected (Hekler et al. 2016).

Our finding that research has not kept pace with the volume of ASD apps available is consistent with results from other branches of psychiatry. A recent review paper of smartphone apps for mood disorder identified only 29 papers (Dogan et al. 2017) and on schizophrenia identified only 11 papers (Torous et al. 2017). Another recent review study, where authors looked at the use of mobile touch-screen apps by patients with developmental disorders, also suggested that most of the existing studies had suggestive evidence, small subject numbers, and likely biases, as they were often funded and performed by the developers (Stephenson and Limbrick 2015). A systemic meta-review published this year also indicates that the benefits of DHI (Digitized Health Intervention), which includes smart phone apps, are yet without convincing evidence (Hollis et al. 2017). Further, without any standardized outcome reporting for app usability or engagement and with numerous studies using varied outcome metrics and clinical scales, the heterogeneity of the mental health app research space has precluded any definitive statements about the effect size or impact of these tools. However, it would be unfair and even counterproductive to require that such technological advances must wait for research findings and clinical evidence to catch up (Gyori et al. 2015). An app for ASD that does not currently have evidence does not mean that it is automatically ineffective or unsafe. Still, this current situation of hundreds of apps but limited evidence does present a challenge for both psychiatrists and patients. It means that making an informed decision regarding an app is not a simple process and one that demands weighing risks and benefits. It also means there is a need to look deeper, as exemplified by our results that the purported evidence cited on the Autism Speaks website may not offer the quantity, quality, rigor, or merit that a reasonable person would expect.

Like all studies, ours has several limitations. Our study had two physician raters for app evaluations, and we only assessed inter-rater reliability of a random subset of 10 apps. However, given the simplicity of our classification scheme and perfect reliability on the sample of 10 apps, we believe that our methods are appropriate. Also, our study based our search for ASD-related apps on Autism Speaks website apps section, and this selection may not be comprehensive or exhaustive of all ASD apps. However, there is no authoritative source for ASD apps, and our selection of this source seems at least reasonable.

The rising demand and supply of educational and medical mobile device apps for families and patients with ASD call for more structured guidelines for them. As working solutions, it seems reasonable to encourage informed dialogue between patients and providers or consider using guided frameworks such as the American Psychiatric Association’s freely available smartphone app evaluation tool. The ASD community will also benefit from forming new partnerships between patients, families, clinicians, and app makers with the goal of bringing as many stakeholders as possible together and work as a team to find ways to ensure that apps are effective and safe (Pulier and Daviss 2017).