Implementation science approaches for integrating eHealth research into practice and policy

https://doi.org/10.1016/j.ijmedinf.2013.07.002Get rights and content

Abstract

Purpose

To summarize key issues in the eHealth field from an implementation science perspective and to highlight illustrative processes, examples and key directions to help more rapidly integrate research, policy and practice.

Methods

We present background on implementation science models and emerging principles; discuss implications for eHealth research; provide examples of practical designs, measures and exemplar studies that address key implementation science issues; and make recommendations for ways to more rapidly develop and test eHealth interventions as well as future research, policy and practice.

Results

The pace of eHealth research has generally not kept up with technological advances, and many of our designs, methods and funding mechanisms are incapable of providing the types of rapid and relevant information needed. Although there has been substantial eHealth research conducted with positive short-term results, several key implementation and dissemination issues such as representativeness, cost, unintended consequences, impact on health inequities, and sustainability have not been addressed or reported. Examples of studies in several of these areas are summarized to demonstrate this is possible.

Conclusions

eHealth research that is intended to translate into policy and practice should be more contextual, report more on setting factors, employ more responsive and pragmatic designs and report results more transparently on issues important to potential adopting patients, clinicians and organizational decision makers. We outline an alternative development and assessment model, summarize implementation science findings that can help focus attention, and call for different types of more rapid and relevant research and funding mechanisms.

Section snippets

Background

There is consistent and increasing evidence that eHealth interventions are efficacious [1], [2], [3], [4] For the purposes of this paper, we define eHealth interventions as “the use of emerging information and communication technology, especially the Internet, to improve or enable health and health care” including Internet, interactive voice response (IVR), automated and electronic programs, CD-ROMS, mobile applications, and computer-tailored print but exclude telemedicine and interventions

The need for speed: a Rapid and Relevant Research Paradigm

It is possible to conduct research much more rapidly thereby simultaneously increasing relevance and making it more likely that results will translate into policy and practice.

Fig. 1b displays a more rapid and relevant intervention development approach that may be particularly relevant to eHealth research given the fast pace of technological development, data volume collection capabilities and potential for eHealth to reach a broad range of people. This paradigm would allow for dissemination of

Implementation science models

Over 60 IS frameworks have been developed to address health services across a wide diversity of issues [17] and can be used to help design more rapid and relevant research. Reviews have analyzed these conceptual and theoretical models with mixed findings regarding development of theoretical concepts, methods, and measures available for direct application [17], [28], [29], [30], [31]. IS frameworks are increasingly applied and widely adopted in health service research. However, to our knowledge

Rapid learning methods for eHealth research

To increase research that is not only rigorous but rapid, relevant, robust, recursive and transparent [47], there is a need for more pragmatic “rapid-learning research systems” that integrate relevant stakeholders (e.g., researchers, funders, health systems, and community partners) with clinically relevant research questions, use efficient and innovative research designs, and leverage rich, longitudinal data sets [20], [48], [49]. Rapid learning methods involve evaluability assessments [23] to

Rapid, adaptive designs

Insights from evaluability assessments inform prototypes and quick refinements of eHealth intervention components as discussed above. Several optimization approaches adapted from industry and engineering may be particularly useful to quickly test and refine eHealth intervention components [58], including dynamic systems models [59], N-of-1 [60], [61], A–B quasi experimental [62], multiphase optimization strategies (MOST) [63], and sequential multiple assignment (SMART) [63]. These trial designs

Comparative effectiveness research (CER)

The Institute of Medicine defines CER as “the generation and synthesis of evidence that compares the benefits and harms of alternative methods to prevent, diagnose, treat and monitor a clinical condition, or to improve the delivery of care” with the goal of assisting consumers, clinicians and policy makers to make informed decision that will improve healthcare [65]. eHealth may be optimally suited for CER because a wealth of relevant data can be collected quickly to compare outcomes between two

Pragmatic trials

Pragmatic trials evaluate the effectiveness of interventions in real-life routine practice conditions in order to maximize applicability and generalizability, and seek to maximize heterogeneity in all aspects (participants, staff, settings) [24], [26], [67]. Pragmatic trials are used to address questions relevant to, and informed by, stakeholders and use comparison conditions that are real-world alternatives. The pragmatic-explanatory continuum indicator tool (PRECIS) [24] can be used to

More relevant RCTs

Dismantling and stepped-wedge [70] designs, and modifying follow-up periods can be used to enhance the relevance of traditional RCT designs. Dismantling designs, particularly, The Minimal Intervention Needed for Change comparison conditions (MINC) [71], [72] can be used to find low cost, minimally intensive interventions that improve outcomes. Stepped-wedge designs maximize statistical power because the intervention effect is estimated by both between-cluster and within-cluster comparisons and

Practical harmonized measures

Two classes of measures are relevant to this discussion: “gold standard” and “practical.” “Gold standard” measures are recommended for primary outcomes in grants and are most useful with large resources and staff to ensure quality while “practical measures” are often necessary for applied research in busy, low-resource settings or in cases when large amounts of data are being collected and brief and feasible measures on multiple issues are the primary goal. Regardless of the type of measure

Discussion and recommendations

In summary, IS provides models and frameworks to focus attention on key translation issues, methods and designs to help make eHealth research more rapid and relevant. It also has implications for the development and use of practical measures and outcomes that can iteratively inform eHealth development, help with transparent reporting, and direct attention to key issues related to integration of research into policy and practice. Some examples of eHealth studies which have successfully used IS

Authors’ contributions

All authors (a) wrote sections of the manuscript; (b) contributed key ideas; (c) reviewed, edited and approved the final submission.

Conflict of interest

The authors declare no conflict of interest.

Summary points

What was already known?

  • That eHealth interventions are effective under some conditions.

  • That implementation, long-term engagement, and retention of users in eHealth interventions is challenging.

What this paper adds:

  • Systematic implementation science models, methods and measures for eHealth application.

  • Examples of eHealth studies that demonstrate how research can be much more rapid and relevant to potential adopters.

Acknowledgements

The opinions expressed are those of the authors and do not necessarily represent those of the National Cancer Institute.

References (82)

  • D.E. Rivera et al.

    Using engineering control principles to inform the design of adaptive interventions: a conceptual introduction

    Drug Alcohol Depend.

    (2007)
  • D.R. Zucker et al.

    Individual (N-of-1) trials can be combined to give population comparative treatment effect estimates: methodologic considerations

    J. Clin. Epidemiol.

    (2010)
  • C.L. Backman et al.

    Single-subject research in rehabilitation: a review of studies using AB, withdrawal, multiple baseline, and alternating treatments designs

    Arch. Phys. Med. Rehabil.

    (1997)
  • L. Collins et al.

    The multiphase optimization strategy (MOST) and the sequentical mulatiple assignment randomized trial (SMART): new methods for more potent eHealth interventions

    Am. J. Prev. Med.

    (2007)
  • G.J. Norman

    Answering the what works? Question in health behavior change

    Am. J. Prev. Med.

    (2008)
  • T. Koppenaal et al.

    Pragmatic vs. explanatory: an adaptation of the PRECIS tool helps to judge the applicability of systematic reviews for daily practice

    J. Clin. Epidemiol.

    (2011)
  • J.M. Gierisch et al.

    Finding the minimal intervention needed for sustained mammography adherence

    Am. J. Prev. Med.

    (2010)
  • M.A. Hussey et al.

    Design and analysis of stepped wedge cluster randomized trials

    Contemp. Clin. Trials.

    (2007)
  • J.R. Shapiro et al.

    Text4Diet: a randomized controlled study using text messaging for weight loss behaviors

    Prev. Med.

    (2012)
  • K. Viswanath

    Cyberinfrastructure: an extraordinary opportunity to bridge health and communication inequalities

    Am. J. Prev. Med.

    (2011)
  • G.G. Bennett et al.

    The delivery of public health interventions via the Internet: actualizing their potential

    Annu. Rev. Public Health

    (2009)
  • S.M. Noar et al.

    Interactive Health Communication Technologies: Promising Strategies for Health Behavior Change

    (2013)
  • V. Strecher

    Internet methods for delivering behavioral and health-related interventions (eHealth)

    Annu. Rev. Clin. Psychol.

    (2007)
  • T. de Jongh et al.

    Mobile phone messaging for facilitating self-management of long-term illnesses

    Cochrane Database Syst. Rev.

    (2012)
  • B.A. Rabin et al.

    Dissemination of interactive health communication programs

  • K. Patrick et al.

    A text message-based intervention for weight loss: randomized controlled trial

    J. Med. Internet Res.

    (2009)
  • D. Stokols

    Social ecology and behavioral medicine: implications for training, practice, and policy

    Behav. Med.

    (2000)
  • L. Catwell et al.

    Evaluating eHealth interventions: the need for continuous systemic evaluation

    PLoS Med.

    (2009)
  • T. Lorenc et al.

    What types of interventions generate inequalities? Evidence from systematic reviews

    J. Epidemiol. Community Health

    (2013)
  • K.C. Stange et al.

    Considering and reporting important contextual factors

  • B.A. Rabin et al.

    Advancing the application, quality and harmonization of implementation science measures

    Implement. Sci.

    (2012)
  • R.E. Glasgow, W.T Riley, Pragmatic measures: what they are and why we need them, Am. J. Prev. Med., (in...
  • B. Gaglio et al.

    Evaluation approaches for dissemination and implementation research

  • R.E. Glasgow

    What does it mean to be pragmatic? Opportunities and challenges for pragmatic approaches

    Health Educ. Behav.

    (2013)
  • W.T. Riley et al.

    Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise

    Clin. Trans. Sci.

    (2013)
  • G.J. Langley et al.

    The Improvement Guide: A Practical Approach to Enhancing Organizational Performance

    (1996)
  • L.C. Leviton et al.

    Evaluability assessment to improve public health policies, programs, and practices

    Annu. Rev. Public Health

    (2010)
  • K.E. Thorpe et al.

    A pragmatic-explanatory continuum indicator summary (PRECIS): a tool to help trial designers

    CMAJ

    (2009)
  • A.H. Krist et al.

    Designing a flexible, pragmatic primary care implementation trial: the My Own Health Report (MOHR) Project

    Implement. Sci.

    (2013)
  • R.E. Glasgow et al.

    Practical clinical trials for translating research to practice: design and measurement recommendations

    Med. Care

    (2005)
  • L.M. Klesges et al.

    Beginning with the application in mind: designing and planning health behavior change interventions to enhance dissemination

    Ann. Behav. Med.

    (2005)
  • Cited by (101)

    View all citing articles on Scopus
    View full text