Introduction
Theoretical approaches used in case studies
Implementation framework or theory | Nilsen [24] classification | Constructs influencing implementation | Case stud(ies) |
---|---|---|---|
Consolidated Framework for Implementation Research (CFIR) | Determinant framework: categorizes implementation barriers/enablers | Characteristics of intervention or practice (e.g., evidence, complexity, cost) Outer setting (e.g., patient needs, policies) Inner setting (e.g., organization/clinic characteristics, culture, implementation climate) Characteristics of individuals (e.g., clinician knowledge, self-efficacy) Implementation process (e.g., engaging, evaluating) | Ahmed et al. [30]: implementing ePROMs in a pain network van Oers et al. [32]: implementing ePROMs in multiple pediatric and adult health clinics Manalili and Santana [33]: implementing ePREMs for quality improvement in primary care |
Determinant framework: categorizes implementation barriers/enablers | Factors Influencing Clinician Behavior Change, e.g.: Knowledge, skills Professional role/identity Beliefs about capabilities Beliefs about consequences Reinforcement Intentions/goals Environmental context and resources Social influence Memory, attention, decision influences Behavioral regulation | Ahmed et al. [30]: implementing ePROMs in a chronic pain network | |
Integrated framework for Promoting Action on Research Implementation in Health Services (i-PARIHS) | Determinant framework: categorizes implementation barriers/enablers | Successful implementation formula = Facn(I + R + C) Fac = facilitation Person or organization assigned to do work of facilitation (implementation support) I = innovation Characteristics of innovation Degree of fit with existing practice and values Usability Relative advantage Trialability/observable results R = recipients Clinical experiences/perceptions Patient experiences, needs, preferences C = context Leadership support Culture, receptivity to change Evaluation capabilities | Roberts et al. [31]: implementing paper and electronic PROMs in a medical oncology outpatient department |
Knowledge to Action (KTA) | Process model: describes practical steps in translating research to practice | Knowledge creation phases: Knowledge inquiry Knowledge synthesis Create knowledge tools Action phases: Determine the know/do gap Adapt knowledge to local context Assess barriers/facilitators to use Select, tailor, implement Monitor knowledge use Evaluate outcomes Sustain knowledge use | Manalili and Santana [33]: implementing ePREMs for quality improvement in primary care |
Normalization Process Theory (NPT) | Implementation theory: specifies causal mechanisms | Coherence/sense-making (what is the work?) Cognitive participation (who does the work?) Collective action (how do people work together to get the work done?) Reflexive monitoring (how are the effects of the work understood?) | Manalili and Santana [33]: implementing ePREMs for quality improvement in primary care |
Barriers and enablers in case studies
Country | Clinical setting | Implemented PROMs or PREMs | IS framework or theory | Implementation barriers identified | Implementation enablers identified | Implementation strategies employed |
---|---|---|---|---|---|---|
Eastern Canada [30] | Chronic pain network including primary care, rehabilitation care, and hospital-based care | ePROMs | Barriers: Primary care: • Well-defined clinical process: barriers at clinician level • Lack knowledge on how to interpret pain PROMs Tertiary care: • Variability in care process: multilevel barriers • Confidentiality concerns • Technology comfort • Perceived increase in workload and time to review PROMs • Perception PROMs may decrease patients’ satisfaction with care • PROMs not integrated in electronic health record • Cost and time to implement | Enablers • Existing PROM system easy for clinicians to use and accessible on all forms of devices • Rapid access to PROM results • Selected PROMs that are easy to complete and interpret • Top-down decision from clinic leadership to implement • Created business plan with health system and moved money to clinic budgets • Opinion leader support | Strategies Pre-implementation: • Identify barriers with clinic • Map observed barriers to evidence-based strategies Implementation: • Training workshop with clinic team (half day) • Local opinion leader with PROM knowledge provided coaching • Educational materials • Onsite tech support • Workflow redesign support • Support to help patients complete PROMs Post-implementation: • Examine potential cost savings by triaging patients more efficiently | |
Australia [31] | Medical oncology outpatient department | Paper and electronic PROMs | Integrated Framework for Promoting Action on Research Implementation in Health Services | Barriers • Gaps in infrastructure • Varying workflows • Clinics needed more time than anticipated to implement • Staff felt pressured with competing priorities • Past negative experiences with innovations | Enablers • Dedicated facilitator (implementation support role) • Rapid access to PROM results • Research funding • Peer champions for PROMs emerged naturally | Strategies Pre-implementation: • Stakeholder engagement about barriers and context assessments • Workflow assessment and redesign assistance Implementation: • Training/information resources • Technical support • Rapid cycle testing Post-implementation: • Audit and feedback to clinics |
Netherlands [32] | Multiple pediatric and adult health conditions | ePROMs | Barriers • Some clinics undergoing too many change initiatives • PROMs not integrated in EHR • Stakeholders did not see relative advantage of PROMs • Compatibility • No organizational incentives | Enablers • Clinicians perceived value • Strong evidence PROMs improve clinical outcomes • Existing online portal is user friendly for patients and clinicians • Existing automated PROM reminders • Existing automatic and direct access to PROM results and visualization for clinicians • Existing ability for multidisciplinary clinic team members to customize PROMs based on patient age, health conditions, etc • Existing clinician self-efficacy | Strategies Pre-implementation: • Stakeholder engagement • PROM integration in EHR • Provided PROM recommendations based on patients’ age and condition Implementation • Training • Implementation support team available to all clinics Post-implementation: • Annual evaluation meeting with clinics • Reflecting and evaluating on what worked and did not work | |
Western Canada [33] | Primary care: implementing ePREMs for quality improvement | ePREMs | Barriers • Unclear stakeholder preferences and barriers • Unclear what optimal implementation strategies will be for PREMs and whether they differ from PROM strategies | Enablers • Research grant support • Collaboration with quality improvement specialists • National policy change: Primary care patient’s medical home encourages patient-centered communication and patient surveys to evaluate effectiveness of practice’s services | Strategies Pre-implementation • Stakeholder engagement to identify barriers (interviews with clinic teams) • Categorize barriers with theory and map to evidence-based implementation strategies Implementation: • Training clinic teams • Stakeholder engagement • Onsite coaching • Plan-Do-Study-Act rapid testing cycles Post-implementation: • Audit and feedback to clinics • Process evaluation |
Implementation strategies in case studies
Evaluating PROM/PREM implementation initiatives
IS evaluation framework | Construct to evaluate | Construct definition | Similar construct | Case studies |
---|---|---|---|---|
Proctor’s outcomes [29] | Acceptability | Extent to which implementation stakeholders perceive innovation to be agreeable or palatable | Satisfaction | Ahmed et al. [30]: implementing PROMs in a chronic pain network Roberts et al. [31]: implementing PROMs in routine cancer care van Oers et al. [32]: implementing PROMs for pediatric and adult clinics treating chronic conditions |
Appropriateness | Perceived fit, relevance, or compatibility of innovation for given practice setting | Compatibility, usefulness | ||
Adoption | Intention, initial decision, or action to employ innovation by service settings (proportion and representativeness) | Uptake | ||
Feasibility | Extent to which innovation can be successfully used or carried out within given setting | Practicability | ||
Reach/penetration | Extent to which target population is reached | Service penetration | ||
Fidelity | Degree to which innovation or implementation strategy delivered as intended | Adherence | ||
Costs | Financial impact of innovation, including costs, personnel, and clinic and patient time necessary for treatment delivery, or cost of implementation strategy | Cost–benefit, cost-effectiveness | ||
Sustainability | Extent to which innovation is maintained as intended and/or institutionalized within service setting’s ongoing operations | Maintenance, institutionalized | ||
Reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) | Reach | Extent to which target population is reached | Penetration | Manalili and Santana [33]: implementing PREMs for quality improvement in primary care |
Effectiveness | Impact of innovation on important outcomes, including potential negative effects, quality of life, and economic | |||
Adoption | Absolute number, proportion, and representativeness of settings and intervention agents (people who deliver the program) who are willing to initiate a program | Uptake | ||
Implementation | • At setting level: intervention agents’ fidelity to various elements of innovation’s protocol, including consistency of delivery as intended and time and cost of intervention • At individual level: use of intervention strategies | |||
Maintenance | • At setting level: extent to which an innovation becomes institutionalized/part of routine practices and policies • At individual level: Long-term effects of innovation on outcomes 6+ months after most recent contact | Sustainability, institutionalized |
Implementation science construct | Evaluating perception of the innovation (PROMs) | Evaluating the implementation strategies |
---|---|---|
Acceptability | Patients and clinicians • % willing to recommend PROMs to other patients • % reporting PROMs helpful in discussing symptoms/symptom management • % reporting ease of use and comprehensibility for PROMs and technology systems | • Stakeholder perceptions of acceptability of implementation strategies (e.g., PROM training session is appropriate length) • Barriers and enablers for implementing PROMs • Related contextual factor: organizational readiness for change |
Appropriateness | • PROM fit with patient population (e.g., literacy level, technology comfort, language(s), font size, culturally appropriate, meaningful for clinical condition) • PROM fit for clinic team (e.g., PROM easy to interpret, meaningful for clinical care, integrated in electronic health record system, linked clinical decision support) • PROM fit with clinic culture and values • Perceived relative advantage of PROMs vs. usual care • Leadership support for PROMs | • Stakeholder perceptions of clinic needs and resources for implementing PROMs • Fit of potential implementation strategies for specific clinics, their needs and resources, clinic team members, and patient population • Leadership support for implementation strategies (e.g., providing space and time for clinic team to receive training) |
Feasibility | • Extent to which technology or electronic health record can be developed or modified to administer PROMs and visualize results in a meaningful way for clinicians • If collecting PROMs from home, feasibility testing considers underserved patient groups’ needs and access to internet and habits (or alternative data collection methods like interactive voice response offered) • Consent rate > 70% (if applicable) • How many and which items are missed or skipped (and identifiable patterns) • Length of time for patients to complete the PROM, comprehensibility • Rates of technical issues • Dropout rate for patients • PROM characteristics (e.g., literacy demand, number of items, preliminary psychometric properties if used in new population, validity and reliability evidence for population) | • “Action, actor, context, target, time (AACTT)” framework [62]: describe who needs to do what differently, and select fit-for-purpose strategies • % clinics completing at least one implementation activity or phase (and/or all activities and implementation phases) • Rates of technical issues for clinics • Stakeholder perceptions of which implementation strategies are possible • Stakeholder perceptions of what to include in PROM training session • Pilot study or rapid cycle testing to determine if implementation strategy is possible (e.g. whether specific workflow change possible in a clinic) • Which implementation activities were completed vs. skipped |
Adoption | • % of clinics advancing to administering PROMs routinely • Representativeness of clinics willing to initiate PROMs • Underserved patient groups (e.g., older patients) complete PROMs at similar rates to clinic average | • Dropout rate for clinics • Representativeness of clinics completing implementation activities • Stakeholder perceptions and observations on which implementation support strategies were/were not effective in a clinic, and why • How and why clinics operationalized implementation strategies • Minor changes made to implementation strategies to fit local conditions or context (if major changes, see fidelity below) • StaRI reporting guidelines for implementation strategies [61] |
Reach/penetration | • % of patient panel completing ≥ 1 PROM during defined time interval (denominator chosen appropriately: all patients with an in-person visit during time interval, etc.) • % of missing data during defined time interval (with appropriate denominator) • Informed missingness (correlated with patient demographics) • Average # PROMs completed per patient during interval | • % of clinic team participating in implementation strategies • % of clinic team attending training • % of clinic team reporting training helped them understand new role and how to implement in their workflow • Clinicians: % reporting self-efficacy for using PROMs after training |
Fidelity | • Consistency of PROMs completed by patients (e.g., 80% PROM completion rate for clinic) • % of clinicians who review PROMs with patients during visits • How and why clinics adapted the innovation (e.g., changed PROM timeframe for items) • FRAME framework for reporting adaptions to interventions [49] | • FIDELITY framework [50]: report on five implementation fidelity domains (study design, training, delivery, receipt, and enactment) • How and why clinics or support personnel adapted implementation strategies (e.g., changed the PROM training format or content) • % of clinics completing all implementation activities |
Cost | • Financial, personnel, and time costs to administer and review PROMs on routine basis • Technology costs | • Financial, personnel, technology, and time costs to implement PROMs • Cost of Implementing New Strategies (COINS) [64] |
Sustainability | • Extent to which PROMs become normalized and routinized in a clinic’s workflow • Stakeholder perceptions • Periodically assess whether updates to PROMs are needed | • Routine data-informed feedback to clinic on PROM completion rates, missing data, and informed missingness • Provide additional implementation support to identify and overcome new or ongoing barriers (if needed) • Retraining or “booster” training or train new staff (if needed) |
Discussion
Relevance of IS approaches for PROMs/PREMs implementation
1. Coherence: Assess understanding of PROMs/PREMs in context What are PROMs/PREMs, and why should clinical teams use them? | 2. Cognitive participation: Engage stakeholders in communities of practice Who will do what for routine use of PROMs/PREMs in clinical care? | 3. Collective action: Identify barriers and facilitators What helps or hinders the use of PROMs/PREMs in clinical care? Whom do these factors affect? | 4. Reflexive monitoring: Evaluate understanding of routine PROMs/PREMs use What did we learn about using PROMs/PREMs in clinic? Will we keep doing it? | |
---|---|---|---|---|
Overlap with relevant domains from widely used implementation science frameworks | KTA: Identify problem, Select and Review Knowledge i-PARIHS: Innovation (how it is perceived by various stakeholders), Recipient CFIR: Intervention characteristics (e.g., evidence strength and quality, relative advantage, adaptability, complexity), Characteristcs of individuals (e.g., knowledge and beliefs about the intervention) TDF: Knowledge, beliefs and capabilities, social/professional role and identity, beliefs about consequences RE-AIM: Effectiveness (longer-term impacts e.g., quality of life) Proctor’s outcomes: Appropriateness, Cost, Feasibility (stakeholder perceptions) | KTA: Adapt knowledge to local context (involve local stakeholders) i-PARIHS: Recipient (identify key stakeholders including patients), Facilitation (regular meetings with clinic) CFIR: Leadership engagement (under Inner setting), Process (e.g., engaging, opinion leaders, internal implementation leaders, champions, external change agents) TDF: Skills, memory, attention and decision, emotion, behavioral regulation, intentions, goals, optimism RE-AIM: Reach, Adoption (numbers of patients and champions willing to participate in implementation) Proctor’s outcomes: Acceptability, Adoption, Penetration | KTA: Assess barriers to knowledge use i-PARIHS: Innovation (how it is adapted to work in local contexts), Context (inner setting and outer setting) CFIR: Outer setting (e.g., patient needs and resources, external policies and incentives), Inner setting (e.g., networks and communication, culture, relative priority, organizational incentives, available resources, access to knowledge and information) TDF: Reinforcement, environmental context and resources, social influences RE-AIM: Maintenance (normalized 6 months after introduction) Proctor’s outcomes: Feasibility, Cost | KTA: Monitor knowledge use, Evaluate outcomes i-PARIHS: Facilitation, Organizational Readiness to Change assessment CFIR: Reflecting and evaluating (under Process) RE-AIM: Implementation (fidelity) Proctor’s outcomes: Sustainability |
Implementation strategies identified in case studies | • Stakeholder engagement • Provide evidence about clinical validity of PROMs/PREMs | • Training workshops • Workflow redesign • Implementation support team | • Context assessments • Technology support • Practice facilitator | • Annual evaluation meetings with clinics • Audit and feedback |