Skip to main content

Aligning implementation and user-centered design strategies to enhance the impact of health services: results from a concept mapping study

Abstract

Background

Innovative approaches are needed to maximize fit between the characteristics of evidence-based practices (EBPs), implementation strategies that support EBP use, and contexts in which EBPs are implemented. Standard approaches to implementation offer few ways to address such issues of fit. We characterized the potential for collaboration with experts from a relevant complementary approach, user-centered design (UCD), to increase successful implementation.

Method

Using purposive and snowball sampling, we recruited 56 experts in implementation (n = 34) or UCD (n = 22). Participants had 5+ years of professional experience (M = 10.31), worked across many settings (e.g., healthcare, education, human services), and were mostly female (59%) and white (73%). Each participant completed a web-based concept mapping structured conceptualization task. They sorted strategies from established compilations for implementation (36 strategies) and UCD (30 strategies) into distinct clusters, then rated the importance and feasibility of each strategy.

Results

We used multidimensional scaling techniques to examine patterns in the sorting of strategies. Based on conceptual clarity and fit with established implementation frameworks, we selected a final set of 10 clusters (i.e., groups of strategies), with five implementation-only clusters, two UCD-only clusters, and three trans-discipline clusters. The highest-priority activities (i.e., above-average importance and feasibility) were the trans-discipline clusters plus facilitate change and monitor change. Implementation and UCD experts sorted strategies into similar clusters, but each gave higher importance and feasibility ratings to strategies/clusters from their own discipline.

Conclusions

In this concept mapping study, experts in implementation and UCD had perspectives that both converged (e.g., trans-discipline clusters, which were all rated as high-priority) and diverged (e.g., in importance/feasibility ratings). The results provide a shared understanding of the alignment between implementation science and UCD, which can increase the impact and sustainability of EBP implementation efforts. Implications for improved collaboration among implementation and UCD experts are discussed.

Peer Review reports

Background

Implementation science—an interdisciplinary field in the health sciences that is focused on improving the use of research evidence in everyday practice settings—has long focused on promoting the use of evidence-based practices (EBPs) for the assessment of, intervention with, and management of medical and behavioral health conditions. Unfortunately, even when implementation occurs, EBPs typically show reduced impacts in community settings and are rarely sustained once implementation support ends [1, 2]. Numerous characteristics of EBPs—and the strategies used to support their implementation—can undermine their effectiveness in typical health service settings by producing a mismatch with the real-world needs of providers, patients, and service organizations (see [3] for a review). Examples of design problems include low ease of use (e.g., interventions that lack flexibility needed for community patient populations), high complexity (e.g., screening tools that are difficult for providers to administer and interpret correctly), and incompatibility with constraints of the delivery setting (e.g., time-intensive training and consultation models for implementation). To maximize the public health benefits of applying research evidence, implementation efforts will require complementary approaches that can enhance EBP fit with the contexts in which they are implemented [4,5,6]. In the present study, we sought to characterize the potential of one such approach, user-centered design, to provide a set of strategies that can align with implementation strategies to better support EBPs’ use in community settings.

Ongoing challenges in promoting implementation success

Over the past several decades, experts in implementation research and practice have identified a number of promising strategies for the implementation of EBPs. The most comprehensive review of these strategies is the Expert Recommendations for Implementing Change (ERIC) study, in which a panel of 35 implementation experts defined 73 discrete implementation strategies through a Delphi consensus-building process that expanded on the results of an earlier systematic review [7] and then sorted those strategies into nine conceptually distinct categories while also rating their importance and feasibility [8]. The ERIC study provided a much-needed common language and set of best-practice strategies used in implementation research and practice. However, a close examination of the strategies reveals important gaps in the approach currently taken by the field. For example, Dopp and colleagues [9] examined the ERIC compilation using the multilevel domains specified in the Consolidated Framework for Implementation Research (CFIR [10];) and found that most of the 73 strategies focus on changes in the individuals and systems (inner/outer setting) that will adopt a health services innovation, whereas only three seemed to address the possibility of tailoring the innovation to local contexts (i.e., “develop and implement tools for quality monitoring,” “develop educational materials,” and “promote adaptability”). Given that EBP usability is a key upstream determinant of implementation outcomes such as acceptability, appropriateness, and feasibility [11], as well as findings that context-specific modifications to EBPs are common and influential during implementation efforts [2, 12,13,14], current approaches to the promotion of implementation success are likely to be incomplete.

Recently, researchers have observed that both EBPs and implementation strategies have fundamental design problems that limit their effectiveness in diverse health service settings [3]. Health care providers and other stakeholders (e.g., patients, administrators) often encounter significant usability challenges with EBPs, both in terms of the tasks involved (e.g., clinical techniques, goal setting, practice-specific supervision) and the packaging that structures the tasks (e.g., manuals, worksheets, length and modality of sessions). Although some of these challenges could be addressed through improved attention to design during initial development of EBPs, scholars have increasingly argued that EBPs are frequently “over-designed” in research settings—leading to inclusion of features that are not necessary or useful to end users—and instead recommended that health care practices be optimized within their ultimate implementation setting [11, 15]. Recognizing that the ERIC [7] compilation, while groundbreaking, speaks only sparingly to aspects of EBP design that may improve uptake, we suggest that there is a need for additional strategies that attend directly to those issues of design. To that end, it may be useful to seek innovative strategies from outside the health service fields and deepen our understanding of how multidisciplinary experts might collaborate to apply those strategies.

Potential of user-centered design

The field of user-centered design (UCD) holds considerable potential for increasing the impact and sustainment of EBPs (see [3, 11, 16, 17]). Drawing from research in human–computer interaction, user experience design, service design, and cognitive psychology, UCD and the closely-related field of human-centered design offer a set of principles and strategies that guide the design of an innovation from the perspectives of (and with input from) the people who will ultimately use that innovation [18,19,20,21]. Dopp and colleagues recently published a glossary of 30 UCD strategies for implementation researchers [22]; illustrative examples include identification of users and user needs, cycles of rapid prototyping and iterative development, co-creation and usability testing sessions with users, and interpretation sessions with stakeholders. In contrast to the ERIC implementation strategies, far more UCD strategies targeted the innovation (33%) or individuals (40%) involved in the implementation effort, although UCD can also be used to modify the inter- or intra-organizational context to better suit an EBP [22]. The ultimate aim of UCD is to make innovations and systems “useable and useful” for specified users, activities, and goals [23]. UCD can be applied to the development and improvement of digital and analog technologies (e.g., [24]), service systems (e.g., [25]), and training processes (e.g., [26]). It most frequently has been used to design new health services and technologies (e.g., [17, 27, 28]), whereas applications to the delivery and sustainment of already-designed EBPs (including the design of implementation strategies) remain rare. Health service fields like implementation science have yet to apply UCD extensively, although there are a growing number of examples for both intervention design studies (e.g., [29, 30]) and conceptual models (e.g., [15, 31]). Findings to date suggest that UCD has high relevance to most (if not all) EBPs, implementation strategies, and practice contexts within health care (see [31] in particular).

Despite its potential, it remains unclear how UCD fits within the evolving landscape of implementation research and practice. Implementation is already a highly interdisciplinary field and new collaborations between implementation experts and UCD experts will be essential to capitalize on the promise of UCD for health services. Experts from these two fields have only recently begun joining together to examine the role of design in implementation, and their efforts have been primarily in the form of conceptual frameworks (e.g., [15, 31]). As a step toward better understanding the alignment of implementation and UCD strategies, we used concept mapping [32] to characterize how experts from each discipline conceptualize the relations among the strategies described in these frameworks. Our study offers a novel empirical understanding of the proposed conceptual relationship between these two disciplines.

Method

The method for this study was previously described in a published study protocol [9]. Herein we summarize the method and provide additional details about its actual execution, but readers should refer to [9] for a more fully detailed description. Additional file 1 contains a checklist of reporting guidelines for mixed-method research (supplemented with specific items for concept mapping) that we completed for the study.

Recruitment and participants

To ensure our participants had appropriate expertise and constituted an internationally representative sample, recruitment used a combination of purposive and snowball sampling [33] in which we sent invitation emails to experts in implementation and/or UCD. Purposive sampling targeted experts from research centers and professional organizations that were centers of excellence for research in implementation and/or UCD; snowball sampling involved nominations from participants who completed the study. Interested participants contacted the study coordinator (second author) and were given login information for Concept Systems Global MAX (CSGM [34];), the web-based software platform that we used to conduct concept mapping. Once they logged into CSGM, the participants read and electronically signed the informed consent form, completed a short demographic questionnaire, and then began the concept mapping exercise.

The 56 participants were implementation experts (n = 34; 61%) and UCD experts (n = 22; 39%). Expertise was self-reported based on experience in research, practice/industry, and/or education over the past 5 or more years. We did not ask participants to identify specific areas of expertise, but we believe many had both research and applied experience in their discipline based on our recruitment methods and our interactions with participants during the study. Participants averaged 10.3 years of professional experience (SD = 6.7, range = 5–35). When asked how often their work involved interdisciplinary collaboration, half of participants indicated 80–100% of the time (top fifth), with increasingly smaller proportions endorsing 61–80%, 41–60%, 21–40%, and 0–20% of the time (21%, 16%, 11%, and 2% endorsement, respectively). Most participants (88%) reported focusing on health care in their work, but many also reported working with the prevention and health promotion (36%), education (18%), or human services (e.g., justice, child welfare, housing) (16%) sectors. When asked which CFIR domains they seek to improve through their work, most participants endorsed the individual (88%) and intervention/innovation (84%) levels, a smaller majority indicated the inner setting (70%), and the smallest proportion indicated the outer setting (34%). Finally, because the concept mapping software program limited the number of demographic questions that we could ask participants, we collected gender and race data in a follow-up Qualtrics survey which was completed by 51 participants (9% missing). Demographic data indicated that the sample was 59% female (n = 33; another 18 [32%] were male) and 73% white (n = 41; another six [11%] were Asian and the remaining four [8%] were other races).

We originally aimed to recruit 30 experts from each discipline [9], but more participants self-reported expertise in implementation than anticipated at enrollment (which filled slots originally intended for UCD experts), and several recruited UCD experts did not complete the study. Nevertheless, our sample size was still adequate for concept mapping as it exceeded the recommended sample size of n ≥ 15 per group [35].

Procedures

Concept mapping

We used concept mapping [32] to systematically capture the relationships that participants perceived between different concepts or ideas (i.e., implementation strategies and UCD strategies). This method guides participants through a structured conceptualization process where they sort ideas into related groups and then rate the ideas on key dimensions. It is a self-contained mixed-method approach (i.e., incorporating both qualitative and quantitative data collection and analysis approaches) consisting of four phases: (1) idea generation, (2) sorting, (3) rating, and (4) analysis.

  • Idea generation. As detailed in [9], our research team generated the ideas/concepts for participants to sort and rate by using existing resources that documented implementation and UCD strategies. For implementation, we selected a subset of 36 strategies from the full list of ERIC [7] strategies, with strategies chosen to maximize representativeness across (i) CFIR domains, (ii) categories of implementation strategies from a previous concept mapping study [8], and (iii) importance ratings (also collected by [8]). For UCD, we included all 30 strategies from our aforementioned compilation [22]. We uploaded each strategy (name and brief definition) into CSGM as a separate “statement” for subsequent sorting and rating by participants.

  • Sorting and rating. The middle two phases of concept mapping, sorting and rating, were completed in tandem through the CSGM platform. CSGM allows participants to complete tasks in any order, and participants could also stop and start the activities as often as they wished. Our instructions and rating dimensions were adapted from ERIC [8].

For the sorting step, participants sorted each of the 66 implementation and UCD strategies into groups based on their view of the strategies’ meaning or theme. The order of strategy presentation was randomized, with no distinction between implementation versus UCD strategies. For the rating step, participants rated each strategy on its importance and feasibility on a scale ranging from 1 (least important/feasible) to 5 (most important/feasible). Ratings for importance and feasibility were completed separately.

Post-survey

After participants completed all steps in CSGM, the system displayed a link to the post-survey in Qualtrics which collected additional demographic information; questions about challenges in collaboration between implementation experts and UCD experts (which were not yet analyzed for this initial study); and snowball sampling nominations. Upon completion, participants received a unique link for a $20 electronic gift card.

Analytic strategy

The final step of concept mapping, data analysis [32], involved using multidimensional scaling techniques (embedded in CSGM [34]) to identify clusters of implementation and UCD strategies that were generated most consistently across participants. We retained and analyzed data provided by all participants, including those who did not complete all study steps, although usable data were available from most participants (98% for sorting; 96% for rating).

CSGM can empirically generate any number of clusters, so the research team reviewed the results for conceptual clarity and credibility before selecting which set of clusters to report. To guide our thinking, we examined cluster maps produced by CSGM, which represent the relatedness of concepts within and between clusters in terms of visual distance. We also considered the extent to which clusters were consistent with or expanded upon the (1) clusters of implementation strategies identified in the ERIC study [8]; (2) CFIR domains [10]; and (3) the Integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework [36], which describes the process of facilitating EBP use in practice settings by attending to characteristics of the EBP, recipients, and context (i-PARIHS is a process framework, which complements the determinant-focused nature of CFIR [37]). We began with a 13-cluster solution, which is one SD above the mean number of clusters in a typical concept mapping solution [35], and examined splitting and merging of clusters in a stepwise fashion. Once we selected the final set of clusters, we calculated average importance and feasibility ratings for each cluster and strategy. We used unweighted averages because weighting by subsample size (to account for the different number of implementation vs. UCD experts in the sample) resulted in very small changes to the average values, with no changes to study conclusions. We also examined ladder graphs, which provide a visual representation of the relationship between dimensions (e.g., importance and feasibility) within and across clusters. In addition, we explored the number and types (i.e., by discipline) of strategies in each cluster.

Following the initial analyses of concept mapping data from all participants, we also examined results separately by subgroup (i.e., implementation vs. UCD experts). We applied the same analytic approach described previously with data separated by discipline, and we evaluated whether there were observed differences in the number, content, or ratings of the clusters. We also used multivariate general linear models to test for differences in ratings of each cluster’s perceived importance and feasibility across disciplines.

Results

Cluster solution

The stress value for the multidimensional scaling analysis of our data was 0.188, well below the 0.365 cutoff recommended for ensuring adequate consistency among respondents [32], which indicated that we could proceed with identifying a cluster solution. After examining and discussing solutions that ranged from 13 down to 8 clusters over a period of several weeks, we identified a 10-cluster solution. The research team unanimously agreed that this solution offered the greatest conceptual clarity and contained concepts that aligned with the ERIC cluster solution [8] and relevant implementation frameworks [10, 36]. We also followed the process and guidelines outlined by the ERIC team [8] to achieve consensus on labels for the final clusters.

Figure 1 presents a cluster map that visually represents the relationships among the 66 strategies, with symbols on the map representing implementation strategies (circles) or UCD strategies (diamonds). Table 1 presents a complete list of strategies, organized by cluster, and summarizes the characteristics of the strategies and clusters. Five clusters were comprised entirely of implementation strategies, two were comprised entirely of UCD strategies, and the remaining three clusters contained strategies from both disciplines. Average importance ratings ranged from 2.4 to 4.5 for individual strategies and from 2.9 to 4.0 for clusters. Average feasibility ratings ranged from 1.5 to 4.5 and from 1.8 to 4.0 for strategies and clusters, respectively. Importance and feasibility ratings were highly correlated (r = 0.57). Figure 2 presents a ladder graph that visually represents the importance and feasibility ratings of each cluster. We considered clusters that fell above the mean on both sides of the ladder graph to be “high-priority” because they were highly important and feasible. All three of the trans-disciplinary clusters were high-priority, as were two clusters of implementation strategies.

Fig. 1
figure 1

Cluster map of implementation and user-centered design (UCD) strategies. The map reflects the product of an expert panel (valid response n = 55) sorting 66 discrete strategies into groupings by similarity. Circles indicate implementation strategies and diamonds indicate UCD strategies. The number accompanying each strategy allows for cross-referencing to the list of strategies in Table 1. Light-colored clusters are comprised entirely of implementation strategies; dark-colored clusters are comprised entirely of UCD strategies; and multi-colored clusters are comprised of strategies from both disciplines. Spatial distances reflect how frequently the strategies were sorted together as similar. These spatial relationships are relative to the sorting data obtained in this study, and distances do not reflect an absolute relationship

Table 1 Summary of strategies and clusters, including key characteristics
Fig. 2
figure 2

Ladder graph of the average importance and feasibility ratings for the cluster solution (see Fig. 1). The graph reflects the product of an expert panel (valid response n = 54) rating 66 discrete implementation and user-centered design (UCD) strategies on a scale from 1 to 5. The range of values on the y-axis reflect the mean rating obtained for each cluster (as reported in Table 1) with a color-coded line joining the importance and feasibility ratings for each cluster. The cluster names are listed to the right with a line indicating the respective part of the graph for that cluster’s ratings († = implementation-only cluster, ^ = UCD-only cluster, * = trans-discipline cluster). The gray dotted line indicates the average importance (3.45) and feasibility (2.92) ratings across all strategies; clusters that fall fully above this line on the ladder graph were considered “high-priority”

Comparison of results by discipline

We were able to examine cluster solutions separately by discipline, given that we found adequate stress values for the implementation expert data (0.200) and UCD expert data (0.251). However, the research team determined that the discipline-specific cluster solutions did not differ in meaningful ways from the primary 10-cluster solution, with one exception: UCD experts sorted UCD strategies somewhat differently from implementation experts, producing a 9-cluster solution that replaced one of the four UCD-dominant clusters (co-design) with a difficult-to-interpret cluster that contained several key approaches to the design process (e.g., iterative development, design in teams) as well as implementation facilitation. The other three UCD-dominant clusters in this alternate solution were all conceptually similar to those from the primary cluster solution—to the point that we retained the same names—but the makeup of strategies within those clusters differed by 43–67%. The alternate solution did not offer any major conceptual or practical advantages over the primary cluster solution (indeed, we were unable to agree on a name for the new cluster), so we focused on the primary cluster solution for our remaining analyses. For consideration, however, we present the cluster map of the four alternate UCD-dominant clusters in Additional file 2, and we indicate in Table 1 the alternate cluster assignment for each strategy from the UCD-dominant clusters.

Next, for the primary 10-cluster solution, we compared average cluster ratings between disciplines. Multivariate general linear models indicated that there were significant differences between implementation and UCD experts’ ratings of importance (F10,43 = 5.12, p < 0.001, with significant differences for 5 individual clusters) and feasibility (F10,43 = 5.78, p < 0.001, with significant differences for 7 individual clusters). A post hoc repeated-measures analysis confirmed that these differences in cluster ratings were driven by participants’ tendency to rate strategies from their own discipline as more important and more feasible (F2,50 = 20.56, p < 0.001). Table 2 presents the importance and feasibility ratings of each cluster by implementation versus UCD experts. The table also reports, from the multivariate models, the statistical significance and magnitude (calculated using Cohen’s d effect sizes) of all between-discipline differences in ratings for each cluster. Whenever the difference between disciplines was significant, the higher ratings came from the same discipline as the majority of strategies in the cluster. The magnitude of differences fell within the small to medium range (0.2 < ds < 0.8). Despite these differences, the high-priority clusters had high ratings on importance and feasibility (with small to negligible differences, ds < 0.5) across disciplines.

Table 2 Average cluster ratings compared between disciplines

We had originally planned [9] to examine differences between professions in the number and types of strategies in the identified clusters using χ2 analysis. Since we arrived at a common cluster solution for both disciplines, however, we deemed such an analysis unnecessary.

Discussion

Implementation researchers who wish to increase the public health impact of EBPs need to consider novel approaches, such as UCD, that can improve the fit between EBPs, the strategies used to implement them, and their implementation contexts. This concept mapping study explored 56 experts’ perspectives on the potential interdisciplinary convergence and alignment among implementation strategies and UCD strategies. Based on their input, we identified 10 clusters of strategies (5 implementation-only, 2 UCD-only, and 3 trans-discipline) that deepen our understanding of how UCD strategies relate to traditional strategies for supporting implementation efforts. Given this observed clustering of strategies, we conclude that implementation science and UCD offer complementary approaches to improving health and well-being, with each discipline making unique contributions that could be strengthened by the other. This represents less interdisciplinary overlap than we had anticipated when planning the study, given the common objectives of the two fields (i.e., we referred to “integrating” implementation and UCD strategies in our protocol [9]), and demonstrates the value of using empirical methods to inform conceptualization and confirm (or disconfirm) impressions. Of course, as a preliminary study, this one also had limitations (e.g., the lower than anticipated recruitment of UCD experts) and left many unanswered questions, so we highlight the need for additional research throughout our subsequent discussion.

The potential for collaboration between implementation and UCD experts is most evident in the three trans-discipline clusters of strategies, which experts from both disciplines rated above-average on both importance and feasibility. This suggests that implementation and UCD experts may be most ready to align around the activities represented by these clusters in ways that produce mutual benefit. For example, UCD offers specific tools and methods that can help implementation experts achieve important aims such as identifying barriers and facilitators to change [38] (located in cluster 8, “understand systems and context”) and co-designing solutions with stakeholders (cluster 10) [39]. Through collaboration, implementation experts could incorporate more effective and feasible ways to achieve their aims while UCD experts may benefit from increased opportunities to apply their expertise to pressing, large-scale health needs. The final trans-discipline cluster, “promote leadership and collaboration,” differs in that it is dominated by implementation strategies, but UCD contributes the strategy “build a user-centered organizational culture.” UCD experts may find that organization-focused strategies (such as “identify and prepare champions”) can help make a user-centered culture more feasible, while implementation experts might consider whether user-centeredness is an important dimension to address in existing leadership- and collaboration-oriented strategies (e.g., [40]). However, it is important to note that these results are at the cluster level; in future work, we plan to examine “go-zone” graphs [32] that plot individual strategies along key dimensions (e.g., importance vs. feasibility, implementation vs. UCD experts) to identify discrete strategies within and across clusters that are particularly promising for cross-disciplinary collaboration.

Most implementation and UCD strategies were located in distinct (rather than trans-discipline) clusters, which suggests an additional level of complementarity in that the two disciplines each contribute novel approaches to addressing common problems. In keeping with key implementation frameworks [10, 36], the expert panel identified clusters of implementation strategies that addressed intra- and inter-organizational contexts (“access resources,” “incentivize the innovation”), EBP providers (“support providers”), or the implementation process itself (“monitor change,” “facilitate change”). The latter two clusters were the remaining high-priority clusters, consistent with how the i-PARIHS framework [36] proposes facilitation as a key ingredient for successful implementation (CFIR [10] also includes a “process” domain but emphasizes it less). These observations provide validation of our cluster solution, even though the observed implementation-only clusters did not closely replicate ERIC clusters [8] (e.g., strategies from the ERIC cluster “utilize financial strategies” were split across “access resources” and “incentivize the innovation”). Rather than revealing some universal truth, cluster mapping instead represents how a group thinks about particular issues or ideas—so these differing conceptualizations are not problematic, as they were theoretically consistent and not directly contradictory. Of course, even implementation-specific clusters may still benefit from collaboration with UCD experts (e.g., by helping actualize effective strategies to “remind clinicians”), although the path forward may be less evident than in trans-disciplinary clusters.

The apparent context dependency of concept mapping solutions suggests a number of other future research directions. At the most basic level, it will be important to see how well other samples of implementation and UCD experts can replicate the observed cluster solution, especially across different subdomains in health care (e.g., medical vs. mental health, adults vs. pediatrics). More research is also needed to examine whether conceptualizations of these strategies differ among experts in the research versus practice of implementation and UCD, given that our recruitment strategy did not distinguish among these two types of expertise (between which there are notable gaps in both fields [41, 42]). Finally, the compilations from which we drew strategies for sorting and rating in this study [7, 22] are themselves context-dependent in that they primarily describe implementation and UCD activities within health care. A recent project adapted the ERIC implementation strategies for use in school settings [43] and replicated key importance and feasibility ratings for each strategy [44]; the findings showed that ratings for one-third of the strategies shifted meaningfully from the original ERIC compilation to the school-adapted set. Future research should similarly consider how UCD strategies transfer.

The two UCD-only clusters offer important extensions of the implementation strategies summarized above, as both offer approaches to address the often-overlooked innovation/intervention domain in implementation frameworks [10, 36]. These clusters were consistent with a separate framework for UCD [17] which proposes a cyclical process of identifying user needs (“consider user needs and experiences”), and then developing prototype solutions with which users interact (“develop and test solutions rapidly”). This rapid, iterative, and user-engaged approach to problem-solving is a key contribution that could help implementation experts achieve more rapid and pragmatic impact [45]—and again, UCD experts may also see their skills achieve broader-scale impact with complementary implementation strategies. The specifics of the conceptualization of UCD strategies within clusters remain less clear, as evidenced by the alternate clusters generated from the UCD experts’ data, but this may reflect the more nascent state of describing UCD strategies. Four researchers with expertise in implementation and UCD developed our UCD strategy compilation through a rapid literature review [22], whereas ERIC was based on a systematic literature review followed by a Delphi consensus-building exercise with 35 implementation experts [7]. Like implementation science, UCD is a diverse, innovative field that remains highly variable in terms of language and approaches, yet to date, UCD has focused less on consistently categorizing its own processes. Therefore, more research may be needed to achieve a compilation of UCD strategies that fully represents the field. For example, an interpretation session in which UCD experts consider and discuss the alternative UCD-only cluster solution—perhaps guided by follow-up questions from implementation experts—might offer insights into how UCD strategies could best be conceptualized and defined to maximize alignment with implementation science.

Conclusions

Implementation science and UCD offer complementary approaches with several key points of interdisciplinary alignment. It may be ideal for implementation and UCD experts to work side-by-side to execute strategies from trans-discipline clusters, but work sequentially or in parallel for strategies from discipline-specific clusters. Yet such collaboration could encounter challenges for a variety of reasons. Experts tended to modestly favor their own discipline in their importance and feasibility ratings, suggesting that multi-disciplinary teams could disagree about how to prioritize various strategies when resources are limited. It will also be important to develop supports for multidisciplinary implementation-design teams, drawing on the growing science of team science [46]. In the future, our research team plans to investigate UCD-focused team science resources (e.g., mentored development programs) and tools (e.g., shared online workspaces) to complement the limited, but growing, offerings of implementation training initiatives [47] and our UCD strategy glossary for implementation experts [22]. Our efforts will be informed by continued analysis of additional data collected from participants in this study. For example, participants also provided rank-order and qualitative feedback about challenges and desired supports for cross-discipline collaboration (see the study protocol [9] for details).

In addition to support for collaboration, other advances will be needed to fully realize the potential impact of aligning implementation science and UCD. One important step will be to continue merging implementation-focused frameworks (e.g., [10, 36]) with frameworks that describe how to design for implementation (e.g., [15, 31]) to provide a more complete account of the levels and processes involved in successful implementation. Such guidance, along with recent efforts to advance causal models of the relations between specific implementation strategies and determinants [48, 49], could help decision-makers prioritize UCD strategies among the numerous other implementation strategies available (i.e., 73 in the ERIC project). It will also be necessary to test the impact of specific UCD strategies on implementation and clinical outcomes (e.g., the impacts of some strategies, such as personas, remain unclear), considering again the need to select strategies that address a given problem [49]. Evaluating cost-effectiveness will also be important, given the potential high costs of incorporating UCD into implementation efforts [50]. Finally, it will be important to consider when researcher efforts involving UCD experts can be made “backwards compatible,” meaning they advance scientific understanding and large-scale impact within the UCD field, and/or “forwards compatible,” in which design strategies are described in sufficient detail to inform future implementation research of relevance. The promising level of alignment between implementation and UCD strategies indicated in this study suggests that such efforts will be worth the advances they can bring in terms of advancing health, wellness, and EBP availability.

Availability of data and materials

The datasets generated and analyzed during this study are available from the corresponding author on reasonable request.

Abbreviations

CFIR:

Consolidated Framework for Implementation Research

CSGM:

Concept Systems Global MAX

EBP:

Evidence-based practice

ERIC:

Expert Recommendations for Implementing Change

i-PARIHS:

Integrated Promoting Action on Research Implementation in Health Services framework

UCD:

User-centered design

References

  1. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013. https://doi.org/10.1186/1748-5908-8-117.

  2. Shelton RC, Cooper BR, Wiltsey SS. The sustainability of evidence-based interventions. Annu Rev Public Health. 2018. https://doi.org/10.1146/annurev-publhealth-040617-014731.

    Article  PubMed  Google Scholar 

  3. Lyon AR, Koerner K. User-centered design for psychosocial intervention development and implementation. Clin Psychol-Sci Pr. 2016. https://doi.org/10.1111/cpsp.12154.

    Google Scholar 

  4. Aarons G, Green A, Palinkas L, Self-Brown S, Whitaker D, Lutzker J, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012. https://doi.org/10.1186/1748-5908-7-32.

  5. Chorpita BF, Daleiden EL. Structuring the collaboration of science and service in pursuit of a shared vision. J Clin Child Adolesc. 2014. https://doi.org/10.1080/15374416.2013.828297.

    Article  Google Scholar 

  6. Kazdin AE, Rabbitt SM. Novel models for delivering mental health services and reducing the burdens of mental illness. Clinical Psychological Science. 2013. https://doi.org/10.1177/2167702612463566.

    Article  Google Scholar 

  7. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015. https://doi.org/10.1186/s13012-015-0209-1.

  8. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015. https://doi.org/10.1186/s13012-015-0295-0.

  9. Dopp AR, Parisi KE, Munson SA, Lyon AR. Integrating implementation and user-centered design strategies to enhance the impact of health services: protocol from a concept mapping study. Health Res Policy Syst. 2019. https://doi.org/10.1186/s12961-018-0403-0.

  10. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing Implement Sci. Implement Sci. 2009. https://doi.org/10.1186/1748-5908-4-50.

  11. Lyon AR, Bruns EJ. User-centered redesign of evidence-based psychosocial interventions to enhance implementation: hospitable soil or better seeds? JAMA Psychiatry. 2019. https://doi.org/10.1001/jamapsychiatry.2018.3060.

    Article  PubMed  Google Scholar 

  12. Lau A, Barnett M, Stadnick N, Saifan D, Regan J, Wiltsey Stirman S, et al. Therapist report of adaptations to delivery of evidence-based practices within a system-driven reform of publicly funded children’s mental health services. J Consult Clin Psychol. 2017. https://doi.org/10.1037/ccp0000215.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Park AL, Tsai KH, Guan K, Chorpita BF. Unintended consequences of evidence-based treatment policy reform: is implementation the goal or the strategy for higher quality care? Adm Policy in Ment Health. 2018. https://doi.org/10.1007/s10488-018-0853-2.

    Article  PubMed  Google Scholar 

  14. Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019. https://doi.org/10.1186/s13012-019-0898-y.

  15. Mohr DC, Lyon AR, Lattie EG, Reddy M, Schueller SM. Accelerating digital mental health research from early design and creation to successful implementation and sustainment. J Med Internet Res. 2017. https://doi.org/10.2196/jmir.7725.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Searl MM, Borgi L, Chemali Z. It is time to talk about people: a human-centered healthcare system. Health Res Policy Syst. 2010. https://doi.org/10.1186/1478-4505-8-35.

  17. Witteman HO, Dansokho SC, Colquhoun H, Coulter A, Dugas M, Fagerlin A, et al. User-centered design and the development of patient decision aids: protocol for a systematic review. Systematic Rev. 2015. https://doi.org/10.1186/2046-4053-4-11.

  18. Goodman E, Kuniavsky M, Moed A. Observing the user experience: a practitioner’s guide to user research. 2nd ed. Waltham: Morgan Kaufmann; 2012.

    Google Scholar 

  19. Hanington B, Martin B. Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Beverly: Rockport Publishers; 2012.

    Google Scholar 

  20. Holtzblatt K, Beyer H. Contextual design: Design for life. 2nd ed. Cambridge: Morgan Kaufmann; 2017.

    Google Scholar 

  21. IDEO. The field guide to human-centered design. 2015. http://www.designkit.org/resources/1.

  22. Dopp AR, Parisi KE, Munson SA, Lyon AR. A glossary of user-centered design strategies for implementation experts. Transl Behav Med. 2018. https://doi.org/10.1093/tbm/iby119.

    Article  Google Scholar 

  23. International Standards Organization. Ergonomic requirements for office work with visual display terminals (VDTs) – Part 11: Guidance on usability, vol. 9241. Geneva: International Organization for Standardization; 1998.

    Google Scholar 

  24. Norman D. The design of everyday things: Revised and expanded edition. New York: Basic Books; 2013.

    Google Scholar 

  25. Zomerdijk LG, Voss CA. Service design for experience-centric services. J Serv Res-US. 2010. https://doi.org/10.1177/1094670509351960.

    Article  Google Scholar 

  26. Gagne RM, Wager WW, Golas KC, Keller JM, Russell JD. Principles of instructional design. 5th ed. Fort Worth: Harcourt Brace Jovanovich; 2004.

    Google Scholar 

  27. Ratwani RM, Fairbanks RJ, Hettinger AZ, Benda NC. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors. J Am Med Inform Assoc. 2015. https://doi.org/10.1093/jamia/ocv050.

    Article  PubMed  Google Scholar 

  28. Timmerman JG, Tönis TM, Dekker-van Weering MGH, Stuiver MM, Wouters MWJM, van Harten WH, et al. Co-creation of an ICT-supported cancer rehabilitation application for resected lung cancer survivors: Design and evaluation. BMC Health Serv Res. 2016. https://doi.org/10.1186/s12913-016-1385-7.

  29. Marcu G, Bardram JE, Gabrieli S. A framework for overcoming challenges in designing persuasive monitoring and feedback systems for mental illness. Proc Int Conf Pervasive Comput Technol Healthcare. 2011:1–8.

  30. Lyon AR, Wasse JK, Ludwig K, Zachry M, Bruns EJ, Unützer J, et al. The Contextualized Technology Adaptation Process (CTAP): optimizing health information technology to improve mental health systems. Adm Policy in Ment Health. 2016. https://doi.org/10.1007/s10488-015-0637-x.

    Article  Google Scholar 

  31. Lyon AR, Munson SA, Renn BN, Atkins DA, Pullmann MD, Friedman E, Areán PA. Use of human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: Protocol for studies applying a framework to assess usability. JMIR Research Protocols. 2019. https://doi.org/10.2196/14990.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Kane M, Trochim WMK. Concept mapping for planning and evaluation. Thousand Oaks: Sage; 2007.

    Book  Google Scholar 

  33. Teddlie C, Yu F. Mixed methods sampling: a typology with examples. J Mix Method Res. 2007. https://doi.org/10.1177/2345678906292430.

    Article  Google Scholar 

  34. Concept Systems Inc. Concept Systems Global Max©. 2017. http://www.conceptsystems.com/content/view/the-concept-system.html

  35. Rosas SR, Kane M. Quality and rigor of the concept mapping methodology: a pooled study analysis. Eval Program Plann. 2012. https://doi.org/10.1016/j.evalprogplan.2011.10.003.

    Article  PubMed  Google Scholar 

  36. Harvey G, Kitson A. PARIHS revisited: From heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016. https://doi.org/10.1186/s13012-016-0398-2.

  37. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015. https://doi.org/10.1186/s13012-015-0242-0.

  38. Nilsen P, Bernhardsson S. Context matters in implementation science: a scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. 2019. https://doi.org/10.1186/s12913-019-4015-3.

  39. Greenhalgh T, Jackson C, Shaw S, Janamian T. Achieving research impact through co-creation in community-based health services: literature review and case study. Millbank Quart. 2016. https://doi.org/10.1111/1468-0009.12197.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Aarons GA, Ehrhart MG, Farahnak LR, Hurlburt MS. Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implement Sci. 2015. https://doi.org/10.1186/s13012-014-0192-y.

  41. Colusso L, Bennett CL, Hsieh G, & Munson SA. Translational resources: reducing the gap between academic research and HCI practice. Proceedings of the 2017 Conference on Designing Interactive Systems (pp. 957-968). ACM.

  42. Westerlund A, Nilsen P, Sundberg L. Implementation of implementation science knowledge: the research-practice gap paradox. Worldv Evid-Based Nu. 2019. https://doi.org/10.1111/wvn.12403.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Cook CR, Lyon AR, Locke J, Waltz T, Powell BJ. Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prev Sci. 2019. https://doi.org/10.1007/s11121-019-01017-1.

    Article  PubMed  Google Scholar 

  44. Lyon AR, Cook CR, Locke J, Davis C, Powell BJ, Waltz TJ. Importance and feasibility of an adapted set of strategies for implementing evidence-based mental health practices in schools. J School Psychol. 2019. https://doi.org/10.1016/j.jsp.2019.07.014.

    Article  PubMed  Google Scholar 

  45. Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clin Transl Sci. 2012. https://doi.org/10.1111/j.1752-8062.2011.00383.x.

    Article  PubMed  PubMed Central  Google Scholar 

  46. International Network for the Science of Team Science. INSciTS: Building the knowledge base for effective team science. 2018. https://www.inscits.org/.

  47. Darnell D, Dorsey CN, Melvin A, Chi J, Lyon AR, Lewis CC. A content analysis of dissemination and implementation science resource initiatives: what types of resources do they offer to advance the field? Implement Sci. 2017. https://doi.org/10.1186/s13012-017-0673-x.

  48. Lewis CC, Klasnja P, Powell B, Tuzzio L, Jones S, Walsh-Bailey C, Weiner B. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018. https://doi.org/10.3389/fpubh.2018.00136.

  49. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, Mandell DS. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017. https://doi.org/10.1007/s11414-015-9475-6.

    Article  Google Scholar 

  50. Oliver K, Kothari A, Mays N. The dark side of coproduction: do the costs outweigh the benefits for health research? Health Res Policy Syst. 2019. https://doi.org/10.1186/s12961-019-0432-3.

Download references

Acknowledgements

The authors thank the busy professionals who generously dedicated their time to participating in this study.

Funding

This work was supported by the Marie Wilson Howells Fund, University of Arkansas, Department of Psychological Science (PI: Dopp, #1711.01), and NIMH center grant P50-MH115837 to the University of Washington (PI: Areán). The funding bodies did not play any role in designing the study; collection, analysis, or interpretation of data; or writing this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

AD designed and oversaw all aspects of the study, conducted data analysis, and wrote the first draft of this manuscript. KP assisted AD in designing the study and served as the study coordinator. SM and AL provided consultation to AD and KP regarding study design, recruitment, and data analysis/interpretation. All authors reviewed and provided feedback on this manuscript.

Corresponding author

Correspondence to Alex R. Dopp.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed by the Institutional Review Board of the University of Arkansas, which determined that the study was exempt from review given that the procedures pose minimal risk to participants. The study protocol included procedures for obtaining informed consent from participants.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1:

Reporting guidelines checklist for this study.

Additional file 2:

Alternate cluster map for clusters dominated by user-centered design (UCD) strategies.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dopp, A.R., Parisi, K.E., Munson, S.A. et al. Aligning implementation and user-centered design strategies to enhance the impact of health services: results from a concept mapping study. Implement Sci Commun 1, 17 (2020). https://doi.org/10.1186/s43058-020-00020-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-020-00020-w

Keywords