Skip to main content
  • Research article
  • Open access
  • Published:

Agency responses to a system-driven implementation of multiple evidence-based practices in children’s mental health services

Abstract

Background

Large mental health systems are increasingly using fiscal policies to encourage the implementation of multiple evidence-based practices (EBPs). Although many implementation strategies have been identified, little is known about the types and impacts of strategies that are used by organizations within implementation as usual. This study examined organizational-level responses to a fiscally-driven, rapid, and large scale EBP implementation in children’s mental health within the Los Angeles County Department of Mental Health.

Methods

Qualitative methods using the principles of grounded theory were used to characterize the responses of 83 community-based agencies to the implementation effort using documentation from site visits conducted 2 years post reform.

Results

Findings indicated that agencies perceived the rapid system-driven implementation to have both positive and negative organizational impacts. Identified challenges were primarily related to system implementation requirements rather than to characteristics of specific EBPs. Agencies employed a variety of implementation strategies in response to the system-driven implementation, with agency size associated with implementation strategies used. Moderate- and large-sized agencies were more likely than small agencies to have employed systematic strategies at multiple levels (i.e., organization, therapist, client) to support implementation.

Conclusions

These findings are among the first to characterize organizational variability in response to system-driven implementation and suggest ways that implementation interventions might be tailored by organizational characteristics.

Peer Review reports

Background

System-driven implementation of multiple EBPs

Efforts to implement evidence-based practices (EBPs) in community mental health systems have increased considerably [1, 2], with several public mental health systems instituting fiscal policies to support EBP delivery [3]. In Philadelphia, the city mental health system uses a request for proposal process to provide funding supporting EBP implementation with fiscal incentives for certain EBPs [4]. In Los Angeles County, the Department of Mental Health (LACDMH) enacted a plan to utilize a state revenue stream from a voter-approved ballot initiative, the Mental Health Services Act (MHSA), to promote the use of EBPs through new contracts for Prevention and Early Intervention (PEI) services [5, 6]. The EBP implementation was linked with two other major shifts in service provision: (1) expansion to a new target population (i.e., early in the course of mental illness) and (2) requirement for the collection of clinical outcome measures linked to individual EBPs.

The MHSA was passed by voters in 2004 and each county was responsible for developing a PEI plan for state approval. Following a lengthy stakeholder-engaged planning process, the State approved LACDMH’s PEI plan in August, 2009, which included a systematic roll-out of 52 approved evidence-based, promising and community-defined, evidence-informed practices [7]. However, approximately 5 months later, the state budget crisis resulted in an LACDMH budget shortfall estimated between $32–42 million, which could have resulted in a devastating cut in service delivery impacting all provider agencies in Fiscal year ‘10–11. Agencies contracted with LACDMH were expected to lose 50% to 100% of their funding allocation under County General Funds, which threatened the viability of their continued operation. In the context of this urgent financial crisis and to prevent the imminent closure of community-based agencies and substantial reduction in the number of clients served, LACDMH transformed and accelerated the PEI implementation to allow agencies to leverage MHSA funds.

In May 2010, LACDMH facilitated the rapid launch of an initial set of six evidence-based/informed practices (hereafter referred to as “practices”) to address a range of prevalent youth mental health problems, including Cognitive Behavioral Intervention for Trauma in Schools [CBITS], Child-Parent Psychotherapy [CPP], Managing and Adapting Practice [MAP], Seeking Safety [SS], Trauma-Focused Cognitive Behavioral Therapy [TF-CBT], and Triple P Positive Parenting Program [Triple P] (See Table 1). These practices were selected by LACDMH due to their wide range of presenting problem areas and the ability of practice developers to rapidly train a large number of staff. Consistent with the MHSA county requirements, LACDMH provided implementation support (i.e. training, consultation, implementation guidelines, and technical assistance). A timeline of major events across phases of PEI implementation in LACDMH is presented in Fig. 1 to highlight the rapid and accelerated nature of the system-driven implementation. This paper describes agencies’ experiences at the outset of the PEI implementation and the strategies they employed to support EBP implementation.

Table 1 Characteristics of the six PEI practices
Fig. 1
figure 1

Timeline of major mental health services act-related events in Los Angeles County 2004-2013

System-driven implementation strategies

Understanding EBP implementation in routine care requires attention to both the interventions to be deployed and the implementation strategies used to facilitate their deployment. Powell and colleagues [8] used a modified Delphi method to identify and define 73 implementation strategies and applied concept mapping to organize the strategies into distinct categories [9]. Consistent with the EPIS implementation framework [10] distinguishing multiple phases of implementation (i.e., Exploration, Preparation, Implementation, Sustainment) and multiple levels of influences (i.e., outer context and inner contexts), the strategies may be employed at different implementation phases and levels. System-driven implementation efforts are unique in that strategies are likely employed concurrently in both the outer and inner contexts. Regarding the outer context, LACDMH employed a number of system-level implementation strategies identified by these studies [8, 9] primarily in the categories of utilizing financial strategies, developing stakeholder relationships, using evaluative and iterative strategies, and providing interactive assistance. The specific strategies included: fund and contract for the clinical innovation (i.e., rapidly amending contacts to deliver approved PEI practices in the context of a fiscal crisis), place innovation on formularies (i.e., claiming linked to individual practices), centralize technical assistance (i.e., furnishing initial training/consultation for specific practices for over 3300 therapists during the first year), develop a formal implementation blueprint (i.e., ongoing, iterative development of PEI Guidelines based on providers’ initial experiences), develop and implement tools for quality monitoring/provide local technical assistance (e.g., conducting Technical Assistance Site Visits), promote network weaving (e.g., Learning Networks) and use advisory boards and workgroups (i.e., community partner and LACDMH staff workgroups proposed PEI guidelines).

Some of the strategies were part of the initial, state-approved PEI Implementation Plan and developed through the stakeholder-involved planning process (e.g., the formulary for 52 approved practices). Other strategies were employed during the acute transformation prompted by the fiscal crisis (e.g., selecting six practices which had the capacity to quickly roll out trainings to a large workforce). The remaining strategies were developed in response to challenges and needs that arose during the initial phase of implementation (e.g., development of the PEI Implementation Handbook [11]). Use of these multiple system-level strategies contributed to the outer context of EBP implementation across organizations and practices.

Organizational characteristics and implementation processes and outcomes

Although the outer context and system-level implementations strategies may be held constant across the agencies, individual organizations may vary in response based on their own contexts and characteristics. Research has identified several inner-context organizational factors that influence implementation. Having a supportive and open organizational climate has been linked to lower staff turnover and greater sustainment of practices [12, 13]. Organizational processes, such as shared decision-making and communication both within and between organizations, play an important role in positive implementation outcomes [14, 15].

In contrast, structural characteristics of agencies have not been as well-studied. An early review identified structural characteristics, including organizational size, functional differentiation and specialization (i.e., number of organizational units and specialties) associated with adopting innovations in healthcare settings [16]. However, organizational size may be a proxy for more nuanced factors, such as having more slack resources to devote to new initiatives resulting in a more receptive context. It is unclear how agency size itself may relate to EBP implementation experiences as it could either facilitate or inhibit implementation support. It is possible that large-size agencies have more resources and, thus, more capacity to mount effective implementation strategies. Yet, it is also possible that larger agencies may have greater bureaucracy that constrains innovative responses to new initiatives. Smaller-size agencies, on the other hand, may have fewer resources, but may be more able to nimbly respond to change due to decreased bureaucracy. Understanding the role of agency size may inform ways to match implementation strategies to specific organizational profiles. Use of implementation strategies may also differ in terms of agency bureaucratic structure, which may be indicated by the extent to which organizational administration and operations are centralized versus dispersed across units. It is plausible that the process of EBP implementation may be more complex in agencies where providers are spread out among multiple sites with varying structures and cultures.

Organizational implementation strategies

Although there is growing research on leader and therapists’ perceptions of EBPs [17, 18] and on organizational outcomes of EBP implementation efforts (e.g., reduced staff turnover [19], improved implementation climate [20]), there is less research on how organizational leaders perceive and respond to system-level implementation strategies. The limited existing research indicates that key implementation strategies to facilitate success include stakeholder outreach and stakeholder-engaged planning, attending to short- and long-term implementation costs, and focusing on the scalability of EBP training [18, 21].

Few studies have examined strategies agency leaders have independently put into place following a large-scale system reform [22]. It is not known which implementation strategies agencies employ in response to system-driven implementation, and the impact of system-level implementation strategies on organizations. Previous research has shown that different stakeholder groups (i.e., system leaders, agency leaders, front-line staff, treatment developers) have unique priorities and goals related to EBP implementation [23, 24] and these differences can be perceived as barriers [18]. In one study [18], implementation barriers and facilitators were shared across stakeholder groups and concentrated around specific inner context, outer context, and intervention factors (e.g., financing, therapist-level characteristics). However, the relative emphasis on each of these levels varied across stakeholder groups (e.g., system leaders were more concerned with the outer context, agency leaders with the inner context and intervention factors), with stakeholder priorities shaped by factors most proximal to them.

Present study

This qualitative study provides an in-depth examination of organizational-level responses to a rapid, fiscally-driven EBP implementation in children’s mental health within the LACDMH PEI Transformation. Qualitative methods were used to characterize the responses of 83 community-based agencies to the system-wide implementation of multiple practices based on archived materials from site visits conducted 2 years after the PEI Transformation began. Documents from these site visits offer a view into the inner context of organizations undergoing a large-scale operations change. We examined site visit documentation to characterize the perceptions of agency leaders and system leaders concerning the use of specific organization-level implementation strategies and perceptions of the impact of system-level implementation strategies during the initial implementation of PEI practices. Lastly, we examined whether organizational structural characteristics were associated with the types of implementation strategies agencies used in this context.

Methods

Sample

As of August 2012, 124 agencies were contracted to provide PEI services and 119 site visits were conducted. Inclusion for the current analyses involved agencies providing PEI services to children or transition-age youth (TAY) in outpatient settings. Documents for 36 agencies were excluded, 19 based on age of population served and 15 based on service setting (i.e., agencies providing residential care only). Documents for 83 agencies were included.

Two structural agency characteristics were extracted from utilization management reports. Agency size was indexed by the average number of child clients (M = 368, SD = 461, range 0 to 2347) and TAY clients (M = 104, SD = 146, range 0 to 1056) served per agency within the fiscal year of the site visit (2011-12 or 2012-13). Based on these figures, agencies were categorized into those that served fewer child or TAY clients (small, < 100 child clients, N = 23; < 25 TAY clients, N = 23), those that served a moderate number of child or TAY clients (moderate, between 100 and 500 child clients, N = 42, between 25 and 120 TAY clients, N = 33), and those that served a significant number of child or TAY clients (large, > 500 child clients, N = 17; > 120 TAY clients, N = 26). Agency centralization was based on whether agencies had a single program site (55%) or more than one program site (45%).

Data sources and extraction

The LACDMH PEI Implementation Unit conducted site visits with each PEI-contracted agency from August 2012 to September 2013. The purpose of these visits was to assess implementation milestones (e.g., practitioner training/credentialing, outcome tracking) and provide technical assistance for compliance with system-level functions.

  1. 1)

    Utilization Management Reports. Prior to the site visits, LACDMH provided each agency with a utilization management report detailing client demographics, PEI allocations and claims for fiscal years 2010-11, 2011-12 and/or 2012-13, and outcome measurement compliance by practice. Agency characteristics, including size and cost per client, were based on data from these reports.

  2. 2)

    PEI Technical Assistance Site Visit Provider Pre-Site Visit Questionnaire (PSVQ). An agency leader or program manager completed the mandatory PEI Provider PSVQ prior to their site visit. The PSVQ included specific questions related to: 1) plans for practices, 2) outcome measurement, 3) PEI implementation successes, 4) PEI implementation challenges, 5) unique infrastructure created to facilitate PEI implementation, 6) fidelity monitoring, 7) plans for sustainment, 8) practice-specific challenges/questions, and 9) training/technical assistance needs.

  3. 3)

    PEI Technical Assistance Site Visit Summary Report (TASV). Site visits included a three-hour meeting led by LACDMH staff and attended by agency program managers, supervisors, and therapists. Regarding individual informants, a range of two to 15 agency representatives attended their agency’s site visit (559 in total) and a range of two to 11 system leaders (71 total) attended each meeting. One or two of the four consultants was present for each site visit meeting and consolidated their findings into the site visit reports. A third-party company produced summary reports within 3 weeks of each visit. Each TASV included five standard components: 1) site visit participant information, 2) characteristics of the agency and an overview of PEI implementation, 3) strengths and successes, 4) challenges and concerns, 5) next steps and follow-up actions.

Data analysis

A methodology of “Coding Consensus, Co-occurrence, and Comparison” [25] rooted in principles of grounded theory [26] was used to code qualitative data from the TASV reports and the PSVQs. All documents were analyzed using ATLAS.ti (ATLAS.ti Scientific Software Development, version 7.5.6; http://atlasti.com/). The initial coding team (A.H., M.B., J.R., & N.S.) independently reviewed six TASV reports and developed inductive, descriptive codes to categorize the content of the reports. After discussion about the clarity, comprehensiveness, and distinctiveness of the codes, the team agreed upon 32 codes (e.g., claiming, client severity, outcome measures, adaptations) and corresponding definitions. This revised list of 32 codes was applied to approximately 20% of the reports, which resulted in the addition of six inductive codes (e.g., treatment length, documentation, non-PEI practice). After independent coding of additional site visit reports using the revised code list, the initial coding team evidenced high agreement and reached consensus resulting in a final coding scheme of 38 codes (available upon request). Two coders were added to the team to complete coding of all documents and evidenced high agreement with the initial six coded TASVs. The full coding team individually applied the final coding scheme to the remaining reports. As the content of the PSVQs was very similar to the TASVs, the same coding system and methodology was applied.

Once all documents were coded, individual codes were clustered into what ATLAS terms “code families” that captured similar content to identify emergent themes (e.g., PEI implementation requirements, organizational response, clinical delivery of practices). Themes were determined to reflect agency or system perspectives according to which party was referenced in relation to that theme in the TASV text. All comments in the PSVQs were attributed to agencies. The role of agency characteristics was examined by creating ATLAS “document families,” or clusters of documents based on key characteristics. These families allowed us to compare and contrast code content within and across document families. For this analysis, we examined two structural agency characteristics: agency size and agency centralization. Document families for these two characteristics were unique (i.e., small agencies could have multiple sites, moderate or large agencies could be single-site). Themes regarding agency structural characteristics were considered to differ substantially from one another if the theme occurred considerably more frequently (determined by consensus) in one document family as opposed to the comparison families.

Results

Themes that emerged from both data sources were grouped into the following three categories: 1) Perceptions of System Implementation Strategies, 2) Perceptions of PEI Practice Implementation, and 3) Types of Agency Implementation Strategies. See Table 2 for illustrative quotes for each theme. Quotes in the text are annotated according to initial category, which is denoted by number; theme, which is denoted by letter; and then subtheme, which is denoted by Roman numeral. For example, 1ai corresponds to category 1: Perceptions of System Implementation Strategies, theme a: Important role of Training/ technical assistance, and subtheme i: Limited availability of training.

Table 2 Themes and representative quotes

Perceptions of system implementation strategies

Both agency and system leaders commented on the important role of training/technical assistance in supporting continued practice use. Specifically, almost all agency leaders considered the initial limited availability of therapist training an obstacle (1ai). Thus, most agencies stated that having their staff trained in multiple practices during the initial rollout was a source of pride (1aii). Regarding ongoing training, agency leaders and system leaders had different perspectives on where the burden of funding should rest. Agency leaders cited the need for continued funding from the county to maintain a trained workforce in the face of turnover (1aiii), whereas, LACDMH representatives encouraged agency leaders to plan for sustainment of PEI practices including assuming fiscal responsibility for continued training (1aiv).

Related to the lack of available trainings, agency leaders also reported that it was difficult for them to use their full PEI funding allocation (1bi). With only few clinicians able to attend PEI practice trainings at a time, agency leaders did not initially have the capacity to bill for these services. In addition, they did not immediately have access to client populations most likely to benefit from early intervention. As such, agencies needed to launch time-intensive outreach initiatives to cultivate appropriate referral sources, which necessitated staff time that was not eligible for reimbursement (1bii).

Challenges with outcome monitoring (a required change to services concurrent with the PEI practice implementation) was another major theme. Agency leaders described the burden of non-billable time for scoring and entering outcome measures (1ci) and reported that practices with greater routine progress monitoring presented more challenges, especially when delivering in-home services. They also noted that outcome measures were not available in the languages spoken by their clients and existing translations were often confusing to families particularly with limited literacy skills (1cii). Clinicians feared that outcome measures may not capture progress given language barriers and client reluctance to endorse sensitive items, such as trauma exposure (1ciii). Agency leaders further emphasized the difficulty of collecting post-treatment measures in the context of high levels of drop-out and client mobility. However, system leaders focused on increasing completion of outcome measures, and provided agencies with feedback on their rates of outcome reporting (1civ).

Regarding compliance with PEI practice claiming allowances, agency and system leaders both indicated concerns regarding treatment length. Clinicians expressed that they struggled to complete treatment within the recommended time frame due to clients presenting with multiple stressors, such as new trauma exposures (1di). Therapists also reported needing to slow the pace of treatment to ensure that clients understood treatment principles and could implement skills outside session. Because many agencies had previously served more severe and chronic clients, some agency leaders discussed the challenges associated with adjusting their care to fit brief models designed for early stage illness and prevention (1dii).

LACDMH representatives were concerned with compliance with PEI requirements. During site visits, they frequently emphasized fidelity, which was often framed in terms of compliance to PEI practice guidelines, (i.e., length of treatment and cost per client) rather clinical fidelity marked by adherence to the protocol (1diii). Time was devoted to discussion of errors in processing claims for reimbursement (e.g., claims submitted for a non-contracted PEI practice; 1div) and under- or over-utilization of PEI-allocated funds. LACDMH staff often made suggestions for remedying these issues (e.g., generating new referral streams, reviewing client eligibility).

Finally, agency leaders commented on the challenges of complying with PEI implementation requirements as these guidelines were being developed during implementation. Agencies reported that it was difficult to track current requirements as they were changed over time. One agency described the process as trying to “fly the PEI Transformation plane while building it.” Agency leaders expressed the need for centralized communication regarding requirements and readily-available documentation to ensure that agencies could achieve compliance.

Perceptions of EBP implementation

Overall, agency leaders recognized that the PEI Transformation had multiple impacts on their staff and their clients. Numerous agencies reported that at least one EBP addressed some portion of their clients in terms of demographics or presenting problem (2ai). However, many also reported that practices had to be adapted (e.g., treatment pace slowed) to better fit the population; and yet some clients were still not covered by these practices (2aii). Agencies that had implemented a PEI practice prior to the transformation often had existing infrastructure and reported having since further developed their capacity to implement (2bi).

Related to staff impact, a concern voiced across almost all agencies was the struggle to balance clinician availability and resources with the need to serve clients. Agency leaders would have ideally offered multiple practices to maximize the range and number of clients treated, but this was not feasible due to difficulties accessing trainings and the demands of learning multiple practices. More than half of agency leaders stated that it would be optimal for clinicians to be trained in 2-3 practices so they could serve a large contingent of clients. Agency leaders also reported that therapists differed in their opinions about PEI practices. Multiple agencies reported that their clinicians were enthusiastic about using the practices, especially early career therapists with less experience. However, several leaders reported that their clinicians were concerned about brief treatment in the face of frequent client stressors, low literacy and chronically disadvantaged caregivers, which often derail continuity and necessitate longer treatment.Footnote 1 A common narrative was that clinicians who were initially reluctant became more comfortable over time, especially as they witnessed positive client outcomes and improvements in their clinical skills (2ci). Relatedly, agency leaders reported that staff had to spend additional time in session translating and explaining concepts to non-English speaking clients, which lengthened sessions and increased workload (2cii). They expressed a need for more resources to cover this extra time and reported difficulties maintaining a culturally- and linguistically-diverse workforce.

Types of agency implementation strategies

A major aim of the current study was to characterize the implementation strategies used by agency leaders to facilitate delivery of PEI practices. In the context of multiple EBP implementation, a theme emerged concerning practice selection. Although they predominantly focused on existing client needs in selecting practices, many agency leaders considered clinician preferences as well to increase the potential for sustainment (3ai). Moderately-sized and large agencies selected a greater number of PEI practices to implement than smaller agencies, likely due to their greater capacity. Once practices were selected, many agency leaders indicated that they changed their intake case assignment procedures to prioritize both clinical and fiscal considerations (3bi), assigning clients to a practice rather than basing case assignments on therapist availability.

In order to manage the transition to providing PEI practices, agency leaders reported making infrastructure changes, including changing staffing. Multiple agencies created new positions, such as practice coordinators or leads (3ci), or reallocating the existing staff time to new functions. For example, intake coordinators took over entering outcome data, and clinical supervisors were tasked with monitoring PEI implementation compliance (3cii). Beyond staffing considerations, some agencies instituted other types of infrastructure for implementation support. These included structured opportunities for line staff and management to communicate, such as focus groups, all-staff meetings, or PEI-practice consultation groups (3di). A number of agencies also utilized technology to facilitate PEI implementation, particularly through using Electronic Health Records (EHR) or information systems to monitor outcomes and remind staff to administer measures, create feedback reports, flag billing concerns, and provide example progress notes (3dii). Multiple agencies also reported addressing PEI requirements by making changes to clinical supervision procedures, such as devoting part of supervision to monitoring compliance (e.g., reviewing outcomes, progress notes, 3ei), or creating practice-specific supervision groups (3eii) to provide additional support on skill development.

For agencies that discussed a high investment in sustainment of practices, most reported adopting a train-the-trainer model (i.e., supervisors are trained to lead in-house trainings for agency employees to reduce reliance on external trainings) (3fi). Staff turnover was discussed as a perpetual problem in community mental health settings that threatened practice sustainment. Staff members trained in multiple practices are especially valuable and the transfer of even one clinician who could serve multiple client types represents a significant loss. Thus, some agencies mentioned innovative strategies to deter turnover, including seniority incentive systems, creating greater opportunities for internal advancement, and maximizing fit of therapist professional goals with the agency mission during hiring (3fii).

Many agencies reported that it was necessary to significantly increase their outreach and engagement efforts to generate a base of clients who would benefit from PEI services. Some agencies discussed employing innovative practices to increase referrals, such as holding mental health fairs in partnership with non-mental health agencies, offering services in non-traditional formats (e.g., summer camps, afterschool groups), and partnering with local schools and organizations (4gi). Strategies to increase client engagement were also frequently discussed, including conducting home visits and providing bilingual services. A few agencies reported offering client incentives, such as food, childcare, or transportation vouchers, or incorporating a family resource center (4gii). Once clients entered services, some agency leaders discussed the need to adapt PEI practices based on age, developmental level, and culture. Some practices originally developed for adults had to be adapted to be developmentally appropriate for youth (e.g., using age-appropriate words, making situations more relevant to adolescents; 4hi). Agencies that serve youth with intellectual disabilities discussed reducing emphasis on the cognitive components of practices.

Agency structural characteristics associated with agency implementation strategies

Use of specific implementation strategies differed by agency structural characteristics (see Table 3). Overall, moderately-sized, large, and multi-site agencies employed more implementation strategies targeting multiple levels (i.e., organization, therapist, client) in response to the system-driven implementation strategies qualitatively compared to small-sized and single-site agencies. For example, more of these agencies reported creating new positions or reallocating time in existing positions than small and single-site agencies. A substantial percentage of moderately-sized and large agencies also referenced using technology to facilitate PEI implementation, implementing practice-specific supervision, and use of train-the-trainer models as compared to the percentage of small agencies. A large percentage of small agencies reported using adaptations to treatment delivery (e.g., modifying content) as an implementation strategy whereas medium and large agencies tended to report a range of strategies. A substantial percentage of small and single-site agencies also indicated particular focus on practice selection, choosing practices that best fit the agency to preserve resources and promote sustainment.

Table 3 Differences in agency implementation strategies by agency structural characteristics

Discussion

Using document review of qualitative data, we identified recurrent themes in early implementation experiences within community mental health agencies undergoing a rapid, large-scale system-driven transformation providing reimbursement for the delivery of multiple practices initiated in the context of a budget crisis with the goal of avoiding shut down of agencies and substantial reduction in services. These themes related to perceptions of system implementation strategies, perceptions of practice implementation, and types of agency implementation strategies. As in previous studies, experience with EBPs was associated with improved staff attitudes and implementation quality [27]. Many of the implementation strategies utilized by agency leaders in response to the PEI transformation have been associated with success in previous implementation efforts [7, 28], particularly the creation of more structured opportunities for management and front-line staff to communicate [14, 15], generating partnerships with other service systems [29], utilizing technology [30], and the train-the trainer model [2, 8]. Overall, it appears that, when faced with a system-driven reform, agency leaders independently chose and utilized implementation strategies that are supported in the literature.

Findings also suggested that, in the absence of significant external assistance (e.g., practice developer involvement or university-community partnerships), agencies established strategies that appeared to fit within the context of their resources. Results revealed differences in organizational responses to the PEI roll-out as a function of agency structural characteristics. In general, moderate, large, and multi-site agencies were more likely to utilize a range of implementation strategies and to exhibit greater flexibility in redistributing and utilizing resources. Large organizations with a more decentralized, multi-site structure appeared better equipped to adopt multiple innovations to support implementation, perhaps because they had a greater amount of slack in their resources to prepare for changes within this major reform [16]. Single-site agencies tend to have fewer staff to designate to fully administrative roles and, thus, may need to utilize existing infrastructure (e.g., supervision) to fulfill those needs. A large percentage of smaller agencies discussed efforts to increase client engagement and adapt treatments, which may indicate that these agencies did not initially have a strong intervention-population fit and needed to utilize resources to enhance fit by modifying the practice and/or by developing a client base appropriate to the practice.

Notably, although specific PEI practices were often discussed in the site visits, they were predominately discussed in the context of PEI implementation requirements (e.g., claiming, outcome measurement). References to specific PEI practices were typically brief and varied widely across agencies, with no consensus or clear emergence of themes. This was surprising because agencies were implementing multiple practices simultaneously and practice-specific concerns have been noted in similar implementation efforts [31, 32]. It is possible that concerns regarding PEI requirements took precedence over issues related to the practices themselves given the foremost pressure to demonstrate compliance with the fiscal mandate and the urgent nature of PEI that occurred with an accelerated timeline which included three linked reforms (i.e. PEI practice delivery, expansion in patient population, and clinical outcome collection). It may be that practice-specific concerns did not emerge until after the early implementation period, once the more general concerns had been addressed.

These findings were likely shaped by both characteristics of the implementation approach and conditions of the outer context during the initial implementation of PEI. First, this implementation was initiated by the system (LACDMH), in contrast to other efforts initiated by researchers, intervention purveyors, or service organizations. The system used a reform of fiscal reimbursement policies and practices to facilitate rapid implementation. This occurred in the context of a state budget crisis which put significant financial strain on service organizations that threatened their viability. In response, LACDMH amended and accelerated the PEI implementation plan in order to preserve the viability of agencies and availability of services to clients. As such, many of the themes that emerged related to agencies rapidly restructuring to comply with new and evolving requirements for reimbursement. Additionally, the themes reflected the context of simultaneous and rapid implementation of multiple practices. As such, this introduced the complexity of selecting practices to meet the characteristics of the agency workforce and target client populations. Furthermore, the themes observed were likely influenced by the characteristics of the client population served in LACDMH, which includes a predominantly ethnically- and linguistically-diverse community with relatively high levels of acuity. These client characteristics often required more intensive levels of care than intended within the prevention and early intervention focus of these services. Lastly, the EBP implementation occurred in the context of multiple, concurrent shifts in services including the target population and implementation of an outcome collection requirement.

System and agency leaders both highlighted barriers and facilitators regarding adequate training resources, ongoing funding, progress monitoring, and billing allowances, which have been identified as important themes in implementation as usual [18, 23, 33]. LACDMH staff viewed agency responses through the lens of ensuring systematic accountability to the broader goals of PEI (e.g., outcome-focused, sustainment of practices, reaching the intended population) similar to policymakers at the state and county levels in other systems [24], whereas agency leaders tended to view these factors through the lens of their local contexts and consumer needs [23, 34]. Agency leaders also emphasized the fit of individual practices with organizational, therapist, and client characteristics. In sum, system and agency leaders focused on the most immediate pressures facing each level of stakeholder. LACDMH system leaders’ focus on contractual compliance reflected their responsibility to ensure accountability in spending public funds. Agency leaders had to attend to the immediate viability of their programs and ensuring continuity of care for their vulnerable clients.

Given that the EBP implementation described in the present study occurred within a unique and rapidly-changing environment, it is difficult to know the extent to which the findings are attributable to the agencies themselves or the external context. It is also a challenge to estimate the relative importance or impact of each implementation strategy on the outcomes. As this is a neutral, observational study in which there was no manipulation of the variables, this is a limitation of the data and should be considered when interpreting findings. As dissemination and implementation science continues to progress, it would be beneficial to conduct dismantling studies of implementation strategies as well as prospective studies of implementation to determine unique and defined benefits of different implementation strategies. It is imperative to customize implementation strategies to an organization’s needs rather than using a prescriptive, one-size-fits-all approach, and more work is needed to determine how to tailor individual strategies [35].

There are important limitations to note regarding this study. Utilization management reports furnished data regarding factors at the organizational level (e.g., unique client count). These measures were used as proxies that were intended to provide information regarding organizational characteristics and therefore do not directly measure features of agencies, such as their staffing resources. The TASVs were also prepared by a third-party organization and, as such, are not a direct representation of agency and LACDMH perspectives. Given the length of time that has passed between these meetings and the current study as well as high turnover rate of agency staff since the meetings occurred, we were not able to independently validate these documents. These documents were also intended to provide a record of the major points discussed during the visit and, therefore, may not provide the level of detail and consistency regarding themes that would be provided by primary data collection. Surveys and interviews directly querying therapists and program managers about organizational factors and their perceptions of PEI implementation are currently being administered and can provide this level of analysis. Lastly, given that this is retrospective, cross-sectional study that occurred within a unique context, it is important to consider that the conclusions may not be generalizable to other contexts undergoing implementation efforts, particularly given that implementation was marked by a time-sensitive budget process.

Conclusions

Despite these limitations, there are a number of important implications of this study, which add to the limited body of research regarding system-driven implementation efforts of multiple EBPs. Overall, this study highlights the complexity involved in rapid large-scale system reform and suggests that there are unique factors to consider at each level in the inner and outer context that may facilitate successful implementation. At the agency level, it is important to select and utilize implementation strategies that fit well with the organization, such as making changes to triage procedures or re-organizing staff positions. At the systems-level, it is important to facilitate collaboration between agencies and disseminate strategies that agencies find beneficial. It may also be important for leaders at both the agency and the systems levels to consider agency characteristics when selecting implementation strategies as these may affect the feasibility and fit of strategies, such as utilizing technology or hiring new staff. Whereas larger agencies may have the capacity to enact these implementation strategies independently, smaller agencies might require additional supports. Finally, simultaneous implementation of multiple practices requires shifts in care delivery and requirements associated with individual EBPs (e.g., outcome measures, claiming compliance) may pose challenges for agencies to consider beyond the clinical components of delivering the intervention itself. This is particularly salient as system leaders may rank implementation priorities differently than agency leaders. In preparing agencies to deliver multiple EBPs, it is important to choose practices that fit the agency’s structure and organization and the client population, but it may be equally or even more important to consider streamlining implementation requirements and providing initial and ongoing support in meeting requirements across agencies.

Notes

  1. Session limits were put into place for each practice by LACDMH based on developer feedback.

Abbreviations

CBITS:

Cognitive Behavioral Intervention for Trauma in Schools

CPP:

Child-Parent Psychotherapy

EBP:

Evidence-based practice

EHR:

Electronic Health Record

EPIS:

Exploration, Preparation, Implementation, Sustainment framework

LACDMH:

Los Angeles County Department of Mental Health

MAP:

Managing and Adapting Practice

MHSA:

Mental Health Services Act

PEI:

Prevention and Early Intervention

PSVQ:

PEI provider pre-site visit questionnaire

SS:

Seeking Safety

TASV:

PEI technical assistance site visit summary report

TAY:

Transition-age youth

TF-CBT:

Trauma-Focused Cognitive Behavioral Therapy

Triple P:

Triple P Positive Parenting Program

References

  1. Novins DK, Green AE, Legha RK, Aarons GA. Dissemination and implementation of evidence-based practices for child and adolescent mental health: a systematic review. J Am Acad Child Adolesc Psychiatry. 2013;52(10):1009–25. doi:10.1016/j.jaac.2013.07.012.

    Article  PubMed  PubMed Central  Google Scholar 

  2. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments: a review of current efforts. Am Psychol. 2010;65(2):73–84. doi:10.1037/a0018121.

    Article  PubMed  Google Scholar 

  3. Bruns EJ, Kerns SE, Pullmann MD, Hensley SW, Lutterman T, Hoagwood KE. Research, data, and evidence-based treatment use in state behavioral health systems, 2001–2012. Psychiatr Serv. 2015;67(5):496–503. doi:10.1176/appi.ps.201500014.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Beidas RS, Aarons G, Barg F, Evans A, Hadley T, Hoagwood K, et al. Policy to implementation: evidence-based practice in community mental health—study protocol. Implement Sci. 2013;8(1):8–38. doi:10.1186/1748-5908-8-38.

    Article  Google Scholar 

  5. Brookman-Frazee L, Stadnick N, Roesch S, Regan J, Barnett M, Bando L, et al. Measuring sustainment of multiple practices fiscally mandated in children’s mental health services. Admin Pol Ment Health. 2016. doi:10.1007/s10488-016-0731-8.

  6. Lau AS, Brookman-Frazee L. The 4KEEPS study: identifying predictors of sustainment of multiple practices fiscally mandated in children’s mental health services. Implement Sci. 2016. doi:10.1186/s13012-016-0388-4.

  7. Los Angeles County Department of Mental Health, Mental Health Services Act prevention and early intervention plan for Los Angeles County. 2009 Aug [cited 2017 Jan 4]. Available from: http://file.lacounty.gov/SDSInter/dmh/159376_LACPEI_Plan_Final_8-17-2009.pdf.

  8. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015. doi:10.1186/s13012-015-0209-1.

  9. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the expert recommendations for implementing change (ERIC) study. Implement Sci. 2015. doi:10.1186/s13012-015-0295-0.

  10. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38(1):4–23. doi:10.1007/s10488-010-0327-7.

    Article  Google Scholar 

  11. Los Angeles County Department of Mental Health. Mental Health Services Act prevention and early intervention implementation handbook. 2016 Jul [cited 2017 Jan 4]. Available from: http://file.lacounty.gov/SDSInter/dmh/247145_PEIImplementationHandbook-PDFforWebsiterev.7-27-16.pdf

  12. Glisson C, Schoenwald SK, Kelleher K, Landsverk J, Hoagwood KE, Mayberg S, et al. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Admin Pol Ment Health. 2008;35(1-2):124–33. doi:10.1007/s10488-007-0152-9.

    Article  Google Scholar 

  13. Aarons GA, Sawitzky AC. Organizational climate partially mediates the effect of culture on work attitudes and staff turnover in mental health services. Admin Pol Ment Health. 2006;33(3):289–301. doi:10.1007/s10488-006-0039-1.

    Article  Google Scholar 

  14. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3-4):327–50. doi:10.1007/s10464-008-9165-0.

    Article  PubMed  Google Scholar 

  15. Fixsen DL, Naoom SF, Blasé KA, Friedman RM. Implementation research: a synthesis of the literature. 2005. http://nirn.fpg.unc.edu/resources/implementation-research-synthesis-literature. Accessed 27 July 2016.

    Google Scholar 

  16. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. doi:10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Beidas RS, Stewart RE, Adams DR, Fernandez T, Lustbader S, Powell BJ, et al. A multi-level examination of stakeholder perspectives of implementation of evidence-based practices in a large urban publicly-funded mental health system. Admin Pol Ment Health. 2015; doi:10.1007/s10488-015-0705-2.

  19. Aarons GA, Sommerfeld DH, Hecht DB, Silovsky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: evidence for a protective effect. J Consult Clin Psychol. 2009;77(2):270–80. doi:10.1037/a0013223.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Glisson C, Hemmelgarn A, Green P, Dukes D, Atkinson S, Williams NJ. Randomized trial of the availability, responsiveness, and continuity (ARC) organizational intervention with community-based mental health programs and clinicians serving youth. J Am Acad Child Adolesc Psychiatry. 2012;51(8):780–7. doi:10.1016/j.jaac.2012.05.010.

    Article  PubMed  Google Scholar 

  21. D’Angelo G, Pullmann MD, Lyon AR. Community engagement strategies for implementation of a policy supporting evidence-based practices: a case study of Washington state. Admin Pol Ment Health. 2015. doi:10.1007/s10488-015-0664-7.

  22. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res and Rev. 2012;69(2):123–57. doi:10.1177/1077558711430690.

    Article  Google Scholar 

  23. Aarons GA, Wells RS, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: a multiple stakeholder analysis. Am J Public Health. 2009;99(11):2087–95. doi:10.2105/AJPH.2009.161711.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Willging CE, Green AE, Gunderson L, Chaffin M, Aarons GA. From a “perfect storm” to “smooth sailing” policymaker perspectives on implementation and sustainment of an evidence-based practice in two states. Child maltreat. 2015;20(1):24–36. doi:10.1177/1077559514547384.

    Article  PubMed  Google Scholar 

  25. Willms DG, Best AJ, Taylor DW, Gilbert JR, Wilson DMC, Lindsay EA, Singer JA. Systematic approach for using qualitative methods in primary prevention research. Med Anthropol Q. 1990;4:391–409.

    Article  Google Scholar 

  26. Glaser B, Strauss A. The discovery grounded theory: strategies for qualitative inquiry. Chicago: Aldin; 1967.

    Google Scholar 

  27. Whitley R, Gingerich S, Lutz WJ, Mueser KT. Implementing the illness management and recovery program in community mental health settings: facilitators and barriers. Psychiatr Serv. 2015;60(2):202–9. doi:10.1176/appi.ps.60.2.202.

    Article  Google Scholar 

  28. Mancini AD, Moser LL, Whitley R, McHugo GJ, Bond GR, Finnerty MT, Burns BJ. Assertive community treatment: facilitators and barriers to implementation in routine mental health settings. Psychiatr Serv. 2009;60(2):189–95. doi:10.1176/appi.ps.60.2.189.

    Article  PubMed  Google Scholar 

  29. Bond GR, Becker DR, Drake RE, Rapp CA, Meisler N, Lehman AF, et al. Implementing supported employment as an evidence-based practice. Psychiatr Serv. 2001;52(3):313–22. doi:10.1176/appi.ps.52.3.313.

    Article  CAS  PubMed  Google Scholar 

  30. Alexander JA, Weiner BJ, Shortell SM, Baker LC, Becker MP. The role of organizational infrastructure in implementation of hospitals’ quality improvement. Hosp Top. 2006;84(1):11–21.

    Article  PubMed  Google Scholar 

  31. Isett KR, Burnam MA, Coleman-Beattie MA, Hyde PS, Morrissey JP, Magnabosco J, et al. The state policy context of implementation issues for evidence-based practices in mental health. Psychiatr Serv. 2007;58(7):914–21. doi:10.1176/1ppi/ps.58.7.914.

    Article  PubMed  Google Scholar 

  32. Ringle VA, Read KL, Edmunds JM, Brodman DM, Kendall PC, Barg F, Beidas RS. Barriers to and facilitators in the implementation of cognitive-behavioral therapy for youth anxiety in the community. Psychiatr Serv. 2015;66(9):938–45. doi:10.1176/appi.ps.201400134.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Brunette MF, Asher D, Whitley R, Lutz WJ, Wieder BL, Jones AM, McHugo GJ. Implementation of integrated dual disorders treatment: a qualitative analysis of facilitators and barriers. Psychiatr Serv. 2008;59(9):989–95. doi:10.1176/appi.ps.59.9.989.

    Article  PubMed  Google Scholar 

  34. Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, et al. Evidence-based practice implementation strategies: results of a qualitative study. Community Ment Health. 2008;44(3):213–24. doi:10.1007/s10597-007-9109-4.

    Article  Google Scholar 

  35. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillan JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2015;44(2):177–94. doi:10.1007/s11414-015-9475-6.

    Article  Google Scholar 

  36. Lieberman AF, Van Horn P. Don’t Hit my mommy. Washington, DC: Zero to Three Press; 2005.

    Google Scholar 

  37. Jaycox LH. CBITS: Cognitive behavioral intervention for trauma in schools. Dallas: Sopris West; 2003.

  38. Chorpita BF, Daleiden EL. Structuring the collaboration of science and service in pursuit of a shared vision. J Clin Child Adolesc Psychol. 2014;43(2):323–38. doi:10.1080/15374416.2013.828297.

    Article  PubMed  Google Scholar 

  39. Najavits LM. Seeking safety: a treatment manual for PTSD and substance abuse. New York: Guilford Press; 2002.

    Google Scholar 

  40. Cohen JA, Mannarino AP, Deblinger E. Treating trauma and traumatic grief in children and adolescents. New York: Guilford Press; 2006.

    Google Scholar 

  41. Sanders MR. Triple P-positive parenting program: towards an empirically validated multilevel parenting and family support strategy for the prevention of behavior and emotional problems in children. Clin Child Fam Psych Rev. 1999;2(2):71–90.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

We appreciate the support that the Los Angeles County Department of Mental Health (LACDMH) has provided for this project. Funding for this research project was supported by NIMH grant R01 MH100134. We would also like to acknowledge Dana Saifan and Jessica Issacs for their assistance with coding the qualitative documents.

Funding

Funding for this research project, including the study design and data collection, analysis and interpretation of data, was supported by NIMH Grant R01 MH100134.

Availability of data and materials

The data that support the findings of this study are available from the Los Angeles County Department of Mental Health but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of the Los Angeles County Department of Mental Health.

Author information

Authors and Affiliations

Authors

Contributions

JR, MB, NS, and AH coded and analyzed the qualitative documents. AH also provided expertise on qualitative coding methods. AL and LBF are the principal investigators of this multiple PI study and contributed to the study design, data collection, and interpretation of data. KP and LB furnished the site report documents and contributed to the interpretation of the data. All authors were involved in the preparation of the manuscript and read and approved the final manuscript.

Corresponding author

Correspondence to Jennifer Regan.

Ethics declarations

Ethics approval and consent to participate

Approval by the institutional review boards of the University of California, Los Angeles and the University of California, San Diego was obtained prior to the study. This study was conducted in compliance with APA ethical standards in treatment of individuals participating in research.

Consent for publication

Consent for publication of quotations from the site visit data sources was given by the Los Angeles County Department of Mental Health.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Regan, J., Lau, A.S., Barnett, M. et al. Agency responses to a system-driven implementation of multiple evidence-based practices in children’s mental health services. BMC Health Serv Res 17, 671 (2017). https://doi.org/10.1186/s12913-017-2613-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-017-2613-5

Keywords