Skip to main content

Advertisement

Log in

A Sustainable Alternative to the Gold Standard EBP: Validating Existing Programs

  • Published:
The Journal of Behavioral Health Services & Research Aims and scope Submit manuscript

Abstract

Increasingly, jurisdictions are requiring the adoption of certified evidence-based programs (EBPs) for behavioral health and human services for children, youth, and their families. Often, such adoption of proven, prepackaged programs is done without regard to existing, yet effective, locally developed program models. This study presents a replicable six-step process that identifies key researched elements from within existing programs and creates program-specific fidelity scoring and tracking tools for routine use during clinical supervision to assure that these elements are implemented well. A case study is used to demonstrate that a locally developed program model, when implemented with high fidelity, can serve clients with outcomes comparable to its EBP counterpart at a much lower cost. The results underscore the importance of one common element among EBPs and effective services in general: measuring key elements of the service and client outcomes and feeding these data back to clinicians for continuous improvement.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Fixsen D, Blase K, Metz A, et al. Statewide implementation of evidence-based programs. Exceptional Children. 2013;79(2):213–230.

    Article  Google Scholar 

  2. Littell J. Evidence-based practice: evidence or orthodoxy? In: BL Duncan, SD Miller, BE Wampold, et al. (Eds). The Heart & Soul of Change, Second Edition. Washington: American Psychological Association, 2010, pp. 167–198.

    Google Scholar 

  3. Lipsey M, Howell J, Kelly R, et al. Improving the Effectiveness of Juvenile Justice Programs: A New Perspective On Evidence-Based Practice. Center for Juvenile Justice Reform, Georgetown University. Available online at http://njjn.org/uploads/digital-library/CJJR_Lipsey_Improving-Effectiveness-of-Juvenile-Justice_2010.pdf. Accessed on January 28, 2015.

  4. Substance Abuse and Mental Health Services Administration. National Registry of Evidence-Based Programs and Practices. Programs & Campaigns. Available online at https://www.samhsa.gov/nrepp. Accessed on January 29, 2015.

  5. Baron J, Haskins R. The Obama Administration’s Evidence-Based Social Policy Initiatives: An Overview. Brookings Institution. Available online at http://www.brookings.edu/research/articles/2011/04/obama-social-policy-haskins. Accessed on December 3, 2015.

  6. Center for the Study and Prevention of Violence. Blueprints for Violence Prevention. University of Colorado. Available online at http://www.colorado.edu/cspv/blueprints/index.html. Accessed on February 9, 2015.

  7. Victora CG, Habicht JP, Bryce J. Evidence-based public health: moving beyond randomized clinical trials. American Journal of Public Health. 2004;94(3):400–405.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Rotheram-Borus MJ, Swendeman D, Chorpita B. Disruptive innovations for designing and diffusing evidence-based interventions. American Psychologist. 2012;67(6):463–476.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Shadish WR, Clark MH, Steiner PM. Can nonrandomized experiments yield accurate answers? A randomized experiment comparing random and nonrandom assignments. Journal of the American Statistical Association. 2008;103:1334–1344.

    Article  CAS  Google Scholar 

  10. Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: a meta-analysis of direct comparisons. American Psychologist. 2006;61(7):671–689.

    Article  PubMed  Google Scholar 

  11. Greenwood PW, Welsh BC, Rocque M. Implementing Proven Programs for Juvenile Offenders: Assessing State Progress. Association for the Advancement of Evidence-Based Practice. Available online at http://youthjusticenc.org/download/juvenile-justice/prevention-interventions-and-alternatives/Implementing%20Proven%20Programs%20for%20JuvenIle%20Offenders.pdf. Accessed on January 27, 2015.

  12. Henggeler SW, Schoenwald SK. Evidence-Based Interventions for Juvenile Offenders and Juvenile Justice Policies that Support Them. Social Policy Report. Volume 25, Issue 1. Society for Research in Child Development. Available online at https://eric.ed.gov/?id=ED519241. Accessed on January 27, 2015.

  13. Hill I, Hogan S, Palmer L, et al. Medicaid Outreach & Enrollment for Pregnant Women: What is the State of the Art? Report for the March of Dimes Foundation. Urban Institute. Available online at http://www.urban.org/UploadedPDF/411898_pregnant_women.pdf. Accessed on January 29, 2015.

  14. Fernandez MA, Eyberg SM. Keeping families in once they’ve come through the door: attrition in parent–child interaction therapy. Journal of Early and Intensive Behavior Intervention. 2005;2(3):207–212.

    Article  Google Scholar 

  15. Lyon AR, Budd KS. A community mental health implementation of parent–child interaction therapy (PCIT). Journal of Child and Family Studies. 2010;19(5):654–668.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Wampold B. The research evidence for the common factors models: a historically situated perspective. In: BL Duncan, SD Miller, BE Wampold, et al. (Eds). Heart & Soul of Change: Delivering What Works in Therapy, Second Edition. Washington, D.C.: American Psychological Association, 2010, pp. 49–81.

    Chapter  Google Scholar 

  17. Fagan AA, Hanson K, Hawkins JD, et al. Implementing effective community-based prevention programs in the community youth development study. Youth Violence and Juvenile Justice. 2008;6(3):256–278.

    Article  Google Scholar 

  18. Gottfredson DC, Gottfredson GD. Quality of school-based prevention programs. Journal of Research in Crime and Delinquency. 2002;39(1):3–35.

    Article  Google Scholar 

  19. Hallfors D, Godette D. Will the ‘principles of effectiveness’ improve prevention practice? Early findings from a diffusion study. Health Education Research. 2002;17(4):461–470.

    Article  PubMed  CAS  Google Scholar 

  20. Hallett TB, White PJ, Garnett GP. Appropriate evaluation of HIV prevention interventions: from experiment to full-scale implementation. Sexually Transmitted Infections. 2007;83:i55-i60.

    Article  PubMed  Google Scholar 

  21. Fixsen D, Naoom S, Blase K, et al. Implementation Research: A Synthesis of the Literature. Tampa, FL: Louis de la Parte Florida Mental Health Institute, National Implementation Research Network, 2005.

    Google Scholar 

  22. Fixsen DL, Blase KA, Timbers GA, et al. In search of program implementation: 792 replications of the teaching-family model. The Behavior Analyst Today. 2007;9 (1):96–110.

    Article  Google Scholar 

  23. Lipsey MW. The effects of treatment on juvenile delinquents: results from meta-analysis. In: F Lösel, D Bender, T Bliesener (Eds). Psychology and Law: International Perspectives. England: Walter De Gruyter, 1992, pp. 131–143.

    Google Scholar 

  24. Chorpita B, Becker K, Daleiden E. Understanding the common elements of evidence-based practice: misconceptions and clinical examples. Journal of the Academy of Child and Adolescent Psychiatry. 2007;46:5:647–652.

    Article  Google Scholar 

  25. Chorpita BF, Bernstein A, Daleiden EL. Empirically guided coordination of multiple evidence-based treatments: an illustration of relevance mapping in children’s mental health services. Journal of Consulting and Clinical Psychology. 2011;79(4):470–480.

    Article  PubMed  Google Scholar 

  26. Landenberger NA, Lipsey MW. The positive effects of cognitive-behavioral programs for offenders: a meta-analysis of factors associated with effective treatment. Journal of Experimental Criminology. 2005;1(4):451–476.

    Article  Google Scholar 

  27. Lipsey, M. The primary factors that characterize effective interventions with juvenile offenders: a meta-analytic overview. Victims and Offenders. 2009;4(2):124–147.

    Article  Google Scholar 

  28. Blase K, Fixsen D. Core Intervention Components: Identifying and Operationalizing What Makes Programs Work. Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services. Available online at https://aspe.hhs.gov/report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work. Accessed on January 26, 2015.

  29. Hubble MA, Duncan BL, Miller SD, et al. Introduction. In: BL Duncan, SD Miller, BE Wampold, et al. (Eds). The Heart & Soul of Change: Delivering What Works in Therapy, Second Edition. Washington, D.C.: American Psychological Association, 2010, pp. 28.

    Google Scholar 

  30. Latimer J. Multisystemic Therapy as a Response to Serious Youth Delinquency. Department of Justice, Government of Canada. Available online at http://www.justice.gc.ca/eng/rp-pr/jr/jr12/p5d.html. Accessed on January 28, 2015.

  31. Office of Mental Health and Substance Abuse Services (OMHSAS). White Paper Community Alternatives to Psychiatric Residential Treatment Facility Services. Commonwealth of Pennsylvania. Available online at http://www.paproviders.org/archives/Pages/Childrens_Archive/PRTF_White_Paper_042208.pdf. Accessed on February 11, 2015.

  32. Hodges K. Child and adolescent functional assessment scale. In: ME Maruish (Ed). The Use of Psychological Testing for Treatment Planning and Outcomes Assessment, Third Edition, Volume 2. New Jersey: Taylor & Francis, 2011, pp. 405–442.

    Google Scholar 

  33. Kirk RS, Martens P. Development and Field Testing of the North Carolina Family Assessment Scale for General Services (NCFAS-G). National Family Preservation Network. Available online at http://nfpn.org/Portals/0/Documents/ncfasg_research_report.pdf. Accessed on January 28, 2015.

  34. Wesley Spectrum Services. Final Report to Wesley Spectrum: Fidelity Management: A low-cost alternative to proprietary evidence-based programs. PHILLIPS. Available online at http://www.phillipsprograms.org/wp-content/uploads/2011/10/WSS-IHFT-FINAL-REPORT-7-5-11-v3.pdf. Accessed on January 29, 2015.

  35. Coates D, Howe D. An evaluation of a service to keep children safe in families with mental health and/or substance abuse issues. Australasian Psychiatry. 2016;24(5):483–488.

    Article  PubMed  Google Scholar 

  36. Reed-Ashcraft K, Kirk S, Fraser M. The reliability and validity of the North Carolina Family Assessment Scale. Research Social Work and Practice. 2001;11(4):503–520.

    Article  Google Scholar 

  37. Lee BR, Lindsey MA. North Carolina Family Assessment Scale: measurement properties for youth mental health services. Research on Social Work Practice. 2010;20(2):202–211.

    Article  Google Scholar 

  38. Turner LA, Powell AE, Langhinrichsen-Rohling J, et al. Helping Families Initiative: intervening with high-risk students through a community, school, and District Attorney partnership. Child and Adolescence Social Work Journal. 2009;26:209–223.

    Article  Google Scholar 

  39. McLeod BD, Sotham-Gerow MA, Tully CB, et al. Making a case for treatment integrity as a psychosocial treatment quality indicator for youth mental health care. Clinical Psychology Science and Practice. 2013;20(1):14–32.

    Article  PubMed  Google Scholar 

  40. MST Institute. Model Fidelity and Positive Outcomes: Hallmarks of Evidence-Based Practice. MST Institute Homepage. Available online at http://www.mstinstitute.org. Accessed on February 10, 2015.

  41. MST Institute. 2010 MST Data Report. QA Program Reports. Available online at http://www.mstinstitute.org/MST%202010%20Data%20Report%20Final.pdf. Accessed on February 10, 2015.

  42. Allegheny County. Data Warehouse. Pennsylvania Department of Human Services. Available online at http://www.alleghenycounty.us/Human-Services/News-Events/Accomplishments/DHS-Data-Warehouse.aspx. Accessed on January 29, 2015.

  43. Aos T, Lieb R, Mayfield J, et al. Benefits and Costs of Prevention and Early Intervention Programs for Youth. Washington State Institute for Public Policy. Available online at http://www.wsipp.wa.gov/ReportFile/881/Wsipp_Benefits-and-Costs-of-Prevention-and-Early-Intervention-Programs-for-Youth_Summary-Report.pdf. Accessed on January 28, 2015.

  44. Substance Abuse and Mental Health Services Administration. Multisystemic Therapy (MST) for Juvenile Offenders. National Registry of Evidence-Based Programs and Practices. Available online at https://nrepp.samhsa.gov/Legacy/ViewIntervention.aspx?id=254. Accessed on January 29, 2015.

  45. Lee BR, Ebesutani C, Kolivoski KM, et al. Program and practice elements for placement prevention: a review of interventions and their effectiveness in promoting home-based care. American Journal of Orthopsychiatry. 2014;84(3):244–256.

    Article  PubMed  Google Scholar 

  46. Bickman L, Kelley SD, Breda C, et al. Effects of routine feedback to clinicians on mental health outcomes of youths: results of a randomized trial. Psychiatric Services. 2011;62(12):1423–1429.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This research was supported by a grant from the Allegheny County Department of Human Services awarded to Wesley Spectrum Services for an evaluation of the WSIH program. The authors thank all of the staff of Wesley Spectrum Services, especially Doug Muetzel (CEO) and Pam Weaver (CPO) and the staff of the WSIH program who worked closely with the evaluation consultants in developing the Model Value Management six steps. We also thank Katy Collins, PhD, who was affiliated with the University of Pittsburgh Graduate School of Public and International Affairs at the time of this research, for her work on the cost–benefit analysis of these services. We thank Michele Garrity, also formerly of the Graduate School of Public and International Affairs, for her final editing. Finally, we thank our Pittsburgh research and community advisors who encouraged this work and support the exploration of credible alternatives to pre-packaged EBP models: Ed Ricci, PhD (University of Pittsburgh School of Public Health; Ed Mulvey, Ph.D., University of Pittsburgh School of Medicine; Marybeth Rauktis, Ph.D., University of Pittsburgh, School of Social Work; Nancy Kukovich, CEO Adelphoi Village; Rochelle Haimes, COA national consultant; Brandi Mauck, CEO Allegheny Health Choices; Robert Sheen, PhD, clinical consultant; and Laura Maines, CEO, Every Child, Inc.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pamela Meadowcroft PhD.

Ethics declarations

Conflict of Interest Statement

The authors served as paid outside evaluation consultants for Wesley Spectrum Services, the case study site for the research reported in this manuscript. They had no role in developing or maintaining the program, serving only as outside evaluators. Despite the active involvement of staff in data collection and discussions regarding the meaning of the information, the authors had full access to all the relevant data and were in no way required to report on findings in a biased manner. None of the authors have had employment or engagement with any of the companies related to the standardized measurement tools referenced in this case study nor with any of the comparison programs cited in this article (e.g., Multi-Systemic Therapy). The authors take responsibility for the integrity and accuracy of the data analysis.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Meadowcroft, P., Townsend, M.Z. & Maxwell, A. A Sustainable Alternative to the Gold Standard EBP: Validating Existing Programs. J Behav Health Serv Res 45, 421–439 (2018). https://doi.org/10.1007/s11414-018-9599-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11414-018-9599-6

Navigation