Skip to main content
Log in

Replication in Prevention Science

  • Published:
Prevention Science Aims and scope Submit manuscript

Abstract

Replication research is essential for the advancement of any scientific field. In this paper, we argue that prevention science will be better positioned to help improve public health if (a) more replications are conducted; (b) those replications are systematic, thoughtful, and conducted with full knowledge of the trials that have preceded them; and (c) state-of-the art techniques are used to summarize the body of evidence on the effects of the interventions. Under real-world demands it is often not feasible to wait for multiple replications to accumulate before making decisions about intervention adoption. To help individuals and agencies make better decisions about intervention utility, we outline strategies that can be used to help understand the likely direction, size, and range of intervention effects as suggested by the current knowledge base. We also suggest structural changes that could increase the amount and quality of replication research, such as the provision of incentives and a more vigorous pursuit of prospective research registers. Finally, we discuss methods for integrating replications into the roll-out of a program and suggest that strong partnerships with local decision makers are a key component of success in replication research. Our hope is that this paper can highlight the importance of replication and stimulate more discussion of the important elements of the replication process. We are confident that, armed with more and better replications and state-of-the-art review methods, prevention science will be in a better position to positively impact public health.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. In this paper we use the terms program, intervention, and treatment interchangeably.

  2. See http://www.preventionresearch.org/StandardsofEvidencebook.pdf.

  3. See http://ies.ed.gov/ncee/wwc/

  4. See http://www.nrepp.samhsa.gov/

  5. See http://www.cochrane.org

  6. See http://www.campbellcollaboration.org

References

  • Brown, C. H., Wyman, P. A., Brinales, J. M., & Gibbons, R. D. (2007). The role of randomized trials in testing interventions for the prevention of youth suicide. International Review of Psychiatry, 19, 617–631.

    Article  PubMed  Google Scholar 

  • Bushman, B. J., & Wang, M. C. (2009). Vote-counting procedures in meta-analysis. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 207–220). New York: Sage.

    Google Scholar 

  • Campbell, D. T. (1986). Relabeling internal and external validity for applied social scientists. In W. M. K. Trochim (Ed.), Advances in quasi-experimental design and analysis (pp. 67–77). San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Campbell, D. T., & Stanley, J. C. (1966). Experimental and quasi-experimental designs for research. Chicago, IL: McNally.

    Google Scholar 

  • Cochran, W. G., & Cox, G. M. (1957). Experimental designs (2nd ed.). New York: Wiley.

    Google Scholar 

  • Cohen, J. (1994). The earth is round (p < .05). American Psychologist, 49, 997–1003.

    Article  Google Scholar 

  • Cooper, H., & Dorr, N. (1995). Race comparisons on need for achievement: A meta-analytic alternative to Graham’s narrative review. Review of Educational Research, 65, 483–508.

    Google Scholar 

  • Cooper, H. M., & Rosenthal, R. (1980). Statistical versus traditional procedures for summarizing research findings. Psychological Bulletin, 87, 442–449.

    Article  CAS  PubMed  Google Scholar 

  • Cumming, G., & Maillardet, R. (2006). Confidence intervals and replication: Where will the next mean fall? Psychological Methods, 11, 217–227.

    Article  PubMed  Google Scholar 

  • Egger, M., Smith, G. D., & O'Rourke, K. (2001). Rationale, potentials, and promise of systematic reviews. In M. Egger, G. D. Smith, & K. O'Rourke (Eds.), Systematic reviews in health care: Systematic reviews in context (2nd ed., pp. 3–22). London, UK: BMJ.

    Chapter  Google Scholar 

  • Eisner, M. (2009). No effects in independent prevention trials: Can we reject the cynical view? Journal of Experimental Criminology, 5, 163–183.

    Article  Google Scholar 

  • Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–52.

    Article  PubMed  Google Scholar 

  • Flay, B. R. (1986) Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Preventive Medicine, 15, 451–474.

    Google Scholar 

  • Flay, B. R., Biglan, A., Boruch, R. F., González Castro, F., Gottfredson, D., Kellam, S., et al. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151–175.

  • Foster, E. M. (2010). The value of reanalysis and replication: Introduction to special section. Developmental Psychology, 46, 973–975.

    Article  PubMed  Google Scholar 

  • Gigerenzer, G. (1993). The superego, the ego, and the id in statistical reasoning. In G. Keren & C. Lewis (Eds.), A handbook for data analysis in the behavioral sciences: Methodological issues (pp. 311–339). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Glass, G. V. (2000). Meta-analysis at 25. retrieved from http://glass.ed.asu.edu/gene/papers/meta25.html

  • Greenwald, A. G., Gonzalez, R., Harris, R. J., & Guthrie, D. (1996). Effect sizes and p values: What should be reported and what should by replicated? Psychophysiology, 33, 175–183.

    Article  CAS  PubMed  Google Scholar 

  • Hedges, L. V. (1987). How hard is hard science, how soft is soft science? The empirical cumulativeness of research. American Psychologist, 42, 443–455.

    Article  Google Scholar 

  • Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic.

    Google Scholar 

  • Hedges, L. V., & Vevea, J. L. (1998). Fixed- and random-effects models in meta-analysis. Psychological Methods, 3, 486–504.

    Article  Google Scholar 

  • Hopewell, S., Clarke, M., Moher, D., Wager, E., Middleton, P., Altman, D. G., et al. (2008). CONSORT for reporting randomised trials in journal and conference abstracts. Lancet, 371, 281–283.

    Article  PubMed  Google Scholar 

  • Hume, D. (1739/1740). A treatise of human nature: Being an attempt to introduce the experimental method of reasoning into moral subjects. Available online at http://www.gutenberg.org/etext/4705.

  • Hunter, J. E. (2001). The desperate need for replications. Journal of Consumer Research, 28, 149–158.

    Article  Google Scholar 

  • Institute of Medicine. (2008). Knowing what works in healthcare: A roadmap for the nation. Washington, DC: The National Academies Press.

    Google Scholar 

  • Kellam, S. G., Koretz, D., & Moscicki, E. K. (1999). Core elements of developmental epidemiologically based prevention research. American Journal of Community Psychology, 27, 463–482.

    Article  CAS  PubMed  Google Scholar 

  • Kirby, D. (2001). Emerging answers: Research findings on programs to reduce teen pregnancy. Washington, DC: National campaign to prevent teen pregnancy.

  • Killeen, P. R. (2005). An alternative to null-hypothesis significance tests. Psychological Science, 16, 345–353.

    Article  PubMed  Google Scholar 

  • Lo, B., Wolf, L. E., & Berkeley, A. (2000). Conflict-of-interest policies for investigators in clinical trials. The New England Journal of Medicine, 343, 1616–1620.

    Article  CAS  PubMed  Google Scholar 

  • National Institutes of Health. (2003). Training in the responsible conduct of research. Retrieved March 19, 2008, from http://grants.nih.gov/training/responsibleconduct.htm

  • Oakes, M. (1986). Statistical inference: A commentary for the social and behavioral sciences. New York: Wiley.

    Google Scholar 

  • Petrosino, A., & Soydan, H. (2005). The impact of program developers as evaluators on criminal recidivism: Results from meta-analyses of experimental and quasi-experimental research. Journal of experimental criminology, 1, 435–450.

    Google Scholar 

  • Pfeffer, C., & Olsen, B. R. (2002). Editorial: Journal of negative results in biomedicine. Journal of Negative Results in Biomedicine, 1, 2.

    Article  PubMed  Google Scholar 

  • Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13, 90–100.

    Article  Google Scholar 

  • Seaman, M. A., & Serlin, R. C. (1998). Equivalence confidence intervals for two-group comparisons of means. Psychological Methods, 3, 403–411.

    Article  Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton.

    Google Scholar 

  • Shadish, W. R., Fuller, S., Gorman, M. E., Amabile, T. M., Kruglanski, A. W., Rosenthal, R., et al. (1994). Social psychology of science: A conceptual and research program. In W. R. Shadish, S. Fuller, & M. E. Gorman (Eds.), The social psychology of science (pp. 3–123). New York: Guilford.

    Google Scholar 

  • Shadish, W. R., & Haddock, C. K. (1994). Combining estimates of effect size. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 261–281). New York: Sage.

    Google Scholar 

  • Schulz, K. F., (1995). Subverting randomization in controlled trials. Journal of the American Medical Association, 274, 1456–1458.

    Google Scholar 

  • Society for Prevention Research. (2002). Conflict of interest and disclosure statement policy. Retrieved March 18, 2008, from http://www.preventionresearch.org/history_conflict.php

  • Stice, E., Shaw, H., Becker, C., & Rohde, P. (2008). Dissonance-based interventions for the prevention of eating disorders: Using persuasion principles to promote health. Prevention Science, 9, 114–128.

    Google Scholar 

  • Summerville, G. (2009). Laying a solid foundation: Strategies for effective program replication. New York: Public/Private Ventures.

    Google Scholar 

  • Tobler, N. S., Roona, M. R., Ochshorn, P., Marshall, D. G., Streke, A. V., & Stackpole, K. M. (2000). School-based adolescent drug prevention programs: 1998 meta-analysis. The Journal of Primary Prevention, 20, 275–336.

    Google Scholar 

  • Tolan, P., Keys, C., Chertok, F., & Jason, L. A. (1990). Researching community psychology: Issues of theory and methods. Washington, DC: American Psychological Association.

    Book  Google Scholar 

  • Valentine, J. C., Pigott, T. D., & Rothstein, H. R. (2010). How many studies do you need? A primer on statistical power for meta-analysis. Journal of Educational and Behavioral Statistics, 35, 215–247.

    Article  Google Scholar 

  • Weiffen, B., Lehrer, D., Leschke, J., Lhachimi, S., & Vasiliu, A. (2007). Negative results in social science. European Political Science, 6, 51–68.

    Article  Google Scholar 

  • Williamson, P. R., Gamble, C., Altman, D. G., & Hutton, J. L. (2005). Outcome selection bias in meta-analysis. Statistical Methods in Medical Research, 14, 515–524.

    Article  CAS  PubMed  Google Scholar 

Download references

Author Note

This paper is the result of the deliberations of the Standards of Evidence Taskforce, convened and funded by the Society for Prevention Research (Brian R. Flay, Chair). The Board of Directors of the Society for Prevention Research is pleased to have supported the preparation of this paper in hopes that it will stimulate further discussion about the importance of replication.

The views expressed in this paper are the authors’, and do not necessarily reflect the views of the authors’ institutions or the Society for Prevention Research. With the exception of the first author, order of authorship is alphabetical.

We thank Richard Catalano, Harris Cooper, Adam Haldahl, Mark Lipsey, and Patrick Tolan for their valuable feedback on earlier versions of this paper, and Kirsten Sundell for editing the final manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeffrey C. Valentine.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Valentine, J.C., Biglan, A., Boruch, R.F. et al. Replication in Prevention Science. Prev Sci 12, 103–117 (2011). https://doi.org/10.1007/s11121-011-0217-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11121-011-0217-6

Keywords

Navigation