Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Generalizing findings from a randomized controlled trial to a real-world study of the iLookOut, an online education program to improve early childhood care and education providers’ knowledge and attitudes about reporting child maltreatment

  • Chengwu Yang ,

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    chengwu.yang@nyu.edu (CY); bhlevi@psu.edu (BL)

    Affiliation Department of Epidemiology and Health Promotion, New York University College of Dentistry, New York City, New York, United States of America

  • Carlo Panlilio,

    Roles Investigation, Writing – original draft, Writing – review & editing

    Affiliation Department of Educational Psychology, Counseling and Special Education, Pennsylvania State University, University Park, Hershey, Pennsylvania, United States of America

  • Nicole Verdiglione,

    Roles Data curation, Investigation, Project administration, Writing – review & editing

    Affiliation Departments of Humanities, Penn State College of Medicine, Hershey, Pennsylvania, United States of America

  • Erik B. Lehman,

    Roles Data curation, Formal analysis, Investigation, Validation, Visualization, Writing – review & editing

    Affiliation Departments of Population Health Sciences, Penn State College of Medicine, Hershey, Pennsylvania, United States of America

  • Robert M. Hamm,

    Roles Investigation, Methodology, Writing – review & editing

    Affiliation Department of Family and Preventive Medicine, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, United States of America

  • Richard Fiene,

    Roles Investigation, Writing – original draft, Writing – review & editing

    Affiliations Departments of Psychology & Human Development Research Center, Pennsylvania State University, Middletown, Pennsylvania, United States of America, Research Institute for Key Indicators, Middletown, Pennsylvania, United States of America

  • Sarah Dore,

    Roles Data curation, Investigation, Project administration, Writing – review & editing

    Affiliation Departments of Humanities, Penn State College of Medicine, Hershey, Pennsylvania, United States of America

  • David E. Bard,

    Roles Investigation, Writing – review & editing

    Affiliation Department of Pediatrics, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, United States of America

  • Breanna Grable,

    Roles Data curation, Project administration, Writing – review & editing

    Affiliation Departments of Psychology & Human Development Research Center, Pennsylvania State University, Middletown, Pennsylvania, United States of America

  • Benjamin Levi

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – original draft, Writing – review & editing

    chengwu.yang@nyu.edu (CY); bhlevi@psu.edu (BL)

    Affiliation Departments of Humanities & Pediatrics, Penn State College of Medicine, Hershey, Pennsylvania, United States of America

Abstract

In recent years, real-world studies (RWS) are gaining increasing interests, because they can generate more realistic and generalizable results than randomized controlled clinical trials (RCT). In 2017, we published a RCT in 741 early childhood care and education providers (CCPs). It is the Phase I of our iLookOut for Child Abuse project (iLookOut), an online, interactive learning module about reporting suspected child maltreatment. That study demonstrated that in a RCT setting, the iLookOut is efficient at improving CCPs’ knowledge of and attitudes towards child maltreatment reporting. However, the generalizability of that RCT’s results in a RWS setting remains unknown. To address this question, we design and conduct this large RWS in 11,065 CCPs, which is the Phase II of the iLookOut. We hypothesize replication of the earlier RCT findings, i.e., the iLookOut can improve CCPs’ knowledge of and attitudes toward child maltreatment reporting in a real world setting. In addition, this RWS also explores whether demographic factors affect CCPs’ performance. Results of this RWS confirmed the generalizability of the previous RCT’s results in a real world setting. It yielded similar effect sizes for knowledge and attitudes as were found in the earlier RCT. Cohen’s d for knowledge improvement was 0.95 in that RCT, 0.96 in this RWS; Cohen’s d for attitude improvement was 0.98 in that RCT, 0.80 in this RWS. Also, we found several significant differences in knowledge and attitude improvement with regard to age, race, education, and employment status. In conclusion, iLookOut improves knowledge and attitudes of CCPs about child maltreatment prevention and reporting in a real-world setting. The generalizability of the initial RCT findings to this RWS provides strong evidence that the iLookout will be effective in other real world settings. It can be a useful model for other interventions aimed at preventing child maltreatment.

Clinical trial registration for the original RCT: NCT02225301 (ClinicalTrials.gov Identifier)

Introduction

While randomized controlled trials (RCT) have long been seen as the “gold standard” for evaluating the efficacy of interventions, there are well-known limitations to their generalizability [1]. Accordingly, there have been growing interests in real-world studies (RWS) to generate real-world evidence (RWE) that are more realistic and generalizable [29], and RWE is increasingly valued by regulators and payers [10]. In addition, RWE and the RCT can happily co-exist and complement each other [9].

Recently, we published data from an RCT about the online educational intervention, the iLookOut for Child Abuse (iLookOut), showing that it improved early childhood care and education providers (CCPs) knowledge and attitudes about child maltreatment and its reporting [11]. In this follow-up study, through an RWS, we evaluate whether these results are generalizable to a broad population of CCPs in a real-world setting.

There are more than 675, 000 confirmed cases of child maltreatment annually in the United States [12], but less than 1% of these are reported by CCPs (U.S. DHHS, 2017). This extremely low report rate by CCPs is alarming, given the fact that about 12 million U.S. children are served in some form of a child care setting, that children five years-old or younger account for 46% of confirmed maltreatment and more than 75% of maltreatment-related deaths (U.S. DHHS, 2017), and that the true incidence of child maltreatment is likely much higher than currently detected [13, 14]. Such underreporting suggests a need for CCPs to become better prepared to protect young children from maltreatment by improving their knowledge and attitude towards child maltreatment reporting. As has been identified by the Institute of Medicine and others, a key obstacle to improving awareness and reporting is the lack of evidence-based interventions [1517]. In addition, the US Preventive Services Task Force (USPSTF) recently called for more evidence-based primary care interventions to prevent child maltreatment [12]. Several small studies have evaluated in-person training for CCPs [18, 19], and a brief online intervention [20, 21]. However, large studies involving scalable interventions are still lacking.

To meet this need, we created iLookout, an interactive online learning program designed specifically for CCPs (https://ilookoutproject.org/). An initial RCT using a test and re-test design with 741 participants demonstrated the feasibility of this three-hour online training, as well as its efficacy at increasing knowledge and changing attitudes about child maltreatment and its reporting [11]. Though this initial trial was promising, with large Cohen’s d effect sizes for knowledge (0.95) and attitudes (0.98), its generalizability was limited by several factors, notably the potential for selection bias. Participants were enrolled only if the director of the child care program responded to the recruitment mailing. Family- and home-based CCP programs were under-represented, as were racial and ethnic minorities. In addition, enrollment was limited to a four-week period in early summer. Also, the sample size limited the opportunity for in-depth comparisons among subgroups.

To address these limitations, the present RWS used a statewide, open-enrollment design to enlist a larger, more representative sample of CCPs. We hypothesized that iLookOut’s efficacy at increasing knowledge and attitudes would be confirmed in this real-world sample, and our exploratory aim was to evaluate the impact of key demographic characteristics.

Materials and methods

Design

The Penn State College of Medicine Institutional Review Board approved this study prior to its initiation (IRB #: 1243). This RWS employed an open enrollment, single group, pre- and post-test design. Participants completed a demographic questionnaire, as well as previously validated knowledge and attitude measures regarding child maltreatment and its reporting [11]. Given the observational feature of this RWS, we have ensured that the manuscript adheres to the appropriate Equator Network guidelines, such as the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) Statement [22].

Participants

As an open-enrollment RWS, participants were not actively recruited to this study. However, all mandated reporters in Pennsylvania (including CCPs) are required by law to complete a mandated reporter training, and iLookOut was one of more than a dozen state-approved trainings listed on Pennsylvania’s Department of Human Services website, and was available online at no charge. As such, online searches and word of mouth were the means for dissemination. Participant data reported here are from CCPs who completed iLookOut between January 2015 and March 2018. CCPs provided online informed consent prior to participating, and earned three hours of professional development credit for completing the learning program. No other incentives or remuneration were provided.

Intervention

The iLookOut online learning program uses an interactive, video-based storyline in which the learners take the role of a teacher of 4–5 year-olds at a child care facility. As key events unfold through interactions involving children, parents, and co-workers (all played by actors), the learners have to decide how to best respond. At different points, learners are posed questions. Based on their answer, they are provided didactic material to educate them about various aspects of child maltreatment. Other times, the learners must choose how to respond to events in the story. Throughout the learning program, CCPs can access multiple resource files covering definitions of maltreatment, facts about maltreatment, red flags, etc.[11].

Measures

The pre- and post-test comprise two parts. The first is a 21-item, true or false, expert-validated instrument previously described [11]. It measures individuals’ knowledge about what constitutes child maltreatment, risk factors for maltreatment, and legal requirements for reporting suspected maltreatment. Correct answer to each of the 21 true or false items is scored as 1 point, and wrong answer is scored as 0 point. Therefore, the total score of the knowledge scale ranges from 0 to 21, which higher score representing more knowledge about child maltreatment. The second part contains 13 items, rated on 7-point Likert-style scales, from a previously validated instrument [23] adapted to comport with Pennsylvania jurisdictional standards. It measures individuals’ attitude towards reporting potential child maltreatment. An individual’s attitude score is the average score of the 13 items, ranges from 1 to 7, with higher score representing more positive attitude towards reporting potential child maltreatment. The pre- and post-test question items were identical, but to minimize recall bias, their sequencing orders were changed between the pre- and the post-test.

Sample size and statistical analysis

Given the RWS nature of this study, no a priori sample size estimation was planned. However, post-hoc power analyses were implemented to check the statistical power for some important subgroup analyses [4]. We also compared participant demographics between the initial RCT and this RWS.

As with the RCT, the statistical analysis of this RWS examined iLookOut’s impact on CCPs’ knowledge and attitudes related to child maltreatment and its reporting. The two primary outcome variables were the total knowledge score and the total attitude score, both measured as “change”, i.e., total score at post-test minus at pre-test. The analysis focused on whether the present RWS confirmed the results of the initial RCT. To compare effect sizes between the RCT and the RWS, we used two measures: 1) the absolute difference, i.e., the measured change in pre- to post-test score for the RWS, minus the measured change in initial RCT; and 2) the Cohen’s d calculation [24]. In addition, we explored the impact of demographic factors on these two primary outcome variables through analysis of covariance (ANCOVA), framing demographic variables as covariates, and adjusting for pre-measurement scores. These demographic variables include age, gender, race/ethnicity, education, employment, parent/guardian status, prior trained status, work environment, years as practitioner, primary job responsibilities, and religiosity. We used the SAS software package, version 9.4, for statistical analyses, and the G*Power software package, version 3.1.9, for post-hoc power analyses.

Results

During the 38 months of the RWS reported here, 11,605 CCPs completed the iLookOut online training. Compared to those CCPs in the initial RCT, these RWS participants were more representative of the general population of CCPs in Pennsylvania, particularly for its enrollment of Blacks (20.8% vs. 8.0%) and males (10.9% vs. 2.3%). In addition, the CCPs in this RWS were younger (48.0% vs. 40.4% aged below 30), and a greater proportion worked in more urban area (36.4% vs. 22.1%). Table 1 illustrates comparisons of full demographics between these two studies.

thumbnail
Table 1. Comparisons of demographic characteristics of early childcare professionals.

https://doi.org/10.1371/journal.pone.0227398.t001

Table 2 illustrates comparisons of the iLookOut training’s effect sizes on knowledge and attitude scores between this RWS and the RCT, demonstrating improved knowledge and attitudes about child maltreatment reporting for both studies. Pre- to post- changes in knowledge score increased by 2.80 for RWS participants, compared to 2.65 in the initial RCT, a 5.7% relative change. The Cohen’s d on the total knowledge score was 0.96 in this RWS versus 0.95 in the RCT, a 1% relative change. The pre-to post- change in attitude average score was 0.5 for RWS participants, versus 0.59 in the initial RCT, a -15.3% relative change. The Cohen’s d on the average attitude score was 0.80 in this RWS, versus 0.98 in the RCT, a relative change of -18.4%.

thumbnail
Table 2. Comparisons of effect sizes on knowledge and attitude scores.

https://doi.org/10.1371/journal.pone.0227398.t002

Table 3 summarizes the results of exploratory multivariate analyses (ANCOVA) for each of the two outcome variables (knowledge and attitude scores) with all of the demographic variables. After adjustment for pre-measurement scores and all the other demographic variables, only four demographics (age, race, education, and employment) showed impacts on either of the two outcome variables, with age and education being positively correlated with increase in knowledge scores.

thumbnail
Table 3. Summary of analysis of covariance (ANCOVA) results.

https://doi.org/10.1371/journal.pone.0227398.t003

Post hoc power analysis indicates that with a total sample size of 11,065, at an alpha level of 0.05, with 80% power, the ANCOVA with 11 covariates would be able to detect an effect size as small as 0.04 among six groups; and using an effect size cut-off of 0.25, then the power would approach to 99.5%.

Discussion

The results from this RWS demonstrate that in a large, representative sample of child care professionals (CCPs), the online iLookOut learning program is effective at improving knowledge and changing attitudes about child maltreatment and its reporting. These findings confirm the conclusions from the initial RCT of iLookOut, and demonstrate the feasibility of scaling this evidence-based, online mandated reporter training. This is notable insofar as more than 11,000 CCPs completed iLookOut, even when no special incentives were offered, and they reported being highly satisfied with the learning experience (paper forthcoming). No significant differences were identified with regard to CCPs’ parenting status, previous training, work environment, years as practitioner, primary job responsibility, or religiosity. However, age, race, education, and employment affected changes in knowledge or attitude scores, with older and more educated CCPs achieving increased gains in knowledge scores.

The generalizability of the initial RCT findings provides supporting evidence that the iLookout online learning program will be effective in other real world settings, and may be a useful model for other interventions aimed at preventing child maltreatment [12]. iLookOut’s general storyline and overall format are generalizable for all kinds of CCPs in all U.S. states, in part because state-specific information is housed in discrete learning modules (within the learning program) that can be readily adapted to comport with the laws and policies of different states. The efficacy of iLookOut does not appear to be affected by previous training, work environment, years as practitioner, primary job responsibility, parenting status, or religiosity. However, larger gains in knowledge were seen in CCPs who were older, more highly educated, employed seasonally, or white. More research is warranted to better understand the underpinnings of these differences, and how best to optimize gains in knowledge for all CCPs.

The statistical analyses reported here focus on effect sizes, instead of p-values, for several reasons. First, p-values are not a good measure of evidence [25]. Second, the misuse and maltreatment of p-values has led both researchers and the American Statistical Association to raise concerns about the limitations of p-value-driven conclusions [26, 27]. Third, the very large sample size (over 11,000) of this RWS could yield findings of statistical significance for even very small effect sizes that have no clinical significance [28]. Fourth, the large difference in sample size between the initial RCT and this RWS renders effect sizes a more meaningful comparison than p-values. Finally, for proposed sub-group analyses involving many demographic covariates, p-values are less likely to yield meaningful findings [29]. Accordingly, we compared effect sizes by examining the overlap of their confidence limits.

The present findings are limited by potential biases encountered in all RWS, including selection bias, information bias, and confounding [3]. Multivariate analysis (ANCOVA) was used to try to account for these factors, and the initial RCT does provide additional reassurance that the present findings are valid. However, without qualitative data, an explanatory model for the present findings will remain incomplete.

Conclusion

This real-world study of more than 11,000 early childhood professionals (CCPs) who were neither recruited nor incentivized to complete the iLookOut for Child Maltreatment confirms that iLookOut significantly improves knowledge and attitudes regarding child maltreatment and its reporting. These results provide strong evidence that interactive, online interventions for helping prevent child maltreatment are both effective and scalable. A 5-year randomized controlled trial (https://clinicaltrials.gov/ct2/show/NCT02225301?term=NCT02225301&rank=1) is currently underway to evaluate how well iLookOut helps CCPs identify and report true child maltreatment.

Acknowledgments

We are grateful to all study participants and their institutions; to the Pennsylvania early childhood education and care agencies which facilitated the research; to participants and expert reviewers in the pilot phases; to the Center for the Application of Information Technologies (CAIT) for technological support; and to the Penn State University Center for the Protection of Children for general support.

References

  1. 1. Kaptchuk TJ. The double-blind, randomized, placebo-controlled trial: gold standard or golden calf? J Clin Epidemiol. 2001;54(6):541–9. pmid:11377113
  2. 2. Garrison LP Jr, Neumann PJ, Erickson P, Marshall D, Mullins CD. Using real‐world data for coverage and payment decisions: The ISPOR real‐world data task force report. Value in Health. 2007;10(5):326–35. pmid:17888097
  3. 3. Sherman RE, Anderson SA, Dal Pan GJ, Gray GW, Gross T, Hunter NL, et al. Real-World Evidence—What Is It and What Can It Tell Us? N Engl J Med. 2016;375(23):2293–7. pmid:27959688
  4. 4. Administration USFD. Use of real-world evidence to support regulatory decision-making for medical devices. Guidance for industry and food and drug administration staff. Docket Number: FDA-2016-D-2153. 2017.
  5. 5. Sun X, Tan J, Tang L, Guo JJ, Li X. Real world evidence: experience and lessons from China. BMJ. 2018;360:j5262. pmid:29437644
  6. 6. Corrigan-Curay J, Sacks L, Woodcock J. Real-world evidence and real-world data for evaluating drug safety and effectiveness. JAMA. 2018;320(9):867–8. pmid:30105359
  7. 7. Food U, Administration D. Framework for FDA’s real-world evidence program. 2018.
  8. 8. Klonoff DC. The Expanding Role of Real-World Evidence Trials in Health Care Decision Making. Journal of Diabetes Science and Technology. 2019:1932296819832653.
  9. 9. Evans K. Real World Evidence: Can We Really Expect It to Have Much Influence? Drugs—Real World Outcomes. 2019;6(2):43–5. pmid:31016548
  10. 10. Calvert MJ, O’Connor DJ, Basch EM. Harnessing the patient voice in real-world evidence: the essential role of patient-reported outcomes. Nature Reviews Drug Discovery. 2019;18(5).
  11. 11. Mathews B, Yang C, Lehman EB, Mincemoyer C, Verdiglione N, Levi BH. Educating early childhood care and education providers to improve knowledge and attitudes about reporting child maltreatment: A randomized controlled trial. PLoS ONE. 2017;12(5):e0177777. pmid:28542285
  12. 12. Curry SJ, Krist AH, Owens DK, Barry MJ, Caughey AB, Davidson KW, et al. Interventions to prevent child maltreatment: US Preventive Services Task Force Recommendation Statement. JAMA. 2018;320(20):2122–8. pmid:30480735
  13. 13. Kohl PL, Jonson-Reid M, Drake B. Time to leave substantiation behind: Findings from a national probability study. Child Maltreatment. 2009;14(1):17–26. pmid:18971346
  14. 14. Stoltenborgh M, Bakermans‐Kranenburg MJ, van IJzendoorn MH, Alink LR. Cultural–geographical differences in the occurrence of child physical abuse? A meta‐analysis of global prevalence. International Journal of Psychology. 2013;48(2):81–94. pmid:23597008
  15. 15. Alvarez KM, Kenny MC, Donohue B, Carpin KM. Why are professionals failing to initiate mandated reports of child maltreatment, and are there any empirically based training programs to assist professionals in the reporting process? Aggression and Violent Behavior. 2004;9(5):563–78.
  16. 16. Dinehart L, Kenny MC. Knowledge of child abuse and reporting practices among early care and education providers. Journal of Research in Childhood Education. 2015;29(4):429–43.
  17. 17. IOM, NRC. New directions in child abuse and neglect research. The National Academies Press Washington, DC; 2014.
  18. 18. McGrath P, Cappelli M, Wiseman D, Khalil N, Allan B. Teacher awareness program on child abuse: a randomized controlled trial. Child Abuse & Neglect. 1987;11(1):125–32.
  19. 19. Khan A, Rubin D, Winnik G. Evaluation of the mandatory child abuse course for physicians: do we need to repeat it? Public Health. 2005;119(7):626–31. pmid:15925678
  20. 20. Kenny MC. Web-based training in child maltreatment for future mandated reporters. Child Abuse & Neglect. 2007;31(6):671–8.
  21. 21. Kenny MC, Lopez-Griman AM, Donohue B. Development and initial evaluation of a cost-effective, internet-based program to assist professionals in reporting suspected child maltreatment. Journal of Child & Adolescent Trauma. 2017;10(4):385–93.
  22. 22. Von Elm E, Altman D, Egger M, Pocock S. G? tzsche PC, Vandenbroucke JP: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med. 2007;4(10):e296. pmid:17941714
  23. 23. Walsh K, Rassafiani M, Mathews B, Farrell A, Butler D. Exploratory factor analysis and psychometric evaluation of the teacher reporting attitude scale for child sexual abuse. J Child Sex Abus. 2012;21(5):489–506. pmid:22994689
  24. 24. Cohen J. A power primer. Psychological bulletin. 1992;112(1):155. pmid:19565683
  25. 25. Piantadosi S. Clinical trials: a methodologic perspective: John Wiley & Sons; 2017.
  26. 26. Nuzzo R. Scientific method: statistical errors. Nature News. 2014;506(7487):150.
  27. 27. Wasserstein RL, Lazar NA. The ASA's Statement onp-Values: Context, Process, and Purpose. The American Statistician. 2016;70(2):129–33.
  28. 28. Yang C, Vrana KE. Rescuing Suboptimal Patient-Reported Outcome Instrument Data in Clinical Trials: A New Strategy. Healthcare (Basel). 2018;6(1).
  29. 29. Rothwell PM. Subgroup analysis in randomised controlled trials: importance, indications, and interpretation. The Lancet. 2005;365(9454):176–86.