Skip to main content
  • Research article
  • Open access
  • Published:

Changes in practitioners’ attitudes, perceived training needs and self-efficacy over the implementation process of an evidence-based parenting program

Abstract

Background

Evidence-based family support programs such as the Triple P – Positive Parenting Program have the potential to enhance the well-being of children and families. However, they cannot achieve their expected outcomes if insufficient attention is paid to the implementation process. It has been demonstrated that practitioners’ attitudes towards evidence-based programs (EBPs), perceived training needs and self-efficacy for working with parents influence implementation outcomes (e.g., program acceptability, adoption, adherence and sustainability). At the same time, the experience of being involved in the implementation process of an EBP could enhance practitioners’ perceptions of the initiative. This study aimed to assess changes in practitioner’s attitudes, perceived training needs and self-efficacy over a two-year EBP implementation process, in interaction with their appraisal of their organization’s capacity to implement the EPB.

Methods

In the province of Quebec, Canada, Triple P was implemented and evaluated in two communities. Ninety-nine practitioners from various organizations completed questionnaires shortly before their training in Triple P and two years later.

Results

Findings show that practitioners who displayed more initial skepticism regarding their organization’s capacity to implement the program reported greater improvements in attitudes over time, while practitioners who showed more optimism at baseline reported a greater decrease in their perceived training needs. Practitioners’ self-efficacy increased moderately regardless of perceived organizational capacity.

Conclusions

These results are encouraging given that more positive perceptions of EBPs could foster the systematic use of these programs in communities, for the potential benefit of a greater number of families.

Peer Review reports

Background

It is increasingly recognized that choosing empirically supported interventions is not enough to improve the well-being of children and families. The way evidence-based programs (EBPs) are implemented within existing delivery systems also matters. Implementation is considered the “missing link” between research and practice [1]. According to Durlak and Dupre [2], the mean effect size of a program’s outcomes can be up to twelve times higher when ideal implementation conditions are met. Optimal program outcomes are contingent on the achievement of implementation outcomes such as program adoption (intention to try), adherence (program delivered as intended) and sustainability (sustained use) [3].

Practitioners’ attitudes towards EBPs, perceived training needs and self-efficacy for working with parents, in particular, have been shown to predict these positive implementation outcomes [4, 5]. At the same time, the experience of being involved in the implementation of an EPB could change practitioners’ perceptions regarding the relevance of EBPs in their practice, potentially reducing their resistance to these programs, sometimes considered a major barrier to the systematic adoption of EBPs in communities [6, 7]. This study thus aims to document changes over time in the attitudes, perceived training needs and self-efficacy of practitioners involved in the implementation of an evidence-based parenting program, namely the Triple P – Positive Parenting Program.

Attitudes, both affective and cognitive in nature, play an important role in orienting people’s decisions and behaviors [8, 9] and are essential components of many motivational theories [10]. Drawing on these theoretical perspectives, many authors have investigated practitioners’ attitudes as predictors of implementation outcomes [11, 12]. Studies have shown that favorable attitudes towards EBPs are related to program adherence [4], commitment to and satisfaction with EBP training, and subsequent use of EBPs [13]. Various factors also appear to influence the valence and intensity of practitioners’ attitudes towards EBPs, such as their prior knowledge of these programs [14], level of education and amount of previous experience as clinicians [15].

Practitioners’ perceived training needs related to their intervention abilities are considered a dimension of motivational readiness for change, according to Simpson’s [16] conceptual framework for transferring research into practice. When practitioners perceive that they could benefit from further training to enhance their work with clients, they may be more inclined to bring about changes in their practice. One way of achieving this could be to engage practitioners in the implementation process of an EBP. Higher motivational readiness, measured, for example, among practitioners in the field of treatment addiction, through a combination of perceived pressure to change and perceived training needs, has been linked to increased adherence to the core components of a cognitive-behavioral EBP [17].

According to Bandura [18], self-efficacy refers to a person’s confidence in his/her capability to perform a specific task and is thought to have a greater influence on actual behavior and performance than the person’s objective ability to do the task [19]. In the implementation field, self-efficacy refers to the degree of practitioners’ confidence in their ability to deliver the program components. When it comes to delivering evidence-based parenting programs in particular, practitioners’ level of self-efficacy is considered an important predictor of implementation outcomes such as increased program use [5, 20] and increased ability to deliver the program with both flexibility and fidelity to its core components [21]. Mazzucchelli and Ralph [22] conceptualize self-efficacy (the capacity to undertake specific therapeutic tasks) as a component of self-regulatory (the ability to manage one’s own emotions and behaviors to achieve specific goals), along with self-management (the capacity to define and monitor goals for himself and the client), personal agency (the tendency to attribue changes to clients and their own efforts instead of chance), self-sufficiency (the ability to be an independent problem solver who also seek support when needed) and problem solving (the capacity to define a problem and select strategies to overcome it). According to these authors, self-regulatory skills in practitioners are crucial, because it allows them to change their own behavior in response to cues and information about the current needs of parents to be more effective when working with them, independantly of the organizational culture or context [22].

While practitioners’ attitudes towards EBPs, perceived training needs and self-efficacy have mainly been examined as determinants of a program’s efficacy or effectiveness, few studies have investigated changes in these variables over time, particularly in the child and family services field. Regarding attitudes, Lim et al. [14] observed an increase in the appeal of EBPs as perceived by community mental health practitioners immediately following their participation in three workshops on evidence-based techniques intended to decrease internalizing and externalizing problems among youths. Another study involving five measures over a 14-month period yielded different results, with no changes in attitudes towards EBPs being observed among practitioners in the child welfare sector who received training in an evidence-based parenting program [23]. However, as pointed out by the authors of this study, the study context did not involve a “full implementation strategy.” Had such a strategy been applied, it is possible that training in this program would have had a greater impact on participants’ attitudes. The same limitation applies to a study conducted over a two-year period in which no changes were found in perceived training needs among substance abuse treatment counselors who participated in a workshop on EBPs at mid-point in the study [24]. However, being trained in an EBP appears to have a significant positive effect on practitioners’ self-efficacy for delivering the program’s components. Studies in various fields, such as the promotion of healthy habits among children [25] and parenting skills training [5, 26], have demonstrated this effect, with the interval between measures ranging from a few days to more than two years.

In summary, studies examining the evolution of practitioners’ attitudes, perceived training needs and self-efficacy have yielded mixed results. Moreover, most of these studies did not take place in a context where practitioners participated in a structured implementation process. Such a process involves multiple steps requiring sensitivity to the context and the point of view of actors in the field [27]. Among the many frameworks describing the steps or stages of implementation [28], the present study used the Quality Implementation Framework (QIF) as the basis for the hypotheses formulated. The QIF was developed by Meyers, Durlak, and Wandersman [29] by synthesizing 25 previous models. It conceptualizes the implementation process in terms of fourteen steps, such as conducting a fit assessment between the host setting and the chosen program, recruiting and training staff, and creating an ongoing monitoring system to provide technical assistance and supportive feedback. The last step of this process is labeled “learning from experience.” The QIF is a cyclical model based on the assumption that the experience gained through the implementation process of any EBP will lead to new learning. This learning will be useful for building organizational capacity (i.e., resources, competencies, attitudes, coordination, etc.) and can later be generalized to start a new cycle of implementation [29]. This assumption suggests that being part of a structured implementation process will have a greater impact on practitioners’ attitudes, perceived training needs, and self-efficacy than mere exposure to an EBP training program. In any case, as pointed out by Weisz et al. [30], given all the efforts that have been put into EBP implementation processes, it is now important to focus on practitioners’ responses to them.

An investigation of this nature should take into account the organizational context in which the implementation of an EBP takes place. Although implementation models typically emphasize the role of both provider- and organization-level factors [31, 32], little is known regarding the interaction between these two levels of factors in the implementation process of EBPs [33, 34].

An organization’s capacity to implement an EBP involves multiple aspects such as administrative support, funding, the clarity of the agency’s mission and goals, staff and supervisor buy-in, staff cohesion, and the quality of clinical supervision [35,36,37]. Practitioners’ subjective appraisal of these organizational factors has been linked to their attitudes, perceived training needs and self-efficacy. For instance, Izmirian and Nakamura [38] found that practitioners in youth mental health services were more likely to have positive attitudes towards EBPs when they reported experiencing a less stressful work environment. Nurses’ attitudes towards EBPs were also found to improve following an organizational intervention that included mentoring by a nurse researcher and the provision of funding to attend conferences promoting the use of evidence-based practices [39]. Moreover, having a supervisor that promotes teamwork and cohesion has been linked to higher levels of self-efficacy, especially among practitioners with less than two years’ experience in their field [19]. Finally, employees’ level of trust in their organization (defined as positive expectations regarding organizational support, integrity and consistency) has been found to moderate the influence of self-efficacy on job satisfaction and task performance [40].

In light of these considerations, the present study aimed to assess changes in practitioners’ attitudes towards EBPs, perceived training needs, and self-efficacy for working with parents over a two-year EBP implementation process. Based on the aforementioned studies and the assumption of the QIF that any implementation process will generate new learning [29], we expected to see an improvement in attitudes, an increase in self-efficacy, and a decrease in perceived training needs over time. It was hypothesized that the level of change in all these variables would be moderated by the level of confidence in the organization’s capacity to implement the program at baseline (i.e., a higher subjective rating of organizational capacity would lead to a greater improvement in attitudes, a greater increase in self-efficacy and a greater decrease in perceived training needs).

Methods

Context

This study is part of a larger study evaluating the implementation of an EBP, the Triple P – Positive Parenting Program, in two health services catchment areas in the province of Quebec, Canada. Triple P entails a five-level integrated system of universal, selective, and indicated interventions whose intensity increases along with the needs of parents of 0–12 year-old children [41]. There is significant scientific evidence supporting Triple P’s efficacy for increasing positive parenting practices and reducing emotional and behavioral problems in children [42,43,44,45]. There is also some evidence that Triple P prevents child maltreatment [46, 47]. The present study focused on Selective Triple P (Level 2 – public seminars), Primary Care Triple P (Level 3 – individual coaching), Group Triple P (Level 4 – parent training), and Pathways Triple P (Level 5 - active skills training including cognitive reattribution components), delivered by trained practitioners [41]. Service delivery was supported by a promotional campaign (Level 1) developed locally [48, 49]. In each community implementing Triple P, a team of community partners carefully planned the implementation process [50]. These partners came from different sectors of activity (child care services, schools, non-governmental and governmental organizations). Managers in the partner organizations targeted practitioners to receive training in one or more levels of Triple P. To receive the proposed training, practitioners had to agree to participate in the study. Data were collected among trained practitioners through a pre-implementation survey (prior to Triple P training) and a post-implementation survey (1–2 years later). Meanwhile, the practitioners were expected to deliver the various components of Triple P and monitor their Triple P interventions on an ongoing basis, with the support of the research team. This procedure was approved by the relevant ethics research board.

Several means were put in place to ensure optimal implementation of Triple P in the communities. First of all, the implementation was carefully prepared in accordance with the QIF [29]. In particular, the needs and resources of the targeted communities were assessed, as well as their readiness to act in maltreatment prevention. In addition, the differentiation of Triple P from other parenting support programs in use in Quebec was established, in order to ensure possible linkages with other programs. Two local implementation coordinators from each of the communities were hired to act as resource person during all phases of implementation. Their role included mobilizing other partners in the field and acting as a bridge between the research team and the partners.

A local implementation committee was formed in each of the communities, bringing together regional and local partners, i.e., representatives of government authorities (e.g., public health department, youth protection department), the local coordinator for the implementation of the territory, as well as managers or representatives of partner organizations. The mandate of these implementation teams was to plan the concerted implementation of Triple P on their territory.

During the active implementation phase of the program, the local implementation coordinators were mandated to provide supervision, to help the practitioners while promoting their self-regulation, and to help refer parents to the level most suited to their needs. The managers were briefed on their role in supporting practitioners, which included informing the implementation team members of the needs of their staff, providing time and tools to practitioners to become efficient in delivering the program, and working in collaboration with the other organizations to share resources and knowledge. Finally, the research team established procedures to facilitate the work of practitioners, for example, by providing them with an electronic tablet that they could use to show parents intervention materials (Triple P videos and tip sheets, for example) and by encouraging them to document their interventions using specially designed computerized monitoring tools. While the research team was more involved in the planning and coordination of the initiative at the beginning of the project, it took on more of a coaching role over time so that communities partners could take ownership of the initiative and develop their collective capacity for implementation on their own.

Participants

Participants were 115 practitioners (93% females) trained in at least one level of Triple P in fall 2014 (n = 94) or fall 2015 (n = 21). Of these, 99 completed the posttest (retention rate: 86%). Participants’ characteristics are presented in Table 1. Posttest completers and non-completers were similar with regard to all sociodemographic variables, except the number of years of experience working with families, with completers having significantly more experience (M = 14.04, SD = 9.41) than non-completers (M = 8.29, SD = 5.33), t (26.7) = − 3.35, p = .002.

Table 1 Sociodemographic Characteristics of Participants (N = 99)

Measures

Variables were assessed using four validated questionnaires completed at pretest and posttest. All measures were translated into French by the research team (except for the PCSC measure that was translated by Triple P International) and contextualized to the implementation of Triple P when applicable. Since the members of the research team are bilingual, including both people whose primary language is French or English, the translation of the questionnaires was the result of collaborative and iterative work between them. The back translation process recommended by some authors [51, 52] was not considered necessary in this context. Internal consistency was calculated for each questionnaire translated and used in the present study to ensure the validity of the measures. A sociodemographic questionnaire was included to collect background information on participants (sex, academic background, discipline, years of experience working with families and type of organization). All translated versions of the questionnaires used in this study are provided as supplementary files (see Additional file 1 for pretest questionnaires and Additional file 2 for posttest questionnaires).

Attitudes towards EBPs

Participants’ attitudes towards EBPs were assessed using the Evidence-Based Practice Attitude Scale (EBPAS) [15]. This questionnaire comprises 15 items rated on a Likert-type scale (1 = not at all to 5 = to a very great extent) and divided into four subscales: Appeal (extent to which EBPs are intuitively appealing to the practitioner); Requirements (extent to which the practitioner would adopt an EBP if his/her supervisor required it); Openness (general receptivity to new practices); and Divergence (perceived divergence between EBPs and the current practice). With the exception of the Divergence subscale, higher scores indicate more favorable attitudes towards EBPs. Internal consistency was satisfactory in both Aarons’ [15] original validation study (Chronbach’s alphas for subscales = .80, .90, .78, .59, respectively) and the present study (.73, .93, .87 and .71).

Perceived training needs

This variable was assessed using the Training Needs subscale of the Organizational Readiness for Change measure (ORC) [50], comprising 8 items rated on a Likert-type scale (10 = strongly disagree to 50 = strongly agree). In the present study, the last item, relating to “using computerized client assessments,” was removed because it did not apply to the context. The remaining items assessed, for example, the extent to which practitioners felt they needed more training to increase client participation in treatment, monitor client progress or improve client thinking and problem-solving skills. This subscale, conceptualized as a measure of motivational readiness for change, demonstrated good internal consistency in both Lehman et al.’s study [53] (Chronbach’s α = .84) and the present study (a =.87).

Self-efficacy

The Parent Consultation Skills Checklist (PCSC) [5], translated into French by Triple P International, was used to assess the practitioners’ level of confidence in their skills for working with parents reporting difficulties with their children. This measure, developed by Turner and Sanders [54], is specifically tailored to levels 2, 3, 4, and 5 of the Triple P program. Items refer to both content self-efficacy (e.g., teaching positive parenting principles to parents) and process self-efficacy (e.g., installing and using the audiovisual equipment required for the session) [26], and are rated on a Likert-type scale (1 = not at all confident” to 7 = very confident). This instrument showed good internal consistency in both Turner et al. study’s [5] (Chronbach’s α = .96 to .97 for the difference program levels) and in the present study (Chronbach’s α = .92, .96, .94 and .95, respectively). At pretest, the PCSC was completed just before training in each level of Triple P. When practitioners were trained in more than one level, only the score on their first completed pretest PCSC was used in the analyses. At posttest, practitioners completed a PCSC for each level of Triple P in which they had received training. A mean score for all the completed posttest PCSCs (ranging from 1 to 7) was computed and used in the analyses.

Perceived organizational capacity at pretest

This variable was assessed by computing an aggregated score for three subscales of the Factors Related to Program Implementation measure (FRPI) [36]: Ideal Agency, Ideal Staff, and Ideal Champion. This procedure was justified given the high correlation found between these three subscales (r ranging from .51 to .80). Practitioners rated 24 Likert-type items assessing the extent to which various characteristics of the agency, staff, and supervisor would be a barrier or an asset to the implementation of Triple P (1 = significant barrier to 5 = significant asset). FRPI items cover different agency characteristics (e.g., perceived coherence of Triple P with organizational mandate, perceived quality of program coordination), staff characteristics (e.g., perceived level of motivation and competence, and communication between team members), and supervisor characteristics (e.g., perceived level of motivation, competence, availability and support). The aggregated score showed good internal consistency in the present study (α = .85).

Procedures

Pretest surveys were completed a few days prior to Triple P training. Posttest surveys were sent to participants and collected in fall 2016, or earlier if the practitioner was going to be leaving the organization for any reason, such as maternity leave, prolonged sick leave or a change of assignment. To increase the response rate, follow-up calls were made to practitioners who did not return their questionnaire within the prescribed period.

Statistical analyses

Descriptive analyses of variable distributions revealed no problems related to the conditions of use of any of the planned analyses. A negligible amount of missing data was found for each dependent variable (3.5% on average). Consequently, procedures for handling missing data were deemed unnecessary [55]. Analyses were conducted using SPSS and SPSS macro PROCESS [56].

Using a bootstrapping method, six linear regressions were conducted to test the interaction effect of perceived organizational capacity on the level of change in the dependent variables over time. The six dependent variables were the levels of change in the four attitude subscales of the EBPAS (Appeal, Requirements, Openness and Divergence), the ORC Training Needs subscale and the PCSC Self-efficacy measure; the moderator was the global FRPI score; and the independent variable was time (pretest, posttest). Figure 1 illustrates the moderation model tested.

Fig. 1
figure 1

Illustration of the moderation model tested. Note. X = Predictive variable, M = Moderator, Y = Predicted variable (i.e. Change in: Appeal of EBPs; Propensity to use an EBP if required by the supervisor; Openness to new practices; Perceived divergence between EBPs and the current practice; Perceived training needs related to working with parents; and Self-efficacy for working with parents)

The Johnson-Neyman procedure, probing interactions with continuous moderators, was performed to determine regions of significance of the interaction effect. This procedure indicates the value of the moderator (i.e., the specific score of the FRPI) at which the interaction effect becomes significant. The advantage of this method is that it provides a more complete picture of the interaction effect and does not require an arbitrary dichotomization of the moderating variable [57].

Regression analyses were conducted to test for the presence of a time effect when the interaction effect was not significant. These analyses controlled for participant characteristics (e.g., level of education, prior experience, type of organization) previously associated in the literature with participants’ attitudes, training needs or self-efficacy [12, 58, 59]. The analyses also controlled for the length of time between pre-test and post-test, since it varied between one and two years depending on participants.

Preliminary analyses involving sociodemographic data showed that only two control variables were significant predictors in some tested models: practitioners’ prior experience working with families (in number of years) and community membership (i.e., working in one health catchment area or the other). Including these variables as covariates did not change the direction, magnitude or significance of the results. More parsimonious models excluding these covariates are thus presented below.

Results

Average scores for practitioner’s attitudes, perceived training needs and self-efficacy at pretest and posttest are presented in Table 2. The average score for perceived organizational capacity, measured at pretest, was 4.12 (SD = .73).

Table 2 Scores for Practitioners’ Attitudes, Perceived Training Needs and Self-Efficacy

As indicated in Table 3, a significant interaction effect of time X perceived organizational capacity was found for Appeal of EBPs, Openness to new practices and Perceived training needs. No significant interaction effect of time X perceived organizational capacity was found for Self-efficacy. However, a main effect of time was observed, with practitioners’ level of self-efficacy for working with parents significantly increasing from pretest to posttest, F (1, 201) = 49.83, p < .001, R2 = .20, b = .87, t (201) = 7.06, p < .001. The effect size was moderate, with a standardized beta coefficient of .45. No significant interaction or time effects were found for Requirements (propensity to use an EBP if required by the supervisor) or Divergence (perceived divergence between EBPs and the current practice).

Table 3 Interaction Effect of Perceived Organizational Capacity on the Level of Change in Practitioners’ Attitudes, Perceived Training Needs and Self-Efficacy, and Time Effect

As illustrated in Fig. 2, the magnitude of the positive change in the Appeal of EBPs increased when practitioners’ rating of Organizational capacity was lower at pretest. The Johnson-Newman technique revealed that the interaction effect was significant when the Organizational capacity score was between 1.80 and 3.67, this is, when practitioners tended to perceive more barriers than assets regarding their organization’s capacity to implement Triple P. The effect size varied from large to moderate in this zone of significance, with standardized beta coefficients ranging from 1.35 (FRPI = 1.80) to .32 (FRPI = 3.67). The same pattern was observed for Openness to new practices. Overall, the effect of time on Openness was stronger when practitioners’ rating of Organizational capacity was low. The interaction effect was significant when the FRPI global score was between 1.80 and 3.06 and standardized beta coefficients ranged from 1.03 (FRPI = 1.80) to .47 (FRPI = 3.06). Also, a greater decrease in Perceived training needs was observed when practitioners tended to perceive more assets than barriers regarding their Organization’s capacity to implement Triple P. Specifically, the Johnson-Newman technique revealed that the time X organizational capacity interaction effect was significant when the Organizational capacity score was between 3.83 and 5.00. The effect size varied from moderate to large in this zone of significance, with standardized beta coefficients ranging from −.31 (FRPI = 3.83) to −.91 (FRPI = 5.00).

Fig. 2
figure 2

Moderation of Variables A, B and C Over Time by Perceived Organizational Capacity. Note. *The moderation effect for this value is in the zone of significance (p < .05)

Discussion

This study aimed to assess changes over time in the attitudes, perceived training needs and self-efficacy of practitioners involved in the implementation of an EBP, namely, the Triple P program. Results suggest that even before being trained in Triple P, the practitioners as a group showed favorable attitudes towards EBPs, and felt quite confident in their ability to deliver the program components. However, they expressed a moderate need for training related to working with parents. The less confident the practitioners felt regarding their organization’s capacity to implement Triple P at pretest, the greater the extent to which the appeal of EPBs and the practitioners’ openness to new practices increased over the course of implementing this program. Moreover, a higher level of initial confidence regarding their organization’s capacity to implement Triple P was associated with a greater decrease in perceived training needs over time. A moderate increase in self-efficacy over time was seen for all practitioners, regardless of their initial perception of organizational capacity.

These favorable changes in the practitioners’ attitudes, perceived training needs and self-efficacy could reflect the considerable effort made by the local coalitions to ensure a high-quality implementation process [28, 57]. In support of this idea, this study appears to be among the only ones to find positive changes over time in practitioners’ attitudes and training needs towards EBPs. Contrary to the present study, no structured implementation strategy was put in place to support practitioners following their EBP training in the other studies [22, 23]. The authors had suggested that this may be a reason why there was no change in practitioners’ perceptions over time. Moreover, no decrease in attitudes or self-efficacy and no increase in perceived training needs were reported. These findings partially support this study’s hypothesis that positive changes in all variables would be observed, based on the assumption of the QIF that every cycle of implementation would foster learning and improvements that could later be used to start a new cycle [29]. However, no changes were observed in two variables of attitudes, and some of these positive changes were moderated by the practitioners’ initial perception of their organization’s capacity to implement Triple P.

Regarding attitudes towards EBPs, the results indicate that the levels of change in the appeal of EBPs and practitioners’ openness to new practices were moderated by organizational capacity, but not in the direction expected. Indeed, it was hypothesized that a more favorable perception of organizational capacity would predict greater improvements in attitudes, based on previous studies reporting positive associations between attitudes and organizational factors [38, 39]. Instead, this study showed that perceiving more barriers than assets to the implementation of Triple P predicted a greater improvement in the attitudes of practitioners. These findings bring out nuances regarding the suggestion emerging from the implementation literature that initial staff buy-in should be obtained before engaging in any implementation process [15, 57]. For instance, having found that school counselors were more likely to ensure better implementation outcomes when they met initial characteristics such as not being cynical and not being limited by excessive managerial control, Lochman and al [34]. emphasized the need to carefully screen for the staff to be trained before beginning the implementation of a program. While it is likely that minimal staff buy-in at the outset of the implementation of an EBP is necessary in order for the program to be offered, the results of the present study show that such buy-in may not need to be very high or consistent among practitioners. Indeed, in the present study, an initially critical or neutral stance regarding the organization’s capacity to implement an EBP was associated with greater positive changes in both the appeal of EBPs and the practitioners’ openness to new practices over time. As demonstrated by Leathers et al. [23], such an improvement in attitudes could lead to higher engagement in support activities following training (e.g., seeking consultation with a mentor), which could in turn improve implementation and program outcomes.

On the other hand, the results pertaining to training needs confirmed the initial hypothesis of this study, with perceived training needs decreasing over time, especially among practitioners who initially displayed more optimism regarding their organization’s capacity to implement Triple P. It is possible that, having greater confidence in the successful implementation of the program, these practitioners were able to engage more actively in the training and subsequent clinical supervision provided. They may thus have drawn greater benefit from their participation in the implementation process, as reflected in a decrease in their perceived training needs.

In this study, practitioners’ confidence in their skills for delivering Triple P interventions increased moderately over time. The extent of this change was not moderated by perceived organizational capacity. Many studies have shown that active training, such as that for Triple P, tends to increase practitioners’ self-efficacy, an effect that can be maintained over time [5, 60]. The finding of the present study raises the following question: Did the increase observed after two years simply reflect the sustained effect of the initial training or, as hypothesized, could it also have been due to the practitioners’ experience of being involved in a structured and supportive implementation process? Given that the practitioners’ initial appraisal of organizational capacity did not moderate changes in their level of self-efficacy over time, it is possible that the observed changes in self-efficacy were moderated or mediated by other factors which came into play during or after the initial training, such as increased practice with the program or higher perceived benefits for parents. Turner et al. [5] observed, for example, that post-training self-efficacy predicted the level of program use six months later. Without testing their hypothesis, these authors proposed that, in turn, successful use of the program with clients would likely increase practitioners’ level of confidence in their skills and motivate them to use the program again, leading to a positive feedback cycle between self-efficacy and level of use. Moreover, Shapiro, Prinz and Sanders [21] highlighted the role of “early wins” (i.e., early experiences of success with the program and related positive impact on families) on the later experience of providers who reported sustained use of the program over a number of years. It is possible that experiencing such “early wins” builds practitioners’ self-efficacy, assuming that practitioners’ skills for delivering the program, using an effective balance of flexibility and fidelity [61], actually begin to improve [5, 20]. The fact that practitioner’s self-efficacy improved over time independantly of their perception of the organizational capacity could also mean that changes in self-efficacy are more linked to individual factors than contextual and organizational factors. Indeed, Mazzucchelli and Ralph [22] conceptualize self-efficacy as a component of self-regulatory skills, which tend to vary from one practitioner to another even when working in a similar context. It should be noted, however, that these authors do not overlook the influence of external factors, with particular emphasis on how the self-regulatory skills of stakeholders can be increased by different interventions. For example, they recommend that practitioners receive an EBP training with a trainer that help them to self-evaluate and self-reinforce, that they receive peer supervision once they use the EBP and that they be able to monitor their results with their clients. All those elements were generally put in place by the local coalitions in the current implementation initiative, and therefore could have helped improve self-efficacy in practitioners over time, as seen in this study’s results.

Regardless of the reason why these positive changes in self-efficacy occurred, these results are of clinical importance, since higher self-efficacy is associated with better actual skills in performing the task and greater resilience to stressors [18, 19]. Therefore, a higher mastery of the skills needed to deliver the program components and greater persistence through the numerous challenges of implementation should lead to better outcomes for families [19, 21].

This study has some limitations. First, the absence of a comparison group makes it impossible to determine whether the changes observed among the practitioners over time were actually due to their participation in the implementation process, or to other factors such as the simple passage of time or mere exposure to Triple P training. An experimental design including randomization of practitioners in experimental and control groups would also have eliminated the threat of statistical regression to the mean that can occur in a pretest/posttest study design [62]. Second, the results found for attitudes towards EBPs were limited by the instrument chosen. For instance, the possibility that the perceived divergence decreased with regard to Triple P specifically could not be captured by the instrument used, since the EBPAS items focus on manual-based programs in general [14, 15, 63]. Third, the implementation portrait provided in this study may predominantly reflect the views of more experienced practitioners, since less experienced practitioners were underrepresented in the sample. However, the fact that the results were not affected by the amount of previous experience when this control variable was included in the analyses raises confidence that this limit does not represent a threat to the validity of the present findings. Fourth, the French translations of the questionnaires used in this study were not subjected to a full adaptation process in accordance with the guidelines of the International Test Commission (ITC) [64]. These guidelines recommend in particular that the translated version of the instrument be tested initially to ensure its validity with the new population. In this study, the sole analyses conducted on the translated questionnaires were aimed at verifying their internal consistency.

Conclusion

At a time when the scientific community and decision-makers are promoting the use of practices based on the best available evidence, it is interesting to see that the practitioners who participated in the present study generally displayed confidence and enthusiasm at the beginning of the implementation of this EBP. Is it even more encouraging to observe that being involved in this process appears to have positively influenced the perceptions of practitioners who were less confident at the outset. These findings support the idea that the efforts invested in the implementation process of EBPs by communities wishing to adopt such programs are worthwhile, even when some staff members initially appear to be less inclined to engage in the initiative.

Availability of data and materials

The data that support the findings of this study are available from Prof. Marie-Hélène Gagné but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of Prof. Marie-Hélène Gagné, who can be reached by e-mail: marie-helene.gagne@psy.ulaval.ca.

Abbreviations

EBP:

Evidence-based program

EBPAS:

Evidence-Based Practice Attitude Scale

FRPI:

Factors Related to Program Implementation

ORC:

Organizational Readiness for Change

PCSC:

Parent Consultation Skills Checklist

Triple P:

Positive Parenting Program

QIF:

Quality Implementation Framework

References

  1. Fixsen DL, Naoom S, Blase K, Wallace F. Implementation: the missing link between research and practice. APSAC Advis. 2007;19(1–2):4–11.

    Google Scholar 

  2. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3–4):327–50.

    Article  PubMed  Google Scholar 

  3. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38(2):65–76.

    Article  Google Scholar 

  4. Beidas RS, Mychailyszyn MP, Edmunds JM, Khanna MS, Downey MM, Kendall PC. Training school mental health providers to deliver cognitive-behavioral therapy. School Ment Health. 2012;4(4):197–206 Available from: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=4013998&tool=pmcentrez&rendertype=abstract.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Turner KMT, Nicholson JM, Sanders MR. The role of practitioner self-efficacy, training, program and workplace factors on the implementation of an evidence-based parenting intervention in primary care. J Prim Prev. 2011;32(2):95–112 Available from: http://www.ncbi.nlm.nih.gov/pubmed/21451942. [cited 2014 Nov 1].

    Article  PubMed  Google Scholar 

  6. Gagné M-H, Drapeau S, Saint-Jacques M-C. Qu’est-ce qui fonctionne pour prévenir la maltraitance? In: Gagné M-H, Drapeau S, Saint-Jacques M-C, editors. Les enfants maltraités : de l’affliction à l’espoir. Québec, Canada: Presses de l’Université Laval; 2012. p. 9–40.

    Google Scholar 

  7. Johnson M, Austin MJ. Evidence-based practice in the social services. Adm Soc Work. 2006;30(3):75–104 Available from: http://search.ebscohost.com/login.aspx?direct=true&db=rzh&AN=2009371842&site=ehost-live.

    Article  Google Scholar 

  8. Bouckenooghe D. Positioning change recipients’ attitudes toward change in the organizational change literature. J Appl Behav Sci. 2010;46(4):500–31.

    Article  Google Scholar 

  9. Allport GW. Attitudes. In: Murchison C, editor. Handbook of social psychology. Worcester, MA: Clark University Press; 1935. p. 798–844.

    Google Scholar 

  10. Ajzen I. The theory of planned behavior. Orgnizational Behav Hum Decis Process. 1991;50(2):179–211.

    Article  Google Scholar 

  11. Reding MEJ, Chorpita BF, Lau AS, Innes-Gomberg D. Providers’ attitudes toward evidence-based practices: is it just about providers, or do practices matter, too? Adm Policy Ment Heal Ment Heal Serv Res. 2014;41(6):767–76.

    Article  Google Scholar 

  12. Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G. Psychometric properties and U.S. national norms of the evidence-based practice attitude scale (EBPAS). Psychol Assess. 2010;22(2):356–65.

    Article  PubMed  Google Scholar 

  13. Stirman SW, Gutner CA, Langdon K, Graham JR. Bridging the gap between research and practice in mental health service settings: an overview of developments in implementation theory and research. Behav Ther. 2016;47(6):920–36 Available from: http://linkinghub.elsevier.com/retrieve/pii/S0005789415001240.

    Article  PubMed  Google Scholar 

  14. Lim A, Nakamura BJ, Higa-Mcmillan CK, Shimabukuro S, Slavin L. Effects of workshop trainings on evidence-based practice knowledge and attitudes among youth community mental health providers. Behav Res Ther. 2012;50(6):397–406.

    Article  PubMed  Google Scholar 

  15. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practise: the evidence-based practice attitude scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Simpson DD. A conceptual framework for transferring research to practice. J Subst Abus Treat. 2002;22(4):171–82.

    Article  Google Scholar 

  17. Henggeler SW, Chapman JE, Rowland MD, Halliday-boykins CA, Randall J, Shackelford J, et al. Statewide adoption and initial implementation of contingency management for substance-abusing adolescents. J Consult Clin Psychol. 2008;76(4):556–67.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Bandura A. Self-efficacy: the exercise of control. New York, NY: W. H. Freeman & Co.; 1997.

    Google Scholar 

  19. Collins-Camargo C, Royse D. A study of the relationships among effective supervision, organizational culture promoting evidence-based practice, and worker self-efficacy in public child welfare. J Public Child Welf. 2010;4(1):1–24.

    Article  Google Scholar 

  20. Gist ME, Mitchell TR. Self-efficacy: a theoretical analysis of its determinants and malleability. Acad Manag Rev. 1992;17(2):183–211.

    Article  Google Scholar 

  21. Shapiro CJ, Prinz RJ, Sanders MR. Sustaining use of an evidence-based parenting intervention: practitioner perspectives. J Child Fam Stud. 2015;24(6):1615–24. https://doi.org/10.1007/s10826-014-9965-9.

    Article  Google Scholar 

  22. Mazzucchelli TG, Ralph A. Self-regulation approach to training thild and family practitioners. Clin Child Fam Psychol Rev. 2019;22:129–45. https://doi.org/10.1007/s10567-019-00284-2.

    Article  PubMed  Google Scholar 

  23. Leathers SJ, Melka-kaffer C, Spielfogel JE, Atkins MS. Children and youth services review use of evidence-based interventions in child welfare: Do attitudes matter? Child Youth Serv Rev. 2016;70:375–82. https://doi.org/10.1016/j.childyouth.2016.10.022.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Simpson DD, Joe GW, Rowan-Szal GA. Linking the elements of change: program and client responses to innovation. J Subst Abus Treat. 2007;33(2):201–9.

    Article  Google Scholar 

  25. Bohman B, Ghaderi A, Rasmussen F. Training in methods of preventing childhood obesity increases self-efficacy in nurses in child health services : a randomized, controlled trial. J Nutr Educ Behav. 2014;46(3):215–8. https://doi.org/10.1016/j.jneb.2013.10.006.

    Article  PubMed  Google Scholar 

  26. Sethi S, Kerns SEU, Sanders MR, Ralph A. The international dissemination of evidence-based parenting interventions: Impact on practitioner content and process self-efficacy. Int J Ment Health Promot. 2014;16(2):126–37 Available from: http://www.tandfonline.com/doi/abs/10.1080/14623730.2014.917896. [cited 2014 Nov 1].

    Article  Google Scholar 

  27. Daro D, Budde S, Baker S, Nesmith A, Harden A. Creating community responsibility for child protection: findings and implications from the evaluation of the community partnerships for protecting children initiative. Chicago, IL; 2005.

  28. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50. https://doi.org/10.1016/j.amepre.2012.05.024.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: A synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50(3–4):462–80 Available from: http://www.ncbi.nlm.nih.gov/pubmed/22644083. [cited 2014 Oct 5].

    Article  PubMed  Google Scholar 

  30. Weisz JR, Ng MY, Bearman SK. Odd couple? Reenvisioning the relation between science and practice in the dissemination-implementation era. Clin Psychol Sci. 2014;2(1):58–74.

    Article  Google Scholar 

  31. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629 Available from: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2690184&tool=pmcentrez&rendertype=abstract%5Cn; http://onlinelibrary.wiley.com/doi/10.1111/j.0887-378X.2004.00325.x/full.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Aldridge WA, II Boothroyd RI, Veazey CA, Powell BJ, Murray DW, & Prinz RJ. Ensuring active implementation support for North Carolina Counties Scaling the Triple P System of Interventions. Chapel Hill: Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill. 2016. https://eric.ed.gov/?id=ED588821.

  33. Aarons GA. Measuring provider attitudes toward evidence-based practice: consideration of organizational context and individual differences. Child Adolesc Psychiatr Clin N Am. 2005;14(2):255–71.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Lochman JE, Powell NP, Boxmeyer CL, Wells KC, Windle M. Implementation of a school-based prevention program: effects of counselor and school characteristics. Prof Psychol Res Pract. 2009;40(5):476–82.

    Article  Google Scholar 

  35. Crisp BR, Swerissen H, Duckett J. Four approaches to capacity building in health: consequences for measurement and accountability. Health Promot Int. 2000;15(2):99–107.

    Article  Google Scholar 

  36. Flaspohler P, Duffy J, Wandersman A, Stillman L, Maras MA. Unpacking prevention capacity: an intersection of research-to-practice models and community-centered models. Am J Community Psychol. 2008;41(3–4):182–96.

    Article  PubMed  Google Scholar 

  37. Mihalic SF, Irwin K. Blueprints for violence prevention: from research to real-world settings—factors influencing the successful replication of model programs. Youth Violence Juv Justice. 2003;1(4):307–29.

    Article  Google Scholar 

  38. Izmirian SC, Nakamura BJ. Knowledge, attitudes, social desirability, and organizational characteristics in youth mental health services. J Behav Health Serv Res. 2015;43(4):630–47. https://doi.org/10.1007/s11414-015-9491-6.

    Article  Google Scholar 

  39. Munroe D, Duffy P, Fisher C. Nurse knowledge, skills, and attitudes related to evidence-based practice : before and after organizational supports. Medsurg Nurs. 2008;17(1):55–60.

    PubMed  Google Scholar 

  40. Ozyilmaz A, Erdogan B, Karaeminogullari A. Trust in organization as a moderator of the relationship between self-efficacy and workplace outcomes : a social cognitive theory-based examination. J Occup Organ Psychol. 2018;91(1):181–204.

    Article  Google Scholar 

  41. Sanders MR. Triple P-positive parenting program as a public health approach to strengthening parenting. J Fam Psychol. 2008;22(3):506–17.

    Article  PubMed  Google Scholar 

  42. de Graaf I, Speetjens P, Smit F, de Wolff M, Tavecchio L. Effectiveness of the triple P positive parenting program on parenting : a meta-analysis. Fam Relat. 2008;57(December):553–66.

    Article  Google Scholar 

  43. de Graaf I, Speetjens P, Smit F, de Wolff M, Tavecchio L. Effectiveness of the triple P positive parenting program on behavioral problems in children: a meta-analysis. Behav Modif. 2008;32(5):714–35 Available from: http://bmo.sagepub.com/content/early/2008/05/12/0145445508317134.short.

    Article  PubMed  Google Scholar 

  44. Nowak C, Heinrichs N. A comprehensive meta-analysis of triple P-positive parenting program using hierarchical linear modeling: effectiveness and moderating variables. Clin Child Fam Psychol Rev. 2008;11(3):114–44.

    Article  PubMed  Google Scholar 

  45. Sanders MR, Kirby JN, Tellegen CL, Day JJ. The triple P-positive parenting program: a systematic review and meta-analysis of a multi-level system of parenting support. Clin Psychol Rev. 2014;34(4):337–57. https://doi.org/10.1016/j.cpr.2014.04.003.

    Article  PubMed  Google Scholar 

  46. Prinz RJ, Sanders MR, Shapiro CJ, Whitaker DJ, Lutzker JR. Population-based prevention of child maltreatment: the U.S. Triple p system population trial. Prev Sci. 2009;10(1):1–12 Available from: http://www.ncbi.nlm.nih.gov/pubmed/19160053. [cited 2014 Aug 3].

    Article  PubMed  PubMed Central  Google Scholar 

  47. Prinz RJ, Sanders MR, Shapiro CJ, Whitaker DJ, Lutzker JR. Addendum to “‘Population-based prevention of child maltreatment: The U.S. Triple P system population trial’”. Prev Sci. 2016;17(3):410–6.

    Article  PubMed  Google Scholar 

  48. Charest E, Gagné M-H, Goulet J. Development and pretest of key visual imagery in a campaign for the prevention of child maltreatment. Glob Heal Promot. 2017;0(0):1–9 1757–9759.

    Google Scholar 

  49. Gagné MH, Bélanger-Gravel A, Clément MÈ, Poissant J. Recall and understanding of a communication campaign designed to promote positive parenting and prevent child maltreatment. Prev Med Reports. 2018;12(October):191–7.

    Article  Google Scholar 

  50. Delawarde C, Gagné M-H, Brunson L, Drapeau S. Implementing a multilevel prevention strategy under an intersectorial partnership: the case of the triple P program. Child Youth Serv Rev. 2018;88(2018):170–9.

    Article  Google Scholar 

  51. Brislin RW. Back-translation for cross-cultural research. J Cross-Cultural Res. 1970;1(3):185–216.

    Google Scholar 

  52. Vallerand RJ. Vers une méthodologie de validation trans-culturelle de questionnaires psychologiques: implications pour la recherche en langue française toward a methodology for the transcultural validation of psychological questionnaires: implications for research in. Can Psychol Can. 1989;30(4):662–80.

    Article  Google Scholar 

  53. Lehman WEK, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abus Treat. 2002;22(4):197–209.

    Article  Google Scholar 

  54. Turner KMT, Sanders MR. Parent consultation skills checklist. Brisbane, QLD, Australia: Parenting and Family Support Centre, The University of Queensland; 1996.

    Google Scholar 

  55. Tabachnick BG, Fidell LS. Cleaning up your act screening data prior to analysis. In: using multivariate statistics. 5th ed. Needham Height, MA: Allyn & Bacon/Pearson Education; 2007.

    Google Scholar 

  56. Hayes AF. Introduction to mediation, moderation, and conditional process analysis: a regression-based approach. New York: The Guilford Press; 2013.

    Google Scholar 

  57. Montoya AK. Moderation analysis in two-instance repeated measures designs: probing methods and multiple moderator models. Behav Res Methods. 2019;51:61-82. https://doi.org/10.3758/s13428-018-1088-6.

  58. Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pract. 2010;17(1):1–30.

    Article  Google Scholar 

  59. Bowles S, Louw J, Myers B. Perceptions of organizational functioning in substance abuse treatment facilities in South Africa. Int J Ment Heal Addict. 2011;9(3):308–19.

    Article  Google Scholar 

  60. Gioia D. Using an organizational change model to qualitatively understand practitioner adoption of evidence-based practice in community mental health. Best Pract Ment Heal An Int J. 2007;3(1):1–15 Available from: http://sfx.scholarsportal.info/mcmaster?sid=OVID:psycdb&id=pmid:&id=doi:&issn=1553-555X&isbn=&volume=3&issue=1&spage=1&pages=1-15&date=2007&title=Best+Practices+in+Mental+Health:+An+International+Journal&atitle=Using+an+organizational+change+model+to+qu.

    Google Scholar 

  61. Mazzucchelli TG, Sanders MR. Facilitating practitioner flexibility within an empirically supported intervention: lessons from a system of parenting support. Clin Psychol Sci Pract. 2010;17(3):238–52.

    Article  Google Scholar 

  62. Kazdin AE. Drawing valid inferences I: internal and external validity. In: Kazdin AE, editor. Research design in clinical psychology. 4th ed. Boston, MA: Allyn & Bacon; 2002. p. 22–54.

    Google Scholar 

  63. Borntrager C, Chorpita B, Higa-McMillan C, Weisz J. Provider attitudes toward evidence-based practices: are the concerns with the evidence or with the manuals? Psychiatr Serv. 2009;60(5):677–81 Available from: http://psychiatryonline.org/article.aspx?doi=10.1176/appi.ps.60.5.677.

    Article  PubMed  Google Scholar 

  64. International Test Commission (ITC). ITC Guidelines for Translating and adapting tests (Second edition). Int J Test. 2018;18:101–34.

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank Sylvie Drapeau and Marie-Claude Richard (advisory committee members), Hélène Paradis and Bei Feng (statistical consultants), Nguyen & Murray associées (language review) and the implementation partners (who made data collection possible) for their contribution to this study.

Funding

The Triple P implementation project and associated research projects were made possible by a grant from the Social Sciences and Humanities Research Council of Canada (SSHRC) (895–2011-1016). The SSHRC is a federal research funding agency that promotes and supports postsecondary-based research and training in the humanities and social sciences. The funds were used to cover the costs of the research project, including various costs of implementing the Triple P program (e.g., financing Triple P training, purchasing equipment, hiring research assistants). SSHRC grants and fellowships are awarded through independent, national, merit review processes. During her graduate studies, the first author received scholarships from the Fonds de recherche du Québec – Société et culture (FRQSC), the Social Sciences and Humanities Research Council (SSRCH), the Chaire de partenariat en prévention de la maltraitance and the Centre jeunesse de Québec – Institut universitaire (CJQ-IU), CIUSSS-CN. These scholarships were intended to financially support the first author in her doctoral studies, and the amounts received were not used to defray costs related to the research project.

Author information

Authors and Affiliations

Authors

Contributions

The first author (MKC) contributed to data collection, conducted the statistical analyses of the data and was the major contributor to this manuscript. The second author (MHG) is the first author’s research director. She is responsible for the whole research project (i.e., implementation of Triple P in two communities in Quebec, Canada). She developed the concept of the study, contributed to data collection by mobilising partners, reviewed the manuscript and provided supervision at each step of the project. All authors have read and approved the manuscript.

Corresponding author

Correspondence to Marie-Kim Côté.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the research ethics board of the Centre jeunesse de Québec – Institut universitaire du CIUSSS-Capitale Nationale (April 15, 2015, MP-CJQ-IU-13-017). Written informed consent was obtained from all participants. At the time of the study, all questionnaires used were available free of charge. Their use and translation for research purposes did not require licensing or authorization.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Questionnaire de pré-implantation/Pretest questionnaire, *Description of data: Set of questionnaires completed by participants at pretest, including a sociodemographic questionnaire and the French translation of the questionnaires EBPAS, ORC and FRPI. The French translation of the questionnaire PCSC was provided to participants by Triple P International before the Triple P training sessions and is therefore not included in Additional file 1. It is however included in Additional file 2.

Additional file 2.

Questionnaire de fin de participation/Postest questionnaire, *Description of data: Set of questionnaires completed by participants at posttest, including the French translation of the questionnaires EBPAS, ORC, FRPI and PCSC.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Côté, MK., Gagné, MH. Changes in practitioners’ attitudes, perceived training needs and self-efficacy over the implementation process of an evidence-based parenting program. BMC Health Serv Res 20, 1092 (2020). https://doi.org/10.1186/s12913-020-05939-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-020-05939-3

Keywords