Skip to main content
Top
Gepubliceerd in: Mindfulness 2/2023

Open Access 18-10-2022 | ORIGINAL PAPER

Implementation Reporting Recommendations for School-Based Mindfulness Programs

Auteurs: Rebecca N. Baelen, Laura F. Gould, Joshua C. Felver, Deborah L. Schussler, Mark T. Greenberg

Gepubliceerd in: Mindfulness | Uitgave 2/2023

share
DELEN

Deel dit onderdeel of sectie (kopieer de link)

  • Optie A:
    Klik op de rechtermuisknop op de link en selecteer de optie “linkadres kopiëren”
  • Optie B:
    Deel de link per e-mail
insite
ZOEKEN

Abstract

Objectives

Research on school-based mindfulness programs (SBMPs) indicates promising, albeit mixed, effects. However, there has been a lack of consistency and completeness in implementation reporting, frustrating efforts to draw causal inferences about the implementation elements that influence program outcomes. To address these issues, we crafted a conceptual framework with an accompanying set of key terms for SBMP implementation elements to guide the development of flexible and practical implementation reporting recommendations for studies of SBMPs.

Methods

To develop the framework and recommendations, we drew insights from the implementation science and school-based prevention literature, explored reporting standards across behavioral science fields, and examined reviews and studies of SBMPs that had an implementation focus.

Results

The SBMP Implementation Framework (SBMP-IF) is organized by four broad categories (i.e., the program, participants, context, and implementation), which inform the reporting recommendations. The recommendations nudge researchers toward more complete and consistent reporting of school contextual factors, participant characteristics and responsiveness, and teacher training/competence. They also encourage researchers to explicitly identify and incorporate into their theories of change and measurement strategies the Hypothesized and/or Validated Core Components of the program, as well as the key elements of the Implementation Support System. Finally, the recommendations urge researchers to define and operationalize mindfulness in their theories of change and consider child development when implementing and studying SBMPs.

Conclusions

The recommendations offered are novel for the field of SBMPs and represent a bold effort to strengthen the evidence base and help discern for whom SBMPs work best and under which conditions.
Opmerkingen
The original online version of this article was revised to update the article title of reference Roeser et al., 2022b.
A correction to this article is available online at https://​doi.​org/​10.​1007/​s12671-022-02040-0.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
School-based mindfulness programs (SBMPs)—programs implemented within the school setting (PreK-12) with a central feature of mindfulness and/or contemplative principles and practices—have grown in popularity in the past 15 years, as has the evidence base (Roeser et al., 2022a; Zenner et al., 2014). The emerging research on these programs indicates that SBMPs produce promising, albeit mixed, effects on student outcomes (Emerson et al., 2020; Felver et al., 2016; Phan et al., 2022; Roeser et al., 2022a). Roeser et al. (2022a) found that SBMPs generate salutary effects for student mindfulness and self-regulatory skills, as well as help to reduce anxiety and depression, support physical health, and bolster engagement in healthy relationships. However, much of this research is lacking in scientific rigor, thereby diminishing the conclusions that can be drawn (Felver et al., 2016; Greenberg & Harris, 2012). In addition, reviews of SBMPs and mindfulness programming with youth indicate a lack of consistency in implementation reporting and a failure to connect implementation to outcomes (Emerson et al., 2020; Felver et al., 2016; Gould et al., 2016; Roeser et al., 2022a).
To date, there exists no guidance on implementation reporting for the field of SBMP research, which may contribute to the lack of rigor and the existing evidence base. Greater attention to, and reporting of, program implementation is an essential next step for strengthening the evidence base on SBMPs. The inconsistency, lack of detail, and under-reporting of SBMP implementation elements frustrates efforts to confidently draw causal inference about the full array of elements that may impact program outcomes, as well as efforts to discern for whom and under what conditions SBMPs produce outcomes (Emerson et al., 2020; Gould et al., 2016). Ultimately, insufficient implementation reporting impedes replication and the ability to use findings to inform future research, policy, and practice (e.g., Roeser et al., 2022a).
Delivering programming in complex social settings, such as schools, requires researchers to pay even greater attention to implementation reporting and the quality of implementation compared to controlled clinical settings. Schools are dynamic social environments where myriad contextual and participant characteristics interact to influence how a program is implemented and the outcomes that are generated (see Roeser et al., 2022b). For instance, research on Social-Emotional Learning (SEL) programs in school settings has focused on and documented the importance of implementation quality for student outcomes. SEL programs implemented with high-quality have been found to produce mean effect sizes at least two to three times higher than those implemented with low quality, a metric that was improved when multiple implementation elements were assessed (Durlak & DuPre, 2008; Kutash et al., 2012).
Implementation is a very broad and multi-dimensional term—generally defined as “what a program consists of when it is delivered in a particular setting” (Durlak & Dupre, 2008, p. 329). Implementation fidelity more specifically refers to the degree to which an intervention is conducted as it was originally intended (Durlak & Dupre, 2008). Different terms have been used to refer to implementation fidelity (e.g., adherence, compliance, and/or program integrity; Dane & Schneider, 1998), each with their own unique conceptualizations. Within the education literature, implementation fidelity is understood as the degree to which program delivery adheres to the intervention developers’ model more broadly and is made up of sub-dimensions of quality, adherence, dosage, and uptake (Gould et al., 2016). With various terms and conceptualizations, it can be difficult to understand what is meant by implementation or implementation fidelity.
Therefore, as a starting point for improving the approach to reporting on implementation with studies of SBMPs, we present a guiding conceptual framework of SBMP implementation elements and define a set of key terms to promote greater clarity and consistency in implementation reporting. The framework incorporates multidisciplinary conceptualizations from the fields of implementation science, contemplative science, and education research. Secondly, we provide a set of implementation reporting recommendations for SBMPs based on the conceptual framework. The goal of these recommendations is to improve the transparency, replicability, consistency, and quality of implementation data for SBMPs, which can help to build a more rigorous evidence base for the field of SBMP research. Consistent and complete approaches to implementation measurement, analysis, and reporting will allow for high-quality synthesis-based research (e.g., systematic reviews and meta-analyses). In short, such data can help SBMP researchers, practitioners, and policy makers better ascertain what works, for whom, and under what conditions, both within and across studies.
These recommendations also encourage researchers to report a range of implementation elements to assist in the identification of Core Components (CCs) that drive and shape outcomes of SBMPs. Core Components are the parts, features, attributes, or characteristics of a program and its implementation that have been empirically shown to influence the program’s outcomes when implemented effectively (Dymnicki et al., 2020; Ferber, et al., 2019). Because little is known about which SBMP implementation elements are the CCs of SBMPs, we recommend that researchers identify the elements that are “hypothesized” to influence program outcomes but have yet to be empirically validated—we refer to these as Hypothesized Core Components (Hypothesized CCs). Reporting on Hypothesized CCs allows for empirical examination (and possible validation) of these elements across studies. In other words, measuring and reporting a diverse and targeted array of Hypothesized CCs may move the field of SBMPs toward identifying Validated CCs that are in line with identified programmatic competencies (see Felver et al., 2022).
Based upon the rationale provided by the creators of the Journal Article Reporting Standards (JARS) reporting tool (APA Publications & Communications Board Working Group on Journal Article Reporting Standards, 2008), we use the term recommendations rather than standards or requirements. The term recommendations was selected at this time to promote inclusion of the diverse disciplines and methodological approaches involved with SBMP research. Standards and requirements imply additional and established authority which involves greater consensus among researchers—consensus that does not yet exist in the field of SBMP research. In line with the JARS Group’s logic, we consider the proposed recommendations for SBMP implementation “as a beginning effort at developing standards” (APA Publications and Communications Board Working Group on Journal Article Reporting Standards, 2008, p. 7). The recommendations offered will require consensus building, testing, and validation to be finalized into a set of standards, and potentially requirements, in the future. Because researchers study SBMPs from a variety of disciplines and methodological approaches, these recommendations are intended to be flexible to meet the various disciplinary needs involved, and to be adapted to suit diverse study designs and aims. To support this intention, the recommendations are accompanied by a set of suggestions for specific study designs and aims (e.g., additional guidance on qualitative or sub-group analyses).
Our set of recommendations are derived from a review of implementation standards used in research with youth and in complex social settings, as well as an examination of standards developed for research in healthcare, clinical, and educational settings. In the past two decades, scientists have endorsed the use of rigorous reporting standards for disseminating the results of clinical trials. One of the earliest efforts to create a set of these reporting standards came from the Consolidated Standards of Reporting Trials (CONSORT;Altman et al., 2001; Moher et al., 2001) which provided guidance on reporting methods and outcomes for randomized controlled trials. The CONSORT guidelines have been extremely influential and are commonly required by top-tier journals. Other more recent efforts have sought to create reporting standards for a broader range of study designs. For instance, Hoffman et al. (2014) developed the Template for Intervention Description and Replication (TIDieR)—a checklist that provides more generic and comprehensive reporting guidelines for an array of study designs. Similarly, other groups in the field of education have developed broadly applicable reporting standards (e.g., Standards for Reporting on Empirical Social Science Research in AERA Publications; American Educational Research Association, 2006).
Taken as a whole, these reporting standards are intended to create accurate, transparent, consistent, and complete records of study procedures, which allows for readers to make informed decisions about a study’s scientific merits, supports the building of a rigorous evidence base, facilitates comparisons across studies, and aids practitioners in their implementation of new programs (Montrosse-Moorhead & Griffith, 2017). Requiring reporting standards also nudges researchers toward considering and reporting on study attributes that they may otherwise ignore (e.g., participant characteristics or contextual factors)—opening new avenues of scientific investigation through increased attention to and awareness of previously underrepresented participants or communities. Reporting standards also facilitate replication trials, as full descriptions of procedures are made transparent to independent investigators. Finally, reporting standards can help reviewers and journal editors to establish consistent standards and expectations of what needs to be reported on in studies.
We drew upon four reporting tools that were most aligned and suited for our aims of designing and developing a set of reporting recommendations for SBMPs: (1) the JARS (Appelbaum et al., 2018; Levitt et al., 2018); (2) the Standards for Reporting Implementation Studies (StaRI) (Pinnock et al., 2017); (3) the Oxford Implementation Index (Montgomery et al., 2013), and (4) the TIDieR (Hoffmann et al., 2014). Two derivatives of the TIDieR were also examined. One iteration emphasizes reporting on specific contextual factors for applied health contexts and documenting heterogeneity of interventions in non-therapeutic settings (Cotterill et al., 2018). Another iteration offers specific reporting recommendations for studies of mindfulness-based programs (MBPs) with adults in therapeutic settings (Crane & Hecht, 2018). Lastly, we utilized a CC approach for the design of our reporting recommendations derived from an implementation reporting framework for studies of programming conducted in youth contexts (Dymnicki et al., 2020).
The reporting formats of the aforementioned tools inform the present recommendations. For instance, the StaRI recommends that researchers report on and provide more detailed descriptions of contextual factors and adaptations made to the intervention based on specific local needs—an approach that lends itself to research being conducted in complex and dynamic school settings. The StaRI also provides guidance for reporting the “logic pathway”—also referred to as the logic model, theory of change, or cause-and-effect diagram—which involves detailing how the implementation strategy will guide the intervention delivery and the mechanisms through which the intervention is expected to produce outcomes (Pinnock et al., 2017).
Additionally, TIDieR and its iterations informed several recommendations reported herein (Cotterill et al., 2018; Crane & Hecht, 2018; Hoffmann et al., 2014). Crane and Hecht (2018) adapted the TIDieR guidelines specifically for researchers studying MBPs with adult populations. This pioneering work focused mostly on implementation elements central to teaching integrity. For instance, they recommend that researchers report on teacher competence (using the Mindfulness-Based Interventions: Teaching Assessment Criteria (MBI:TAC); Crane et al., 2013; Crane & Kuyken, 2019) and teachers’ training in mindfulness-based practices, as well as their adherence to the norms of good practice for teaching mindfulness.
Given the context of SBMPs, Crane and Hecht’s (2018) suggestions are of great use, but are not fully applicable to educational settings for several reasons. MBPs with adults (1) have quite different intervention and therapeutic goals than SBMPs, (2) rely on voluntary participation or those seeking help, and (3) depend solely on trained teachers or facilitators to implement the program in a relatively controlled setting. Conversely, SBMPs are often universally administered (by external facilitators, classroom teachers, or both) to all students as part of the standard curriculum. A recent review of studies of SBMPs found only a small percentage of studies (11%) were of programs administered to a targeted subgroup of students (Roeser et al., 2022a). SBMPs are also implemented with participants at various stages of development and their developmental needs inform the type of programming offered, the program goals, and approaches to program implementation. External facilitators and teachers also deliver SBMPs within complex and uncontrolled school settings that are hierarchically nested contexts (e.g., students, classrooms, schools, districts, communities). Thus, the conditions and participants within each level of these complex social settings influence how a program is implemented and the types of supports that are offered to support implementation. Finally, implementing and studying SBMPs requires buy-in from teachers, staff, administrators, parents, and community members—a factor that is often not a consideration with MBPs carried out with adults (Wilde et al., 2019).
Despite the many upsides of the various reporting standards described above, they fail to account for unique considerations vital to SBMP research. There are important elements in the educational context that are critical to understanding the relationship between methods employed and inferences concluded, both of which are highly dependent on the nature of the phenomena under investigation and the context in which such study occurs. Research investigating SBMPs includes such phenomenological (mindfulness-based) and contextual (school systems) factors that were not captured in any of the existing reporting standard guidelines.

Development of the Recommendations

To guide our set of reporting recommendations, we examined reviews and articles of SBMPs that focused on implementation (Dariotis et al., 2017; Emerson et al., 2020; Espil et al., 2020; Gould et al., 2016; Meixner et al., 2019; Montero-Marin et al., 2022; Rempel, 2012; Tudor et al., 2022; Wilde et al., 2019), as well other reviews of SBMPs that offer suggestions about SBMP implementation in their discussion sections (e.g., Felver et al., 2016; Roeser et al., 2022a). In particular, we prioritized two reviews that focused on SBMP implementation, as opposed to SBMP outcomes (Emerson et al., 2020; Gould et al., 2016). These two systematic reviews highlight several areas that are inconsistent and under-reported in SBMP studies. Based on our examination of the extant literature, six main areas emerged warranting greater attention and more consistent and complete reporting: (1) reporting and providing detail on multiple implementation elements; (2) using consistent implementation terminology; (3) providing a study theory of change, especially as it relates to Hypothesized and/or Validated CCs and operationalizes mindfulness in the theory; (4) reporting on teacher training and competence; (5) reporting on contextual factors; and (6) reporting on participant characteristics and experiences—both those delivering and those receiving the SBMP.
Generally, studies of SBMPs offer little detail and provide inconsistent information about implementation, and also use different terminology and approaches to assess implementation, making it difficult to draw comparisons across studies (Emerson et al., 2020; Gould et al., 2016; Roeser et al., 2022a; Tudor et al., 2022). For example, in one review of SBMP studies, fewer than 20% of studies assessed aspects of implementation beyond dosage—components such as participant responsiveness, uptake, and integrity were significantly underreported (Gould et al., 2016). There also exists terminological confusion regarding implementation. For example, in some studies, dosage is used to refer to the number of sessions delivered, while in others it refers to the number of sessions that participants attended (see Tudor et al., 2022).
Most studies of SBMPs fail to outline Hypothesized and/or Validated CCs in their theory of change for the program of study, as well as fail to operationalize mindfulness in this theory. SBMPs often involve multi-component programming and varying approaches to program structure and delivery, which necessitates the need to identify Hypothesized CCs to ultimately understand and empirically validate which implementation elements are driving the effects observed (see Felver et al., 2022). One review of SBMP studies that focused on implementation found only 10% of studies articulated Hypothesized CCs and only 6% referenced a logic model or theory of change (Gould et al., 2016). Another review of SBMP studies found that only 13% of studies outlined the relationship between elements of the mindfulness intervention and the outcomes assessed (Felver et al., 2016). Furthermore, studies of SBMPs offer disparate definitions of mindfulness—no agreed-upon definition currently exists in the field—and often researchers do not operationalize or include mindfulness in their theory of change, making it difficult to discern how mindfulness might generate outcomes or which core mindfulness competencies under the broad umbrella of mindfulness (e.g., self-awareness, non-judging) are of focus (see Felver et al., 2022).
Studies of SBMPs also involve a range of facilitation approaches (e.g., external facilitator, trained teacher, combination) and lack consistency, transparency, and detail regarding the reporting of teacher and facilitators’ training and competence to deliver SBMPs. Roeser et al. (2022a) found a range of facilitation approaches used with SBMPs, whereby 50% were administered by external facilitators, 41% were administered by trained classroom teachers, and 7% were administered by a combination of classroom teachers and external facilitators. Very few studies or reviews of SBMPs have examined the ways in which these different administration approaches impact outcomes. A recent review of SBMPs found no studies that specified in sufficient-enough detail about the teacher training (Emerson et al., 2020), as was determined by criteria for teacher training and competence used to assess MBPs with adults (Crane et al., 2013, 2017). This same review also revealed that only 6% of studies used and/or reported an assessment of teacher competence. However, some of the most recent studies of SBMPs have started to provide more detailed reports of teacher training and competence using these criteria, as well as begun exploring the relationship between teacher/facilitator factors and program outcomes (Crane et al., 2020; Montero-Marin et al., 2022). Since facilitator competency has been linked to program outcomes (Atkinson & Wade, 2015), this is an area in need of further and more consistent documentation.
Finally, our current examination of the literature found incomplete reporting of contextual elements, participant characteristics, and participant responsiveness, as well as vague descriptions (if any) of the relationships between these elements and the program’s implementation and its outcomes (Emerson et al., 2020; Felver et al., 2016). For example, in one review, 71% of studies failed to provide information about participant or community socio-economic status (SES) or students with identified disabilities (Felver et al., 2016). A more recent review found that 43% of studies did not report on the SES of the participating students or the local community (Roeser et al., 2022a). In addition, SBMP studies have largely failed to collect in-depth assessments of participant responsiveness and experiences, critical aspects of implementation that have been shown to contribute to intended outcomes (e.g., Monteiro, 2020; Roeser et al., 2022a). In their review of SMBP studies, Tudor et al. (2022) found only one study that explored the relationship between participant responsiveness and outcomes (Metz et al., 2013). As the field moves toward larger-scale studies, school-related factors will be important to consider (e.g., teacher buy-in, psychological supports, existing programming) as it may have significant implications for program design, delivery, and outcomes. Without consistent and complete reporting of these contextual and participant factors, it will remain unclear for whom SBMPs work best and under which conditions.

Conceptual Framework and Reporting Recommendations

The extensive effort to create implementation reporting standards in other fields has generated substantial benefit in terms of improved reporting and study quality (Montrosse-Moorhead & Griffith, 2017). Similarly, we hope the creation of implementation reporting recommendations for SBMPs will increase the rigor and interpretability of evidence in the field. In this next section, we offer our conceptual framework of SBMP implementation elements and the corresponding key terms that guide and inform a set of implementation reporting recommendations specifically for studies of SBMPs. This present work encourages researchers to identify Hypothesized CCs of SBMPs and to incorporate them into their theories of change—linking implementation elements with outcomes—and measurement strategies. In addition, the recommendations shed light on implementation elements requiring more attention and complete reporting in studies of SBMPs, such as school context, participant characteristics, teacher training and competence, and developmental considerations for implementing SBMPs with students across the PreK-12 spectrum.

The SBMP Implementation Framework

The goal of this work is to propose an inclusive conceptual framework to foster a common approach for conceptualizing implementation and reporting on SBMP implementation elements. We refer to the framework as the SBMP Implementation Framework (SBMP-IF; see Fig. 1). The SBMP-IF is broad and designed to provide an overarching set of categories and constructs to inform our reporting recommendations. It is not meant to be exhaustive; rather, it focuses on those implementation elements that are most relevant to the field of SBMP research. To inform the development of the framework, we drew from the school-based prevention literature (Domitrovich & Greenberg, 2000; Domitrovich et al., 2008), literature on the implementation of mindfulness programs (Broderick et al., 2019; Crane & Hecht, 2018; Emerson et al., 2020; Dariotis et al., 2017; Espil et al., 2021; Gould et al., 2016; Meixner et al., 2019; Monteiro, 2020; Montero-Marin et al., 2022; Rempel, 2012; Tudor et al., 2022; Wilde et al., 2019), and the implementation science literature (Berkel et al., 2011; Blase & Fixsen, 2013; Durlak & DuPre, 2008; Dymnicki et al., 2020). These categories and constructs are not necessarily distinct nor are they completely hierarchical in nature; however, they provide guidance for reporting on the implementation elements of SBMPs that might help to eventually identify CCs (see Dymnicki et al., 2020). Indeed, the SBMP-IF is intended to be of heuristic value to clarify and support more consistent and complete reporting, measurement, and testing of SBMP elements, thereby better identifying factors that lead to, mediate, and/or moderate outcomes.
Additionally, we offer a set of corresponding key terms and definitions (see Table 1) found both in the SBMP-IF and in the reporting recommendations. By explicitly establishing this implementation terminology, it is our hope to create a common language related to SBMP implementation that provides greater consistency in reporting and enhances comparability across studies. We encourage researchers and practitioners to adopt this language and/or provide explanation for deviating from it.
Table 1
Definition of Key Terms
Key Terms
Definition
Works referenced
School-based Mindfulness Program (SBMP)
Any program implemented within the school context (PreK-12) with a central feature of mindfulness and/or contemplative principles and practices
 
SBMP Implementation Elements
Key elements of SBMPs that, when specified, allow researchers to identify the Core Components (CCs) of SBMPs and help to elucidate for whom and under what conditions SBMPs work. These key elements are organized into four broad categories: program, participants, context, and implementation
Dymnicki et al. (2020)
Core Components (CC) and Hypothesized Core Components
The parts, features, attributes, or characteristics of an SBMP that empirically influence its success when implemented effectively. The essential or active ingredients necessary to produce desired outcomes. Prior to empirical validation, these are referred to as Hypothesized Core Components. Core Components serve as the unit of analysis for researchers to determine or describe “what works.” They become the components that practitioners and policymakers seek to replicate in and across a range of related programs and systems
Ferber et al. (2019); Dymnicki et al. (2020); Domitrovich et al. (2008)
Program
Program (Program Design)
The strategies or innovations that are causally linked to specified, intended outcomes. They can include programs, policies, processes, or principles. Requires specification prior to implementation to determine the extent to which program components are implemented as intended (also referred to as Program Design)
Domitrovich et al. (2008); Saul et al. (2008)
Core Program Components (CPCs)
Essential parts of the program itself, which include the practices, policies, processes, or principles, that are empirically or hypothetically linked to program and participant outcomes. CPCs produce outcomes
Domitrovich et al. (2008); Saul et al. (2008)
Implementation Support System (ISS)
Practices, policies, and supports that help reduce variability in high-quality implementation by providing the infrastructure necessary to coordinate the deployment of the program through elements such as teacher training. Elements of the ISS promote high-quality implementation and can support integrity to CPCs. The program itself and the corresponding support system are independent, though interrelated, elements of a whole
Domitrovich et. al. (2008)
Participants
All people who are involved with and affected by a particular program. Participants include both the recipients of the program and the deliverers of the program (e.g., teachers, students, staff, and community members). Reporting on participants can include background characteristics, risk and protective factors, etc.
Dymnicki et al. (2020)
Context
The setting and characteristics of the locale and school system/site within which the program is being implemented. Reporting can include information on school structure, values, buy-in, demand, locale and community characteristics, and relational trust, as well as broader ethical or legal considerations
Dymnicki et al. (2020)
Implementation
Program Implementation
What a program consists of when it is delivered in a particular setting. Program implementation is comprised of three broad dimensions and 8 sub-dimensions to support consistency in reporting
Durlak and Dupre (2008)
Quality of Implementation (QOI)
The extent to which a provider approached a theoretical ideal in delivering a program or the effectiveness with which a program is delivered. Comprised of three inter-related sub-dimensions of integrity, teacher/facilitator competence, and adaptations. High QOI is more likely to produce program impacts. It can be helpful to set a priori benchmarks of QOI to determine if a program was implemented well enough to anticipate participant outcomes
Durlak and Dupre (2008)
Integrity
The extent to which a program’s CPCs, objectives, and principles are implemented as intended. Emphasis is on integrity to CPCs not solely to a manual, rigid set of practices, curriculum, or protocol. Involves a degree of flexibility and alignment to Validated and/or Hypothesized CCs and/or program objectives. Also referred to by others as fidelity or adherence
Greenhalgh and Papoutsi (2019)
Competence
The level of skill a teacher has in teaching the program (e.g., embodiment of foundational mindfulness qualities, knowledge, proficiency in teaching the program, commitment to mindfulness practice, and participation with students in a process of inquiry during the teaching process). Can involve domains of planning, organization, curriculum coverage, teaching mindfulness, guiding practices, and facilitation of the learning environment for programming
Broderick et al. (2019); Crane et al. (2013)
Adaptations
Additions or modifications made to the program either pre-emptively to adapt/align with context or participant needs (planned adaptations) or during implementation (unplanned adaptations)
Berkel et al. (2011)
Amount
The quantity of the program itself that is delivered and/or received
 
Dosage
The amount or how much of a program that is delivered (can also be referred to as exposure). Can include number of sessions offered, intensity of sessions, and length of time of sessions
Durlak and Dupre (2008)
Uptake
The amount of the program received and practiced by recipients. Can include number of sessions attended, amount of in-class and out-of-class practice, time engaged in formal and informal practice—structured practice such as sitting meditation v. bringing the skills acquired through formal practice into the moments and events of everyday life
Montgomery et al. (2013)
Goodness of fit
Compatibility or alignment of program elements (as implemented) with aspects of a particular school context or locale (e.g., to the cultural and developmental needs and capacities of students, educators, and the school community). The fit of the particulars of the program with the particulars of the participants and context
Roeser et al. (2022a)
Participant responsiveness
The extent to which participants are engaged with, receptive to, and interested in the activities and content of the program. Responsiveness is distinct from uptake (one could attend all sessions and not be engaged) and captures any potential harms or confusion participants could have engendered
Berkel et al. (2011); Durlak and Dupre (2008); Roeser et al. (2022b)
Feasibility
The ease with which a program is implemented within a specific setting. Feasibility indicates whether a program can be examined in a more full-scale study or needs further testing. Includes buy-in, relevance, resource availability, capacity, sustainability, barriers, and facilitators
Bowen et al. (2009); Emerson et al. (2020)
Acceptability
The extent to which a program is judged as suitable, satisfying, or attractive to program deliverers, recipients, the overall school community, other teachers and school staff, and parents. Sample outcomes include satisfaction, intent to continue to use, perceived appropriateness, and fit within organizational culture. Typically reported as part of pilot and feasibility studies
Bowen et al. (2009)
The SBMP-IF framework (Fig. 1) focuses on SBMP Implementation Elements, which we define as the key elements of SBMPs that, when specified, allow researchers to identify the CCs of the program and discern for whom and under what conditions the program works. The SBMP implementation elements encompass all elements that are important to consider when developing, implementing, and studying SBMPs. Next, we have conceived of four broad implementation categories of SBMP implementation elements: the program (as designed to be implemented), participants, context, and implementation (the program as it is actually delivered and received). These four categories adapted from Dymnicki et al. (2020) encompass the myriad of elements that influence and are involved in SBMP implementation. Within these four broad categories, we describe sub-dimensions to provide further guidance for conceptualizing and reporting on implementation. For definitions of the categories and sub-dimensions outlined below, refer to Table 1. For further detail on the reporting recommendations related to these categories and sub-dimensions, refer to the full reporting recommendations provided in Table 2.
Table 2
Reporting recommendations
Paper sections & sub-sections
Description
Recommended reporting items
Supplementary reporting suggestions
Title page & abstract
Search Terms
 
✓ Include keyword search terms: "school" and "mindfulness"
✓ Include secondary search terms (if applicable): Implementation; yoga; contemplative; name of program; key outcomes; child; adolescent; youth; classroom
Abstract
Identification as a School-Based Mindfulness Program (SBMP) study. Description of program to be tested, school and/or classroom context(s), methodology, and key outcomes and implementation elements assessed
✓ Name of Program
✓ If the study was pre-registered, state where and include the registration number.
✓ Identification as SBMP study
✓ Methodology
✓ Key outcomes assessed
✓ Key implementation elements assessed
✓ Pre-registration status
Introduction
Study Background & Problem Addressed
Scientific background and theoretical rationale for SBMP being implemented.
✓ Describe the problem that the study aims to address.
✓ Note any subgroups of interest or nested studies.
✓ Describe the specified target population and developmental considerations for the SBMP based on participants' age(s).
✓ Provide definition of mindfulness and/or relevant contemplative area(s) of focus (with attention to developmental considerations).
  
✓ Provide theoretical rationale for SBMP being implemented, as well as an accompanying theory of change with description and cohesive integration of programmatic and implementation theory.
✓ Provide explanation of how mindfulness is being operationalized in theory of change, highlighting the core mindfulness competencies to be targeted.
✓ Provide visual representation of theory of change (e.g., Logic Model)a that incorporates CPCs and elements of the ISS, linking these components to Quality of Implementation (QOI) and relevant participant outcomes.
Core Components (CCs)
Outline Hypothesized and/or Validated Core Program Components (CPCs) and elements of the Implementation Support System (ISS)
✓ Clearly articulate CPCs and how they relate to participant outcomes.
CPCs: Structure, content, process, and principles of the SBMP that are hypothesized to be causally linked to outcomes.
✓ Clearly articulate the ISS and how elements of the ISS support high-implementation and program delivery.
ISS: Practices, policies, and supports that promote high-quality program implementation (e.g., teacher training, teacher support, manual(s) and resources, peer learning, and incentives).
Overview of SBMP & ISS
General description of SBMP including the materials used to support the program design and delivery. Description of full ISS, explicating how the ISS promotes SBMP delivery and high quality implementation.
✓ Provide name of the manualized SBMP and reference to the most recent curriculum guide.
✓ Describe program activities in relation to CPCs, noting approximate percentage of SBMP or amount of time allocated to each CPC (see Jennings et al., 2013)
✓ Describe program curriculum in detail, noting its manualized sequencing and activities.
✓ Describe any physical or informational materials used in SBMP (i.e., those provided to participants, used in program delivery, and training of administrators and/or teachers or program staff).
✓ Physical or informational materials might include written course materials, guided mindfulness practices, instructor scripts, materials for caregivers or parents, materials for home practice, etc. (materials can be included in supplement based on journal space and wording requirements).
✓ Provide an overview of ISS used to promote program delivery and high-quality implementation (e.g., teacher training, school staff buy-in, relational trust, etc.)
 
Study Aims & Objectives
Outline aims and objects of the current study.
✓ Describe overarching aims and objectives of current study.
✓ Describe post-hoc or exploratory questions or analyses (if applicable).
✓ Describe a priori hypotheses or research questions and/or underlying assumptions.
Methods
Study Design
Describe overall study design and key features of the evaluation. Provide eligibility and selection criteria for both SBMP and comparison/control conditions.
✓ Describe methods of how individuals or groups were selected or constructed. Describe comparison condition, randomization procedures (e.g., whether blinded or not) or process for waitlist or matching.
✓ Recruitment procedures: Inclusion/exclusion criteria for BOTH those delivering and receiving the SBMP and comparison/control conditions, consent procedures (i.e., active or passive), assent procedures, and incentives (if applicable).
✓ Measurement procedures overview: Provide brief overview of measurement including number of measurements and timing relative to SBMP delivery (i.e., long-term follow-up).
✓ Describe how sub-groups were created and/or procedures for nested studies (if applicable).
✓ If multiple groups are assigned, include CONSORT Transparent Reporting Diagram
Context
Describe characteristics of the school and classroom context for both SBMP and comparison/control conditions. Describe the characteristics of the broader community or cultural context.
School-Level:
✓ Geographic location of recruited schools (e.g., rural, urban, suburban)
✓ School type (public/private/charter/other)
✓ Supports relevant at classroom or school level (e.g., SEL, mental health supports; Tier 1 in the U.S.)
✓ Note any salient school characteristics as they pertain to implementation (as available) including grade-levels; school climate & resources (e.g., space, number of school counselors or psychologists); classroom structure; language immersion; student-to-teacher ratio
✓ Describe in detail supports (e.g., years of implementation)
✓ Representativeness: Relevant population-level characteristics (e.g., state/province, district or local jurisdiction, country, school, and/or classroom) including (at a minimum): race, ethnicity, gender
✓ Demographics and average class size
✓ (If study conducted in U.S.) Report % of students with IEPs; % of students in IDEA disability categories for IEPs; % with 504 plans, % ELL status of students; other salient characteristics (ICD-10/DSM-5 diagnoses), % of students receiving Free or Reduced-Price Lunch (FRPL) – studies conducted in other countries should include comparable and relevant statistics
✓ (If school level SBMP) Note any special features of recruited school and representativeness of school relative to population-level characteristics (e.g., other schools in the district or local jurisdiction, state/province, or country).
✓ Note staff characteristics (as available) including demographics of race, ethnicity, and gender; teacher demographic match with participants; teacher and staff retention; years teaching; educational credentials of teachers; teachers/students’ sense of relational trust
✓ (If classroom level SBMP) Note classroom characteristics of recruited schools as they pertain to implementation (e.g., subject matter taught; class size; inclusion) and representativeness characteristics relative to population characteristics
Broader-Level (as applicable):
 
✓ Ethical/legal considerations (e.g., county or state/provincial laws, regulations or SEL mandates)
 
✓ Local community characteristics (e.g., economic, violence/crime, community knowledge of and support for SBMP; traumatic events; recent mental health issues)
 
Participants
Description of participants (both those delivering and those receiving the SBMP and comparison/control conditions).
Participant Characteristics:
✓ Provide relevant characteristics of those delivering the SBMP including relevant demographics (i.e., race, gender, ethnicity, socio-economic status and experience with contemplative practice - years practiced, ongoing personal practice, prior completion of any required training, buy-in).
✓ Provide relevant characteristics of those receiving the program (teachers, students, or both; see above for guidance on reporting and note if participant characteristics mirrored those of the school)
✓ Representativeness of participant sample to population unit of analysis (i.e., classroom, school, district, state/province)
✓ For those delivering the program: Note background and experience as a teacher (e.g., grade-level and subjects taught, # of years teaching).
✓ For those receiving the program (as available/ or relevant): Language spoken at home/immigration status/acculturation; % homelessness and/or other risk/protective factors; internalizing and externalizing symptoms; racism, bullying/violence, involvement or experience in the justice system; family and peer relationships; social-emotional competencies.
✓ Describe any subgroups recruited and/or studies with sub-samples.
Measures
Detail approach to measurement and measures used to assess program implementation and outcomes in both conditions.
✓ Describe overall approach to assessing implementation & outcomes in both conditions, ideally linked to logic model/theory of change.
Implementation Measures: Outline measures for:
Quality of Implementation (QOI):
✓ Integrity: Degree to which program is delivered as intended in relation to CPCs, objectives, or principles.
✓ Adaptations: Planned and unplanned additions or modifications made to program during delivery. If no adaptations were made from the SBMP protocol/curriculum guide, this must be explicitly stated.
✓ Competence: Skill level of teacher or external facilitator for teaching SBMP (e.g., embodiment of foundational mindfulness qualities, knowledge and competency in teaching SBMPs, commitment to mindfulness practice, organization, planning, curriculum coverage, guiding of practices and facilitation of the learning environment for programming).
✓ Use guiding table for implementation measurement (see Gould et al. 2014); consider collecting measures from multiple reporters and/or observational measures; and consider calculating reliability and validity for all measures.
Quality of Implementation (QOI):
✓ Integrity: Articulate adherence "to what" - ideally to CPCs, central lesson objectives, or overarching principles (not only to a manual) as well as any strategies to maintain/monitor fidelity.
✓ Adaptations: Detail planned and unplanned adaptations; Planned - If the SBMP was meant to be personalized, titrated or adapted, then describe what, why, when and how; Unplanned - Describe how individual needs/vulnerabilities were addressed by those delivering the SBMP.
✓ Competence and ISS: Assess competence via observational measures (e.g., TMEOS; Broderick et al., 2019). If possible, assess aspects of ISS (e.g., the quality and dosage of teacher training and ongoing coaching or PD)
Amount of Program:
✓ Dosage: Amount of program delivered to participants.
✓ Participant Uptake: Amount of program received by participants.
Amount of Program:
✓ Dosage: Note total instructional time, number of sessions delivered, length of time, and intensity of sessions. Differentiate didactic instruction, inquiry, activities, and contemplative practice.
✓ Participant Uptake: Number of sessions attended or total exposure, time spent engaged in formal v. informal practice, as well as in-class v. out-of-class practice or homework.
Goodness of Fit:
✓ Participant Responsiveness: Extent to which participants are receptive to, engaged with, and interested in material & content.
✓ Feasibility: Ease of implementation, cost of SBMP, barriers to implementation, and facilitators of implementation.
✓ Acceptability: Suitability of and satisfaction with SBMP based on staff, teacher, and student feedback.
Goodness of Fit:
✓ Participant Responsiveness: Assess participant engagement, motivation, interest in, and feedback about SBMP. Note participant harms (i.e., all important harms or iatrogenic effects for each condition).
✓ Feasibility/Acceptability: Collect qualitative measures related to implementation barriers and supports; quantitative measures related to staff time; average cost per student/teacher/instructor.
Outcome Measures:
✓ Describe participant outcomes relevant to theory of change.
✓ Contextual outcomes (if applicable).
✓Note developmental considerations with outcome measures.
Procedures
Describe data collection and implementation procedures that took place for both the SBMP and control/comparison condition in enough detail to allow for replication.
Data Collection Procedures:
✓ Describe data collection procedures for implementation and outcome measures including who, when, where, how long, and at what time intervals.
 
Implementation Procedures:
✓ Describe implementation procedures for ISS in detail and be sure to describe:
 
✓ the MBP teacher training that those delivering the program received (e.g., hours, in-school v. out-of-school PD, who administered the training and in what format)
✓ the guidelines given to teachers and/or program staff (i.e., what, and how to implement with fidelity and any planned adaptations)
✓ongoing supports and monitoring that occurred during program delivery (e.g., coaching, peer learning etc.)
ISS: Ideally document and assess teacher training (e.g., prior, advanced, or follow-up training); ongoing coaching/support/peer learning; resources; manuals; toolkits; incentives – noting how ISS elements differed or were altered from description in Introduction
✓ Describe implementation procedures for the program delivery in enough detail for replication and be sure to describe:
 
✓ identify the actors who enacted the program and oversaw the ISS or enabled it (e.g., school administrators, teachers, program staff). The role of the investigator(s) should be explicit.
✓ describe the modes of delivery (e.g., face to face, online, pre-recorded videos or practices).
✓ describe whether it was delivered to the whole class, in groups, or individually.
✓ outline the timing of delivery, including sequencing and when it was delivered and whether SBMP occurred during school hours.
✓ describe what the comparison/control condition received (ideally with same level of detail as SBMP condition).
✓ describe what services members of the SBMP condition and control/comparison condition received beyond the program being evaluated (e.g., other relevant SEL or mental health programs)
✓ report any context changes or significant events that occurred during the SBMP delivery period
✓ Provide a visual timeline of data collection and implementation procedures (can be provided in online supplement).
Differentiation: Describe differences between conditions in terms of theory and practice, as well as differences between SBMP and other approaches (e.g., SEL programming).
Monitoring: Assess and note other relevant programs (e.g., SEL programs) being implemented.
Analyses
Analytic methods used and rationale for outcomes evaluation, ideally testing mechanisms of change.
Implementation Analyses:
 
✓ Describe analytic methods (e.g., methodology for calculating scores for each implementation dimension and overall QOI).
✓ Articulate a priori benchmarks or cutoffs, as well as accompanying rationale for low, medium, and high QOI.
✓ If variation in QOI exists, ideally assess the relationship between ISS and QOI and between QOI and participant outcomes.
✓ Consider return on investment or cost-benefit analyses (if applicable)
Outcomes Analyses:
 
✓ Outline data analysis plan and rationale for overall analytic approach including how to account for any nested and missing data.
✓ Describe approach for addressing attrition (note any deviation).
✓ (If quantitative methods used) Describe method for determining group equivalence between SBMP and comparison/control conditions on relevant measures. Describe approach to outcome analyses for main effects. Describe any subgroup/moderator and mediation analyses (if applicable).
✓ State any modifications of or deviations from data analysis plan.
✓ Explore effects on available school data (e.g., academic performance, teacher attrition, teacher health care utilization, student attendance, etc.)
✓ Subgroup/Moderator Analyses: Consider testing moderation of demographic covariates
✓ For quantitative analyses: Ideally test mechanisms of change through mediation analyses
✓ (If qualitative methods used) Explain in detail the development of codes (a priori or emergent) and how they relate to the central research question(s) or CPCs and the ISS. Identify the unit of analysis. Provide background and positionality of researchers/coders related to SBMPs and how this impacted data analysis.
✓ For qualitative analyses: Refer to good practice reporting (see Levitt, et al., 2018).
Results
 
Provide results for all outcome and implementation analyses. Describe results of relationship between QOI and SBMP outcomes. Describe possible differences in effects based on subgroups.
✓ Report full results (both null and statistically significant) in-line with stated study aims & objectives
✓ Use QOI Table to consolidate results (see Gould et al., 2014).
✓ Report results with consideration to participant demographic characteristics
Implementation:
✓ Report overall level of QOI and whether it met a priori benchmarks for QOI, and variation in QOI across classrooms/schools (if it exists). Report both (1) null results and (2) significant results for what worked for whom under what conditions.
 
Outcomes:
✓ Applying the implementation framework theory described in the Introduction, report both (1) null results and (2) significant results of what worked for whom under what conditions.
 
Discussion
Summary
Summary of findings, strengths and limitations, comparisons to other studies.
✓ Summarize findings.
✓ Describe how the study advances knowledge about SBMPs and their implementation, especially in comparison to other studies.
✓ Discuss the unique contributions of SBMPs, especially in relation to other programming (e.g., SEL)
✓ Describe how SBMP implementation was connected to and informed results.
✓ Describe how study contributes to understanding of how SBMPs and their implementation impact diverse participants, particularly for historically marginalized and understudied populations (e.g., racial minorities and individuals with disabilities). If not explored, report this.
Implications
Discussion of policy, practice and/or research implications of both the SBMP and the ISS.
✓ Discuss implications of SBMP and ISS for policy, practice, and/or research.
✓ Consider implications for scalability
✓ Describe future directions and offer concluding thoughts.
General
 
Include statement(s) regulatory approvals, funding, and conflicts of interest (if applicable),
✓ Note regulatory approvals (e.g., ethical approval, confidential use of routine data)
 
✓ Acknowledge funding and conflicts of interest (e.g., involvement of the owner or creator of the SBMP in the implementation and/or the evaluation)
aFor resources to support the development of a program logic model, refer to W.K. Kellogg Foundation (2004). Logic Model Development Guide. Michigan: W.K. Kellogg Foundation. Retrieved from http://​www.​compact.​org/​wp-content/​uploads/​2010/​03/​LogicModelGuidep​df1.​pdf

Program (Program as Designed)

The program includes what a given SBMP consists of as designed to be delivered and is made up of two independent though interrelated sub-dimensions: (1) Core Program Components (CPCs) and (2) the corresponding Implementation Support System (ISS). CPCs are the essential aspects of the program itself, including the practices, processes, or principles that are hypothesized to be causally linked to specified outcomes (Domitrovich et al., 2008; Saul et al., 2008). These CPCs can include the practices being offered (e.g., focused attention and loving kindness practices), as well as the way they are offered (e.g., through embodied presence or appreciative inquiry). Elements that are not “core” are those that are not thought to be responsible for driving effects (e.g., discussion prompts or classroom organization) (see Felver et al., 2022 and Gould et al., 2016). The ISS includes the practices and policies that help promote high-quality program implementation and support integrity to CPCs (see Table 1). The ISS includes the practices, policies, and supports that help to reduce variability in high-quality implementation by providing the necessary infrastructure to coordinate the deployment of the program (see Domitrovich et al., 2008; Domitrovich & Greenberg, 2000). For instance, teacher training is one extremely important feature of the ISS that promotes high-quality implementation and should be reported in detail.
The current recommendations emphasize clear articulation of both CPCs and the ISS because these two interrelated sub-dimensions of the program form the foundation of a comprehensive theory of change and allow for the improvement of measurement, analysis, and conclusions. To be clear, CPCs are the essential program ingredients that are validated or hypothesized to lead to participant and program outcomes, whereas elements of the ISS (e.g., teacher training) are validated or hypothesized to promote high-quality SBMP implementation. Clear articulation and description of the ISS is often not included in theories of change of SBMPs—neglecting aspects that may impact quality of implementation in complex social systems such as schools. We also suggest that SBMP researchers provide a visual representation (e.g., a logic model) depicting both CPC and ISS elements as they relate to the overall theory of change.

Participants

The implementation reporting recommendations for participants urge more detailed reporting of the relevant characteristics of both those who deliver the program and those who receive the program—students, teachers, and/or staff (e.g., relevant demographics, previous experience with contemplative practices prior to training or study). Including relevant characteristics of those delivering the program supports a better understanding of the necessary pre-conditions for effective delivery, and including relevant participant characteristics of those receiving the program helps to ascertain for whom SBMPs may work (Felver et al., 2016; Roeser et al., 2022a). This is particularly important when considering the developmental needs and considerations for the range of students participating in SBMPs (see Roeser et al., 2022b). Furthermore, the provided recommendations encourage researchers to report the representativeness of participants in the sample relative to the population which helps to clarify interpretations related to generalizability and external validity.

Context

It is important to report the relevant characteristics of the school and classroom contexts in which the program is implemented as well as the broader community and cultural contexts. Domitrovich et al. (2008) detail several of these contextual elements that are unique to schools and are of central importance for fully interpreting program effects. Such elements include federal, state, and district policies (e.g., mandated standards for SEL programs) and administrative leadership (e.g., principal’s commitment of resources to support program implementation). Additionally, relational trust between the SBMP implementer (i.e., the teacher, external facilitator, or both) and others within the school and broader community (e.g., the SBMP professional learning team, district/school administrators, students’ families, and students) are foundational to the experience of delivering the SBMP and linked to integrity of implementation (Mischenko et al., 2022). Thus, these types of contextual considerations are important to document and have been incorporated into the recommendations offered, along with those outlined by Dymnicki et al. (2020).

Implementation (Program as Delivered and Received)

The final category, implementation, refers to what actually gets delivered and what is received, not what is intended. The salient sub-dimensions of implementation for SBMPs are described in greater detail below—that is, those dimensions that have been shown to predict program impact on participant outcomes (Durlak & DuPre, 2008). There is general agreement on a wide range of dimensions; however, the terms that are used for specific dimensions vary. For example, to ascertain whether a program was delivered as intended, researchers might use the terms integrity, fidelity, or adherence, all of which are appropriate. For simplicity and clarity, the recommendations offered here are guided by an integrated conceptual grouping of three sub-dimensions: Quality of Implementation, Amount, and Goodness of Fit (see Fig. 1 and Table 1). Additionally, these three sub-dimensions are conceptually divided into two categories within the provided model: (1) implementation elements delivered, and thus within the control of those delivering the SBMP (e.g., dosage); and (2) implementation elements received (e.g., participants’ experiences of and responsiveness to the SBMP; see Berkel et al., 2011).
Quality of Implementation (QOI) refers to the overall effectiveness with which an SBMP was delivered, or the extent to which the program delivery met a theoretical ideal (see Durlak & Dupre, 2008). We propose that QOI is comprised of three inter-related, yet separate, sub-dimensions: (1) integrity (sometimes called fidelity or adherence), (2) competence (of teachers or external facilitators), and (3) adaptations. Together these three aspects of QOI allow for assessment of whether a program was implemented well enough to anticipate relevant participant or context-level outcomes. As such, it is vital to set a priori benchmarks for QOI “as intended” or “high/low” levels of QOI. Gould et al. (2016) recommend setting a priori benchmarks based on empirical data or educated guesses, so that within a given trial, researchers know whether to expect programmatic outcomes. There are no accepted thresholds or benchmarks for implementation elements (e.g., dosage) in the field of SBMPs; however, recent studies have started to set and report their own benchmarks (see Montero-Marin et al., 2022).
Integrity captures the extent to which a program is delivered as intended (see Greenhalgh & Papoutsi, 2019). In our conceptualization, integrity is assessed in relation to CPCs, program objectives, or principles, and not necessarily in sole relation to following a fixed manual, rigid set of practices, curriculum, or protocol (commonly known as adherence). Given the nature of the phenomena that SBMPs aim to model and teach (e.g., embodiment, presence, non-judgment, common humanity) and the dynamic environment of schools, rigid adherence to a manual, for example, is not necessarily appropriate, particularly for manuals that may not integrate such phenomena or explicitly account for such contexts in the proscribed protocol (see Table 1). Adhering to the steps detailed by developers of a curriculum, manual, protocol, or rigid set of practices is an essential element of the scientific replication process; however, to most fully understand whether a program is delivered as intended, we suggest the more nuanced, flexible, and inclusive conceptualization of integrity—one that incorporates a degree of flexibility and adaptation and focuses on alignment to Hypothesized CCs or eventually Validated CCs and/or program objectives (see description of adaptations below).
Competence refers to the level of skill a teacher or external facilitator has in teaching SBMPs (see Broderick et al., 2019; Crane et al., 2013). Competence is comprised of many components, some of which are unique to SBMPs (see Table 1 for more extensive description of such components). Unlike traditional conceptions of workplace competence, which conjure up notions of performing certain roles and notions of “expertise,” competence with mindfulness involves “‘a way of being’ which emerges through sustained engagement with the practice” (Crane et al., 2013, p. 79). Competence in this conception includes the extent to which one conveys and embodies the message that there are “universal aspects to the experience of being human” and therefore includes modeling of this vulnerability and authenticity in one’s teaching (p. 79). Additionally, it involves the embodied delivery of mindfulness practices and participation with students in a process of inquiry (see also Roeser, 2016). Crane et al. (2013) developed an instrument to assess the many domains of teacher competence related to mindfulness and offer helpful guidance when seeking to assess this aspect of implementation quality (MBI:TAC; Crane et al., 2013). Building off of this work, Broderick et al. (2019) developed the Teaching Mindfulness in Education Observation Scale (TMEOS)—an observational measure designed to more specifically assess teacher quality and competence in implementing SBMPs in K-12 classrooms (a suggested tool offered in Table 2).
Adaptations are the additions or modifications that are made to the program either pre-emptively to align with a given school context or participant needs (planned) or during implementation (unplanned) (see Berkel et al., 2011). Adaptations can detract from overall QOI if they interfere, misalign, or omit CPCs. Conversely, adaptations can enhance the overall QOI if enacted to align implementation with CPCs (e.g., to meet the needs and capacities of participants, teachers, and local communities) (Berkel et al., 2011). For SBMPs, adaptations are to be expected as teachers model presence and flexibility, allowing what arises for participants to shape their delivery (Brandsma, 2017). As such, adaptations can be viewed as desirable, in that they allow instructors’ flexibility to respond to participant needs in the moment. In a recent review of SBMPs, authors found no studies that provided information or analysis related to adaptations (Tudor et al., 2022). Moving forward, documenting adaptations can allow for testing their relationship to integrity and competence, as well as provide a better understanding of QOI and program effectiveness (Stirman et al., 2019).
Amount refers to how much of a program was delivered (i.e., dosage) and received (i.e., uptake). In our conceptualization, dosage captures the amount of the program delivered—that is, the number of sessions offered (frequency), intensity of sessions, and length of time of sessions (duration) (see Durlak & Dupre, 2008). Uptake, on the other hand, captures the amount of the program received by recipients including the number of sessions attended, and the time spent engaged in practices (formal and informal practice), as well as homework completion (see Montgomery et al., 2013). While both dosage and uptake are quantitative assessments that capture the amount of program delivered and received, it is important to distinguish between dosage and uptake because they are not the same, yet studies of SBMPs have typically reported them both as part of dosage (Gould et al., 2016; Tudor et al., 2022).
Goodness of fit refers to the extent to which a program and the associated practices are compatible or aligned with the “cultural and developmental needs and capacities of students, educators, and the school community” (Roeser et al., 2022a, p. 6). Goodness of fit is comprised of responsiveness, feasibility, and acceptability (see Table 1 for full definitions). Roeser et al. (2022a) theorize that a good program “fit” can help to promote teacher and student engagement with and receptiveness to the program and practices, which in turn may lead to more beneficial outcomes.
In our conceptualization, participant responsiveness refers to participants’ experience of SBMPs, namely their level of engagement with, receptiveness to, and interest in the program material or content (see Table 1; Berkel et al., 2011; Durlak & Dupre, 2008; Roeser et al., 2022b). It also captures the extent to which participants experience confusion or any harm. We recognize that participant responsiveness is related to and can inform the amount of programming participants receive (i.e., dosage and uptake). That said, we view responsiveness as a distinct and understudied element of SBMP implementation (Emerson et al., 2020; Monteiro, 2020)—an element that is associated with meaningful participant outcomes, as evidenced by research in other areas of implementation science (Berkel et al., 2011). While uptake captures the amount of a program received and can be assessed more quantitatively, responsiveness is concerned with other multi-faceted aspects of participants’ experiences including active participation and engagement that can be captured both qualitatively and quantitatively. As discussed in Roeser et al. (2022b), “responsiveness of students” centers around whether or not programs cultivate students’ motivation and engagement to learn mindfulness through specific approaches that are developmentally attuned to student needs. Responsiveness is important for understanding how to implement programs effectively for specific kinds of students in the future.
The final sub-dimensions of goodness of fit are feasibility and acceptability. We conceptualize feasibility as the level of ease with which a program is implemented by teachers or external facilitators within a specific setting (Bowen et al., 2009; Emerson et al., 2020). Assessments of feasibility might include the documentation of implementation barriers and supports, as well as the average program cost and time per student and/or teacher. We define acceptability as the extent to which a program is judged as suitable, satisfying, or attractive to program deliverers and recipients, as well as the overall school community, other teachers and school staff, and parents (see Bowen et al., 2009). Assessments of acceptability might include consumer satisfaction surveys or interviews. Goodness of fit indices are critical for understanding student motivation and the long-term adaptability and sustainability of SBMPs.

Reporting Recommendations

The full set of reporting recommendations for SBMP Implementation Elements can be found in Table 2. The items listed as “recommendations” in the table (see column, “Recommended Reporting Items”) are central and should be included somewhere in the write-up and reporting of study findings, while those that are “suggestions” (see column, “Supplementary Reporting Suggestions”) are illustrative suggestions to enhance further rigor and/or specificity. The supplementary items included depend on the kind of study being conducted, context within which the research occurs, and the stage of the research. The recommendations are meant to be flexible enough that they can be applied to quantitative, qualitative, and mixed-methods studies.
The SBMP-IF (Fig. 1) guided our decisions for organizing and including various recommendations and suggestions. The reporting recommendations are categorized into sections that correspond to the general organization and formatting of peer-reviewed journal publications, like that of the JARS and StaRI reporting standards. This organization provides consistent and complete reporting at the publication stage. Also, beginning the research design process with the publication in mind encourages researchers to consider implementation elements and implementation reporting at the very start of the study design process. These recommendations can inform and guide the study design, implementation, analytic plan, and reporting processes.
On the surface, it might appear that the recommendations are at times redundant; however, the intention is to encourage researchers to report on implementation elements thoroughly and consistently throughout their publications. Take for example, the case of reporting on CPCs and the ISS. In the introduction section of a paper, we recommend that researchers clearly articulate the Hypothesized CPCs (eventually Validated CPCs) and the ISS elements. Then, later in the measures section, researchers are encouraged to measure and report program integrity in relation to CPCs. And finally, in the results section, we suggest that researchers also analyze outcomes in relation to Hypothesized and/or Validated CPCs (see suggested reporting in Table 2). In essence, researchers should consider CPCs throughout the design, implementation, and, subsequently, the reporting phases of a study. This grounds the documentation of what was intended to happen, and what actually occurred in relation to Hypothesized and/or Validated CPCs and the theoretical rational or theory of change of the study. The recommendations facilitate a focus on the outlined theoretical or empirical rationale and nudge researchers to address the ways in which implementation meets or fails to meet integrity to Hypothesized and/or Validated CCs.
Many of the recommendations offered are important for any study publication; however, there are several recommendations specific to SBMPs. For one, studies of SBMPs have been inconsistent in their definitions of mindfulness (or the contemplative area of focus), as well as not situated or considered mindfulness or mindfulness competencies in their theories of change (see Roeser et al., 2022b). To develop a better understanding about the role of mindfulness in influencing outcomes, researchers should explicitly define and operationalize this construct in their theory of change for students of a specific age. This recommendation encourages researchers to document how mindfulness, or the specified contemplative area of the study, addresses the articulated problem of focus. More consistent and complete reporting will promote greater understanding of how mindfulness is defined and measured, as well as support measurement and analytic efforts across studies. It may even help to move the field toward a more unifying definition of mindfulness that takes into account developmental considerations (see Roeser et al., 2022b).
The provided recommendations highlight the need to report on developmental considerations and how they inform the program itself and the theory of change. For instance, many SBMPs are derived from adult programming (e.g., MBSR and MBCT), whereby adult programming is adapted and therefore may not be developmentally appropriate or engaging for children or youth. In fact, Roeser et al. (2022a) found that 52% of SBMPs included in their review were adapted from adult programming. If SBMPs are to be conducted in the unique context of school settings with youth at different developmental stages, more reporting is needed to identify the programmatic aspects that are developmentally unique/appropriate for students of different ages insofar as they target specific elements of mindfulness or mindfulness competencies (Roeser et al., 2022b).
The recommendations are also intended to support the navigation of the multifaceted nature of implementation in complex social settings like schools. These recommendations nudge researchers toward reporting on contextual elements at both the school and broader levels, facilitating a more ecological approach to reporting (Roeser et al., 2021). Researchers are encouraged to collect as much school-level data as might be relevant to their study, detailing data on representativeness and documenting school characteristics that might pertain to implementation (e.g., principal support, teacher attitudes and readiness). Additionally, gathering school, classroom, and community-level data can support greater understanding of implementation and the factors driving program outcomes. For instance, researchers can collect quantitative and qualitative data to report on factors like relational trust, which have been shown to affect program implementation (Mischenko et al., 2022).
Finally, these recommendations are designed to be adaptable and flexible based on the study design, the researcher’s disciplinary foundation, and the dissemination outlet (e.g., journal publication). The intention guiding these recommendations is for researchers to engage in more consistent and complete reporting of implementation elements in the publication of SBMP studies—helping to eventually determine which elements are in fact “core” to the implementation and efficacy of programs with certain populations and in particular settings.

Discussion

We developed a set of implementation reporting recommendations for SBMPs to provide more consistent and complete documentation of both the implementation design (what was intended) and the implementation delivery (what actually happened), as well as the factors that impact delivery of SBMPs. The development of reporting recommendations can support greater consensus about the Core Components (CCs) of SBMPs, improve assessment of whether a program was implemented well enough to anticipate relevant participant outcomes (Gould et al., 2016), and inform for whom and under what conditions SBMPs work. In the recent My Resilience in Adolescence (MYRIAD) trial—the largest randomized controlled trial of an SBMP—researchers explored a range of implementation factors and analyzed their impact on program outcomes (e.g., set a priori benchmarks related to student dosage, explored the effects of fidelity, quality, and student practice on program outcomes) (Montero-Marin et al., 2022). Collecting this type of data can help to draw conclusions about the factors impacting and/or driving program outcomes. Studies like this one help to increase understanding about implementation in the field of SBMPs, however, without reporting recommendations and consensus on a priori benchmarks for elements like QOI or practice, reporting will happen in a non-systematic and inconsistent manner. Thus, in developing these reporting recommendations, the intention is that researchers in the field of SBMPs will be more likely to attend to a wide range of implementation elements in the design, implementation, and reporting stages of research and will do so in a consistent manner—an effort that we hope will be reinforced through heightened journal expectations related to reporting. Practitioners and policymakers can also gain further insight about these programs, as they will be equipped with more information about implementation and the program itself, which can then inform decisions about selecting and implementing SBMPs for specific contexts and sets of participants.
Many of these reporting recommendations are aspects of implementation reporting that have been unaddressed in the early years of the study of SBMPs. As such, much more attention to and reporting on teacher training and competence, participant characteristics, responsiveness and uptake of mindfulness practices, and the school and broader level contexts is necessary for understanding findings and informing future research—reporting that involves refinement, consensus, and standardization of the measures used to assess these implementation elements. Furthermore, significantly more developmental consideration is needed when it comes to program adoption and implementation, as well as measurement of mindfulness and related constructs (see Roeser et al., 2022b).
With regard to measurement, following these recommendations and engaging in more detailed implementation reporting will likely require the development of new measures to capture a broader array of implementation elements. As such, we recommend starting with the development of a clear theory of change that incorporates implementation, as this focus will influence what is measured and tested. To guide this process, researchers of SBMPs might find it helpful to utilize the CORE process model (Gould et al., 2016) as a resource for supporting rigorous measure development from feasibility studies to full program scale-up efforts. Additionally, interdisciplinary collaborations with researchers from clinical, educational, and developmental science backgrounds may afford strengths for developing Common Implementation Measures that attend to the contextual features of life in schools and can be used across studies of different SBMPs (Gould et al., 2016). One such example of a Common Implementation Measure is the TMEOS (Broderick et al., 2019)—a measure that has been used to assess quality of mindfulness instruction and teacher competency in K-12 classroom settings.
It may seem daunting to collect the contextual data outlined here, especially for those unfamiliar with school contexts. So, we recommend collecting and reporting on these data as an ideal to be attained, recognizing that schools offer varying ranges of data and data access. For example, demographic and attendance data are typically collected by schools and easily obtainable. Other characteristics and data may require additional partnership and collaboration with participating school systems. Building trusting relationships with these school systems can facilitate data access and collection, bolstering both the scientific validity of the outcomes reported and the contextual relevance (Bryk et al., 2011).
These reporting recommendations can support the study of SBMPs in underserved and marginalized communities—an often under-studied context in SBMP research—as they encourage more detailed reporting on participant characteristics and school contexts. Recent reviews of studies of MBPs with adults and SBMPs with youth have shown a lack of focus and representation of participants from underserved and marginalized populations (Eichel et al., 2021; Roeser et al., 2022a). To remedy this, we see more complete reporting and attention to the collection of participant voices and data (e.g., participant engagement and receptivity to programs, data on potential harms, attention to identity, agency, and belonging) as important avenues for understanding the reception of programs and for identifying ways to adapt them to be more sensitive and responsive to the diverse needs of marginalized communities. Careful articulation of specific implementation elements can shed light on the effects of SBMPs for different groups of students and various school contexts (e.g., Luthar et al., 2020). For instance, it could be that certain implementation elements drive effects in more affluent and privileged communities, while other elements may influence implementation and outcomes for underserved and marginalized communities. Taking these steps, the field of SBMP research has the potential to become more “transformative” in its approach—helping to interrupt the “reproduction of inequitable educational environments” (Rivas-Drake et al., 2021, p. 1).
With regard to data collection and analysis, our recommendations offer guidance on the reporting for both quantitative and/or qualitative studies. Regardless of the research methodology, researchers should describe any adaptations to their data collection strategies in response to evolving findings, contextual realities, or their study rationale. Furthermore, we recommend for data that is not systematically analyzed to also be described (e.g., field notes, researcher reflexive notes), highlighting how this might have informed the implementation of the SBMP, an understanding of the context, or subsequent data collection. It is also important to note that certain recommendations outlined here are important to report on but may be best suited for other research or other reporting outlets (e.g., a focus on program costs and potential cost-effectiveness).
It is not explicitly stated in the recommendations, but transparency is essential in study reports. We do explicitly recommend that researchers pre-register studies and analyses, as well as report null and/or adverse findings. Researchers should also be clear about their perspectives and potential biases, which might impact the study and its procedures. It can be helpful to detail the researchers’ own backgrounds, describing their perspectives and experiences, especially as related to SBMPs. Acknowledgement of these perspectives and experience is vital for replication efforts, as is the acknowledgement of any biases and a description of how they were managed. Finally, a short description of the relation between the researchers and participants can inform future study efforts and the interpretations of study findings.
There are limitations to our approach. For one, the review of reporting tools in other behavioral science fields and the review of the SBMP literature were not exhaustive. As such, there are potential resources and approaches that we did not explore. Secondly, the set of recommendations offered may not be fully comprehensive (e.g., certain methodological considerations might not have been included; see Roeser et al., 2022b). Thirdly, this work centered around SBMPs, which we defined (based on conventions in the broader field of mindfulness research) as implementation of any program implemented within a school context (PreK-12) with a central feature of mindfulness and/or contemplative principles and practices. However, this is a poorly operationalized definition of what actually constitutes an SBMP as it includes any programming that self-identifies as teaching mindfulness and/or contemplative principles or practices despite significant heterogeneity in the actual programming (Felver et al., 2016). Similar to past efforts that brought operational specificity to the field of SEL programming research (Weissberg et al., 2015), future research should carefully consider identified constituent domains and core mindfulness competencies of MBPs for youth (Felver et al., 2022) to support understanding of the effects and processes of SBMPs. Finally, the recommendations have yet to be pilot tested, which limits the claims that can be made regarding their simplicity, practicality, and usability. One future research effort might involve a consensus-building effort with a larger audience of peers who are experts in the fields of SBMPs and implementation (i.e., a Delphi study) to adapt and solidify these recommendations into a set of standards, building off the work of Felver et al. (2022). We hope studies will adopt these reporting recommendations to generate new learnings about implementation of SBMPs and identify advantages and disadvantages of using the recommendations.
With the proliferation of SBMPs, more consistent and complete reporting of implementation is needed to strengthen the evidence base and determine the Core Components (i.e., of the program and the surrounding support system), as well as discern for whom and under what conditions these programs work (Roeser et al., 2022a). The set of recommendations offered here are the first proposed in the field of SBMPs. As such, they are novel and bold. They are designed to be usable and flexible with the intent of supporting researchers from a wide range of disciplines and their myriad study designs and aims. Finally, they are meant to be practical, equipping practitioners and policymakers with more information about SBMPs to inform both their program selection decisions and overall approaches to implementation based on specific contexts and sets of participants.

Declarations

IRB Ethical Approval

The manuscript does not contain clinical studies or patient data. Therefore, we did not need to obtain ethics approval.
The manuscript does not contain clinical studies or patient data. Therefore, informed consent was not collected.

Conflict of Interest

The authors declare no competing interests.

Disclaimer

The authors are responsible for the review of the literature, the development of the conceptual framework and reporting recommendations, as well as the decision to submit this manuscript for publication. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Robert Wood Johnson Foundation.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Onze productaanbevelingen

BSL Psychologie Totaal

Met BSL Psychologie Totaal blijf je als professional steeds op de hoogte van de nieuwste ontwikkelingen binnen jouw vak. Met het online abonnement heb je toegang tot een groot aantal boeken, protocollen, vaktijdschriften en e-learnings op het gebied van psychologie en psychiatrie. Zo kun je op je gemak en wanneer het jou het beste uitkomt verdiepen in jouw vakgebied.

BSL Academy Accare GGZ collective

Literatuur
go back to reference American Educational Research Association. (2006). Standards for reporting on empirical social science research in AERA publications. Educational Researcher, 35(6), 33–40.CrossRef American Educational Research Association. (2006). Standards for reporting on empirical social science research in AERA publications. Educational Researcher, 35(6), 33–40.CrossRef
go back to reference Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 3. https://doi.org/10.1037/amp0000191CrossRefPubMed Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 3. https://​doi.​org/​10.​1037/​amp0000191CrossRefPubMed
go back to reference Brandsma, R. (2017). The mindfulness teaching guide: Essential skills and competencies for teaching mindfulness-based interventions. New Harbinger Publications. Brandsma, R. (2017). The mindfulness teaching guide: Essential skills and competencies for teaching mindfulness-based interventions. New Harbinger Publications.
go back to reference Broderick, P. C., Frank, J. L., Berrena, E., Schussler, D. L., Kohler, K., Mitra, J., Khan, L., Levitan, J., Mahfouz, J., Shields, L., & Greenberg, M. T. (2019). Evaluating the quality of mindfulness instruction delivered in school settings: Development and validation of a teacher quality observational rating scale. Mindfulness, 10(1), 36–45. https://doi.org/10.1007/s12671-018-0944-xCrossRef Broderick, P. C., Frank, J. L., Berrena, E., Schussler, D. L., Kohler, K., Mitra, J., Khan, L., Levitan, J., Mahfouz, J., Shields, L., & Greenberg, M. T. (2019). Evaluating the quality of mindfulness instruction delivered in school settings: Development and validation of a teacher quality observational rating scale. Mindfulness, 10(1), 36–45. https://​doi.​org/​10.​1007/​s12671-018-0944-xCrossRef
go back to reference Bryk, A. S., Gomez, L. M., & Grunow, A. (2011). Getting ideas into action: Building networked improvement communities in education. In M. T. Hallinan (Ed.), Frontiers in sociology of education (pp. 127–162). Springer.CrossRef Bryk, A. S., Gomez, L. M., & Grunow, A. (2011). Getting ideas into action: Building networked improvement communities in education. In M. T. Hallinan (Ed.), Frontiers in sociology of education (pp. 127–162). Springer.CrossRef
go back to reference Crane, C., Ganguli, P., Ball, S., Taylor, L., Blakemore, S. J., Byford, S., ... & Williams, J. M. G. (2020). Training school teachers to deliver a mindfulness program: Exploring scalability, acceptability, effectiveness, and cost-effectiveness. Global Advances in Health and Medicine, 9, 1–15. https://doi.org/10.1177/2164956120964738 Crane, C., Ganguli, P., Ball, S., Taylor, L., Blakemore, S. J., Byford, S., ... & Williams, J. M. G. (2020). Training school teachers to deliver a mindfulness program: Exploring scalability, acceptability, effectiveness, and cost-effectiveness. Global Advances in Health and Medicine, 9, 1–15. https://​doi.​org/​10.​1177/​2164956120964738​
go back to reference Dariotis, J. K., Mirabal-Beltran, R., Cluxton-Keller, F., Feagans Gould, L., Greenberg, M. T., & Mendelson, T. (2017). A qualitative exploration of implementation factors in a school-based mindfulness and yoga program: Lessons learned from students and teachers. Psychology in the Schools, 54(1), 53–69. https://doi.org/10.1002/pits.21979CrossRefPubMed Dariotis, J. K., Mirabal-Beltran, R., Cluxton-Keller, F., Feagans Gould, L., Greenberg, M. T., & Mendelson, T. (2017). A qualitative exploration of implementation factors in a school-based mindfulness and yoga program: Lessons learned from students and teachers. Psychology in the Schools, 54(1), 53–69. https://​doi.​org/​10.​1002/​pits.​21979CrossRefPubMed
go back to reference Dymnicki, A., Trivits, L., Hoffman, C., & Osher, D. (2020). Advancing the use of core components of effective programs: Suggestions for researchers publishing evaluation results. US Department of Health and Human Services: Office of Assistant Secretary for Planning and Evaluation. Dymnicki, A., Trivits, L., Hoffman, C., & Osher, D. (2020). Advancing the use of core components of effective programs: Suggestions for researchers publishing evaluation results. US Department of Health and Human Services: Office of Assistant Secretary for Planning and Evaluation.
go back to reference Eichel, K., Gawande, R., Acabchuk, R. L., Palitsky, R., Chau, S., Pham, A., Cheaito, A., Yam, D., Lipsky, J., Dumais, T., Zhu, Z., King, J., Fulwiler, C., Schuman-Olivier, Z., Moitra, E., Proulx, J., Alejandre-Lara, A., & Britton, W. (2021). A retrospective systematic review of diversity variables in mindfulness research, 2000–2016. Mindfulness, 12(11), 2573–2592. https://doi.org/10.1007/s12671-021-01715-4CrossRef Eichel, K., Gawande, R., Acabchuk, R. L., Palitsky, R., Chau, S., Pham, A., Cheaito, A., Yam, D., Lipsky, J., Dumais, T., Zhu, Z., King, J., Fulwiler, C., Schuman-Olivier, Z., Moitra, E., Proulx, J., Alejandre-Lara, A., & Britton, W. (2021). A retrospective systematic review of diversity variables in mindfulness research, 2000–2016. Mindfulness, 12(11), 2573–2592. https://​doi.​org/​10.​1007/​s12671-021-01715-4CrossRef
go back to reference Felver, J. C., Cary, E. L., Helminen, E. C., Schutt, M. K., Gould, L. F., Greenberg, M. T., Roeser, R. W., Baelen, R. N., & Schussler, D.L. (2022). Core program components of mindfulness-based programs for youth: Delphi approach consensus outcomes. Mindfulness. Advance of Print. Felver, J. C., Cary, E. L., Helminen, E. C., Schutt, M. K., Gould, L. F., Greenberg, M. T., Roeser, R. W., Baelen, R. N., & Schussler, D.L. (2022). Core program components of mindfulness-based programs for youth: Delphi approach consensus outcomes. Mindfulness. Advance of Print.
go back to reference Gould, L. F., Mendelson, T., Dariotis, J. K., Ancona, M., Smith, A. S. R., Gonzalez, A. A., Smith, A. A., & Greenberg, M. T. (2014). Assessing fidelity of core components in a mindfulness and yoga intervention for urban youth: Applying the CORE process. New Directions for Youth Development, 142, 59–81. https://doi-org.proxy.cc.uic.edu/10.1002/yd.20097 Gould, L. F., Mendelson, T., Dariotis, J. K., Ancona, M., Smith, A. S. R., Gonzalez, A. A., Smith, A. A., & Greenberg, M. T. (2014). Assessing fidelity of core components in a mindfulness and yoga intervention for urban youth: Applying the CORE process. New Directions for Youth Development, 142, 59–81. https://​doi-org.​proxy.​cc.​uic.​edu/​10.​1002/​yd.​20097
go back to reference Hoffmann, T. C., Glasziou, P. P., Boutron, I., Milne, R., Perera, R., Moher, D., Altman, D. G., Barbour, V., Macdonald, H., Johnston, M., Kadoorie, S. E. L., Dixon-Woods, M., McCulloch, P., Wyatt, J. C., Chan, A.-W., & Michie, S. (2014). Better reporting of interventions: Template for intervention description and replication (TIDieR) checklist and guide. BMJ, 348. https://doi.org/10.1136/bmj.g1687 Hoffmann, T. C., Glasziou, P. P., Boutron, I., Milne, R., Perera, R., Moher, D., Altman, D. G., Barbour, V., Macdonald, H., Johnston, M., Kadoorie, S. E. L., Dixon-Woods, M., McCulloch, P., Wyatt, J. C., Chan, A.-W., & Michie, S. (2014). Better reporting of interventions: Template for intervention description and replication (TIDieR) checklist and guide. BMJ, 348. https://​doi.​org/​10.​1136/​bmj.​g1687
go back to reference Jennings, P. A., Frank, J. L., Snowberg, K. E., Coccia, M. A., & Greenberg, M. T. (2013). Improving classroom learning environments by Cultivating Awareness and Resilience in Education (CARE): Results of a randomized controlled trial. School Psychology Quarterly, 28(4), 374–390. https://doi.org/10.1037/spq0000035CrossRefPubMed Jennings, P. A., Frank, J. L., Snowberg, K. E., Coccia, M. A., & Greenberg, M. T. (2013). Improving classroom learning environments by Cultivating Awareness and Resilience in Education (CARE): Results of a randomized controlled trial. School Psychology Quarterly, 28(4), 374–390. https://​doi.​org/​10.​1037/​spq0000035CrossRefPubMed
go back to reference Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suárez-Orozco, C. (2018). Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 26. https://doi.org/10.1037/amp0000151CrossRefPubMed Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suárez-Orozco, C. (2018). Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 26. https://​doi.​org/​10.​1037/​amp0000151CrossRefPubMed
go back to reference Meixner, T., Irwin, A., Wolfe Miscio, M., Cox, M., Woon, S., McKeough, T., & Milligan, K. (2019). Delivery of integra mindfulness martial arts in the secondary school setting: Factors that support successful implementation and strategies for navigating implementation challenges. School Mental Health, 11(3), 549–561.CrossRef Meixner, T., Irwin, A., Wolfe Miscio, M., Cox, M., Woon, S., McKeough, T., & Milligan, K. (2019). Delivery of integra mindfulness martial arts in the secondary school setting: Factors that support successful implementation and strategies for navigating implementation challenges. School Mental Health, 11(3), 549–561.CrossRef
go back to reference Mischenko, P. P., Nicholas-Hoff, P., Schussler, D. L., Iwu, J., & Jennings, P. A. (2022). Implementation barriers and facilitators of a mindfulness-based social emotional learning program and the role of relational trust: A qualitative study. Psychology in the Schools, 59(8), 1643–1671. https://doi.org/10.1002/pits.22724CrossRef Mischenko, P. P., Nicholas-Hoff, P., Schussler, D. L., Iwu, J., & Jennings, P. A. (2022). Implementation barriers and facilitators of a mindfulness-based social emotional learning program and the role of relational trust: A qualitative study. Psychology in the Schools, 59(8), 1643–1671. https://​doi.​org/​10.​1002/​pits.​22724CrossRef
go back to reference Montero-Marin, J., Allwood, M., Ball, S., Crane, C., De Wilde, K., Hinze, V., Jones, B., Lord, L., Nuthall, E., Raja, A., Taylor, L., Tudor, K., & MYRIAD Team. (2022). School-based mindfulness training in early adolescence: What works, for whom and how in the MYRIAD trial?. Evidence-Based Mental Health, 25(3), 117–124. https://doi.org/10.1136/ebmental-2022-300439 Montero-Marin, J., Allwood, M., Ball, S., Crane, C., De Wilde, K., Hinze, V., Jones, B., Lord, L., Nuthall, E., Raja, A., Taylor, L., Tudor, K., & MYRIAD Team. (2022). School-based mindfulness training in early adolescence: What works, for whom and how in the MYRIAD trial?. Evidence-Based Mental Health, 25(3), 117–124. https://​doi.​org/​10.​1136/​ebmental-2022-300439
go back to reference Pinnock, H., Barwick, M., Carpenter, C. R., Eldridge, S., Grandes, G., Griffiths, C. J., RycroftMalone, J., Meissner, P., Murray, E., Patel, A., Sheikh, A., & Taylor, S. J. C. (2017). Standards for reporting implementation studies (StaRI) statement. BMJ, 356. https://doi.org/10.1136/bmj.i6795 Pinnock, H., Barwick, M., Carpenter, C. R., Eldridge, S., Grandes, G., Griffiths, C. J., RycroftMalone, J., Meissner, P., Murray, E., Patel, A., Sheikh, A., & Taylor, S. J. C. (2017). Standards for reporting implementation studies (StaRI) statement. BMJ, 356. https://​doi.​org/​10.​1136/​bmj.​i6795
go back to reference Rempel, K. (2012). Mindfulness for children and youth: A review of the literature with an argument for school-based implementation. Canadian Journal of Counselling and Psychotherapy, 46(3). Rempel, K. (2012). Mindfulness for children and youth: A review of the literature with an argument for school-based implementation. Canadian Journal of Counselling and Psychotherapy, 46(3).
go back to reference Rivas-Drake, D., Rosario-Ramos, E., McGovern, G., & Jagers, R. J. (2021). Rising up together: Spotlighting transformative SEL in practice with Latinx youth. CASEL. Rivas-Drake, D., Rosario-Ramos, E., McGovern, G., & Jagers, R. J. (2021). Rising up together: Spotlighting transformative SEL in practice with Latinx youth. CASEL.
go back to reference Roeser, R. W., Mashburn, A. J., Skinner, E. A., Choles, J. R., Taylor, C., Rickert, N. P., Pinela, C., Robbeloth, J., Saxton, E., Weiss, E., Cullen, M., & Sorenson, J. (2021). Mindfulness training improves middle school teachers’ occupational health, well-being, and interactions with students in their most stressful classrooms. Journal of Educational Psychology, 114(2), 408–425. https://doi.org/10.1037/edu0000675CrossRef Roeser, R. W., Mashburn, A. J., Skinner, E. A., Choles, J. R., Taylor, C., Rickert, N. P., Pinela, C., Robbeloth, J., Saxton, E., Weiss, E., Cullen, M., & Sorenson, J. (2021). Mindfulness training improves middle school teachers’ occupational health, well-being, and interactions with students in their most stressful classrooms. Journal of Educational Psychology, 114(2), 408–425. https://​doi.​org/​10.​1037/​edu0000675CrossRef
go back to reference Roeser, R. W., Galla, B. M., & Baelen, R. N. (2022a). Mindfulness in schools: Evidence on the impacts of school-based mindfulness programs on student outcomes in P–12 educational settings. The Pennsylvania State University. Roeser, R. W., Galla, B. M., & Baelen, R. N. (2022a). Mindfulness in schools: Evidence on the impacts of school-based mindfulness programs on student outcomes in P–12 educational settings. The Pennsylvania State University.
go back to reference Roeser, R. W., Greenberg, M. T., Frazier, T., Galla, B. M., Semenov, A., & Warren, M. T. (2022b). Beyond all splits: Envisioning the next generation of science on mindfulness and compassion in schools for students. Mindfulness. Advance of Print. Roeser, R. W., Greenberg, M. T., Frazier, T., Galla, B. M., Semenov, A., & Warren, M. T. (2022b). Beyond all splits: Envisioning the next generation of science on mindfulness and compassion in schools for students. Mindfulness. Advance of Print.
go back to reference Roeser, R. W. (2016). Processes of teaching, learning, and transfer in Mindfulness-Based Interventions (MBIs) for teachers: A contemplative educational perspective. In K. A. Schonert-Reichl & R. W. Roeser (Eds.), Handbook of mindfulness in education: Integrating theory and research into practice (pp. 149–170). Springer. https://doi.org/10.1007/978-1-4939-3506-2_10 Roeser, R. W. (2016). Processes of teaching, learning, and transfer in Mindfulness-Based Interventions (MBIs) for teachers: A contemplative educational perspective. In K. A. Schonert-Reichl & R. W. Roeser (Eds.), Handbook of mindfulness in education: Integrating theory and research into practice (pp. 149–170). Springer. https://​doi.​org/​10.​1007/​978-1-4939-3506-2_​10
go back to reference Tudor, K., Maloney, S., Raja, A., Baer, R., Blakemore, S., Byford, S. Crane, C., Dalgleish, T., De Wilde, K., Ford, T., Greenberg, M., Hinze, V., Lord, L., Radley, L., Opaleye, E. S., Taylor, L., Ukoumunne, O. C., Viner, R., MYRIAD Team, … Montero-Marin, J. (2022). Universal mindfulness training in schools for adolescents: A scoping review and conceptual model of moderators, mediators, and implementation factors. Prevention Science, 23, 934-953.https://doi.org/10.1007/s11121-022-01361-9 Tudor, K., Maloney, S., Raja, A., Baer, R., Blakemore, S., Byford, S. Crane, C., Dalgleish, T., De Wilde, K., Ford, T., Greenberg, M., Hinze, V., Lord, L., Radley, L., Opaleye, E. S., Taylor, L., Ukoumunne, O. C., Viner, R., MYRIAD Team, … Montero-Marin, J. (2022). Universal mindfulness training in schools for adolescents: A scoping review and conceptual model of moderators, mediators, and implementation factors. Prevention Science, 23, 934-953.https://​doi.​org/​10.​1007/​s11121-022-01361-9
go back to reference Weissberg, R. P., Durlak, J. A., Domitrovich, C. E., & Gullotta, T. P. (Eds.). (2015). Social and emotional learning: Past, present, and future. In J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, & T. P. Gullotta (Eds.), Handbook of social and emotional learning: Research and practice (pp. 3–19). The Guilford Press. Weissberg, R. P., Durlak, J. A., Domitrovich, C. E., & Gullotta, T. P. (Eds.). (2015). Social and emotional learning: Past, present, and future. In J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, & T. P. Gullotta (Eds.), Handbook of social and emotional learning: Research and practice (pp. 3–19). The Guilford Press.
Metagegevens
Titel
Implementation Reporting Recommendations for School-Based Mindfulness Programs
Auteurs
Rebecca N. Baelen
Laura F. Gould
Joshua C. Felver
Deborah L. Schussler
Mark T. Greenberg
Publicatiedatum
18-10-2022
Uitgeverij
Springer US
Gepubliceerd in
Mindfulness / Uitgave 2/2023
Print ISSN: 1868-8527
Elektronisch ISSN: 1868-8535
DOI
https://doi.org/10.1007/s12671-022-01997-2

Andere artikelen Uitgave 2/2023

Mindfulness 2/2023 Naar de uitgave