Pediatric traumatic brain injury (TBI) is a highly prevalent problem known to result in a range of cognitive impairments (Crowe, Catroppa, & Anderson, 2015). In addition, childhood brain injury has been linked to socioemotional problems, the neural and cognitive substrates of which remain underexplored despite evidence suggesting that social and affective difficulties worsen with time since injury (Anderson, Brown, Newitt, & Hoile, 2009; Beauchamp, Dooley, & Anderson, 2010; León-Carrión & Ramos, 2003; Muscara, Catroppa, Eren, & Anderson, 2009). In fact, even mild TBI (mTBI), the most prevalent form consisting of 90 % of TBI, is known to result in increased risk for psychological and psychiatric illness (Massagli et al., 2004) and difficulties in sociocognitive skills underlying appropriate social behaviors (Beauchamp, Dooley, & Anderson, 2013). Moreover, the under 5 years period is the most frequent (McKinlay et al., 2008) and potentially the most damaging period for sustaining a brain injury (Anderson, Spencer-Smith, et al., 2009). Yet except a recent study showing that mTBI in preschool children is associated with theory of mind difficulties (Bellerose, Bernier, Beaudoin, Gravel, & Beauchamp, 2015), almost nothing is known of the first manifestations of socioemotional difficulties after early mTBI.

Among socioemotional abilities, emotional facial expression (EFE) decoding constitutes a key adaptive function that allows the establishment of appropriate interpersonal relationships by enabling rapid decoding of the affective state of others (Adolphs, Tranel, & Damasio, 2003). This skill therefore has a crucial role in social interactions (Frith, 2009) and is considered to be at the heart of socialization processes (de Haan, Belsky, Reid, Volein, & Johnson, 2004; Malatesta-Magai et al., 1994; Todd, Lewis, Meusel, & Zelazo, 2008). Emotional facial expression decoding abilities arise in the first months of life (e.g., Bornstein & Arterberry, 2003; Kobiella, Grossmann, Reid, & Striano, 2008; Serrano, Iglesias, & Loeches, 1992) and continue to develop throughout childhood and adolescence (Gao & Maurer, 2010; see Herba & Phillips, 2004, for a review). The preschool period is particularly critical to the establishment of sound EFE, especially as more complex social communication emerges and children begin to engage in meaningful social interactions, notably with their parents (Izard, Fine, Mostow, Trentacosta, & Campbell, 2002). Accordingly, studies have shown that EFE decoding is related to social competence during the preschool years (Philippot & Feldman, 1990; Waiden & Field, 1990) and predicts social behavior during later childhood (Izard et al., 2001). Early impairments in EFE perception could therefore putatively lead to serious behavioral and social maladjustment (e.g., Izard et al., 2001). In spite of these findings, it remains unknown whether early childhood mTBI has an impact on EFE processing.

To this end, event-related potentials (ERP) constitute an ideal technique by which to investigate EFE processing at a young age. It is a noninvasive method for measuring electrical brain activity associated with cognitive functioning. Despite having lower spatial resolution than other neuroimaging techniques such as fMRI, the ERP technique, owing to its high temporal resolution, provides information about the time course of cerebral activations related to stages of face processing (Dekowska, Kuniecki, & Jaskowski, 2008; Olivares, Iglesias, Saavedra, Trujillo-Barreto, & Valdes-Sosa, 2015; Rugg & Coles, 1995). As a result, measuring ERP while individuals are presented with emotionally laden pictures permits the investigation of the real-time neurocognitive processes that occur during the perception of EFE (Kobiella et al., 2008). A further strength of the ERP technique is that it is well-suited to experimentation with young children and can be obtained passively (i.e., without requiring a behavioral response from the participant).

Studies of face processing in older children and adults consistently identify a number of underlying ERP components (for reviews, see de Haan, Johnson, & Halit, 2003; Olivares et al., 2015; Rossion, 2014; Taylor, Batty, & Itier, 2004). First, P1 is a positive electrophysiological potential that peaks around 100 ms after the onset of a visual stimulus at occipital-temporal sites and can be observed in individuals of all ages (e.g., Batty & Taylor, 2003, 2006). The P1 component is associated with visuospatial processing (Hopf & Mangun, 2000; Taylor, McCarthy, Saliba, & Degiovanni, 1999) and may also index an early stage of face processing, reflecting the detection of configural aspects associated with the presence of faces (Olivares et al., 2015). Second, the N170 is a negative potential elicited at occipitotemporal sites and peaks around 170 ms after stimulus onset in adults. This component is more clearly associated with the detection of faces and their defining features than P1 (Olivares et al., 2015). The N170 is thus considered as the “face-specific component” (Bentin, Allison, Puce, Perez, & McCarthy, 1996; George, Evans, Fiori, Davidoff, & Renault, 1996; Rossion & Jacques, 2008), reflecting the structural encoding of faces (e.g., Sagiv & Bentin, 2001) and therefore a visual expertise stage of information processing. Numerous studies have reported observing the N170 component in preschoolers, sometimes with a latency extending up to 300 ms (e.g., Batty & Taylor, 2006; Cicchetti & Curtis, 2005; Curtis & Cicchetti, 2011; Dawson, Webb, Carver, Panagiotides, & McPartland, 2004; Kuefner, de Heering, Jacques, Palmero-Soler, & Rossion, 2010; Parker, Nelson, & Bucharest Early Intervention Project Core, 2005; Peykarjou, Westerlund, Cassia, Kuefner, & Nelson, 2013; Taylor et al., 1999). Finally, the N250r and N400 components are later electrophysiological components observed in face processing in adults and are associated with long-term face representations (Olivares et al., 2015). N250r is thought to index familiarity processing and is evidenced as a negative deflection around 250 ms at posterior temporal sites (e.g., Schweinberger, Pickering, Burton, & Kaufmann, 2002). To the best of our knowledge, this component has not previously been observed in preschoolers (but see Kuefner et al., 2010). Conversely, Nc is a frontocentral negative deflection around 200–700 ms consistently identified in preschoolers (Batty & Taylor, 2006; Carver et al., 2003; Cicchetti & Curtis, 2005; Curtis & Cicchetti, 2011; Dawson et al., 2002; Dawson et al., 2004; Nelson & Nugent, 1990; Parker et al., 2005; Todd et al., 2008) and which may be equivalent to the N400 observed in adults in response to meaningful stimuli, including faces (Bentin & McCarthy, 1994). It is thought that the Nc reflects young infants’ attentional allocation and/or recognition memory (de Haan et al., 2003), and there is evidence that this component is sensitive to facial familiarity in preschoolers (Dawson et al., 2002).

A limited number of ERP studies have investigated EFE processing in preschoolers and have led to inconsistent findings concerning emotional modulations of the ERP described above. Some studies have found higher P1 amplitude for emotional faces as compared to neutral faces (happy faces in 3.5-year-old children, Curtis & Cicchetti, 2011; fearful faces in 3–4-year-old children, Dawson et al., 2004; fearful faces in 3–8-year-old children, Vlamings, Jonkman, & Kemner, 2010), while others have not (in 4–15-year-old children, Batty & Taylor, 2006; in 4–6-year-old children, Todd et al., 2008). Moreover, some studies have found that P1 latency was shorter for fearful faces as compared to neutral faces (Dawson et al., 2004) as well as for happy faces, as compared to negative emotions (fear, disgust, and sadness in 4–7-year-old children only, Batty & Taylor, 2006), and delayed for fear as compared to positive emotions (happiness and surprise, Batty & Taylor, 2006), while others did not observe any emotional effect on P1 latency. Dawson et al. (2004) report that healthy 3–4-year-old children exhibit faster and higher N300 (which can be considered as a N170) in response to fearful faces compared with neutral faces, while Parker et al. (2005) find that children (ages 8–32 months) show greater N170 amplitude in response to sadness relative to fear. In contrast, several other studies did not find any emotional effect on N170 amplitude or latency (Batty & Taylor, 2006; Curtis & Cicchetti, 2011; Todd et al., 2008). Finally, whereas Cicchetti and Curtis (2005) observed a greater Nc amplitude for happy faces compared to angry and neutral faces in 2.5-year-old children, Curtis and Cicchetti (2011) found that both angry and happy faces elicited a greater Nc response than neutral faces in 3.5-year-old children. Conversely, other authors reported either greater Nc amplitude for negative faces as compared with positive faces (in 4–15-year-old children, Batty & Taylor, 2006; in 4–6-year-old children, Nelson & Nugent, 1990) or no emotional effect (in 3–4-year-old children, Dawson et al., 2004; in 4–6-year-old children, Todd et al., 2008). In summary, these findings suggest that emotional effects may be found all along the face-processing stream, and even if no firm conclusions can be drawn from these studies given the inconsistencies in the results, it appears that EFE decoding in preschoolers may occur as early as the perceptual stage.

This study aimed to investigate whether mTBI sustained during the preschool period alters electrophysiological correlates of EFE perception. To this end, electroencephalograms of children were recorded while they were presented with photographs of faces expressing happiness, anger, or no emotion (neutral). No behavioral response was required, but participants were instructed to decode the emotion expressed by photographed faces. Given the total absence of data regarding EFE decoding difficulties in preschoolers who sustain mTBI and the inconsistencies in the preschool ERP literature regarding this process, we chose to use facial stimuli displaying one positive emotion (happiness), one negative emotion (anger), and no emotion (neutral) in order to test for (1) overall emotional effects that we labeled “arousal effects” when significant differences were observed between both happy and angry faces as compared with neutral faces; and (2) valence effects, when significant differences were observed between angry faces and happy faces. This strategy was chosen because even if children between 2 and 3 years of age appear to be able to correctly label some EFE (such as happiness, anger, and sadness), some authors argue that the structure of emotions in preschoolers is a two-dimensional space comprised of arousal and valence and that discrete emotion labels may not be readily available (Russell & Bullock, 1985; Widen, 2013). Accordingly, the main objective of this study was to determine whether arousal and valence effects differ between preschoolers who sustain mTBI and healthy controls. If so, several hypotheses can be put forward. An impaired EFE processing in preschoolers with mTBI, as compared to healthy controls, could be associated with (1) either reduced or absent arousal and/or valence effects on the latency and/or amplitude of ERP components, which would suggest disrupted EFE processing at these particular steps of information processing; and/or (2) different temporal patterns of arousal and/or valence effects, with healthy preschoolers showing emotional effects at an earlier stage of information processing than preschoolers with mTBI. This second hypothesis would suggest that electrophysiological deficits would be, at least partly, compensated by an alternative strategy. For instance, a specific emotional effect observed on the P1 component (but not on the N170 component) in healthy preschoolers would be observed on the N170 component (but not on the P1 component) in preschoolers with mTBI. In this particular example, this would suggest that the lack of EFE processing at the visuospatial stage (indexed by the P1 component) would be compensated at the visual expertise stage.

Method

Participants and behavioral measures

The study protocol was reviewed and approved by the ethics committee at the Sainte-Justine’s Hospital Research Centre and was conducted in accordance with the Declaration of Helsinki. Participants were recruited in the emergency department for participation in the study as part of a larger prospective longitudinal cohort for a project investigating cognitive and social outcomes of early TBI at 6 and 12 months postinjury. In adolescents and adults, mTBI has been defined and classified based on the presence of confusion, loss of consciousness, headache, or decrease in memory (Hayden, Jandial, Duenas, Mahajan, & Levy, 2007). However, given that these symptoms cannot be reliably assessed in young children, there is no clear definition in the medical literature of what constitutes mTBI in preschool children (Greenes, 2003). Also, studies report that loss of consciousness has a sensitivity ranging between 45 % and 68 % for older children, and only 3 % for toddlers/preschoolers (Bonadio, 1989; Dietrich, Bowman, Ginn-Pease, Kosnik, & King, 1993; Greenes & Schutzman, 1999). In light of this, we used a full range of injury severity along with the definition reported by Osmond et al. (2010) in the CATCH study, which developed a clinical decision rule for children sustaining mTBI. Accordingly, we included all children with suspected accidental mTBI defined by a history of trauma or acceleration–deceleration movement applied to the head, a Glasgow coma score (GCS) of 13 to 15 at the emergency department (Ewing-Cobbs et al., 1997; Greenes & Schutzman, 1999; Kuppermann, 2008; Quayle et al., 1997; Schutzman et al., 2001) and at least one of the following: history of loss of consciousness, excessive irritability according to the parents, persistent vomiting (two or more times), history of confusion, amnesia, or worsening headache in a verbal child.

To limit the confounds related to other pathologies, multiple injuries, language, and suspected abusive family environment (Duhaime et al., 1987; Keenan, 2008; Keenan, Hooper, Wetherington, Nocera, & Runyan, 2007; Keenan, Runyan, Marshall, Nocera, & Merten, 2004) exclusion criteria for the two groups were (1) known premorbid congenital, neurologic, developmental, psychiatric, or metabolic disorder; (2) gestational age inferior to 36 weeks; (3) history of prior TBI serious enough to warrant a visit to the emergency room; (4) focal neurological sign or multiple injuries necessitating surgery or a cast (e.g., polytrauma); (5) suspicion of inflicted TBI; (6) insufficient mastery of either English or French to understand and complete the measures. Information for exclusion criteria (1–3) were gleaned from the standardized TBI medical history questionnaire developed by the international Common Data Elements (CDE) Working group (NINDS; Miller, 2010), while criteria 4 and 5 were determined based on the child’s medical record.

At the time this study was conducted, the cohort consisted of 54 preschool children. One of the two parents completed an in-house developmental and demographic questionnaire including items related to developmental history, family constellation, parent education, handedness, and so on. The questions included allow calculation of familial socioeconomic status using the Blishen Socioeconomic Index (Blishen, Carroll, & Moore, 1987). A socioeconomic score was attributed to each child according to caregiver occupation, and for double-earner families, the highest socioeconomic score was used. Intellectual functioning and verbal ability were measured using the Bayley Scales of Infant and Toddler Development Cognitive and Language subscales (Bayley-III; Bayley, 2005), for children between the ages of 24 and 30 months, and the Wechsler Preschool and Primary Scale of Intelligence (WPPSI-III; Wechsler, 2002), for children 31 months and older. The Cognitive Composite of the Bayley-III and the Global Index of the WPPSI-III were used as measures of intellectual functioning, whereas the Bayley Language Composite and the WPPSI-III Verbal Index were used as approximations of verbal ability. Percentile ranks were used to allow direct comparisons between assessment tools.

For the present study, families of 16 children were not solicited because they had a mTBI before the age of 2 years (n = 8) or because they had a moderate (n = 6) or severe TBI (n = 2). Four children were not included because of missing data (socioeconomic status of parents, n = 2; intellectual functioning and verbal ability measures, n = 2; see procedure and behavioral measures for further details on these measures). The remaining subset was invited to participate in the ERP study at 6-months postinjury. Informed consent was obtained before the experiment from parents or legal caregivers following a full explanation of the procedures undertaken. All participants being minors, consent for participation in the study was signed by a parent. The parents of 16 children with mTBI refused to let their child take part in the study. Eighteen preschool children who sustained mTBI (mTBI group; mean age 53 ± 8 months) were finally included. Fifteen healthy controls (mean age 55 ± 11 months) also participated and were recruited via advertisement in local daycares. Their demographic characteristics are presented in Table 1.

Table 1 Participant demographics and characteristics

Participants were invited to the laboratory at 6-months postinjury (or shortly after recruitment for controls) to participate in the ERP study (approximately 2 hours total). Parents of participants completed the Griffith Empathy Measure (Dadds et al., 2008), which is a brief parent-report measure of empathy targeting empathic skills. For each child, we calculated the affective empathy score, which has the advantage of being unrelated to verbal intelligence. This measure was used as a control for the influence of emotional empathy on the potential emotional modulations of ERP components. Evidence suggests that adolescents who sustained TBI may have lower empathy levels than healthy controls (Beauchamp et al., 2013) and that people with high emotional empathy levels, compared to people with low emotional empathy, evaluate angry faces as expressing more anger and happy faces as being happier (Dimberg et al., 2011).

At the end of the study, families were compensated with $30.00 for their time. In addition, the children were given a certificate of completion and a toy for their involvement.

ERP stimuli

Faces (angry, neutral, and happy) were taken from the NimStim Set of Facial Expression (Tottenham et al., 2009). For the purpose of this experiment, a pretest was carried out during which 20 adult participants (22–36 years) were asked to identify the EFE they perceived (open answer) within a selection of 258 faces, expressing anger, happiness, or no emotion (neutral), and report the certainty of their answer on a 0–10-point scale. Following this pretest, we selected 20 identities (10 men, 10 women) based on the following criteria: (1) the percentage of correct EFE identification across participants was greater than or equal to 70 %, (2) the median of certainty was greater than or equal to 7, and (3) Criteria 1 and 2 were verified for each of the three facial expressions for the same identity (percentage of correct identification: angry, M = 91 %, SD = 6; neutral, M = 95 %, SD = 7; happy, M = 94 %, SD = 9; median of certainty: angry, M = 7.7, SD = 0.8; neutral, M = 7.8, SD = 0.6; happy, M = 8.1, SD = 0.6); no significant difference was found between the sets of pictures for these two measures, F(2, 59) = 2.390, p = .101, for percentage of correct identification; F(2, 59) = 1.403, p = .254, for median of certainty. Of note, to avoid possible configural effects on early ERP components, all the faces that were selected had a globally similar configuration (e.g., eyes wide open).

Extrafacial characteristics (contours and hair) from the stimuli were retained, but to standardize them on the basis of emotion we proceeded as follows: for each identity, extrafacial characteristics from the neutral face formed the base for each of the three conditions; facial characteristics from the happy and angry faces were then extracted and pasted on this base in order to obtain three facial stimuli that differed only in terms of facial expressions (angry, neutral, happy; see Fig. 1). The mean luminance and color saturation levels (for red, green, and blue) were measured for each stimulus, and these values were adjusted for some pictures to maintain homogeneity between angry, neutral, and happy sets (Image J software, http://rsb.info.nih.gov/ij/; see Calvo & Lang, 2005). Separate analyses of variance (ANOVAs) performed on the mean luminance, the color saturation, and the standard deviation of luminance values (i.e., contrast indexes) did not reveal any significant difference between the three sets, all Fs(2, 59) < 0.312, ps > .698.

Fig. 1
figure 1

Examples of stimuli used in the task. The three EFE conditions (angry, neutral, and happy) are presented for the same identity (modified from the NimStim set of facial expressions; Tottenham et al., 2009)

ERP procedure

The recording sessions took place in a dark, soundproof Faraday room. Participants were seated on their parents’ lap in a comfortable chair. The task was administered using Presentation software (Version 16, Neurobehavioral Systems, Inc., Albany, CA). All stimuli (335 × 430 pixels; size: 8.7° × 10.5°) were presented on a 40.5 × 30.5 cm ViewSonic monitor (ViewSonic, Canada; resolution: 1280 × 1024 pixels; frame refresh rate of 60 Hz), at a distance of 70 cm from the child during electrophysiological recordings. During one block, EEG was recorded during 60 trials (20 identities × 3 EFE). Each trial was set up as follows (see Fig. 2): A black fixation cross was centrally presented on a gray background (R = 128, G = 128, B = 128) for a period that randomly varied from trial to trial between 1,000 and 2,000 ms, and alerted the participant to the imminent arrival of the picture, which was projected for 500 ms. Then, a gray background was presented alone for 1,000 ms. This block was repeated three times (for a total of 180 trials, 60 by EFE category) with a break between each block. Before the experiment, participants were familiarized with a small set of different EFE pictures. All participants were monitored through an infrared camera equipped with an integrated speaker. An assistant observed whether the participant looked at the screen at all times and gave a signal whenever the participant looked elsewhere, in order to exclude the associated EEG segments from analysis.

Fig. 2
figure 2

Sequence of events in a trial. A trial begins with the presentation of a central fixation cross on a gray background for a period that varies randomly from trial to trial between 1,000 and 2,000 ms. This central fixation cross alerts the participant to the imminent arrival of a photographed face (modified from the NimStim set of facial expressions; Tottenham et al., 2009) that is projected for 500 ms. In this example, a photographed face expressing happiness is illustrated. After the offset of the picture, the gray background is presented alone for 1,000 ms

EEG recording and preprocessing

A high-density Geodesic Sensor Net system with 128 electrodes was used for recording (Electrical Geodesics System Inc., Eugene, OR, USA). Electrode impedances were considered acceptable if at or below 40 kΩ, and the vertex was used as the reference lead during acquisition (Tucker, 1993). Ground channel was located on the midline between Cz and Pz (CPz). EEG signals were collected running NetStation software (Version 4.4.1) with an Apple Mac Pro computer and synchronized with the onset of stimulus presentation. Blinks and vertical eye movements were registered by means of electrodes below and above the eyes, and horizontal movements were recorded from electrodes near the outer canthi of the eyes, all integrated in the sensor net. Digitalization of EEG data was carried out continuously at a sampling rate of 250 Hz and an analog 0.1–100 Hz bandpass filter was applied. Then EEG data were digitally filtered with a 0.3 Hz first order high-pass filter and exported using NetStation 4.4.1 analysis software (Electrical Geodesics Inc., Eugene, OR) for further off-line analyses with BrainVision Analyser software (Version 2.02, Brain Products, Munich, Germany). Out of the original 128 electrodes, 38 electrodes including the last row of posterior electrodes as well as electrodes around the ears and on the cheeks, were rejected by default in both groups. These electrodes tend to be particularly noisy due to muscle artifacts inherent to their locations at the underside of the head (Rousselet, Husk, Bennett, & Sekuler, 2008). The remaining electrodes provide satisfying coverage of the scalp and include those corresponding to the international 10–10 electrode positions (see Luu & Ferree, 2005), notably, electrodes classically used to quantify ERP components of interest in this study (at occipitotemporal sites for P1 and N170 and at frontocentral sites for Nc; e.g., Batty & Taylor, 2006). The EEG data were low-pass filtered at 30 Hz (Butterworth Zero Phase Filters, 12 dB/oct). An additional notch Filter at 60 Hz was applied. Ocular artifact removal was carried out through an independent component analysis (ICA)-based strategy (Jung et al., 2000). EEG data were then re-referenced to the average of all signals. EEG segments were constructed that consisted of a 200-ms baseline and 800 ms following stimulus onset, and were then baseline corrected on the basis of the mean activity during the 200 ms prior to stimulus onset. The EEG segments during which the participants were not focusing on the facial stimuli were removed. Analysis of variance performed on the proportions of EEG segments excluded (in the CTL group, angry: M = 2 %, SD = 4; neutral: M = 2 %, SD = 3; happy: M = 2 %, SD = 3; in the mTBI group: angry: M = 4 %, SD = 5; neutral: M = 4 %, SD = 4; happy: M = 4 %, SD = 6) with EFE (angry, neutral, happy) as within-subject factors and group (control, mTBI) as between-subject factors did not reveal any significant main effects of group, F(1, 31) = 2.273, p = .142, or EFE, F(2, 62) = 0.757, p = .473, or a group-by-EFE interaction effect, F(2, 62) = 0.228, p = .797. Then, algorithmic artifact rejection of voltage exceeding ±200 μV was followed by visual data inspection of segmented data in which segments with artifacts were manually rejected. Following this procedure, analysis of variance performed on the number of remaining EEG segments (in the CTL group, angry: M = 54, SD = 7; neutral: M = 55, SD = 7; happy: M = 55, SD = 7; in the mTBI group, mTBI: angry: M = 51, SD = 8; neutral: M = 52, SD = 6; happy: M = 51, SD = 8) with EFE (angry, neutral, happy) as within-subject factors and group (control, mTBI) as between-subjects factor did not reveal any significant main effects of group, F(1, 31) = 1.684, p = .204, or EFE, F(2, 62) = 0.427, p = .654, or a group-by-EFE interaction effect, F(2, 62) = 0.308, p = .736. Individual subject averages were constructed separately for each emotion.

Data analysis

Spatial principal components analysis (sPCA) and definition of regions of interest (ROIs)

Given the limited ERP literature on EFE processing among preschoolers, we aimed to define ROIs using a data-driven approach. To this end, electrode groupings were carried out through a covariance-matrix-based sPCA, using the 90 electrode sites as dependent variables and time points (250), participants (33), and conditions (angry, neutral, happy) as observations (IBM SPSS Statistics for Windows, Version 22, Armonk, NY, IBM Corp.; Pourtois, Delplanque, Michel, & Vuilleumier, 2008). SPCA provides a division of the scalp into different regions or spatial factors (SF) and each SF can be visualized by topographic maps of the factor loadings (Cartool software Version 3.54, D. Brunet, http://www.fbmlab.com/cartool-software/). Each SF is formed with the electrode sites where recordings tend to covary. For each SF, the factor loading attributed to each electrode site corresponds to the SF’s contribution to the original variables (i.e., how much the SF accounts for the voltage recorded at each electrode). SFs that explained at least 1 % of the total variance were selected and were submitted to varimax rotation. Symmetrical (with respect to the midline) ROIs were created by clustering electrodes (1) for which loadings were higher than 0.5 or lower than -0.5 and (2) where ERP components could be clearly identified.

Identification and quantification of ERP components

On the basis of the ERP literature on EFE processing in preschoolers described in the introduction, as well as visual inspection of the grand-averaged data and the data from individual participants, we were interested in three ERP components: the P1, N170, and Nc were quantified between 90 and 200 ms, 150 and 300 ms, and 300 and 600 ms poststimulus onset, respectively. As visual inspection of waveforms suggested that potential differences at the N170 may be driven by differences at the preceding P1, a peak-to-trough subtraction was computed for N170 amplitude and a difference score was calculated for N170 latency (Peykarjou et al., 2013; see also, e.g., Hoehl, Wiese, & Striano, 2008; Kuefner et al., 2010; Pinal, Zurron, & Diaz, 2015). More precisely, for each individual, at each concerned ROI and in each EFE condition, adjusted values for the N170 amplitude and latency were calculated by subtracting P1 peak amplitude from N170 peak amplitude, and P1 peak latency from N170 peak latency. Finally, given the absence of clear peaks in the individual averages for the Nc, only mean amplitudes within the specified time window were analyzed for this component. Moreover, as visual inspection of waveforms suggested that potential differences at the Nc may be driven by differences at the preceding positive peak (positive counterpart of the N170 observed at the frontocentral level; e.g., Joyce & Rossion, 2005), adjusted values were again computed, this time by calculating differences with the mean activity within the 200–300-ms time window directly preceding the Nc.

Analyses of experimental effects

The values used for analyses of experimental effects were (1) peak amplitudes and latencies for the P1; (2) adjusted peak amplitudes and latencies for the N170; (3) adjusted mean amplitudes for the Nc. Effects of experimental conditions were analyzed with the GLM procedure (IBM SPSS Statistics for Windows, Version 22, Armonk, NY, IBM Corp.) with EFE (angry, neutral, happy) and hemisphere (left, right) as within-subjects factors and group (control, mTBI) as between-subjects factor for analyses of P1 and N170. The same procedure was used for Nc but with EFE (angry, neutral, happy), hemisphere (left, right) and ROI (frontal, central; see Results section, sPCA, definition of ROIs, and quantification of ERP components) as within-subjects factors and group (control, mTBI) as between-subjects factor. Linear and quadratic contrasts were used to evaluate respectively valence (angry vs. happy) and arousal (angry and happy vs. neutral) effects. Significance level was set at p < .05 (two-tailed). We decomposed all significant interactions into simple effects, and we applied a Bonferroni adjustment to the alpha level. The results section only presents the statistically significant results. Finally, complementary correlational analyses were performed using Pearson’s correlations (corrected for multiple comparisons using the Bonferroni procedure) to explore the potential influence of demographic and behavioral variables on experimental results.

Results

Demographic and behavioral data

The control and mTBI groups did not differ for age, sex, handedness, intellectual functioning, verbal ability, or the Griffith Empathy Measure (GEM; see Table 1) affective score. However, they differed significantly on familial socioeconomic status (SES; Table 1).

ERP data

sPCA, definition of ROIs, and quantification of ERP components

Nine SFs were selected that, taken together, described 91 % of the spatial variance in the data set and were submitted to varimax rotation (see Fig. 3). Following the strategy described in the Method section—spatial Principal Components Analysis (sPCA) and definition of regions of interest (ROIs)—six ROIs were defined (see Fig. 4): from SF1 two occipitotemporal ROIs were defined in the left (electrodes 58, 59, 65, 66, 70) and right (electrodes 83, 84, 90, 91, 96) hemispheres, as well as two central ROIs in the left (electrodes 7, 13, 31, 37) and right (electrodes 80, 87, 105, 112) hemisphere; from SF3 and SF2 a left (electrodes 18, 19, 20, 22, 23, 24, 26, 27, 28, 33) and a right frontal ROI (electrodes 2, 3, 4, 9, 10, 117, 118, 122, 123, 124) were defined.

Fig. 3
figure 3

Topographic maps of the factor loadings for the spatial factors (varimax rotation). The percentage of variance accounted for by each factor is indicated

Fig. 4
figure 4

Electrodes used in the regions of interest (ROIs). In green, electrodes selected for the left and right occipitotemporal ROIs. In orange, electrodes selected for the left and right central ROIs. In light blue, electrodes selected for the left and right frontal ROIs

Grand average ERP in the angry, neutral, and happy conditions at the frontal, central, and occipitotemporal ROIs in the left and right hemisphere are presented in Fig. 5 for the group of healthy controls and in Fig. 6 for the group of preschoolers who sustained mTBI. The P1 and N170 were identified and were thus measured at left and right occipitotemporal ROIs, whereas Nc was measured at left and right central and frontal ROIs. Table 2 shows means and standard deviations of uncorrected amplitudes and latencies for P1 and N170, as well as uncorrected mean amplitudes for Nc according to EFE (angry, neutral, and happy) and hemisphere (left, right) in the two groups (control, mTBI).

Fig. 5
figure 5

Grand average ERP waveforms for the group of healthy controls (CTL). From the top to the bottom, grand averaged ERP in the angry, neutral, and happy conditions at the frontal, central, and occipitotemporal ROIs in the left and right hemisphere. The three analyzed ERP components are identified (P1, N170, Nc)

Fig. 6
figure 6

Grand average ERP waveforms for the group of preschoolers who sustained mTBI. From the top to the bottom, grand averaged ERP in the angry, neutral, and happy conditions at the frontal, central, and occipitotemporal ROIs in the left and right hemispheres. The three analyzed ERP components are identified (P1, N170, Nc)

Table 2 Uncorrected amplitudes and latencies for each component according to emotion facial expression (angry, neutral, happy) and hemisphere (left, right) in the two groups (controls, mTBI)

ERP results

P1: Amplitude

There was a significant Valence × Group interaction, F(1, 31) = 4.664, p = .039, ηp 2 = 0.131. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025: there was a significant simple valence effect in the control group, F(1, 14) = 11.511, p = .004, with greater P1 amplitude in response to happy faces (M = 15.32 μV, SD = 5.37) than to angry faces (M = 13.02 μV, SD = 5.25), but not in the mTBI group, F(1, 17) = 0.378, p = .547, angry: M = 13.42 μV, SD = 6.02; happy: M = 13.78 μV, SD = 5.67. There was also a significant hemisphere effect, F(1, 31) = 9.610, p = .004, ηp 2 = 0.237: P1 amplitude was significantly larger in the right hemisphere (M = 15.78 μV, SD = 7.62) than in the left hemisphere (M = 11.68 μV, SD = 5.02).

P1: Latency

There was a significant Arousal × Group interaction, F(1, 31) = 9.465, p = .004, ηp 2 = 0.234. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025. The simple arousal effect did not reach significance either in the control group, F(1, 14) = 5.190, p = .039, nor in the mTBI group, F(1, 17) = 4.041, p = .061.

N170: Amplitude

Analyses revealed a significant Valence × Group interaction, F(1, 31) = 4.414, p = .044, ηp 2 = 0.125. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025. The simple valence effect did not reach significance either in the control group, F(1, 14) = 2.640, p = .126, nor in the mTBI group, F(1, 17) = 1.623, p = .220.

N170: Latency

There was a significant Arousal × Group interaction, F(1, 31) = 5.070, p = .032, ηp 2 = 0.141. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025. There was a significant simple arousal effect in the control group, F(1, 14) = 8.626, p = .011, with shorter N170 latency in response to emotional faces (angry: M = 72.40 ms, SD = 11.64; happy: M = 76.67 ms, SD = 8.23) than to neutral faces (M = 82.53 ms, SD = 9.81), but not in the mTBI group, F(1, 17) = 0.169, p = .686; angry: M = 78.67 ms, SD = 13.96; neutral: M = 77.22 ms, SD = 16.76; happy: M = 78.22 ms, SD = 10.33.

Nc

There was a significant Valence × ROI × Hemisphere × Group interaction, F(1, 31) = 4.494, p = .042, ηp 2 = 0.127. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025. The simple Valence × ROI × Hemisphere interaction did not reach significance either in the control group, F(1, 14) = 1.189, p = .294, nor in the mTBI group, F(1, 17) = 3.782, p = .069. There was a significant ROI × Group interaction, F(1, 31) = 4.442, p = .043, ηp 2 = 0.125. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025. The simple ROI effect was not significant either in the control group, F(1, 14) = 4.004, p = .065, nor in the mTBI group, F(1, 17) = 0.835, p = .374. There was also a significant ROI × Hemisphere interaction, F(1, 31) = 6.771, p =.014, ηp 2 = 0.179. Statistical significance of a simple effect was accepted at a Bonferroni-adjusted alpha level of 0.025. The simple ROI effect did not reach significance either in the left hemisphere, F(1, 32) = 0.115, p = .736, nor in the right hemisphere, F(1, 32) = 4.401, p = .044. Finally, there was a significant hemisphere effect, F(1, 31) = 11.745, p = .002, ηp 2 = 0.275: Nc amplitude was larger in the right hemisphere (M = -3.96 μV, SD = 1.89) than in the left hemisphere (M = -3.05 μV, SD = 1.60).

Complementary analyses

First, the potential influence of family SES on the ERP results was assessed. To this end, Pearson’s correlation coefficients (within each group and across groups) were computed between the Blishen Socioeconomic Index (Blishen et al., 1987) and ERP results. No significant correlations were found (p > .05 for every correlation). Second, the putative association between ERP results and behavioral variables was assessed by computing Pearson’s correlation coefficients (within each group and across groups) between ERP results and intellectual functioning, verbal abilities, as well as the GEM affective score. No significant correlations were observed (p > .05 for every correlation).

Discussion

The aim of this study was to compare the electrophysiological correlates of EFE processing between preschool children who sustained mTBI and healthy controls. The main results showed that, at the level of occipitotemporal sites, the healthy control group was the only one to present (1) a higher P1 amplitude for happy faces than angry faces, and (2) a shorter N170 latency for emotional faces than neutral faces. Thus, preschoolers who sustained mTBI did not present early valence and arousal effects observed in healthy controls at visuospatial and visual expertise stages, respectively.

The first key finding is the presence of a valence effect on P1 amplitude at occipitotemporal sites only in the control group. This early valence effect showed that happy faces, as compared with angry faces, elicited greater visuospatial processing in healthy preschoolers. This suggests rapid EFE processing in young children, who appear to distinguish between happiness and anger from the primary detection of facial cues. The topography and latency of P1 were very similar to previous reports in children, with notably a higher amplitude in the right hemisphere than in the left hemisphere (e.g., Batty & Taylor, 2006). Past studies in children of preschool age also found an emotional modulation of P1 amplitude (Curtis & Cicchetti, 2011; Dawson et al., 2004; Vlamings et al., 2010). However, the present results contrasted with other studies showing emotional effects on P1 latency (Batty & Taylor, 2006; Dawson et al., 2004), or no emotional effect either on P1 amplitude or P1 latency (Todd et al., 2008). Beyond the differences in experimental procedures (e.g., explicit processing vs. implicit processing vs. passive viewing) that could explain these discrepancies, it has to be noted that the present study controlled for extrafacial characteristics and low-level physical differences between angry, neutral and happy facial stimuli, in such a way that the groups of pictures only differed according to EFE. The greater visual processing of happy faces may thus reflect their particular salience for healthy preschoolers. Emotional modulations of visual processing are known to be critical for flexible and adaptive behaviors (Pourtois, Schettino, & Vuilleumier, 2013) and to depend centrally on direct and possibly indirect influences of the amygdala via its connections with visual areas, and frontal and parietal regions as well as subcortical nuclei (Vuilleumier, 2015). Notably, recent ERP data suggest that emotional effects on P1 amplitude may be mediated by the amygdala (Rotshtein et al., 2010). Even though these emotional influences on visual perception have been repeatedly found for threatening stimuli in adults, there is also evidence that they can be driven by positive signals (Vuilleumier, 2015).

From an evolutionary perspective, one might expect that both kinds of emotional faces (angry and happy) elicit a higher P1 amplitude than neutral faces, because preferential processing of emotional cues at the perceptive stage would allow rapid response to potentially advantageous or harmful stimuli. One possible explanation is that children are more likely to be sensitive to the stimuli that are frequent within their environment (i.e., happy faces; Batty & Taylor, 2006), and this greater sensitivity may be acquired through a learning process affecting perceptual mechanisms (Pollak, 2008). Evidence in favor of modifications in children’s perceptual processing of familiar environmental stimuli notably comes from studies on EFE processing in maltreated children. Behavioral findings show that maltreated children overidentify anger as compared with nonmaltreated children (Pollak & Kistler, 2002). At the electrophysiological level, a study by Curtis and Cicchetti (2011) found higher P1 amplitude in response to angry faces for maltreated children as compared with nonmaltreated children. According to these authors, this result would be a direct consequence of experience-dependent fine-tuning of perception: Visual processing of anger is higher in maltreated children because they more often experience anger expressed in a caregiver’s face as compared with nonmaltreated children. Furthermore, Curtis and Cicchetti (2011) also found that only nonmaltreated children displayed an enhanced P1 amplitude in response to happy faces as compared with neutral faces. We therefore suggest that healthy young children are likely to modify their perceptual mechanisms in such a way that visual processing of happy faces, probably the most familiar EFE for them, is enhanced. This hypothesis is also supported by behavioral data showing that happy faces are much better identified than other basic emotions by 2-year-old children; and even though preschoolers improve with age for other EFE (notably anger and sadness), happiness still appears to be better processed than other EFE until 6 years of age (Widen, 2013). Interestingly, a recent study showed that the efficiency of facial emotion recognition in young adults is also modulated by familiarity of the expressions, with happy faces being the most frequently occurring EFE (Calvo, Gutiérrez-Garcia, Fernandez-Martin, & Nummenmaa, 2014).

The second key finding from this study is the presence of an arousal effect on N170 latency, again only in the control group. Happiness and anger had shorter latencies than neutral, suggesting a faster structural encoding of emotional faces than neutral faces. This result is in agreement with previous studies showing a shorter N170 latency for emotional faces (fear) than neutral faces in preschoolers (Dawson et al., 2004). However, this study, as well as that of Parker et al. (2005), also found emotional modulations of N170 amplitudes, and conversely, others did not find any emotional effects either on N170 amplitude or N170 latency (Batty & Taylor, 2006; Curtis & Cicchetti, 2011; Todd et al., 2008). We can suppose that differences in experimental procedures are likely to explain, at least in part, these inconsistencies in the results. However, it should be noted that, beyond the controls that were performed on our stimulus selection, as we mentioned earlier, this study is also the first on EFE processing in preschoolers to control for the potential influences of preceding P1 on N170 results. Results from ERP research in adults, as for preschoolers, is rather controversial concerning emotional influences on the 170 (Vuilleumier & Pourtois, 2007). Several studies did not find any emotional effects on N170 amplitude and/or latency (e.g., Eimer & Holmes, 2002), and some authors even suggest that the presence or absence of emotional effects on N170 depends on the EEG reference that is used (Rellecke, Sommer, & Schacht, 2013). This absence of emotional effect is in line with Bruce and Young’s theory of face recognition (Eimer, 2011), according to which facial structural encoding and EFE detection are two parallel and independent stages of face processing (Bruce & Young, 1986). However, there is now increasing evidence for emotional effects on N170 (e.g., Batty & Taylor, 2003; Hinojosa, Mercado, & Carretié, 2015; Sprengelmeyer & Jentzsch, 2006), which may reflect the impact, at the structural encoding stage, of configural differences between emotional and neutral facial stimuli.

In agreement with the results of this study, it has been proposed that emotional influences on the N170 are not specific to either negative or positive valence (Vuilleumier & Pourtois, 2007; see also Hinojosa et al., 2015). This more detailed emotional processing at the visual expertise stage may be based on brain responses of a distributed network involving extrastriate visual regions like the fusiform gyrus, as well as the superior temporal sulcus, prefrontal regions and amygdala (see George, 2013; Vuilleumier & Pourtois, 2007). The shorter N170 latency observed in this study shows that healthy preschoolers made a distinction between emotional and neutral facial stimuli, and suggests that this distributed network is functional. This finding also suggests that the detailed processing of expression at this stage may be easier for emotional faces than neutral ones, which is in agreement with data showing difficulties in the decoding of adult neutral facial expressions among children (see the review by Somerville, Fani, & McClure-Tone, 2011).

The crucial question then becomes, what causes the lack of these emotional effects in the mTBI group? There is evidence that mTBI can be associated with structural and functional brain disturbances. These disturbances may notably include subtle brain microhemorrhages undetected by traditional clinical imaging (Beauchamp et al., 2011) and that have been shown to be associated with poorer cognitive and social outcomes (Beauchamp et al., 2013; Ryan et al., 2015a, 2015b), diffuse injuries, which are caused by the tearing and shearing of axons and appear to have severe consequences for the integrity of white matter structures in the developing brain, and changes in several cortical and subcortical regions (Arfanakis et al., 2002; Ashwal, Tong, Ghosh, Bartnik-Olson, & Holshouser, 2014; Eierud et al., 2014; Henry, Tremblay, Boulanger, Ellemberg, & Lassonde, 2010; Shenton et al., 2012; see Toledo et al., 2012, for a review; Zhou et al., 2013). Many of the brain regions particularly vulnerable to the effects of childhood TBI are also those associated with fundamental sociocognitive skills (e.g., Serra-Grabulosa et al., 2005; Toledo et al., 2012; Wilde et al., 2005). Thus, we can suppose that the lack of emotional effects on both P1 and N170 components in preschoolers who sustained mTBI could be due to direct alterations of brain regions (or connections between them) that are involved in emotional processing. Those alterations may concern the two mechanisms that are proposed as being involved in the emotional influences on the P1 and on the N170 (George, 2013). The first is a coarse emotional processing via a fast subcortical route to the amygdala, which may already be functional in newborns (Johnson, 2005; but see Vlamings et al., 2010), or fast top-down influences of anterior regions (amygdala and/or orbitofrontal regions) on visual regions. The second would be a more detailed information processing involving the interactions between a larger set of brain regions, and there is evidence suggesting that N170 precursors in infants are also modulated by EFE (Leppanen, Moulson, Vogel-Farley, & Nelson, 2007).

Contrary to the traditionally held notion that the young brain is flexible and able to reorganize (“plasticity theory”; Kennard, 1936), early brain injuries tend to result in worse outcomes (Anderson, Catroppa, Morse, Haritou, & Rosenfeld, 2005; Anderson, Godfrey, Rosenfeld, & Catroppa, 2012; Anderson, Spencer-Smith, et al., 2009; Beauchamp et al., 2010). This increased vulnerability of the developing brain can be attributed to the fact that cognitive and socioemotional functions coming online during the first 5 years of life are critically dependent on the integrity of particular brain structures at key developmental stages (“vulnerability theory”; Anderson et al., 2005; Anderson, Spencer-Smith, et al., 2009). Maturation of brain circuits involved in emotional processing is likely to continue across childhood and later. For instance, there is evidence from animal data that maturation of connectivity between amygdala and frontal cortical regions continues into adolescence and early adulthood (Cunningham, Bhattacharyya, & Benes, 2002). Thus, the absence of emotional effects evidenced here could (1) also be due to disturbances induced by mTBI in the maturation of cerebral networks (and notably interactions between the brain regions involved) sustaining emotional processing, and (2) have an impact on EFE processing later in childhood, adolescence, and adulthood. Due to the young age of the participants, this study did not use an EFE decoding task because behavioral responses could have been an important source of artifacts on EEG signals and would have also significantly lengthened the total duration of the experiment. Further studies should, therefore, investigate whether emotional deficits after mTBI at the preschool age, observed here at the electrophysiological level, are also evidenced at the behavioral level. Indeed, there are data suggesting that preschoolers around 2–3 years are already capable of using expected labels to identify happiness and anger (Widen, 2013). Thus, one can hypothesize that early childhood mTBI might be associated with deficits in labeling of these specific EFE. However, because several authors suggest that preschoolers mainly use arousal and valence dimensions to discriminate between emotional stimuli (Russell & Bullock, 1985; Widen, 2013), future studies should also investigate whether evaluation of the EFE according to these dimensions differs between preschoolers with mTBI and healthy controls. Of note, it is also possible that an alternative strategy is used by children with mTBI and would act as a compensatory mechanism enabling them to perform at a level equivalent to that of healthy children in an EFE labeling task or to provide equivalent valence and arousal evaluations. Moreover, future studies should determine whether putative behavioral difficulties have an impact on the interpersonal relationships of these children. Indeed, given that quick analysis of facial expression is crucial in social communication, EFE decoding deficits in preschoolers are likely to affect their social interactions. Further studies should also investigate whether adults who sustain a TBI during childhood still present deficits in EFE decoding that may impact their social interactions. In support of this, several studies indicate that in adolescence and adulthood, survivors of childhood TBI have adverse social outcomes impacting their ability to engage in appropriate social interactions and abide by societal rules, placing them at-risk for maladaptive behaviors and their social and legal consequences (e.g., Anderson, Brown, et al., 2009; Baguley, Cooper, & Felmingham, 2006; Beauchamp et al., 2013).

It should also be noted that another possible explanation could account specifically for the lack of positive valence effect on P1 amplitude in preschoolers who sustained mTBI. This alternative explanation, which does not preclude the hypothesis of impaired brain regions and/or connectivity, would be that happy faces are not more familiar than angry faces in the environment of preschoolers who sustain mTBI. In other words, the lack of increased P1 amplitude for happy faces would result from the fact that these children are less confronted with happy faces in their environment than are other children. Although speculative, this explanation has to be considered given data suggesting that childhood TBI has an impact on family life (Stancin, Wade, Walz, Yeates, & Taylor, 2008) and notably on the quality of child–parent interactions (Wade et al., 2008; Lalonde, Jocelyn, Bernier & Beauchamp, 2016). In fact, it is also possible that the absence of positive valence influence on the P1 amplitude constitutes a premorbid state. Unfortunately, we were unable to test for these hypotheses as we did not have any data regarding the frequency with which preschoolers included in our study were confronted with happy or angry faces in their everyday life (before and after injury). Future studies could attempt to replicate these results and determine whether or not such a link between frequency of the EFE and amplitude of the P1 component exists, as well as whether happiness is less frequently experienced by preschoolers who sustain mTBI. Another line of evidence concerning caregivers’ influence on EFE processing in infants comes from a recent ERP study showing that sensitive parenting is associated with increased Nc amplitudes to happy faces as compared with neutral faces (Taylor-Colls & Fearon, 2015). Actually, some studies also showed that Nc amplitude is modulated by the emotional content of the face in infants (see de Haan et al., 2003) and in children (Batty & Taylor, 2006; Cicchetti & Curtis, 2005; Curtis & Cicchetti, 2011; Nelson & Nugent, 1990), even if results are controversial for preschoolers, with some studies showing no emotional effect (Dawson et al., 2004; Todd et al., 2008), as in ours.

To conclude, this study demonstrates for the first time that a mTBI sustained during the preschool period affects the cerebral processing of emotional facial expressions. More precisely, childhood mTBI is associated with a lack of early valence and arousal effects observed during EFE processing in healthy preschoolers at the visuospatial and visual expertise stages. This study constitutes a first observation of electrophysiological modifications associated with emotional processing in mTBI, and sheds new lights on the impact on socioemotional functions associated with this frequent health problem in children.