Skip to main content
Log in

Quantifying Facial Expression Synchrony in Face-To-Face Dyadic Interactions: Temporal Dynamics of Simultaneously Recorded Facial EMG Signals

  • Original Paper
  • Published:
Journal of Nonverbal Behavior Aims and scope Submit manuscript

Abstract

Human social interaction is enriched with synchronous movement which is said to be essential to establish interactional flow. One commonly investigated phenomenon in this regard is facial mimicry, the tendency of humans to mirror facial expressions. Because studies investigating facial mimicry in face-to-face interactions are lacking, the temporal dynamics of facial mimicry remain unclear. We therefore developed and tested the suitability of a novel approach to quantifying facial expression synchrony in face-to-face interactions: windowed cross-lagged correlation analysis (WCLC) for electromyography signals. We recorded muscle activations related to smiling (Zygomaticus Major) and frowning (Corrugator Supercilii) of two interaction partners simultaneously in 30 dyadic affiliative interactions. We expected WCLC to reliably detect facial expression synchrony above chance level and, based on previous research, expected the occurrence of rapid synchronization of smiles within 200 ms. WCLC significantly detected synchrony of smiling but not frowning compared to a control condition of chance level synchrony in six different interactional phases (smiling: d z s = .85–1.11; frowning: d z s = .01–.30). Synchronizations of smiles between interaction partners predominantly occurred within 1000 ms, with a significant amount occurring within 200 ms. This rapid synchronization of smiles supports the notion of the existence of an anticipated mimicry response for smiles. We conclude that WCLC is suited to quantify the temporal dynamics of facial expression synchrony in dyadic interactions and discuss implications for different psychological research areas.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. For this article, we assume that behavioral mimicry is a form of nonverbal synchrony that is restricted to matching movements and that the timing component of the mimicry response can be accounted for by using measures of nonverbal synchrony for matching behaviors. For a discussion of the overlap of behavioral mimicry and nonverbal synchrony see Chartrand and Lakin (2013).

  2. We use the term ‘synchronization’ to refer to the active process of synchronizing a movement to match another movement. Contrarily, we use the term ‘synchrony’ to refer to the state of movements being synchronous.

  3. By setting the time constant of the moving average to a value of .5 s, we approximated the approach taken by Ramseyer and Tschacher (2011).

  4. For an optimal temporal resolution of the WCLC output, this increment should be set equal to the time interval between single observations in the time series (e.g., .1 s when the data was sampled at 10 Hz).

  5. Practical considerations may also play a role in the analysis. For example, choosing smaller window sizes as well as choosing larger maximum time lags increases the number of correlations calculated by WCLC and thus processing time and family wise error rate.

  6. For analyses across muscle sites, we averaged the synchrony emerging for the two combinations of muscle sites (i.e. Zygomaticus person 1 » Corrugator person 2 and Corrugator person 1 » Zygomaticus person 2).

  7. In cross-lagged correlation analyses (cave: not windowed cross-lagged correlation), the time lag corresponding to the peak value is a commonly used outcome to determine the time lag of the association of two analyzed time series (cf. Boker et al. 2002). Because in WCLC each row (= each time window) represents a single cross-lagged correlation, our peak counting approach is analogous to the established approach.

References

Download references

Acknowledgements

We wish to thank all participants for their participation and Rukiye Köysürenbars, Mathias Osterried, Beatrice Salewski, Olga Schulz, and Katrin Zilliken, who helped with the data collection.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marcel Riehle.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures were approved by a local ethics committee (Psychotherapeutenkammer Hamburg) and performed in accordance with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 773 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Riehle, M., Kempkensteffen, J. & Lincoln, T.M. Quantifying Facial Expression Synchrony in Face-To-Face Dyadic Interactions: Temporal Dynamics of Simultaneously Recorded Facial EMG Signals. J Nonverbal Behav 41, 85–102 (2017). https://doi.org/10.1007/s10919-016-0246-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10919-016-0246-8

Keywords

Navigation