Skip to main content

Advertisement

Log in

Four Research Designs and a Comprehensive Analysis Strategy for Investigating Common Method Variance with Self-Report Measures Using Latent Variables

  • Original Paper
  • Published:
Journal of Business and Psychology Aims and scope Submit manuscript

Abstract

Common method variance (CMV) is an ongoing topic of debate and concern in the organizational literature. We present four latent variable confirmatory factor analysis model designs for assessing and controlling for CMV—those for unmeasured latent method constructs, Marker Variables, Measured Cause Variables, as well as a new hybrid design wherein these three types of method latent variables are used concurrently. We then describe a comprehensive analysis strategy that can be used with these four designs and provide a demonstration using the new design, the Hybrid Method Variables Model. In our discussion, we comment on different issues related to implementing these designs and analyses, provide supporting practical guidance, and, finally, advocate for the use of the Hybrid Method Variables Model. Through these means, we hope to promote a more comprehensive and consistent approach to the assessment of CMV in the organizational literature and more extensive use of hybrid models that include multiple types of latent method variables to assess CMV.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. It is preferable to test for CMV using a CFA model, as compared to a full structural equation model with exogenous and endogenous latent variables. This preference is based on the fact that the CFA model is the least restrictive in terms of latent variable relations (all latent variables are related to each other), so there is no risk of path model misspecification compromising CMV tests. Also, there are likely fewer estimation/convergence problems due to the complex method variance measurement model when implemented in a CFA vs. a path model.

  2. With LISREL, latent variable standardization is the default, and the factor loadings and error variances to be used are referred to as LISREL Estimates. With Mplus, the default is to achieve identification by setting a referent factor loading equal to 1.0 and the factor variance is estimated, so this default must be released so that the referent factor loading is estimated and the corresponding factor variance is set equal to 1.0. Assuming this has occurred, the Mplus unstandardized estimates are used as fixed values for the relevant factor loadings and error variances.

  3. As part of the original CFA Marker Technique, Williams et al. (2010) also included a Phase III Sensitivity Analysis based on Lindell and Whitney (2001) to address the degree to which conclusions might be influenced by sampling error (see Williams et al. pp. 500; 503). In our current strategy, Sensitivity Analysis is not included, but can be seen as optional. Based on the results of Williams et al., we note that researchers may consider including it only if their sample sizes are very small and there is a concern that sampling error may be influencing their point estimates and method variance effects might be underestimated.

References

  • Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103(3), 411–423.

    Article  Google Scholar 

  • Bagozzi, R. P. (1982). A field investigation of causal relations among cognitions, affect, intentions, and behavior. Journal of Marketing Research, 19, 562–584.

    Article  Google Scholar 

  • Bagozzi, R. P. (1984). Expectancy-value attitude models an analysis of critical measurement issues. International Journal of Research in Marketing, 1(4), 295–310.

    Article  Google Scholar 

  • Barrick, M. R., & Mount, M. K. (1996). Effects of impression management and self-deception on the predictive validity of personality constructs. Journal of Applied Psychology, 81, 261–272.

    Article  PubMed  Google Scholar 

  • Bentler, P. M., & Moojart, A. (1989). Choice of structural model via parsimony: A rationale based on precision. Psychological Bulletin, 106, 315–317.

    Article  PubMed  Google Scholar 

  • Brief, A. P., Burke, M. J., George, J. M., Robinson, B. S., & Webster, J. (1988). Should negative affectivity remain an unmeasured variable in the study of job stress? Journal of Applied Psychology, 73, 193–198.

    Article  PubMed  Google Scholar 

  • Brown, T. A. (2006). Confirmatory factor analysis for applied research. London: Guilford Press.

    Google Scholar 

  • Bryant, F. B., & Satorra, A. (2012). Principles and practice of scaled difference Chi square testing. Structural Equation Modeling: A Multidisciplinary Journal, 19, 372–398.

    Article  Google Scholar 

  • Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81–105.

    Article  PubMed  Google Scholar 

  • Chan, D. (2001). Method effects of positive affectivity, negative affectivity, and impression management in self-reports of work attitudes. Human Performance, 14(1), 77–96.

    Article  Google Scholar 

  • Chen, P. Y., & Spector, P. E. (1991). Negative affectivity as the underlying cause of correlations between stressors and strains. Journal of Applied Psychology, 76, 398–407.

    Article  PubMed  Google Scholar 

  • Conway, J. M., & Lance, C. E. (2010). What reviewers should expect from authors regarding common method bias in organizational research. Journal of Business and Psychology, 25, 325–334.

    Article  Google Scholar 

  • Dawson, J. F. (2014). Moderation in management research: What, why, when, and how. Journal of Business and Psychology, 29, 1–19.

    Article  Google Scholar 

  • Diener, E., Emmons, R. A., Larsen, R. J., & Griffin, S. (1985). The satisfaction with life scale. Journal of Personality Assessment, 49, 71–75.

    Article  PubMed  Google Scholar 

  • Ding, C., & Jane, T. (2015, August). Re-examining the effectiveness of the ULMC technique in CMV detection and correction. In L. J. Williams (Chair), Current topics in common method variance, Academy of Management Conference, Vancouver, BC.

  • Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50.

    Article  Google Scholar 

  • Ganster, D. C., Hennessey, H. W., & Luthans, F. (1983). Social desirability response effects: Three alternative models. Academy of Management Journal, 26, 321–331.

    Article  Google Scholar 

  • Haynes, C. E., Wall, T. D., Bolden, R. I., Stride, C., & Rick, J. E. (1999). Measures of perceived work characteristics for health services research: Test of a measurement model and normative data. British Journal of Health Psychology, 4, 257–275.

    Article  Google Scholar 

  • Heise, D. R., & Bohrnstedt, G. W. (1970). Validity, invalidity, and reliability. Sociological Methodology, 2, 104–129.

    Article  Google Scholar 

  • Johnson, R. E., Rosen, C. C., & Djurdjevic, E. (2011). Assessing the impact of common method variance on higher order multidimensional constructs. Journal of Applied Psychology, 96, 744–761.

    Article  PubMed  Google Scholar 

  • Jöreskog, K. G. (1971). Simultaneous factor analysis in several populations. Psychometrika, 36, 409–426.

    Article  Google Scholar 

  • Karasek, R. (1979). Job demands, job decision latitude, and mental strain: Implications for job re-design. Administrative Science Quarterly, 24, 285–306.

    Article  Google Scholar 

  • Kenny, D. A., & Kashy, D. A. (1992). Analysis of multitrait-multimethod matrix by confirmatory factor analysis. Psychological Bulletin, 112, 165–172.

    Article  Google Scholar 

  • Kline, R. B. (2010). Principles and practice of structural equation modeling (3rd ed.). New York: Guilford Press.

    Google Scholar 

  • Landis, R. S. (2013). Successfully combining meta-analysis and structural equation modeling: Recommendations and strategies. Journal of Business and Psychology, 28, 251–261.

    Article  Google Scholar 

  • Lindell, M. K., & Whitney, D. J. (2001). Accounting for common method variance in cross-sectional research designs. Journal of Applied Psychology, 86, 114–121.

    Article  PubMed  Google Scholar 

  • McDonald, R. P., & Ho, M. H. R. (2002). Principles and practice in reporting statistical equation analyses. Psychological Methods, 7, 64–82.

    Article  PubMed  Google Scholar 

  • McGonagle, A. K., Fisher, G. G., Barnes-Farrell, J. L., & Grosch, J. W. (2015). Individual and work factors related to perceived work ability and labor force outcomes. Journal of Applied Psychology, 100, 376–398. doi:10.1037/a0037974.

    Article  PubMed  Google Scholar 

  • McGonagle, A., Williams, L. J., & Wiegert, D. (2014, August). A review of recent studies using an unmeasured latent method construct in the organizational literature. In L. J. Williams (Chair), Current issues in investigating common method variance. Presented at annual Academy of Management conference, Philadelphia, PA.

  • Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88, 879–903.

    Article  PubMed  Google Scholar 

  • Podsakoff, P. M., MacKenzie, S. B., Moorman, R. H., & Fetter, R. (1990). Transformational leader behaviors and their effects on followers’ trust in leader, satisfaction, and organizational citizenship behaviors. Leadership Quarterly, 1(2), 107–142.

    Article  Google Scholar 

  • Podsakoff, P. M., MacKenzie, S. B., & Podsakoff, N. P. (2012). Sources of method bias in social science research and recommendations on how to control it. Annual Review of Psychology, 63, 539–569.

    Article  PubMed  Google Scholar 

  • Podsakoff, P. M., & Organ, D. W. (1986). Self-reports in organizational research: Problems and prospects. Journal of Management, 12, 531–544.

    Article  Google Scholar 

  • Ragins, B. R., Lyness, K. S., Williams, L. J., & Winkel, D. (2014). Life spillovers: The spillover of fear of home foreclosure to the workplace. Personnel Psychology, 67, 763–800.

    Article  Google Scholar 

  • Richardson, H. A., Simmering, M. J., & Sturman, M. C. (2009). A tale of three perspectives: Examining post hoc statistical techniques for detection and correction of common method variance. Organizational Research Methods, 12, 762–800.

    Article  Google Scholar 

  • Rogers, W. M., & Schmitt, N. (2004). Parameter recovery and model fit using multidimensional composites: A comparison of four empirical parceling algorithms. Multivariate Behavioral Research, 39, 379–412.

    Article  Google Scholar 

  • Schaubroeck, J., Ganster, D. C., & Fox, M. L. (1992). Dispositional affect and work-related stress. Journal of Applied Psychology, 77, 322–335.

    Article  PubMed  Google Scholar 

  • Schaufeli, W. B., & Bakker, A. B. (2003). The Utrecht Work Engagement Scale (UWES). Test manual. Utrecht: Department of Social & Organizational Psychology.

    Google Scholar 

  • Schaufeli, W. B., Bakker, A. B., & Salanova, M. (2006). The measurement of work engagement with a short questionnaire a cross-national study. Educational and Psychological Measurement, 66, 701–716.

    Article  Google Scholar 

  • Schmitt, N. (1978). Path analysis of multitrait-multimethod matrices. Applied Psychological Measurement, 2, 157–173.

    Article  Google Scholar 

  • Schmitt, N., Nason, E., Whitney, D. J., & Pulakos, E. D. (1995). The impact of method effects on structural parameters in validation research. Journal of Management, 21, 159–174.

    Article  Google Scholar 

  • Schmitt, N., Pulakos, E. D., Nason, E., & Whitney, D. J. (1996). Likability and similarity as potential sources of predictor-related criterion bias in validation research. Organizational Behavior and Human Decision Processes, 68(3), 272–286.

    Article  Google Scholar 

  • Schmitt, N., & Stults, D. M. (1986). Methodology review: Analysis of multitrait-multimethod matrices. Applied Psychological Measurement, 10(1), 1–22.

    Article  Google Scholar 

  • Simmering, M. J., Fuller, C. M., Richardson, H. A., Ocal, Y., & Atinc, G. M. (2015). Marker variable choice, reporting, and interpretation in the detection of common method variance: A review and demonstration. Organizational Research Methods, 18, 473–511. doi:10.1177/1094428114560023.

    Article  Google Scholar 

  • Smith, D. B., & Ellingson, J. E. (2002). Substance versus style: A new look at social desirability in motivating contexts. Journal of Applied Psychology, 87, 211–219.

    Article  PubMed  Google Scholar 

  • Smith, C. S., Tisak, J., Hahn, S. E., & Schmieder, R. A. (1997). The measurement of job control. Journal of Organizational Behavior, 18, 225–237.

    Article  Google Scholar 

  • Spector, P. E. (2006). Method variance in organizational research truth or urban legend? Organizational Research Methods, 9, 221–232.

    Article  Google Scholar 

  • Spector, P. E., & Brannick, M. T. (2010). Common method issues: An introduction to the feature topic in organizational research methods. Organizational Research Methods, 13, 403–406.

    Article  Google Scholar 

  • Spector, P. E., Rosen, C. C., Johnson, R. E., Richardson, H. A., & Williams, L. J. (2015). Legend or legendary: A measure-centric model of method variance. Unpublished manuscript.

  • Thompson, E. (2007). Development and validation of an internationally reliable short-form of the positive and negative affect schedule (PANAS). Journal of Cross-Cultural Psychology, 38, 227–242.

    Article  Google Scholar 

  • Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: The PANAS Scales. Journal of Personality and Social Psychology, 54, 1063–1070.

    Article  PubMed  Google Scholar 

  • West, S. G., Wu, A. B., & Taylor, W. (2012). Model fit and model selection in structural equation modeling. In R. H. Hoyle (Ed.), Handbook of structural equation modeling. New York: Guilford Press.

    Google Scholar 

  • Widaman, K. F. (1985). Hierarchically nested covariance structure models for multitrait-multimethod data. Applied Psychological Measurement, 9(1), 1–26.

    Article  Google Scholar 

  • Williams, L. J. (2014, August). Use of an unmeasured latent method construct (ULMC) in the presence of multidimensional method variance. In L. J. Williams (Chair), Current issues in investigating common method variance. Presented at annual Academy of Management conference, Philadelphia, PA.

  • Williams, L. J., & Anderson, S. E. (1994). An alternative approach to method effects by using latent-variable models: Applications in organizational behavior research. Journal of Applied Psychology, 79, 323–331.

    Article  Google Scholar 

  • Williams, L. J., Edwards, J. R., & Vandenberg, R. J. (2003a). Recent advances in causal modeling methods for organizational and management research. Journal of Management, 29, 903–936.

    Article  Google Scholar 

  • Williams, L. J., Gavin, M. B., & Williams, M. L. (1996). Measurement and nonmeasurement processes with negative affectivity and employee attitudes. Journal of Applied Psychology, 81, 88–101.

    Article  Google Scholar 

  • Williams, L., Hartman, N., & Cavazotte, F. (2003). Method variance and marker variables: An integrative approach using structural equation methods. Paper presented at annual Academy of Management Conference.

  • Williams, L. J., Hartman, N., & Cavazotte, F. (2010). Method variance and marker variables: A review and comprehensive CFA marker technique. Organizational Research Methods, 13, 477–514.

    Article  Google Scholar 

  • Williams, L. J., & O’Boyle, E. H. (2008). Measurement models for linking latent variables and indicators: A review of human resource management research using parcels. Human Resource Management Review, 18, 233–242.

    Article  Google Scholar 

  • Williams, L. J., & O’Boyle, E. H. (2015). Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters. Journal of Applied Psychology, 100(5), 1579–1602.

    Article  PubMed  Google Scholar 

  • Zickar, M. J. (2015). Digging through dust: Historiography for the organizational sciences. Journal of Business and Psychology, 30, 1–14.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Larry J. Williams.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Williams, L.J., McGonagle, A.K. Four Research Designs and a Comprehensive Analysis Strategy for Investigating Common Method Variance with Self-Report Measures Using Latent Variables. J Bus Psychol 31, 339–359 (2016). https://doi.org/10.1007/s10869-015-9422-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10869-015-9422-9

Keywords

Navigation