Elsevier

Social Science Research

Volume 40, Issue 4, July 2011, Pages 1037-1050
Social Science Research

Are we keeping the people who used to stay? Changes in correlates of panel survey attrition over time

https://doi.org/10.1016/j.ssresearch.2011.03.001Get rights and content

Abstract

As survey response rates decline, correlates of survey participation may also be changing. Panel studies provide an opportunity to study a rich set of correlates of panel attrition over time. We look at changes in attrition rates in the American National Election Studies from 1964 to 2004, a repeated panel survey with a two-wave pre-post election design implemented over multiple decades. We examine changes in attrition rates by three groups of variables: sociodemographic and ecological characteristics of the respondent and household, party affiliation and political and social attitudes recorded at the first interview, and paradata about the first wave interview. We find relatively little overall change in the pre-post election panel attrition rates, but important changes in demographic correlates of panel attrition over time. We also examine contact and cooperation rates from 1988 to 2004.

Introduction

A panel study timed around a political election is one common method of conducting an election study. Election studies are conducted to understand who votes and why they vote, and take place in many countries around the world, including the United States, Canada, countries throughout the European Union, Australia and Japan (ANES, 2008). A typical election study asks a random sample of voting age adults about their political and social attitudes immediately before an election. Soon after the election, these pre-election survey respondents are once again asked about their political and social attitudes and – importantly – their voting behavior. These surveys are referred to as pre- and post-election surveys.

Inevitably, some people who participated in the pre-election survey do not participate in the post-election survey. Attrition, or nonresponse to wave two or later of a panel survey, reduces sample size for analyses and biases survey estimates when those who participate are systematically different from those who do not (Bethlehem, 2002, Deming, 1953, Groves and Couper, 1998). Demographic characteristics have been shown to be related to panel attrition, with older people, minorities and males generally more likely to attrite (Aneshensel et al., 1989, Burkam and Lee, 1998, Fitzgerald et al., 1998, Gray et al., 1996, Loosveldt et al., 2002, Lynn et al., 2005, MaCurdy et al., 1998, Mirowsky and Reynolds, 2000, Peracchi, 2002, Sharot, 1991, Watson and Wooden, 2009). Persons with more education and higher income are less likely to attrite (Fitzgerald et al., 1998, Gray et al., 1996, Loosveldt et al., 2002, Lynn et al., 2005, MaCurdy et al., 1998, Mirowsky and Reynolds, 2000, Watson and Wooden, 2009), and persons living in urban areas are more likely to attrite (Gray et al., 1996, Lynn et al., 2005, Watson and Wooden, 2009). Attitudinal measures have also been examined as predictors of panel attrition, including social (Waterton and Lievesley, 1987) and political attitudes (Lepkowski and Couper, 2002, Loosveldt and Carton, 1997, Loosveldt et al., 2002). Nonrespondents in election studies tend to be those who are less politically interested or involved in politics (Brehm, 1993, Burden, 2000, Voogt and Saris, 2003; Voogt, 2005).

Different from cross-sectional surveys, sample units to later waves of panel surveys already have experienced at least one interview. As such, characteristics of the previous survey interview also predict panel attrition. Item nonresponse to questions in a previous wave of the study (Burkam and Lee, 1998, Lepkowski and Couper, 2002, Loosveldt and Carton, 1997, Loosveldt et al., 2002, Watson and Wooden, 2009) is associated with higher attrition rates, while the longer interviews in prior waves are associated with higher retention rates (Bogen, 1996).

Although declines in response rates for cross-sectional household surveys over the last four decades are well documented (Curtin et al., 2005, de Leeuw and de Heer, 2002, Steeh, 1981), changes in panel survey attrition rates over these years have been less frequently examined, with limited evidence for declining trends (Agency for Healthcare Research, 2008, Atrostic et al., 2001, Westat, 1998). More importantly, unknown is whether the kinds of people who participate in surveys are changing. To our knowledge, no research has been conducted on whether these factors consistently predict attrition from year to year, holding the wave constant, that is, from wave 1 to wave 2. In other words, are the types of nonrespondents now different from those from a few decades ago?

Among those social changes with the potential to affect who participates in surveys, women’s labor force participation in the US increased substantially with more women working outside the home (Blau, 1998, Dye, 2005, US Census Bureau, 2008). There has been a general aging of the US population (Hobbs and Stoops, 2002), accompanied by an increase in education levels (Stoops, 2004). The racial/ethnic composition of the US also became increasingly non-white (Hobbs and Stoops, 2002). Where people live also changed, with a shift back to urban centers, especially in the 1990s and early 2000s (Mackun, 2005). Some of these changes over time can be attributed to historical events spurring social change, others to birth cohort differences, and others to within-individual changes over time (Alwin and McCammon, 2003, Fienberg and Mason, 1979). As the public changes, their views on the value of surveys or time that can be devoted to surveys might change. We hypothesize that these social groups (as defined by sex, race, age, education, and urbanicity) are the most likely to exhibit changes in participation rates over time.

In an election study, participation may be particularly attractive to respondents who are affiliated with certain political parties. We hypothesize that persons whose political party wins a Presidential election will be more likely to cooperate with the post-election survey request. Thus, as political power changes over time, so will cooperation rates. These differences may not be reflected in overall retention rates as we do not expect differences in contact rates over time for different political parties.

This paper examines changes in correlates of post-election survey participation in the American National Election Studies from 1964 to 2004. We also examine whether there are differences in correlates of making contact with the sampled person and cooperation, conditional on contact, between 1988 and 2004. These analyses permit us to understand whether there is change in characteristics of nonrespondents, and thus the potential for change in nonresponse bias of survey estimates, over these years.

Section snippets

Data and methods

A repeated panel survey (Kalton and Citro, 1993), in which wave one and wave two of a panel survey are fielded at different points in time, is needed for this study. The American National Election Study (ANES) fits this criterion. The goal of the ANES is to understand voting behavior, political participation and political, social and economic attitudes of American adults.

Results

We start with the benchmark comparisons between the pre-election survey respondents and the benchmark Current Population Survey data. We then present results from the modeling of post-election survey retention rates. Modeling results are divided into characteristics for which we found changes in retention rates over time and those whose retention rates stayed relatively the same. We examine models for the overall time period (1964–2004) and then models separately for 20 year segments (1964–1984

Discussion and conclusion

The relationship between characteristics of people and survey participation today is not necessarily the same as the relationship between those characteristics and survey participation many years ago. We examined whether predictors of panel attrition changed magnitude or direction over a 40 year time period encompassing eleven administrations of the American National Election Studies, and predictors of contactability and cooperation over a 16 year time period. Based on cross-sectional literature,

Acknowledgments

An earlier version of this paper was presented at the American Association for Public Opinion Research, New Orleans, May 15, 2008. Any opinions, findings and conclusions or recommendations expressed in these materials are those of the authors and do not necessarily reflect the views of the funding organizations for the American National Election Studies.

References (73)

  • AAPOR

    Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys

    (2009)
  • Agency for Healthcare Research and Quality, 2008. MEPS-HC Response Rates by...
  • L.S. Aiken et al.

    Multiple Regression: Testing and Interpreting Interactions

    (1991)
  • D.F. Alwin et al.

    Generations, cohorts and social change

  • American National Election Studies, 2004. The 2004 National Election Study. University of Michigan, Center for...
  • American National Election Studies, 2008. Other Election Studies. Stanford University and the University of Michigan,...
  • C.S. Aneshensel et al.

    Participation of Mexican American female adolescents in a longitudinal panel survey

    Public Opinion Quarterly

    (1989)
  • B.K. Atrostic et al.

    Nonresponse in US government household surveys: consistent measures, recent trends, and new insights

    Journal of Official Statistics

    (2001)
  • Bethlehem, J., 2002. Weighting nonresponse adjustments based on auxiliary information. In: Groves, R.M., Dillman, D.A.,...
  • F.D. Blau

    Trends in the well-being of American women, 1970–1995

    Journal of Economic Literature

    (1998)
  • Bogen, K., The effect of questionnaire length on response rates – a review of the literature. In: Proceedings of the...
  • Brehm, J. 1987. Who’s Missing? An analysis of non-response and undercoverage in the 1986 national election studies...
  • J. Brehm

    The Phantom Respondents: Opinion Surveys and Political Representation

    (1993)
  • B.C. Burden

    Voter turnout and the national election studies

    Political Analysis

    (2000)
  • D.T. Burkam et al.

    Effects of monotone and nonmonotone attrition on parameter estimates in regression models with educational data: demographic effects on achievement, aspirations and attitudes

    Journal of Human Resources

    (1998)
  • Burns, N., Kinder, D.R., Rosenstone, S.J., Sapiro, V., American National Election STudies., 2001. National Election...
  • Casper, L.M., L.E. Bass. 1998. Voting and registration in the election of November 1996. Current Population Reports,...
  • Couper, M.P., 1998. Measuring survey quality in a CASIC environment. In: Proceedings of the American Statistical...
  • R. Curtin et al.

    Changes in telephone survey nonresponse over the past quarter century

    Public Opinion Quarterly

    (2005)
  • E. de Leeuw et al.

    Trends in household survey nonresponse: a longitudinal and international perspective

  • W.E. Deming

    On a probability mechanism to attain an economic balance between the resultant error of response and the bias of nonresponse

    Journal of the American Statistical Association

    (1953)
  • DeBell, M., Krosnick, J.A., 2009. Computing weights for American national election study survey data. ANES Technical...
  • J.L. Dye

    Fertility of American women: June 2004

  • S.E. Fienberg et al.

    Identification and estimation of age-period-cohort models in the analysis of discrete archival data

    Sociological Methodology

    (1979)
  • J. Fitzgerald et al.

    An analysis of sample attrition in panel data: the Michigan panel study of income dynamics

    Journal of Human Resources

    (1998)
  • R. Gray et al.

    Exploring survey nonresponse: the effect of attrition on a follow-up of the 1984–85 health and life style survey

    The Statistician

    (1996)
  • R.M. Groves

    Nonresponse rates and nonresponse bias in household surveys

    Public Opinion Quarterly

    (2006)
  • R.M. Groves et al.

    Nonresponse in Household Interview Surveys

    (1998)
  • R.M. Groves et al.

    The impact of nonresponse rates on nonresponse bias: a meta-analysis

    Public Opinion Quarterly

    (2008)
  • Groves, R.M., Brick, J.M., Couper, M., Kalsbeek, W., Harris-Kojetin, B., Kreuter, F., Pennell, B.-E., Raghunathan,...
  • F. Hobbs et al.

    Demographic Trends in the 20th Century

  • Holder, K. 2006. Voting and registration in the election of November 2004. Current Population Reports, Series P-20, No....
  • Jamieson, A., H.B. Shin, and J. Day. 2002. Voting and registration in the election of November 2000. Current Population...
  • G. Kalton et al.

    Panel surveys: adding the fourth dimension

    Survey Methodology

    (1993)
  • J.M. Lepkowski et al.

    Nonresponse in the second wave of longitudinal household surveys

  • Loosveldt, G., Carton, A., Evaluation of nonresponse in the belgian election panel study ‘91–‘95. In: 52nd Annual...
  • Cited by (26)

    • The use of adjustment weights in voter surveys. Correcting for panel attrition and nonresponse can produce less accurate estimates of voting behavior

      2022, Electoral Studies
      Citation Excerpt :

      Community attachment seems to be especially important, more so than personality factors and well-being related variables (Bianchi and Biffignandi, 2018). While there are several studies on the effects of panel attrition, most studies have been concerned with the effect on demographic and psychological characteristics (personality traits) (Bianchi and Biffignandi, 2018; Olson and Witt, 2011; Satherley et al., 2015). Fewer have been concerned with attitudinal measures, which are central to the study of political behavior and are focused on in this paper (but see, Frankel and Hillygus, 2014).

    • Careless response and attrition as sources of bias in online survey assessments of personality traits and performance

      2017, Computers in Human Behavior
      Citation Excerpt :

      Calls for increased use of longitudinal research designs (e.g., Chandler & Lyon, 2001) necessitates a better understanding of the extent to which attrition and CR impact survey data (e.g., Crook, Shook, Morris, & Madden, 2010; Goodman & Blum, 1996). Although CR and attrition co-occur in online surveys (e.g., Barak & English, 2002; Berry et al., 1992; Curran, Kotrba, & Denison, 2010; Meade & Craig, 2012; Olson & Witt, 2011; Robinson-Cimpian, 2014; Rogelberg & Stanton, 2007), previous empirical investigations have examined the effects of CR and attrition separately. This threatens the ecological validity of previous findings regarding the problematic effects of CR and attrition.

    • Elections as a democratic linkage mechanism: How elections boost political trust in a proportional system

      2016, Electoral Studies
      Citation Excerpt :

      In 2014, Belgium held elections for the federal, regional and European level, all on the same day. The inevitable risk of a panel design is panel attrition, rendering the sample less representative (Bartels, 1999; Olson and Witt, 2011). Therefore, we need to take into account the possibility that the results we find might not be representative for the whole population.

    View all citing articles on Scopus
    1

    NRC Picker (present address).

    View full text