Does survey format influence self-disclosure on sensitive question items?

https://doi.org/10.1016/j.chb.2011.09.007Get rights and content

Abstract

Although there are advantages for use of internet based survey research over other formats, there remains in question whether survey mode influences the data measurement equivalency. While most research exploring survey format finds little or no difference in measurement equivalency, the interaction of sensitive topics and survey modality is not fully understood. Additionally, research suggests gender differences in item response on sensitive topics. The present study examined archival data from a college health survey using both online and paper–pencil survey formats. The interaction was evaluated between gender, survey format, and item sensitivity level. Results indicate that question topic sensitivity has a large effect on missing data, and survey format has a moderate effect. These findings have necessary implications for survey design and outcome interpretations.

Highlights

► We examine gender and survey format’s influence on sensitive item self disclosure. ► Web-based survey formats elicit greater levels of sensitive item self disclosure. ► Male survey responders self disclose at higher rates on Web-based surveys.

Introduction

Behavioral science research focuses on designs that help explore, describe, explain, and predict human behavior (Cozby, 2007). Survey research modalities are frequently used when the researcher is seeking to explore or describe what humans are thinking or doing. Surveys can be in an interview or written format, and use questions to collect information from respondents which if standardized, increases reliability and validity. Some advantage of surveys is that they allow for the collection of large amounts of data with less expense and time, and they can collect information on a wide variety of subjects.

With the increased use of surveys, the science of research methodology has explored the comparative effectiveness of face-to-face interviews, phone surveys, mailed paper–pencil surveys, and now is sifting through the efficacy of a variety of web-based research formats (Knapp & Kirk, 2003). The current study explores whether there is a significant difference in item non-response rates on personally sensitive items when using Web-based surveys versus paper–pencil surveys, and also whether gender influences item non-response rates. This study attempts to validate the use of electronic Web-based surveys in the collection of sensitive information using archival data from the American College Health Association’s National College Health Assessment (NCHA).

Section snippets

Changing survey considerations

With high use of the World Wide Web, research is quickly transitioning from paper-based survey formats to Web-based designs. There are a number of benefits offered by electronic Web-based formats including reduction of cost, increased ease of distribution, improved data accuracy and relatively effortless data entry and data coding (Dillman, 2007, Gunn, 2002, Lefever et al., 2007, Tourangeau, 2004). While these benefits encourage the increasing use of this survey format within a number of fields

Web-based versus paper-based formats

Well-designed research includes careful selection of the survey format with the goal of limiting sampling error (i.e., limited sampled units of the survey population), coverage error (i.e., missing a representation of all elements of the study population), measurement error (i.e., survey subjects respond inaccurately due to “poor question wording and questionnaire construction” [p. 9]), and non-response error (i.e., subject characteristic differences between those who respond to the survey and

Self-disclosure of sensitive information in research

Self-disclosure is the revealing of previously private information about one’s self to others and can include thoughts, feelings or experiences (Derlega et al., 1993, Joinson, 2001) Joinson & Paine, as cited in Joinson, McKenna, Postmes, and Reips (2007). Individual personal self-disclosure is growing at a rapid rate on the internet (e.g., chat rooms, instant messaging, social networks, and blogs) and participants frequently share intimate and sensitive information. However, it is known that

Purpose of the study

The American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (1999) indicate the need for cross-format equivalencies to be established prior to direct comparison of data on paper–pencil and internet surveys. With the current trend moving from paper–pencil formats to Web-formats, establishing when and where there are cross-format equivalencies is critical. The present study explores whether there is a significant difference

Method

The missing data rates in the NCHA data were examined for data collected from men and women in web and paper-based survey formats for health-related questions of high, moderate, low, and no sensitivity. Currently in the literature there is no standard for measuring level of survey question sensitivity. As a result, question sensitivity was based on ratings by a sample of 78 college students.

Sensitivity of NCHA questions

Mean sensitivity ratings were calculated for 200 NCHA questions. Based on the mean sensitivity ratings by the sample of 78 students, questions were divided into quartiles. The mean sensitivity ratings, on a 4-point Likert scale, for the least sensitive, low sensitivity, moderate sensitivity, and highest sensitivity questions were 1.33 (sd = 0.69), 1.58 (sd = 0.86), 1.98 (sd = 0.98), and 2.62 (sd = 1.14), respectively. Of the 200 total items, 192 (96%) had no ethnicity difference in their sensitivity

Discussion

Twenty-five years ago evidence initially supported the indication that fewer items remain unanswered on electronic formats, and that this could be a reason to question the interchangeability of data coming from two different survey modalities (Kiesler & Sproull, 1986). Our study confirms that Web-based surveys yield more data on sensitive questions than do paper-based surveys. Although recent studies (Gosling et al., 2004, Luce et al., 2007, Naus et al., 2009, Suris et al., 2007) suggest it is

Conclusion

The results of this study support the benefits of Web-based surveys in research when collecting data on sensitive issues. While research suggests privacy and confidentiality perception play a greater role in sensitive information disclosure than does survey modality (Huang, 2006), alternative factors such as format familiarity and cultural expectation of self-disclosure may also influence survey response. While Web-based surveys may be efficient with college populations and other populations

References (50)

  • R. Tourangeau et al.

    Humanizing self-administered surveys: Experiments on social presence in web and IVR surveys

    Computers in Human Behavior

    (2003)
  • American College Health Association

    American College Health Association National College Health Assessment (ACHA-NCHA), Spring 2006 Reference Group Data Report (Abridged)

    Journal American College Health

    (2007)
  • American Educational Research Association, American Psychological Association, & National Council on Measurement in...
  • E. Babbie

    The practice of social research

    (2001)
  • Booth-Kewley, S., Larson, G., Miyoshi, D., (2007). Social desirability effects on computerized and paper-and-pencil...
  • R. Carini et al.

    College student responses to web and paper surveys: Does mode matter?

    Research in Higher Education.

    (2003)
  • J. Cohen

    A power primer

    Psychological Bulletin

    (1992)
  • M. Couper

    Web surveys: A review of issues and approaches

    The Public Opinion Quarterly

    (2000)
  • Cozby, P. (2007). Methods in behavioral research (9th ed.). New York,...
  • V. Derlega et al.

    Self-disclosure

    (1993)
  • D. Dillman

    Mail and internet surveys: The tailored design method

    (2007)
  • K. Dindia et al.

    Sex differences in self-disclosure: A meta-analysis

    Psychological Bulletin

    (1992)
  • R.D. Gibbons et al.

    Estimation of effect size from a series of experiments involving paired comparisons

    Journal of Educational Statistics

    (1993)
  • S.D. Gosling et al.

    Should we trust web-based studies? A comparative analysis of six preconceptions about internet questionnaires

    American Psychologist

    (2004)
  • J. Greene et al.

    Telephone and web: Mixed-mode challenge

    Health Services Research

    (2008)
  • Cited by (0)

    View full text