Does survey format influence self-disclosure on sensitive question items?
Highlights
► We examine gender and survey format’s influence on sensitive item self disclosure. ► Web-based survey formats elicit greater levels of sensitive item self disclosure. ► Male survey responders self disclose at higher rates on Web-based surveys.
Introduction
Behavioral science research focuses on designs that help explore, describe, explain, and predict human behavior (Cozby, 2007). Survey research modalities are frequently used when the researcher is seeking to explore or describe what humans are thinking or doing. Surveys can be in an interview or written format, and use questions to collect information from respondents which if standardized, increases reliability and validity. Some advantage of surveys is that they allow for the collection of large amounts of data with less expense and time, and they can collect information on a wide variety of subjects.
With the increased use of surveys, the science of research methodology has explored the comparative effectiveness of face-to-face interviews, phone surveys, mailed paper–pencil surveys, and now is sifting through the efficacy of a variety of web-based research formats (Knapp & Kirk, 2003). The current study explores whether there is a significant difference in item non-response rates on personally sensitive items when using Web-based surveys versus paper–pencil surveys, and also whether gender influences item non-response rates. This study attempts to validate the use of electronic Web-based surveys in the collection of sensitive information using archival data from the American College Health Association’s National College Health Assessment (NCHA).
Section snippets
Changing survey considerations
With high use of the World Wide Web, research is quickly transitioning from paper-based survey formats to Web-based designs. There are a number of benefits offered by electronic Web-based formats including reduction of cost, increased ease of distribution, improved data accuracy and relatively effortless data entry and data coding (Dillman, 2007, Gunn, 2002, Lefever et al., 2007, Tourangeau, 2004). While these benefits encourage the increasing use of this survey format within a number of fields
Web-based versus paper-based formats
Well-designed research includes careful selection of the survey format with the goal of limiting sampling error (i.e., limited sampled units of the survey population), coverage error (i.e., missing a representation of all elements of the study population), measurement error (i.e., survey subjects respond inaccurately due to “poor question wording and questionnaire construction” [p. 9]), and non-response error (i.e., subject characteristic differences between those who respond to the survey and
Self-disclosure of sensitive information in research
Self-disclosure is the revealing of previously private information about one’s self to others and can include thoughts, feelings or experiences (Derlega et al., 1993, Joinson, 2001) Joinson & Paine, as cited in Joinson, McKenna, Postmes, and Reips (2007). Individual personal self-disclosure is growing at a rapid rate on the internet (e.g., chat rooms, instant messaging, social networks, and blogs) and participants frequently share intimate and sensitive information. However, it is known that
Purpose of the study
The American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (1999) indicate the need for cross-format equivalencies to be established prior to direct comparison of data on paper–pencil and internet surveys. With the current trend moving from paper–pencil formats to Web-formats, establishing when and where there are cross-format equivalencies is critical. The present study explores whether there is a significant difference
Method
The missing data rates in the NCHA data were examined for data collected from men and women in web and paper-based survey formats for health-related questions of high, moderate, low, and no sensitivity. Currently in the literature there is no standard for measuring level of survey question sensitivity. As a result, question sensitivity was based on ratings by a sample of 78 college students.
Sensitivity of NCHA questions
Mean sensitivity ratings were calculated for 200 NCHA questions. Based on the mean sensitivity ratings by the sample of 78 students, questions were divided into quartiles. The mean sensitivity ratings, on a 4-point Likert scale, for the least sensitive, low sensitivity, moderate sensitivity, and highest sensitivity questions were 1.33 (sd = 0.69), 1.58 (sd = 0.86), 1.98 (sd = 0.98), and 2.62 (sd = 1.14), respectively. Of the 200 total items, 192 (96%) had no ethnicity difference in their sensitivity
Discussion
Twenty-five years ago evidence initially supported the indication that fewer items remain unanswered on electronic formats, and that this could be a reason to question the interchangeability of data coming from two different survey modalities (Kiesler & Sproull, 1986). Our study confirms that Web-based surveys yield more data on sensitive questions than do paper-based surveys. Although recent studies (Gosling et al., 2004, Luce et al., 2007, Naus et al., 2009, Suris et al., 2007) suggest it is
Conclusion
The results of this study support the benefits of Web-based surveys in research when collecting data on sensitive issues. While research suggests privacy and confidentiality perception play a greater role in sensitive information disclosure than does survey modality (Huang, 2006), alternative factors such as format familiarity and cultural expectation of self-disclosure may also influence survey response. While Web-based surveys may be efficient with college populations and other populations
References (50)
- et al.
The impact of computer versus paper–pencil survey, and individual versus group administration, on self-reports of sensitive behaviors
Computers in Human Behavior
(2008) - et al.
Comparison of paper-and-pencil versus Web administration of the Youth Risk Behavior Survey (YRBS): Participation, data quality, and perceived privacy and anonymity
Computers in Human Behavior
(2010) - et al.
Pixels or pencils? The relative effectiveness of Web-based versus paper surveys
Library & Information Science Research
(2004) Do print and Web surveys provide the same results?
Computers in Human Behavior
(2006)- et al.
Measuring self-disclosure online: Blurring and non-response to sensitive items in web-based surveys
Computers in Human Behavior.
(2008) - et al.
Internet and touch-tone phones for self-administered surveys: Does methodology matter?
Computers in Human Behavior
(2003) - et al.
Reliability of self-report: Paper versus online administration
Computers in Human Behavior
(2007) - et al.
Comparing web and mail responses in a mixed mode survey in college alcohol use research
Addictive Behaviors
(2006) - et al.
From paper to pixels: A comparison of paper and computer formats in psychological assessment
Computers in Human Behavior
(2009) - et al.
Aggression, impulsivity, and health functioning in a veteran population: Equivalency and test-retest reliability and computerized and paper-and-pencil administrations
Computers in Human Behavior
(2007)