33
TECHNICAL REPORT 12-001 Determinants of Item Nonresponse to Web and Mail Respondents in Three Address-Based Mixed-Mode Surveys of the General Public January 2012 Submitted by Benjamin L. Messer Graduate Research Assistant Michelle L. Edwards Graduate Research Assistant and Don A. Dillman Regents Professor

Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

TECHNICAL REPORT 12-001

Determinants of Item Nonresponse to

Web and Mail Respondents in

Three Address-Based Mixed-Mode

Surveys of the General Public

January 2012

Submitted by

Benjamin L. Messer

Graduate Research Assistant

Michelle L. Edwards

Graduate Research Assistant

and

Don A. Dillman

Regents Professor

Page 2: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co
Page 3: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail

Respondents in Three Address-Based Mixed-Mode Surveys

of the General Public1

TECHNICAL REPORT 12-001

This report may be downloaded at:

http://www.sesrc.wsu.edu/dillman/papersweb/2012.html

January 2012

Submitted by

Benjamin L. Messer

Graduate Research Assistant

Michelle L. Edwards

Graduate Research Assistant

and

Don A. Dillman

Regents Professor

Social & Economic Sciences Research Center

PO Box 644014; Wilson Hall 133

Washington State University

Pullman, WA 99164-4014

509-335-1511

509-335-0116 (fax)

[email protected]

[email protected]

1 Support for this research was provided by USDA-National Agricultural Statistics Service and the NSF-National

Center for Science and Engineering Statistics, under a Cooperative Agreement to the Social and Economic Sciences

Research Center (SESRC) at Washington State University. Additional support was provided by the SESRC.

Page 4: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

i | P a g e

TABLE OF CONTENTS

ABOUT THE AUTHORS.............................................................................................................. 1

ABSTRACT................................................................................................................................... 2

I. INTRODUCTION....................................................................................................................... 3

II. BACKGROUND....................................................................................................................... 5

Demographic Effects.......................................................................................................... 6

Question Effects.................................................................................................................. 7

III. METHODS............................................................................................................................... 9

Statistical Analyses...........................................................................................................1 2

IV. RESULTS...............................................................................................................................1 5

Item Nonresponse Rates by Mode....................................................................................1 5

Demographic Analyses.....................................................................................................1 6

Question Analyses............................................................................................................20

V. CONCLUSIONS.....................................................................................................................25

REFERENCES.............................................................................................................................27

Page 5: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

1 | P a g e

About the Authors

Benjamin L. Messer is a Ph.D. Candidate in the Department of Sociology at Washington

State University (WSU) and a graduate research assistant in the Social and Economic Sciences

Research Center. A graduate of Georgia Technological Institute (BS 2005), he has co-authored

articles on methodological issues in Social Science Research and Public Opinion Quarterly and

environmental issues in Social Science Quarterly. His Ph.D. dissertation research combines these

methodological and environmental interests in an experiment aimed at improving web and

survey response to multi-state household surveys with an examination of the determinants of

preferences for meeting future electricity needs.

Michelle L. Edwards is a Ph.D. Candidate in the Department of Sociology at Washington

State University (WSU) and a graduate research assistant with the Social and Economic Sciences

Research Center. A graduate of Rice University (BA 2005) and Texas State University-San

Marcos (MA 2009), she has authored and co-authored articles on various topics, including social

disorganization theory in Deviant Behavior and community organizers for agricultural workers

in Organization & Environment. Her Ph.D. dissertation research examines resident perceptions

of water governance institutions at different spatial scales across two states.

Don A. Dillman is Regents Professor, Department of Sociology, and Deputy Director for

Research in the Social and Economic Sciences Research Center (SESRC) at Washington State

University. He maintains an active research program of experimentation on ways of improving

the quality of mixed-mode surveys. He is author of more than 250 publications, including,

Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd

edition (Dillman,

Smyth and Christian, John Wiley Co. Hoboken, N.J.). Recent publications have emphasized

measurement and non-response aspects of survey quality, including how to obtain responses over

the Internet from general public populations that can only be contacted by mail.

Page 6: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

2 | P a g e

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

Benjamin L. Messer, Michelle L. Edwards, and Don A. Dillman

ABSTRACT

Item nonresponse in self-administered modes such as Web and mail can be a major

problem affecting survey data quality and, in some cases, may be as severe as unit nonresponse.

Moreover, little is known about the determinants of item nonresponse in Web and mail

household surveys. In this paper, we assess item nonresponse differences by Web and mail

modes, question types (e.g. factual, attitudinal, behavioral) and formats (e.g. nominal, ordinal,

multi-item, open-end, etc.), and respondent demographics (e.g. gender, age, education, race, and

income) in three general public household surveys. For the three surveys, each conducted in the

northwestern U.S. in 2007, 2008, and 2009, respectively, we used address-based sampling with

the U.S. Postal Service‟s Delivery Sequence File and employed postal mail methods to send all

contacts. Sampled respondents in each survey were presented with a) a mail-only response

option, b) a mail response option with a Web follow-up sent two weeks later (i.e. mail+web), or

c) a Web response option with a mail follow-up sent two weeks later (i.e. web+mail). The Web

and mail questionnaires in each survey were designed very similarly in order to minimize and

control for effects from visual design and layout. This paper serves to quantify and describe item

nonresponse differences and the sources of those differences, and to identify potential ways of

reducing item nonresponse in Web and mail modes of data collection.

Page 7: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

3 | P a g e

I. Introduction

Finding effective methods for surveying the general public over the Internet is an important goal

for survey research. One method that seems to be showing signs of success uses address-based

sampling with different mail contact strategies to deliver a Web survey request to a sample of

households. A “Web+mail” design, in which a Web survey request is mailed to households,

followed by a mail alternative two weeks later, has been shown to produce response rates of 46-

55%, with about 2/3 of the responses coming over the Web (Smyth et al. 2010; Messer &

Dillman, 2011). A mail-only design obtained higher overall response rates of 57-71%, but mail-

only and Web+mail respondents were quite similar, indicating some consistency in both designs

in terms of nonresponse bias (Messer & Dillman, 2011).

A potential shortcoming of data collection that depends upon using mail alone or as a

supplement to the Web is item nonresponse. Web is generally perceived as a better option for

controlling item nonresponse because of multiple design features (e.g. individual page

construction, automatic branching from screen questions, etc.) that are typically not feasible in

mail questionnaires (Kwak & Radler, 2002). However, it remains unclear what specific factors

might influence item nonresponse for Web and mail surveys. If use of the Web encourages

respondents to complete more items in the questionnaire, then there may be a trade-off in the

higher response rates obtained by mail and the higher quality data obtained by Web. This would

further support the argument that there is value to encouraging greater numbers of households to

respond via the Web rather than by mail.

Our purpose in this paper is to compare item nonresponse for three similarly-conducted

Web and mail household studies, with a focus on potential sources of item nonresponse,

including respondent demographics and question characteristics. Because individuals with quite

Page 8: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

4 | P a g e

different characteristics (e.g. age and education) choose to respond by Web and mail, we

ascertain the extent to which differences in demographics vs. mode influence the patterns of item

nonresponse. In addition, each of the surveys provides a variety of question types and formats

that may also contribute to variations in item nonresponse. From these analyses we draw

conclusions about to what extent and in what way differences in item nonresponse should be

considered in designing of Web and mail mixed-mode surveys of the general public.

Page 9: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

5 | P a g e

II. Background

Item nonresponse can be a significant source of error in surveys and is oftentimes higher

than unit nonresponse, in which the respondent does not participate in or complete the survey at

all (Dillman et al. 2002; Dixon & Tucker 2010). Item nonresponse occurs when a respondent

participates in the survey but does not provide an answer to a question or item, or the answer

provided by the respondent is not meaningful or substantive with regards to the question asked

(Dillman et. al 2002). Item nonresponse results in missing data, which diminishes the validity

and reliability of the data (Dillman et al. 2002).

Considerable research suggests that item nonresponse is a significant problem in mail

surveys, at least compared to telephone and face-to-face interviewing (de Leeuw 1992; de

Leeuw, Hox, & Huisman 2003). Despite the growing use of Web and mail survey modes, past

research comparing item nonresponse between mail/paper questionnaires and Web/online

questionnaires has primarily been limited to specific target populations (e.g., students,

teachers/faculty, counselors) in which respondents are generally more Internet-literate and use

the Internet more frequently compared to the general public (e.g. Brečko & Carstens 2007;

Denscombe 2009; Kaplowitz, Hadlock, & Levine 2004; Kiessler & Sproull 1986; Kwak &

Radler 2002; Schaefer & Dillman 1998; Wolfe et al. 2009).

In theory, item nonresponse can be eliminated from Web surveys by requiring answers to

all questions. However, many Institutional Review Boards expect that all answers to survey

items be voluntary, and thus all respondents must have the option to skip any item. In the three

experiments in this study, we did not require answers or use special text messages2 to encourage

2 Excludes screen questions. Web respondents who tried to skip a branching item were told: “we need for an

answer to be provided to this item so we know which items should be presented next.”

Page 10: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

6 | P a g e

answers because of the likely differential effect that this would have on mid-survey terminations

between Web and mail modes. Research on item nonresponse rates between Web and mail

surveys has produced mixed results: some found lower rates for Web surveys than mail surveys

(Boyer et al. 2002; Kiesler & Sproull 1986; Kwak & Radler 2002), some found similar rates

(Wolfe et al. 2009), and others found higher rates for Web surveys than mail surveys (Brečko &

Carstens 2007; Manfreda & Vehovar 2002).

Besides survey mode, a number of other factors may affect item nonresponse rates,

including respondent characteristics (e.g. age, gender, income, education, etc.) and question

formats and types (Alkaya & Esin 2005; de Leeuw et al. 2003; Dillman et al. 2002). Each of

these sources potentially affect the cognitive effort and capabilities required by respondents, and

the more cognitive effort required to answer a question, the more likely a respondent will not

provide an answer or will provide an incorrect or invalid answer (Beatty & Herrmann 2002;

Tourangeau & Bradburn 2010; Tourangeau, Rips, & Rasinksi 2000). We are unable to account

for variations in the cognitive processes of respondents, but do address trends in item

nonresponse by the sources described in more detail below.

Demographic Effects

Past research indicates that different types of people tend to respond to Web and mail

surveys (i.e. younger, higher-educated, and more affluent people tend to respond to Web surveys

at greater rates than other individuals) (Kaplowitz et. al. 2004; Kwak & Radler 2002; Messer &

Dillman 2010; Smyth et al. 2010). Although demographic comparisons are infrequently included

in discussions of Web and/or mail item nonresponse (exceptions include Ferber 1966; Kaplowitz

et al. 2004; Kwak & Radler 2002; Wolfe et al. 2009), several studies have shown that older and

less educated respondents tend to have higher item nonresponse rates in many surveys (Dillman

Page 11: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

7 | P a g e

et al. 2002). In this study, examining demographic effects on item nonresponse by mode is made

more difficult due to the demographic differences between participants who responded by Web

and participants who responded by mail when it was offered later (in the Web+mail design).

Thus, it is unclear whether variation in item nonresponse rates is a function of the demographic

differences between Web and mail respondents or is a function of differential participation. In

line with available research, we expect that older respondents with less education will exhibit

higher item nonresponse rates than others, even when controlling for survey mode.

Question Effects

Question properties, including format and type, can influence item nonresponse in Web

and mail surveys. Some question formats such as open-ended, screened, and multi-item

questions might be relatively more challenging to answer than single, closed-ended questions.

Past research has shown item nonresponse to be higher in the screened questions immediately

following a branching question (Messmer and Seymour 1982). Other studies of question format

in mail and Web surveys have largely focused on open- and closed-ended questions. For

example, researchers have found that item nonresponse is lower in Web surveys than mail

surveys for open-ended questions (Denscombe 2009) and close-ended questions (Kwak & Radler

2002). We expect that question formats requiring more effort will have higher item nonresponse

rates, regardless of mode, and that Web rates will be lower than mail rates.

Second, past research has yet to clearly identify which question types might be more

likely to have higher item nonresponse rates. Dillman, Smyth, & Christian (2009) outline three

different question types: factual, attitudinal, and behavioral. Factual questions ask respondents to

provide information considered to be a fact, such as demographic questions. These questions are

considered the most sensitive, since they are typically personal in nature, but require less effort

Page 12: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

8 | P a g e

to answer since respondents typically know the answers beforehand (Dillman et. al. 2009).

Attitudinal questions ask respondents about their attitudes or opinions regarding the subject of

the question, such as questions measuring respondent satisfaction. These questions can require

very little to very much effort, largely depending on whether or not respondents have already-

formed attitudes toward the topic and whether they will provide them (Dillman et. al. 2009).

Finally, behavioral questions ask respondents to report their behavior regarding the topic of the

question, such as questions asking how respondents use the Internet or cell phones. These

questions can require much effort to answer, depending on the topic in question and the

specificity of the information requested (Dillman et. al. 2009).

Limited studies on item nonresponse by question type have produced mixed results,

including: lower item nonresponse for low-sensitivity items (Shoemaker et al. 2002) and lower

item nonresponse for high-sensitivity items (Wolfe et al. 2009). In the surveys used here, we

expect questions with high sensitivity and cognitive effort to have higher item nonresponse,

regardless of mode.

Page 13: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

9 | P a g e

III. Methods

We report analyses based on three experiments using address-based, random samples of

general public households to test Web and mail strategies. Each experiment employed

combinations of mail and Web modes. A “Web+mail” combination, or strategy, begins with a

mailed request to respondents asking them to complete the survey on the Web, and concludes

with a follow-up mail alternative (i.e. “mail follow-up”) sent about two weeks later. A

“mail+Web” strategy employs the opposite approach, starting with a mail survey and following

up with a Web alternative (i.e. “Web follow-up”). A “mail-only” strategy was also used for

comparisons purposes, in which respondents received only mail questionnaires with no mention

of Web.

Experiment 13 was a regional quality of life survey conducted in the rural Lewiston-

Clarkston Valley in the Pacific Northwest in the summer of 2007. The questionnaire was titled

the “2007 Lewiston & Clarkston Quality of Life Survey” (LCS) and contained 51 numbered

questions with up to 92 items about quality of life issues. Experiment 1 contained two treatment

groups4 designed to test the effects of Internet and mail mixed-mode combinations, including:

mail+Web5 and Web+mail

6. A $5 cash incentive and an illustrated Web instruction card (Web

card) were also sent to respondents with the initial survey request but were not tested

experimentally.

3 See Smyth et. al. (2009) for additional details regarding Experiment 1 not provided in this manuscript.

4 The original study contains four treatment groups, but only two (i.e. mail+Web and Web+mail) are congruent

with the methods used in Experiments 2 & 3 so we excluded the other two. 5 Referred to as “mail preference” by the original authors. Respondents in this group were offered mail first,

followed by a Web alternative sent two weeks later. 6 Referred to as “Web preference” by original authors. Respondents in this group were offered Web first, followed

by a Web alternative sent two weeks later.

Page 14: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

10 | P a g e

Experiments 2 & 37 were statewide surveys conducted in the state of Washington in 2008

and 2009, respectively. The questionnaire for Experiment 2, the “Washington Community

Survey” (WCS), contained 41 numbered questions with up to 110 items about community

satisfaction and quality of life issues. Nine treatment groups were fielded in two different phases

to test the effects of mail+Web and Web+mail combinations, an initial cash incentive, and a Web

instruction card. Experiment 3 used a questionnaire titled the “Are you Better or Worse Off Than

a Year Ago: A study of how households throughout Washington may have been affected by

changes in the economy” (the Washington Economic Survey, or WES). It included 46 questions

with up to 96 items about how the household had been affected by changes in the economy

between September 2008 and 2009. Six treatment groups were designed to test the effects of

Web and mail combinations using Priority Mail and a second $5 incentive.

Each Experiment lasted about three months and employed four8 mail contacts; each

contact was also addressed to the “Resident” of the city or town in the postal address. Business,

seasonal, and vacant addresses were excluded from the sample frame to ensure that sampled

addresses were residential households that were also more likely to belong to full-time residents

of the region (for Experiment 1) or state (for Experiments 2 and 3). Post Office Boxes that

belonged to individuals were included in the sample frame because of the likelihood that most

were alternatives for residential delivery.

7 See Messer & Dillman (2010; 2011) for additional details regarding Experiments 2 & 3 not provided in this

manuscript. 8 A fifth contact was used in Experiment 3 (WES) but all respondents obtained after the request was mailed have

been dropped to maintain consistency in the number of contacts used in the three studies (see Messer & Dillman,

2011).

Page 15: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

11 | P a g e

The three questionnaires used contain many of the same questions on Internet and cell

phone use and demographic characteristics. In addition, a unified-mode design strategy (Dillman

2000) was used to construct the paper and Web questionnaires in each experiment to minimize

measurement differences. Both modes utilized the same graphical features, including colors,

symbols, fonts, pictures, spacing, etc. and the same questions and question order. Questions in

the paper survey were presented in stand-alone color boxes to emulate the “single question per

page” design used in the Web version. Web respondents could move through the Web survey

without providing answers to questions, the same as with a paper questionnaire. In addition,

cascading style sheets were used in the Web survey to ensure compatibility across different

Internet browsers. Considerable research shows that Web and paper responses are comparable

when similar questionnaire constructions are used (Dillman et al. 2009).

Experiment 1 consisted of 1800 randomly selected residential addresses from Lewiston,

ID and Clarkston, WA. Experiments 2 and 3 consisted of 5400 and 3900 randomly selected

residential addresses in Washington, respectively, and each sample was stratified to include 50%

of households from urban counties and 50% from rural counties. In addition, we combined some

of the treatment groups in Experiments 2 & 3 for purposes of this analysis. Experiment 2

contained five Web+mail treatment groups, which we combined in to one Web+mail group, as

well as one mail-only treatment group and three mail+Web groups. We dropped the Web

respondents from the mail+Web groups since there were so few (i.e. less than 3%) and combined

these groups with the mail-only group (heretofore referred to as “mail-only”). Experiment 3

contained three Web+mail groups, which we combined into one Web+mail group, and three

mail-only groups, which we combined into one mail-only group. This was done after statistical

Page 16: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

12 | P a g e

analyses resulted in minimal differences in respondent characteristics in each of the respective

design groups for each experiment.

Statistical Analyses

Several statistical analyses were conducted with the data to determine the effects of

demographic characteristics, question formats and types, and survey modes on item nonresponse

rates. Due to questionnaire and population differences among the three Experiments, the LCS,

WCS, and WES are considered separately. In addition, post-stratification weights for urban-

rural county household population were applied to the WCS and WES data to offset the effects

of disproportionately sampling rural county households (Lee & Forthofer, 2006). Further

description of the weighting techniques utilized in this study is included in Messer and Dillman

(2011).

The dependent variables in the following analyses, item nonresponse rates, are calculated

the same for mail and Web versions in each Experiment. For each respondent, the number of

missing responses was divided by the total number of possible complete responses and

multiplied by 100. The total number of possible complete responses varied based on how

respondents answered the branching questions. Overall rates are calculated by averaging

individual rates for a particular mode. Respondents with over 50% of items missing were

dropped from analyses as partial respondents. Missing responses are indicated based on whether

or not the respondent provided any answer on a particular item. Only unanswered items are

counted as item nonresponses. Non-substantive (i.e. “don‟t know” or “not sure”) or incorrect

responses are considered to be responses for purposes of these analyses.

The demographic variables used in analyses are: gender (dichotomous, female=1), age

(continuous), education (ordinal, high school or less = 0, some college, no degree = 1, 2-, 4-year,

Page 17: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

13 | P a g e

or graduate/professional degree = 2), and income (nominal, less than $50,000 = 0, $50,000 to

less than $100,000 = 1, $100,000 or more = 2, prefer not to say = 3).

The question format variables used in analyses are: open-ended questions, in which

respondents are asked to write or enter their responses in a blank answer space, and closed-ended

questions, in which respondents are asked to fill in or select the radio button next to their

preferred choice. The closed-ended questions are subdivided into different formats: ordinal

scale, nominal scale, and multi-item questions. Ordinal scale questions contain answer categories

that have a natural order (e.g., “1” to “10”, “Very Good” to “Very Poor”), and range from 3 to 8

categories. Nominal scale questions contain answer categories without an order (e.g., marital

status, yes/no), and range from 2 to 7 categories. Although many of the closed-ended questions

contained “Don‟t Know,” “Does not apply,” or “Prefer not to say” answer categories, these were

not counted in the scale length but were counted as responses for purposes of this analysis.

Multi-item questions asked respondents to provide answers for multiple items in the same

question. Items ranged from three to 14, with an average of seven per question across the three

Experiments. The surveys also contained screened questions of different formats. In the paper

questionnaires, screened questions follow branching questions. For the mail, branching was

indicated by an arrow pointing to the next question for those who proceed by branching and, for

those who skip to the next question, by bold instructions informing respondents to “Skip to QX”

next to the answer category(ies). For the Web, branching was automated so that respondents

automatically received the next question, branch or not, by simply answering the previous

question.

The question type variables used in analyses are: 1) factual demographic, 2) factual non-

demographic, 3) attitudinal, and 4) behavioral questions. Factual questions ask about a

Page 18: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

14 | P a g e

characteristic of the respondent or the respondent‟s household (e.g., age, employment status,

income). Attitudinal questions ask about the respondent‟s attitude, opinion, or preference on a

topic (e.g., “Do you feel/consider/think/believe...”). Behavioral questions ask about the

respondent‟s behavior (e.g., using the Internet or a cell phone, changes in lifestyle).

Page 19: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

15 | P a g e

IV. Results

Item Nonresponse Rates by Mode

Table 1 reports the sample sizes, unit response rates and item nonresponse rates by

survey design (mail-only or Web+mail) and mode (Mode 1 or Mode 2) for each of the three

experiments. Overall, the total unit response rates ranged from 40.1 percent (WCS) to 66.3

percent (LCS). The overall item nonresponse rates ranged from 3.6 (LCS) to 8.1 (WES). In

each experiment, web obtained the lowest item nonresponse rates, compared to mail-only and the

mail follow-up; the latter obtained the highest rate. Table 2 shows that the differences between

web and mail follow-up item nonresponse rates are statistically significant across all

experiments, with the Bonferroni-Holm correction. For the WCS and WES, the differences

between mail follow-up and mail-only item nonresponse rates are also statistically significant.

Other comparisons (web vs. mail-only and web+mail vs. mail-only) are not significant.

Table 1. Sample Sizes, Unit Response Rates1, and Item Nonresponse Rates by Design and Mode, by

Experiment.

1st Mode

2nd Mode

Design N

(N‟)2

Total Unit Response

Rate % (n)

Total

Number

of Items3

Mode

Used

Unit Response

Rate % (n)

Item

Nonresponse Rate

Mode

Used

Unit Response

Rate % (n)

Item

Nonresponse Rate

Total Item

Nonresponse Rate

LCS

92

Mail Only

800

(738)

66.3

(489)

Mail

64.4

(475)

5.0

Web

1.9

(14)

DNC5

5.0

Web+Mail

600

(566)

55.1

(312)

Web

40.8

(231)

2.7

Mail

14.3

(81)

6.2

3.6

WCS4

110

Mail Only

2200

(2069)

50.4

(1043)

Mail

49.2

(1017)

4.2

Web

1.3

(26)

DNC5

4.2

Web+Mail

3200

(2993)

40.1

(1200)

Web

25.0

(747)

2.7

Mail

15.1

(453)

6.9

4.2

WES4

96

Mail Only

1800

(1673)

62.2

(1040)

Mail

62.2

(1040)

8.1

Web

--

--

8.1

Web+Mail

2100

(1932)

50.2

(969)

Web

32.6

(630)

6.1

Mail

17.5

(339)

11.6

8.0

Notes: 1Response rate = number of completed (I+P) / N‟ [(I+P)+(R+NC+O)+(UH+UO)-undeliverables] (AAPOR, 2009); 2 N‟ = N –

undeliverables, which are the number of addresses in the sample that were no longer in service and were determined by whether the mailings were returned to the sender; 3Number varies per respondent depending on branching questions. 4Weighted data; 5DNC = did not calculate due to small

sample size.

Page 20: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

16 | P a g e

Table 2. Item Nonresponse Rate Comparisons for Treatment Groups Using a

Chi-Squared Test with a Bonferroni-Holm Correction

Comparison

LCS

F(p)1[df]

WCS1

F(p)1[df]

WES1

F(p)1[df]

Web vs.

Mail Follow-up

138.37* (.000)

[1,51]

418.19* (.000)

[1,186]

461.20* (.000)

[1,209]

Web vs.

Mail-only

89.92 (.707)

[1,98]

210.15 (.079)

[1,187]

271.45 (.029)

[1,227]

Mail Follow-up vs.

Mail-only

140.57 (.012)

[1,105]

393.61* (.000)

[1,252]

442.66* (.000)

[1,281]

Web+Mail vs.

Mail-only

90.07 (.992)

[1,125]

272.91 (.168)

[1,262]

293.00 (.397)

[1,291]

NOTES: *p≤.01 with Bonferroni-Holm correction; 1Weighted data.

Demographic Analyses

Table 3 displays the summary statistics for the demographic variables utilized in the item

nonresponse analyses by below. Overall, trends in respondent demographics appear to be mostly

consistent across the three experiments. The percentage of female respondents was slightly

highest in the mail follow-up group, compared to the Web and mail-only groups. The average

age was lowest for Web respondents and highest for mail follow-up respondents. In terms of

education, the highest percentage of respondents with a high school degree or less was in the

mail follow-up group, followed by the mail-only group, and the Web group. In contrast, the

highest percentage of respondents with a 2- or 4-year or graduate/professional degree was in the

Web group, followed by the mail-only group, and the mail follow-up group. Similarly, the

highest percentage of respondents with an income of less than $50,000 was in the mail follow-up

group, followed by the mail group, and the Web group. In contrast, the highest percentage of

respondents with an income of $50,000 to less than $100,000 was in the Web group, followed by

the mail-only group, and the mail follow-up group. The same trend was also present in the

income category of $100,000 or more. Thus, overall, the average mail follow-up respondent is

Page 21: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

17 | P a g e

more likely to be female, older, and have lower education and income levels compared to an

average Web or mail-only respondent.

Table 3. Percentages (and Means) of Respondents in Each Demographic Category for Mail-only and

Web+Mail Respondents, by Experiment.

LCS WCS1 WES1

Mail-

only

Web+Mail Mail-

only

Web+Mail Mail-

only

Web+Mail

Web Mail Total Web Mail Total Web Mail Total

Total (n) 475 224 80 304 1015 748 449 1197 1039 601 338 939

Gender (% Female) 57.6 59.2 68.4 61.6 59.4 55.8 63.1 59.4 55.6 56.6 57.4 58.2

Age (Mean) (55.4) (51.4) (61.6) (54.1) (53.6) (48.6) (59.0) (52.3) (51.7) (48.2) (57.8) (51.5)

18-44 (%) 25.9 27.2 17.5 24.7 29.5 40.9 19.9 33.3 35.0 38.8 21.5 32.9

45-64 40.4 53.6 31.3 47.7 44.9 39.8 36.4 38.5 42.4 45.5 42.5 44.5

65 or more 33.7 19.2 51.3 27.6 25.6 19.4 43.7 28.2 22.7 15.7 36.0 22.6

Education (%)

HS or less 30.1 21.4 38.0 25.7 19.1 11.4 26.6 17.2 43.3 34.1 62.1 44.3

Some college, no degree 33.1 31.3 43.0 34.3 27.3 26.5 25.2 26.5 13.2 14.2 15.5 14.9

2-,4-Yr., or Grad/Prof

degree 36.9

47.3 19.0 39.9

52.2

61.7 44.0 56.3

42.4

50.4 20.3 40.9

Income (%)

< than $50K 53.5 39.0 68.8 46.7 34.2 29.6 44.5 35.8 40.0 29.3 57.7 39.7

$50K to < $100K 27.5 40.8 9.1 32.7 32.8 36.1 23.0 32.0 30.3 37.5 20.5 32.5

$100K or more 8.08 9.9 6.5 9.0 17.5 20.1 9.0 16.5 19.1 23.7 6.1 18.2

Notes: 1Weighted data.

Item nonresponse rates by demographic category, mode, and experiment are shown in

Table 4. Overall, this table demonstrates that item nonresponse rates tend to be highest for mail

follow-up respondents, followed by mail-only respondents, and lowest for Web respondents. In

terms of the demographic categories, the highest item nonresponse rates tend to occur in the “65

or more age” category, the “high school or less” education category, and the “less than $50,000”

and “prefer not to say” income categories. There are similar trends in item nonresponse for male

and female respondents.

Page 22: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

18 | P a g e

Table 4. Item Nonresponse Rates by Demographic Category for Mail-only and Web+Mail Respondents, by

Experiment.

LCS WCS1 WES1

Mail-

only

Web+Mail Mail-

only

Web+Mail Mail-

only

Web+Mail

Web Mail Total Web Mail Total Web Mail Total

Overall Rate 5.0 2.7 6.2 3.6 4.2 2.7 6.9 4.2 8.1 6.1 11.6 8.0

Gender

Female

Male

4.7

4.7

2.4

3.1

6.6

4.8

3.6

3.5

4.3

3.5

2.7

2.5

6.8

5.6

4.3

3.5

7.8

7.9

6.1

5.6

10.6

11.9

7.6

7.6

Age

18-44 2.0 1.7 2.3 1.8 2.2 2.0 3.2 2.2 5.5 4.4 8.4 5.3

45-64 2.8 2.9 4.1 3.1 3.3 2.7 4.5 3.3 7.8 6.5 9.2 7.3

65 or more 9.9 3.4 8.8 6.0 8.0 4.0 10.6 7.8 12.5 9.2 16.4 13.1

Education

HS or less 6.7 2.4 7.1 4.2 7.4 3.1 10.0 7.1 9.2 6.1 12.4 9.1

Some college, no degree 4.6 3.0 4.9 3.6 3.4 2.9 6.6 4.2 8.8 5.5 10.3 7.2

2-,4-Yr., or Grad/Prof degree 3.1 2.6 6.5 3.1 2.9 2.4 4.0 2.9 6.4 5.7 8.5 6.2

Income

< than $50K 4.7 2.1 6.2 3.6 5.1 2.7 7.5 4.9 9.3 5.6 11.5 8.6

$50K to < $100K 3.4 2.9 2.3 2.9 2.9 2.5 3.8 2.8 7.3 5.8 9.2 6.6

$100K or more 2.9 3.4 3.3 3.4 2.7 2.2 3.6 2.5 5.7 5.7 7.5 6.0

Notes: 1Weighted data.

We conducted bivariate and multivariate OLS regression models predicting item

nonresponse rates by survey mode and individual demographic characteristics for each

experiment. These results are displayed in Table 5. In Models 1, 3, and 5, we only included

survey mode as a predictor of item nonresponse rates. These variables were all statistically

significant at the 0.05 level or lower. In Models 2, 4, and 6, we included survey mode and also

controlled for demographic characteristics. Using global F-tests, we found these models to all be

significant improvements over the models with only survey mode. In Models 2, 4, and 6, survey

mode continues to be statistically significant (with one exception), even when controlling for

demographic characteristics. Thus, on average, Web respondents tend to have significantly

lower item nonresponse rates than mail-only respondents, holding demographics constant. Also,

mail follow-up respondents tend to have significantly higher item nonresponse rates than mail-

only respondents, controlling for demographic information. In terms of demographics, the trends

Page 23: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

19 | P a g e

vary somewhat across the three experiments. However, overall, education and age tend to be

significant predictors of item nonresponse across all three experiments. Thus, with each

additional year of education, the item nonresponse rate increases by about 0.10 units, holding

other variables constant. Compared to respondents with a high school degree or less,

respondents with at least some college tend to have lower item nonresponse rates, holding other

variables constant.

Table 5. Bivariate and Multivariate OLS Regression Models1 Predicting Item Nonresponse Rates by Survey

Mode and Respondent Demographic Characteristics, by Experiment.

LCS WCS2 WES2

Model: 1 Model: 2 Model: 3 Model: 4 Model: 5 Model: 6

Mode

Mail-only

Web (of Web+Mail)

Mail (of Web+Mail)

Reference

-1.81*** (.476)

1.52* (.741)

Reference

-1.19** (.444)

0.18 (.695)

Reference

-1.27*** (.207)

2.12*** (.434)

Reference

-0.55** (.179)

1.20** (.402)

Reference

-2.17*** (.328)

2.97*** (.631)

Reference

-1.49*** (.313)

1.74** (.603)

Demographics

Female -- -0.46 (.374) -- 0.63** (.204) -- -0.17 (.350)

Age -- 0.13*** (.011) -- 0.10*** (.009) -- 0.12*** (.012)

HS or less

Some college, no deg.

2-, 4-Yr., Grad/Prof deg.

--

--

--

Reference

-1.20* (.473)

-1.83*** (.473)

--

--

--

Reference

-2.55*** (.463)

-2.76*** (.406)

--

--

--

Reference

-0.44 (.519)

-1.57*** (.366)

Less than $50K

$50K to less than $100K

$100K or more

Prefer not to say

--

--

--

--

Reference

-0.54 (.436)

-0.21 (.695)

0.82 (.609)

--

--

--

--

Reference

-0.87*** (.238)

-0.79** (.250)

-0.58 (.397)

--

--

--

--

Reference

-0.69 (.412)

-1.3** (.411)

-0.28 (.618)

R2 0.02*** 0.17*** 0.05*** 0.19*** 0.05*** 0.15***

N 991 991 2143 2143 1901 1901

Notes: *p ≤ .05; **p ≤ .01; ***p ≤ .001; 1 Unstandardized coefficients reported (standard errors in parentheses); 2 Weighted data.

In Table 6, instead of predicting item nonresponse using survey mode, we analyzed

bivariate and multivariate OLS regression models using survey design, again adding in

demographic variables in the even-numbered models. These analyses enabled us to evaluate

overall differences in data quality between the Web-plus-mail design and the mail-only design.

Page 24: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

20 | P a g e

Results indicate that survey design is only a significant predictor of item nonresponse in the LCS

experiment. As shown in Models 1 and 2, Web-plus-mail respondents have on average

significantly lower item nonresponse rates than mail-only respondents. Similar to Table 5, the

models with demographic variables provide a significant improvement over the models with only

survey design. Also, both age and education tend to be significant predictors of item

nonresponse, in the same directions discussed in the Table 5 analyses.

Table 6. Bivariate and Multivariate OLS Regression Models1 Predicting Item Nonresponse Rates by Design

and Respondent Demographic Characteristics, by Experiment.

LCS WCS2 WES2

Model: 1 Model: 2 Model: 3 Model: 4 Model: 5 Model: 6

Design

Web+Mail

(Reference: Mail Only)

-0.95* (.431)

-0.83* (.396)

-0.09 (.240)

0.07 (.217)

-0.45 (.360)

-0.41 (.338)

Demographics

Female -- -0.48 (.374) -- 0.71*** (.205) -- -0.18 (.355)

Age -- 0.13*** (.011) -- 0.10*** (.009) -- 0.13*** (0.12)

HS or less

Some college, no deg.

2-, 4-Yr., Grad/Prof deg.

--

--

--

Reference

-1.19* (.474)

-1.89*** (.472)

--

--

--

Reference

-2.68*** (.462)

-2.89*** (.408)

--

--

--

Reference

-0.54 (.528)

-1.86*** (.370)

Less than $50K

$50K to less than $100K

$100K or more

Prefer not to say

--

--

--

--

Reference

-0.65 (.432)

-0.24 (.695)

0.81 (.610)

--

--

--

--

Reference

-0.99*** (.239)

-0.94*** (.251)

-0.61 (.399)

--

--

--

--

Reference

-1.02* (.415)

-1.73*** (.414)

-0.24 (.633)

R2 0.00* 0.17*** 0.00 0.18*** 0.00 0.13***

N 991 991 2143 2143 1901 1901

Notes: *p ≤ .05; **p ≤ .01; ***p ≤ .001; 1 Unstandardized coefficients reported (standard errors in parentheses); 2 Weighted data.

Question Analyses

Table 7 reports the summary statistics for the question variables utilized in the below

analyses by survey design and experiment. In terms of question format, the percentage of

screened items was slightly higher in the LCS and WES experiments than in the WCS

experiment. In contrast, the percentage of items considered part of a multi-item question was

Page 25: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

21 | P a g e

slightly higher in the WCS than in the LCS and WES experiments. The highest percentage of

items were ordinal, compared to yes/no, other nominal, or open-ended items across all three

experiments. In terms of question type, in the LCS and WCS experiments, the highest

percentage of items was attitudinal, compared to demographic, other factual (non-demographic),

and behavioral items. In contrast, in the WES experiment, there was the highest percentage of

“other factual” items.

Table 7. Total Questions and Items by Question Format and Type, by

Experiment.

Questions (Items) LCS WCS WES

Total 51 (92) 41 (110) 46 (96)

Question Format

Screened 10 (18) 6 (18) 6 (19)

Multi-Item 7 (42) 7 (61) 8 (45)

Ordinal 27 (56) 18 (57) 28 (55)

Y/N 7 (13) 7 (14) 13 (25)

Other Nominal 7 (8) 15 (26) 8 (8)

Open-End 10 (16) 10 (14) 6(8)

Question Type

Demographic 8 (10) 9 (11) 10 (17)

Other Factual 12 (16) 8 (16) 13 (31)

Attitudinal 21 (40) 15 (45) 20 (24)

Behavioral 10 (26) 12 (38) 9 (24)

Item nonresponse rates by question format and type, mode, and experiment are shown in

Table 8. Overall, this table demonstrates once again that item nonresponse rates tend to be

highest for mail follow-up respondents, followed by mail-only respondents, and lowest for Web

respondents. In terms of the question format categories, the highest item nonresponse rates tend

to occur in the screened, multi-item, and open-ended categories. There do not appear to be

consistent trends among the question type categories.

Page 26: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

22 | P a g e

Table 8. Item Nonresponse Rates by Question Format and Question Type for Mail-only and Web+Mail

Respondents, by Experiment.

LCS WCS1 WES1

Mail-

only

Web+Mail Mail-

only

Web+Mail Mail-

only

Web+Mail

Web Mail Total Web Mail Total Web Mail Total

Overall Rate 5.0 2.7 6.2 3.6 4.2 2.7 6.9 4.2 8.1 6.1 11.6 8.0

Question Format

Screened 9.6 4.6 12.8 6.6 7.0 3.3 12.8 6.7 5.4 1.3 11.8 5.1

Multi-Item 5.5 3.4 7.1 4.3 4.0 2.9 6.6 4.2 15.8 12.8 22.0 16.1

Ordinal 2.8 0.6 4.0 1.4 2.3 0.7 4.9 2.2 10.2 8.6 14.3 10.6

Other Nominal 2.8 0.3 3.4 1.1 3.2 0.8 5.5 2.5 2.3 1.0 4.9 2.4

Y/N 7.9 8.5 10.0 8.8 8.2 9.4 11.4 10.1 4.5 1.2 8.3 3.9

Open-End 11.4 8.8 13.6 9.9 12.2 12.2 16.6 13.8 6.4 3.2 10.7 5.8

Question Type

Demographic 5.0 2.7 5.0 3.3 4.8 3.4 8.6 5.3 19.0 19.7 24.2 21.3

Attitudinal 3.1 0.6 3.5 1.4 2.6 1.0 5.4 2.6 2.7 0.8 4.9 2.4

Behavioral 6.9 5.0 11.9 6.4 4.9 3.7 7.3 4.9 2.3 0.6 4.6 1.9

Other Factual 5.2 3.8 6.9 4.5 6.3 4.7 10.1 6.6 9.9 6.1 15.1 9.3

Notes: 1Weighted data.

We conducted bivariate and multivariate OLS regression models predicting item

nonresponse rates by survey mode and question characteristics for each experiment. These

results are displayed in Table 9. In Models 1, 3, and 5, we only included survey mode as a

predictor of item nonresponse rates. The Web variable was statistically significant for the LCS

and WCS experiments and the mail follow-up variable was significant for the WCS experiment.

In Models 2, 4, and 6, we included survey mode and also controlled for question characteristics.

Using global F-tests, we found these models to all be significant improvements over the models

with only survey design.

In these models, survey mode was statistically significant (with one exception), even

when controlling for question characteristics. On average, Web respondents tend to have

significantly lower item nonresponse rates than mail-only respondents, holding question

information constant. Also, mail follow-up respondents tend to have significantly higher item

nonresponse rates than mail-only respondents, controlling for other variables. In terms of

Page 27: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

23 | P a g e

question characteristics, the trends vary somewhat across the three experiments. However,

overall, screened, multi-item, and other factual questions tend to be significant predictors of item

nonresponse across all three experiments in similar directions. Screened questions have higher

item nonresponse rates than non-screened questions, even holding survey mode and other

question characteristics constant. Similarly, multi-item questions have higher item nonresponse

rates than single-item questions, controlling for other variables. Finally, other factual variables

have lower item nonresponse rates than demographic questions, holding other variables constant.

Table 9. Multivariate OLS Regression Models1 Predicting Item Nonresponse Rates by Question Type and

Format, by Experiment.

LCS WCS WES

Model 1 Model 2 Model 3 Model 4 Model 5 Model 6

Mode

Mail-only Web (of Web+Mail)

Mail (of Web+Mail)

Reference -3.78** (1.356)

1.11 (1.356)

Reference -3.78** (1.116)

1.11 (1.116)

Reference -2.67* (1.132)

3.03** (1.132)

Reference -2.67** (.983)

3.03** (.983)

Reference -3.52 (2.274)

3.76 (2.274)

Reference -3.52* (1.583)

3.76* (1.583)

Question Format

Screened -- 10.17*** (1.409) -- 6.92*** (1.193) -- 6.87*** (1.852)

Multi-Item -- 2.78* (1.134) -- 2.55** (.987) -- 15.26*** (1.655)

Ordinal Other Nominal

Y/N

Open-Ended

-- --

--

--

-11.98*** (1.709) -8.60*** (2.041)

1.66 (1.926)

Reference

-- --

--

--

-13.14*** (1.757) -11.31*** (1.647)

-2.20 (1.738)

Reference

-- --

--

--

7.52* (3.010) -4.37 (3.221)

0.33 (2.994)

Reference

Question Type

Demographic

Attitudinal

Behavioral Other Factual

--

--

-- --

Reference

2.31 (1.878)

3.34 (1.907) -11.72*** (2.292)

--

--

-- --

Reference

2.01 (1.743)

1.03 (1.633) -6.17*** (1.859)

--

--

-- --

Reference

-19.35*** (2.477)

-29.44*** (2.263) -11.43*** (2.078)

R2 0.05*** 0.38*** 0.07*** 0.32*** 0.03** 0.55***

N 276 276 330 330 288 288

Notes: *p ≤ .05; **p ≤ .01; ***p ≤ .001; 1 Unstandardized coefficients reported (standard errors in parentheses).

In Table 10, instead of predicting item nonresponse using survey mode, we analyzed

bivariate and multivariate OLS regression models using survey design, again adding in question

characteristic variables in the even-numbered models. Results indicate that survey design is not

Page 28: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

24 | P a g e

a significant predictor of item nonresponse in any of the three experiments. Similar to Table 9,

the models with question characteristic variables provide a significant improvement over the

models with only survey design. Also, screened, multi-item, and other factual questions remain

significant predictors of item nonresponse, in the same directions discussed in the Table 9

analyses.

Table 10. Multivariate OLS Regression Models1 Predicting Item Nonresponse Rates by Design, Question

Format and Type, by Experiment.

LCS WCS WES

Model 1 Model 2 Model 3 Model 4 Model 5 Model 6

Design

Web+Mail

(Reference: Mail Only)

-1.33 (1.200)

-1.33 (.999)

0.18 (1.016)

0.18 (.894)

0.12 (2.001)

0.12 (1.420)

Question Format

Screened -- 10.17*** (1.457) -- 6.92*** (1.252) -- 6.87*** (1.917)

Multi-Item -- 2.78* (1.172) -- 2.55* (1.036) -- 15.26*** (1.714)

Ordinal

Other Nominal

Y/N

Open-Ended

--

--

--

--

-11.98*** (1.767)

-8.60*** (2.110)

1.66 (1.991)

Reference

--

--

--

--

-13.14*** (1.844)

-11.31*** (1.729)

-2.20 (1.825)

Reference

--

--

--

--

7.52* (3.117)

-5.37 (3.336)

0.33 (3.100)

Reference

Question Type

Demographic

Attitudinal

Behavioral Other Factual

--

--

-- --

Reference

2.31 (1.942)

3.34 (1.971) -11.72*** (2.369)

--

--

-- --

Reference

2.01 (1.830)

1.03 (1.714) -6.17** (1.951)

--

--

-- --

Reference

-19.35*** (2.565)

-29.44*** (2.344) -11.43*** (2.152)

R2 0.00 0.33*** 0.00 0.24*** 0.00 0.51***

N 276 276 330 330 288 288

Notes: *p ≤ .05; **p ≤ .01; ***p ≤ .001; 1 Unstandardized coefficients reported (standard errors in parentheses).

Page 29: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

25 | P a g e

V. Conclusions

Past research shows that mail-only designs tend to obtain higher unit response rates than Web

designs and are more representative of the general population. However, there appears to be a

tradeoff in using Web and mail modes. Web designs tend to obtain higher quality data, or lower

item nonresponse rates, from those who respond, while mail designs tend to elicit more

respondents overall but obtain lower overall data quality from these respondents. Recent studies

on mixed-mode methods have shown that researchers can use a “Web-plus-mail” design to

persuade the majority of participating households to respond via the Web in general public

household surveys, while achieving as demographically representative a sample as the mail-only

design (Messer & Dillman, 2011).

Results from two similarly designed and implemented statewide general public household

surveys in the northwestern region of the U.S. (WCS and WES) show that the Web-plus-mail

design does not produce significantly different item nonresponse rates than the mail-only design,

even when controlling for demographic characteristics of respondents or survey question

characteristics. Results from a regional general public household survey in the northwestern

region (LCS) also demonstrate that the Web-plus-mail design produces significantly lower item

nonresponse rates than the mail-only design, even when controlling for demographic

characteristics of respondents. However, this difference is no longer statistically significant

when survey question characteristics are held constant.

Although we analyzed demographic characteristics and question characteristics

separately, we found that both are significant sources of item nonresponse variation. For

example, across all surveys, age was a significant predictor of item nonresponse, where higher

ages were associated with higher item nonresponse rates, holding either mode (mail-only, Web,

Page 30: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

26 | P a g e

mail-follow) or design (mail-only, Web-plus-mail) constant. Education was also found to be a

significant predictor of item nonresponse. Compared to respondents with a high school degree

or less, respondents with some college but no degree and respondents with some form of higher

education degree had lower item nonresponse rates, again controlling for either mode or design.

In terms of question characteristics, we found that in at least two of the three

experiments, question formats requiring more cognitive effort to answer – such as open-ended,

screened, or multi-item questions – obtained higher item nonresponse rates than single-item,

close-ended, or non-screened questions, even after controlling for either mode or design and

question type. Compared to demographic questions, other factual questions also obtained

significantly higher item nonresponse rates, holding other variables constant.

There are several important limitations of this study. First, alternative results may be

obtained with different populations, in different locations, and/or with different methods. Based

on these potential areas for variation, these results may be limited in their applicability to other

Web and mail survey contexts. Second, our measure of data quality, item nonresponse, is also

limited to exclude whether an answer was incorrect or invalid, not applicable (e.g. “Don‟t

know”), and, on open-ended questions, differences in length. These measures of data quality are

important and may differ between designs and modes. Third, our combining of treatment groups

in each experiment might mask some important differences based on whether respondents

received an incentive, a Web instruction card, or a Priority Mail envelope.

In sum, Web and mail modes of data collection are likely to become even more prevalent

in the future. Thus, it is important for survey researchers to more extensively consider potential

differences in data quality based on mode and continue to identify the sources of these

differences.

Page 31: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

27 | P a g e

References

Alkaya, Aylin & Alptekin Esin. 2005. “Item nonresponse reasons and effects.” G.U. Journal of

Science 18(4):577-89.

Boyer, Kenneth K., John R. Olson, Roger J. Calantone, & Eric C. Jackson. 2002. “Print versus

electronic surveys: a comparison of two data collection methodologies.” Journal of

Operations Management 20:357-373.

Brečko, Barbara Neza and Ralph Carstens. 2007. “Online data collection in SITES 2006: Paper

survey versus Web survey – Do they provide comparable results?” Proceedings of the

IEA International Research Conference (IRC 2006). Washington, D.C.: 261-269.

de Leeuw, Edith D. 1992. Data quality in mail, telephone, and face-to-face surveys. Amsterdam:

TT-Publicaties.

de Leeuw, Edith D., Joop Hox, & Mark Huisman. 2003. “Prevention and treatment of item

nonresponse.” Journal of Official Statistics 19(2):153-76.

Denscombe, Martin. 2009. “Item nonresponse rates: a comparison of online and paper

questionnaires.” International Journal of Social Research Methodology 12(4):281-291.

Dillman, Don A. 2000. Mail and Internet Surveys: The Tailored Design Method. New York,

NY: John Wiley & Sons, Inc.

Dillman, Don A., John L. Eltinge, Robert M. Groves, & Roderick J.A. Little. 2002. “Survey

nonresponse in design, data collection, and analysis,” in Survey Nonresponse, Groves,

R.M, D.A. Dillman, J.L. Eltinge, & R.J.A. Little (eds.), NY: John Wiley & Sons, Inc. Pp.

3-26.

Dillman, Don A., Jolene D. Smyth, and Leah Melani Christian. 2009. Internet, mail, and mixed-

mode surveys: The tailored design method (3rd

ed.) Hoboken, NJ: John Wiley & Sons,

Inc.

Dixon, John & Clyde Tucker. 2010. “Survey nonresponse,” in Handbook of Survey Research,

Second Edition, Marsden, P.V. & J.D. Wright (eds.), Bingley, UK: Emerald Group

Publishing Ltd. Pp. 593-630.

Ferber, Robert. 1966. “Item nonresponse in a consumer survey.” Public Opinion Quarterly

30(3):399-415.

Page 32: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

28 | P a g e

Kaplowitz, Michael D., Timothy D. Hadlock, & Ralph Levine. 2004. “A comparison of Web

and mail survey response rates.” Public Opinion Quarterly 68(1):94-101.

Kiesler, Sara & Lee S. Sproull. 1986. “Response effects in the electronic survey.” Public

Opinion Quarterly 50(3):402-413.

Kwak, Nojin & Barry Radler. 2002. “A comparison between mail and Web surveys: Response

pattern, respondent profile, and data quality.” Journal of Official Statistics 18(2):257-

273.

Lee, Eun Sul & Ron N. Forthofer. 2006. Analyzing complex survey data (2nd

ed.) Thousand

Oaks, CA: Sage Publications Inc.

Manfreda, Katja Lozar & Vasja Vehovar. 2002. “Do Web and mail surveys provide the same

results?” Development in Social Science Methodology 18:149-169.

Messer, Benjamin L. & Don A. Dillman. 2011. “Surveying the general public over the Internet

using addressed-based sampling and mail contact procedures.” Public Opinion Quarterly,

75(3):429-57.

Messer, Benjamin L. & Don A. Dillman. 2010. “Using address-based sampling to survey the

general public by mail vs. „Web plus mail.‟” Technical Report 10-13, Pullman, WA:

Social and Economic Science Research Center.

http://www.sesrc.wsu.edu/dillman/papersWeb/2010.html

Messmer, Donald J. & Daniel T. Seymour. 1982. “The effects of branching on item

nonresponse.” Public Opinion Quarterly 46(2):270-277.

Schaefer, David R. & Don A. Dillman. 1998. “Development of a standard e-mail methodology:

Results of an experiment.” Public Opinion Quarterly 62(3):378-397.

Shoemaker, Pamela J., Martin Eichholz, & Elizabeth A. Skewes. 2002. “Item nonresponse:

Distinguishing between don‟t know and refuse.” International Journal of Public Opinion

Research 14(2):193-201.

Smyth, Jolene D., Don A. Dillman, Leah Melani Christian, & Allison C. O‟Neill. 2010. “Using

the Internet to survey small towns and communities: Limitations and possibilities in the

early 21st century.” American Behavioral Scientist 53(9):1423-48.

Page 33: Determinants of Item Nonresponse to Web and Mail ......Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition (Dillman, Smyth and Christian, John Wiley Co

Determinants of Item Nonresponse to Web and Mail Respondents in

Three Address-Based Mixed-Mode Surveys of the General Public

SESRC Technical Report 12-001

29 | P a g e

Tourangeau, Roger & Norman M. Bradburn. 2010. “The psychology of survey response,” in

Handbook of Survey Research (2nd

ed.), Marsden, P.V. & J.D. Wright (eds.), Bingley,

UK: Emerald Group Publishing Ltd. Pps 315-346.

Tourangeau, Roger, Lance J. Rips, & Kenneth Rasinski. 2000. The psychology of survey

response. Cambridge, UK: Cambridge University Press.

Wolfe, Edward W., Patrick D. Converse, Osaro Airen, & Nancy Bodenhorn. 2009. “Unit and

item nonresponses and ancillary information in Web- and paper-based questionnaires

administered to school counselors.” Measurement and Evaluation in Counseling and

Development 21(2):92-103.