31
PERSONNELPSYCHOLOGY 2003.56 FROM PAPER TO PIXELS: MOVING PERSONNEL SURVEYS TO THE WEB LORI FOSTER THOMPSON East Carolina University and Army Research Institute ERIC A. SURFACE, DON L. MARTIN North Carolina State University and Army Research Institute MICHAEL G. SANDERS Army Research Institute Practitioners are not adequately prepared to handle concerns related to the acceptability of the online survey medium from the worker’s viewpoint because the literature has only begun to address this issue. This study assessed reactions to Web-based questionnaires while mov- ing an organization’s climate survey online. Initial questions, posed via a paper-and-pencil instrument, gathered opinions concerning on- line surveys (n = 437). A Web-based climate survey was then created and piloted (n = 98). Afterwards, the finalized instrument was ad- ministered (n = 403), and a follow-up questionnaire was disseminated (n = 175) to further gauge workers’ reactions. Despite some initial anonymity concerns, most personnel were amenable to online survey- ing, and the Web-based medium did not appear to discourage partici- pation from any subgroup (based on gender, race, military versus civil- ian classification, and workgroup size comparisons). This article,which is intended for practitioners considering the transition to Web-based surveys as well as those interested in evaluating and improving current Web-based survey processes, outlines issues regarding online survey implementation, offers a tool for evaluating survey software, and con- cludes with lessons learned and avenues for future researchlpractice. We thank Gary Barrett for his assistance with this research. We are also grateful for the helpful suggestions offered by Michael Rumsey, Jim Smither, and two anonymous reviewers. The first, second, and third authors are affiliated with Army Research Institute through the Consortium Research Fellows Program. This investigationwas funded in part by Army Research Institute. Throughout the article, the terms “study” and “research” are used in- terchangeably. The usage of the word “study” is not meant to imply that the investigation drew upon any particular source of funding within or outside of the organization. Further- more, the views expressed in this article are those of the authors and do not necessarily reflect the positions or policies of the research sponsors. Portions of this manuscript were presented at the 17th Annual Conference of the Society for Industrial and Organizational Psychology, lbronto, Ontario, April, 2002. Correspondence and requests for reprints should be addressed to Lori Foster Thomp- son, Department of Psychology, 104 Raw1 Building, East Carolina University, Greenville, NC 27858-4353; [email protected]. COPYRIGHT 0 2W3 PERSONNELPSYCHOLOGY, INC. 197

FROM PAPER TO PIXELS: MOVING PERSONNEL SURVEYS TO THE WEB

Embed Size (px)

Citation preview

PERSONNEL PSYCHOLOGY 2003.56

FROM PAPER TO PIXELS: MOVING PERSONNEL SURVEYS TO THE WEB

LORI FOSTER THOMPSON East Carolina University and Army Research Institute

ERIC A. SURFACE, DON L. MARTIN North Carolina State University and Army Research Institute

MICHAEL G. SANDERS Army Research Institute

Practitioners are not adequately prepared to handle concerns related to the acceptability of the online survey medium from the worker’s viewpoint because the literature has only begun to address this issue. This study assessed reactions to Web-based questionnaires while mov- ing an organization’s climate survey online. Initial questions, posed via a paper-and-pencil instrument, gathered opinions concerning on- line surveys (n = 437). A Web-based climate survey was then created and piloted (n = 98). Afterwards, the finalized instrument was ad- ministered (n = 403), and a follow-up questionnaire was disseminated (n = 175) to further gauge workers’ reactions. Despite some initial anonymity concerns, most personnel were amenable to online survey- ing, and the Web-based medium did not appear to discourage partici- pation from any subgroup (based on gender, race, military versus civil- ian classification, and workgroup size comparisons). This article, which is intended for practitioners considering the transition to Web-based surveys as well as those interested in evaluating and improving current Web-based survey processes, outlines issues regarding online survey implementation, offers a tool for evaluating survey software, and con- cludes with lessons learned and avenues for future researchlpractice.

We thank Gary Barrett for his assistance with this research. We are also grateful for the helpful suggestions offered by Michael Rumsey, Jim Smither, and two anonymous reviewers.

The first, second, and third authors are affiliated with Army Research Institute through the Consortium Research Fellows Program. This investigation was funded in part by Army Research Institute. Throughout the article, the terms “study” and “research” are used in- terchangeably. The usage of the word “study” is not meant to imply that the investigation drew upon any particular source of funding within or outside of the organization. Further- more, the views expressed in this article are those of the authors and do not necessarily reflect the positions or policies of the research sponsors.

Portions of this manuscript were presented at the 17th Annual Conference of the Society for Industrial and Organizational Psychology, lbronto, Ontario, April, 2002.

Correspondence and requests for reprints should be addressed to Lori Foster Thomp- son, Department of Psychology, 104 Raw1 Building, East Carolina University, Greenville, NC 27858-4353; [email protected].

COPYRIGHT 0 2W3 PERSONNELPSYCHOLOGY, INC.

197

198 PERSONNEL PSYCHOLOGY

As an information gathering tool, employee surveys play crucial roles in organizations worldwide. Individuals, departments, and institutions as a whole use data from personnel surveys to adapt and develop them- selves in ways that help them meet the needs of subordinates, business associates, and other key stakeholders. Clearly, the workforce’s willing- ness to participate in these surveys can significantly impact the larger or- ganizational system. In fact, the success of any employee questionnaire depends upon the cooperation of personnel who volunteer the time and effort required to respond to the survey items. These voluntary efforts historically involved the completion of paper-and-pencil questionnaires. Recent advances in technology have made electronic (e.g., e-mail and Web-based) surveys possible; consequently, a number of organizations are now moving their personnel attitude surveys online.

At present, there is a dearth of published research assessing workers’ reactions to Web-based survey methods. This study addresses this hole in the literature. Specifically, this study accomplishes three primary ob- jectives: (a) identify workers’ concerns regarding electronic survey tech- niques prior to the implementation of a Web-based survey; (b) research, design, develop, and disseminate a Web-based survey; and (c) examine workers’ acceptance of the Web-based survey, once administered. The secondary objective of this article is to elaborate on some of the issues that need to be considered when implementing online surveys (e.g., cri- teria against which to judge electronic surveying software) to guide prac- titioners who are thinking about shifting from paper to Web-based ques- tionnaires as well as those interested in evaluating and improving current online survey processes.

Research dealing with the differences between paper- and Web- based personnel surveys is in its infancy (Kraut, 2001). The work that has been done tends to compare the two survey media from the survey ad- ministrator’s point of view. Online questionnaires can benefit practition- ers and organizations in numerous ways. Key advantages concern speed and cost efficiency. Once startup costs are absorbed, online surveys can save money by reducing the paper, ink, mailing, and environmental costs associated with their paper-and-pencil counterparts (Sheehan & McMil- lan, 1999). Internet technology can decrease manpower costs by elimi- nating steps such as photocopying surveys, mailing packets, typing, scan- ning, cleaning, and coding data. As noted by McFarland, Ryan, and Paul (1998), automatic data entry increases accuracy because coding errors are less likely. Furthermore, the survey is delivered to employees faster, responses are received more quickly, and the data analysidfeedback steps are automatic or accelerated (Dommeyer & Moriarty, 2000; Shee- han, 2001; Sheehan & McMillan, 1999; Yost & Homer, 1998; Yun & Trumbo, 2000), leading to more timely use of employee input. These

LORI FOSTER THOMPSON ET AL. 199

advantages have led some to suggest that online surveys are ultimately more cost efficient than paper alternatives (Couper, 2000; Dommeyer & Moriarty, 2000; Kraut & Saari, 1999; Schmidt, 1997; Sheehan, 2001).

In many situations, the preceding benefits largely encourage the shift toward electronic surveys. Nevertheless, this transition has faced some opposition (Couper, 2000). Various practitioners have encountered a “don’t mess with success” form of resistance (Kraut, 2001; Spera, 2000). Thus, paper-and-pencil surveys persist in both computer-sawy organi- zations and organizations with limited computer literacy and/or access among employees. The pros and cons of doing so are worth noting. A fundamental disadvantage of this decision involves the forfeiture of the efficiencies afforded by online survey technologies (e.g., quicker data entry). Then again, the paper surveys currently generate fewer upfront costs and uncertainties. As Spera (2000) has pointed out, management is risk averse, and new technology is risky. This safety-efficiency tradeoff is often at the heart of decisions driving the survey medium of choice.

Notably, practitioners working on Web-based survey projects can take steps to both reduce the risks that accompany this undertaking and alleviate undue resistance among managers and other key decision mak- ers. The literature emphasizes the importance of candidly addressing concerns upfront and establishing risk reduction methods (Kraut, 2001; Spera, 2000). There are at least six common concerns that practitioners should consider upfront each time they choose to implement an online survey. First, there is the apprehension that some or all of the popula- tion may not have convenient access to the equipment needed to fill out the survey. In anticipation of this concern, it is important to assess the proportion of the workforce with computer access and the cost of pro- viding such access where necessary. Research has begun to demonstrate the psychometric equivalence of various survey media options (Dono- van, Drasgow, & Probst, 2000; King & Miles, 1995; Magnan, Lundby, & Fenlason, 2000; Saphore, 2000; Spera & Moye, 2001; Stanton, 1998; Young, Daum, Robie, & Macey, 2000). If measurement invariance is established, it may be reasonable to administer Web surveys to some em- ployees and use other methods (e.g., paper, kiosks) for subpopulations that lack broad Web access (Spera & Moye, 2001).

Second, some electronic surveys do not ensure “one employee, one survey,” thereby raising questions about ballot stuffing, which occurs when a survey is completed multiple times by the same person. In re- sponse to this suspicion, practitioners may wish to note that employ- ees could find a way to complete paper surveys multiple times if they so desired (Spera, 2000). Spera (2000) has argued that the probabil- ity of people taking the time to submit multiple responses to an elec- tronic survey is quite low. Indeed, Church (2001) examined the possi-

200 PERSONNEL PSYCHOLOGY

bility of identical survey submissions after administering an online ques- tionnaire without access controls and was able to verify that none of his participants submitted the same survey responses multiple times. The implementation of survey passcodes (unique access control numbers) can altogether prevent ballot stuffing while barring unauthorized per- sons from the survey. On the downside, it has been speculated that the use of passcodes may negatively affect respondents’ perceptions of anonymity and hence their candor (Stanton, 1998); however, this belief remains untested.

Third, start-up costs in terms of labor and money are considerable (Donovan, 2000). Costs include things such as purchasing software and hardware, acquiring adequate computer network space, and time in- vestments required when learning how to program the survey (Yun & Tmmbo, 2000). Even though there is no way to get around these start- up expenses, it is important to recognize them as investments that will pay off on many future projects, rather than seeing them as absolute ex- penses tied to a particular survey (Donovan, 2000).

A fourth area of concern involves the fear that segments of the work- force will be unable to access the survey due to technical difficulties (e.g., unreliable connections and bandwidth limitations). Fifth, the survey administrator has less control over the look and feel of a Web-based questionnaire, compared to its paper counterpart, be- cause the nuances of a Web-based survey are determined by factors (e.g., browser settings, processors, monitors, user preferences, and hardware platforms) that may change from respondent to respondent (Couper, 2000; Dillman, 2000). Keeping the online survey design as simple as pos- sible helps to reduce both technical difficulties and the variability of the survey’s appearance. Furthermore, it is essential to pilot test the survey to ensure that it functions and appears properly across users/machines before it is sent to the overall workforce (Donovan, 2000; Magnan et al., 2000; Zatz, 2000).

The sixth (and perhaps the most compelling) concern that accompa- nies online surveys involves uncertainties regarding the degree to which personnel will accept the Web-based survey methodology. If an online format discourages large numbers of people from completing a ques- tionnaire, then response rates will dwindle. Furthermore, if distinct seg- ments of the workforce are not comfortable filling out online question- naires, results may be skewed by people who disagree with nonrespon- dents in meaningful ways. It is currently difficult to respond to this sixth and final subject of concern due to the lack of published research to in- form practice in this area. Clearly, the examination of workers’ reactions to electronic surveys is paramount.

LORI FOSTER THOMPSON ET AL. 201

Personnel Attitudes Toward Web-Based Surveys

For starters, we need to begin addressing questions such as: “Given the choice, which method of response do employees prefer?” (Church, 2001, p. 938). From one perspective, employees might prefer the con- venience of a Web-based survey. Volunteer respondents may appreci- ate the speed with which survey feedback is relayed, particularly when administrators take advantage of the survey software’s automatic data analysis and reporting capabilities. Environmentally friendly question- naires can be sent via the click of a button, and data can go directly to the survey administrators, thereby reducing the likelihood that super- visors or unauthorized others will see individual survey response sheets. Many people type faster than they write, making it easier to answer open- ended questions. Web-based surveys also eliminate the possibility that a respondent can be identified based on his or her handwriting style. In- deed, McFarland et al. (1998) have predicted that electronic surveys will increase respondents’ feelings of anonymity.

In contrast, most authors have argued that respondents who are asked to complete electronic surveys may be particularly inclined to question reassurances of anonymity, which could be breached by people who illegally tap into electronic survey data files and by “big brother” technologies that trace online survey responses back to individual com- puters or e-mail accounts (Couper, 2000; Donovan, 2000; Kraut & Saari, 1999; Magnan et al., 2000; Rosenfeld, Booth-Kewley, Edwards, & Thomas, 1996; Yost & Homer, 1998). Moreover, a workforce with weak Internet self-efficacy beliefs is likely to resist online opportunities and experience stress-inducing problem situations when they do use the Internet (Eastin & LaRose, 2000). Another potential problem involves assurances that data are actually transported. In many cases, respon- dents physically hand paper surveys to couriers or designated others. Al- ternatively, they cannot see their electronic data being sent to the survey administrator. Even when survey transmission succeeds, people who do not trust technology may feel uncertain about whether their responses were actually received. Finally, many practitioners have noted that em- ployees who spend most of their time traveling or working from home demand flexibility in where they complete their surveys. If accessing an online questionnaire away from the office is more difficult than pack- ing a paper survey in a briefcase, then on-the-go personnel may react negatively.

Unfortunately, practitioners lack data to refute or support these po- tential reactions to electronic questionnaires. Indeed, no published re- search has directly investigated employees’ perceptions of online sur- veying, which is not a trivial issue. It is encouraging to note that some

202 PERSONNEL PSYCHOLOGY

research has indirectly addressed employee acceptance by comparing paper versus Web-based survey response rates. Results are somewhat mixed. Several of these studies compared response rates across the two media after asking people to complete either a paper questionnaire or an online survey. A few found lower response rates for the online sur- vey (Paolo, Bonaminio, Gibson, Partridge, & Kallail, 2000; Schuidt & Totten, 1994; Weible & Wallace, 1998), some revealed higher response rates for the electronic survey (Oppermann, 1995; Parker, 1992), and in others’ experiences participation rates have been similar across the two survey media (Kraut, 2001). Moving beyond simple comparisons of paper versus online survey response rates, additional research has been designed to assess true media preferences by giving people the choice of whether they wanted to complete their surveys online or on paper. Church (2001) found that, when given a choice, most (58%) of the re- spondents opted for the online rather than the paper format. Younger employees were especially likely to choose the online survey option. A follow-up study in the same agency yielded similar results. Conversely, most (approximately two-thirds) of the respondents in an unpublished study by Yost and Homer (1998) chose the paper option when they were given a choice between paper and Web-based surveys.

Interestingly, at the end of both the paper and Web versions of their survey, Yost and Homer (1998) asked an open-ended question to gauge how employees would feel about taking future surveys via the Web. Con- tent analyses of the responses revealed support tempered by concerns regarding: anonymity, lack of access to the Web, and lack of training on how to complete online surveys. An unpublished study by McFarland et al. (1998) directly examined attitudes toward online surveys by asking employees to complete a paper questionnaire assessing feelings about computerized surveys. Rating data suggested that attitudes toward on- line surveys were quite favorable. McFarland et al. (1998) also found that the online medium increased perceptions of anonymity/confi- dentiality and noted that it was not clear why this result occurred.

Research Scope and Overview

In sum, practitioners are not adequately prepared to handle con- cerns related to the acceptability of the online survey medium from the worker’s viewpoint because the literature has only begun to address this issue. The administrative advantages of electronic surveys notwith- standing, it is presently unclear how personnel perceive and react to the prospect of completing their surveys online. Thus, the current study was designed to systematically assess apprehensions, perceptions, and reac- tions related to the use of Web-based personnel surveys.

LORI FOSTER THOMPSON ET AL. 203

This study describes a multimethod investigation designed to mea- sure personnel attitudes and behaviors while moving a large military group’s organizational climate survey from paper to the Web. This initia- tive involved four unique phases. First, initial questions were provided near the end of a 1998 paper-and-pencilversion of the organizational cli- mate survey (n = 437) that asked people to rate their support for future online surveys and express any concerns regarding Web-based surveys in an open-ended response format. Second, the climate survey was moved online and pilot tested in February of 2001 (n = 98). Participants were asked to rate their reactions to various aspects of the survey. Third, the Web survey was administered in full in July of 2001 (n = 403). The data (perceptions concerning the survey and response rates) were sub- sequently analyzed. Fourth and finally, a brief follow-up questionnaire was administered in September of 2001 (n = 175) to provide an initial indication of the reasons why nonrespondents did not participate in the July 2001 Web-based survey.

Study One

The goal of the first phase of this research was to investigate how the workforce felt about shifting their familiar “command climate survey” from paper to the Web. This instrument was a census survey designed for recurrent administration to everyone working at the headquarters of a military organization. It is important to emphasize that the climate survey was administered to the headquarters staff only. These personnel were responsible for providing numerous types of support (e.g., logistics) to operational units distributed across multiple locations and deployed around the world. (The survey was not used for operational units, i.e., soldiers who deploy.) The participants in this study were essentially all off ice workers functioning in administrative, professional, and manage- rial capacities. Almost all of them used computers for their jobs on a daily basis, and all were presumed to be quite comfortable with com- puters. Approximately half of the workforce consisted of civilian em- ployees, and nearly two-thirds of these civilians, who primarily worked under the General Schedule (GS) classification system, operated at a grade level of 11 or higher.l More than half of the military personnel were commissioned officers.

Most of the survey items used a rating scale ranging from 1 = strongly disagree to 5 = strongly agree. The survey measured perceptions of several

‘Grade is the numerical designation, GS-1 through GS-15, which identifies the range of difficulty and responsibility as well as the level of qualification requirements for civilian positions included in the General Schedule (GS) system (Workforce Compensation and Performance Service, 1991).

204 PERSONNEL PSYCHOLOGY

areas: general satisfaction, immediate supervision, training/develop- ment, and so forth. Due to the amount of time and labor involved in administering and analyzing the paper-and-pencil version of the climate questionnaire, the survey research staff proposed a Web-based alterna- tive. IIb assess attitudes toward this change, personnel were questioned before the implementation of the Web-based version of the survey.

Method

In order to assess concerns about Web-based questionnaires, we added two items to the climate survey prior to distributing a paper-and- pencilversion of the instrument. The first item was worded as follows: “I support the use of LAN or online surveying technology for the next com- mand climate survey.” Respondents were asked to use a 5-point scale to rate their agreement with this statement. In the survey booklet, this item was preceded by an explanation of electronic surveying2 and followed by the second item-an open-ended question asking for comments related to the use of electronic surveys. In October of 1998, the paper-and- pencil climate survey was distributed to the entire workforce. As usual, responses were anonymous, and participants were not identifiable. Data collection was terminated after 14 days.

Results and Discussion

The 83-item3 paper survey was completed and returned by 437 peo- ple (a 54% response rate). The sample consisted of 198 (45%) civilians, 186 (43%) military personnel, and 53 (12%) people who did not reveal their classification. Overall, 122 (28%) respondents were women, 256 (59%) were men, and 59 (14%) did not indicate their gender. The eth- nic makeup of the sample was: 73 (17%) African American; 8 (2%) American Indian; 14 (3%) Asian; 267 (61%) Caucasian; 11 (3%) His- panic; and 64 (15%) ~ndeclared.~ All directorates (which are functional groups, e.g., personnel management) were represented. In general, the demographics of those who completed the survey were quite similar to

2The 1998 paper climate survey included the following explanation before gathering opinions concerning online surveys: “We are considering using an online or JAN-based survey for the next command climate survey.. . a LAN-based survey [can be made] avail- able for OUT use, and several companies offer Web-based surveying using the same security technology used for banking transactions. We would like to know what you think about us- ing the online or LAN surveying technology.”

3The command climate survey asked several open-ended questions; the survey item totals reported here do not factor in these open-ended questions.

4The demographic group percentages reported in text do not always total to 100% due to rounding error.

LORI FOSTER THOMPSON ET AL. 205

the overall workforce demographics, suggesting that the sample of re- spondents was representative of the broader population.

Many respondents were supportive of the transition to Web-based surveys, as suggested by reactions to the questionnaire item that asked participants to use a 5-point scale to rate their support of Web-based technology for the next command climate survey (M = 3.43, SD = 1.33). Each participant was placed into one of three categories, based on his or her response to this item. Those who selected one of the top two scale options were considered supportive, those who chose the midpoint of the rating scale were considered neutral, and those who selected one of the bottom two scale options were assigned to the does not support category. nble 1 reports the degree to which members of various de- mographic groups expressed positive, neutral, and negative reactions to online surveying. As can be seen, more than 75% of the sample was either supportive or indifferent. Chi-square analyses were conducted to determine whether demographic groups varied in their support of imple- menting electronic surveying. As shown in nble 1, military and civilian personnel were the only groups that differed significantly, with military respondents expressing greater support than civilians. It should be noted that military and civilian respondents differed on most survey items and dimensions; with few exceptions, military personnel were more satisfied than civilians. No other meaningful demographic data were gathered from respondents due to the organization's concern that too many demo- graphic items would affect response rates and candor. We were there- fore unable to test relationships between online survey support and ad- ditional demographic variables, such as age.

Next, we content analyzed the qualitative data resulting from the open-ended question related to electronic surveying. Eighty-one re- spondents (19% of the paper survey sample) provided comments, which were first coded into categories related to their level of support for elec- tronic surveying: support, conditional support, do not support, and other comments. This analysis indicated that 28 (35%) of the 81 participants who provided comments supported the use of electronic surveying tech- niques, 9 (11%) supported the use of electronic surveying under certain conditions, 21 (26%) did not support electronic surveying, and the re- maining 23 (28%) provided comments that either made no direct men- tion of level of support or were indifferent.

Some comments did not contain specific content other than express- ing a level of support. Content-laden comments were coded into one of four groupings: areas of concern, benefits, process issues, and more information needed (i.e., respondents felt they needed more information before they could determine their level of support). Of the 73 comments appropriate for this analysis, 59 (81%) addressed areas of concern with

206 PERSONNEL PSYCHOLOGY

TABLE 1 Support for Electronic Surveys, as Expressed on Paper;

by Demographic Groupings

Level of s u ~ ~ o r t ~ Do not

Suvwrt Neutral suv~ort Groupings n(%) n(%) n(%) n xz df p

Gender 0.87 2 .646 Male 143 (61%) 46 (19%) 47 (20%) 236 Female 56 (55%) 21 (21%) 24 (24%) 101

Civilian 85 (49%) 37 (21%) 51 (29%) 173 Military 114 (67%) 31 (18%) 24 (14%) 169

AfricanAmerican 44(69%) 11 (17%) 9(14%) 64 American Indian 4(57%) 1(14%) 2(29%) 7 Asian 7(54%) 3(23%) 3(23%) 13 Caucasian 137 (57%) 47 (20%) 55 (23%) 239 Hispanic 8(80%) 1(10%) 1(10%) 10

African American 44 (69%) 11 (17%) 9 (14%) 64 Caucasian 137 (57%) 47 (20%) 55 (23%) 239

Overall sample totals 213 (57%) 71 (19%) 92 (24%) 376

Classificationb 14.43 2 .001

Ethnic group 5.37 8 .718

Subset of ethnic groups‘ 3.18 2 .204

Notes: Percentages provided in parentheses are rounded to the nearest whole number and based on the group total that appears in the same row. Percentages in the same row may not total to 100% due to rounding error. Respondents who failed to report the relevant demographic information were excluded from these analyses.

a The original 5-point scale was collapsed into three categories for ease of presentation. Each of the group comparisons presented above was also run on the full 5-point scale. In all cases, the two analysis types led to the same conclusions regarding group similari- tieddifferences.

Civilian and military respondents differed significantly. The valid use of chi-square requires that expected frequencies not be too small (Witte

& Witte, 2001). We therefore reran the analysis testing ethnic group differences on the largest two ethnic subgroups.

electronic surveying. The top issue related to anonymity/confidentiality (n = 44). Other concerns involved: data quality (n = 7), lack of ac- cess to the LANflnternet (n = 3), data security (n = 2), computer liter- acy/need for training (n = 2), and the inability to complete an eIectronic survey at home (n = 1). Next, 8 (11%) of the content-laden comments described benefits, including more effective use of resources (n = 4), process improvement (n = 3), and environmental friendliness (n = 1). The remaining 6 (8%) of the 73 content-laden comments involved pro- cess issues (n = 3) and a need for more information (n = 3).

In sum, the questionnaire items provided useful information con- cerning the workforce’s attitudes toward Web-based surveys. Many of those who supported the transition mentioned key benefits associ-

LORI FOSTER THOMPSON ET AL. 207

TABLE 2 Features and Capabilities to Consider During Survey Software Evaluation

Survey design

1. 2.

3. 4. 5. 6. 7. 8. 9.

10. 11. 12. 13.

14. 15. 16. 17.

18.

High quality survey item library. Item pool storage and reuse capabilities (to create/maintain pools of items from

which to build future surveys). Support for transforming item stems into Web-page format. Ability to program questions that respondents must answer. Ability to program branch questionskip patterns based on respondents’ answers. Ample items allowed for each survey. Ability to create N x N item tables. High quality response scale library. Ability to program a wide range of response types (Likert, sliding scale. etc.). Ability to include multiple types of response scales in the same table of items. Ability to program help screens unique to each survey item. Ability to customize background. Ability to develop surveys that can be suspended midstream (mechanisms for

storing data generated by respondents who pause and return to the survey at a later point in time).

Intuitive graphical user-interface for survey programmer. Availability of menu-driven options to ease programming. Easy drop-and-drag or pixel-level programming options. Availability of a supporthelp component (super-imposed support system to ease

Ability to create surveys suitable for a range of delivery method (e.g., €ITMI.- programming and survey administration).

Internet and intranets, opscan, kiosk, e-mail, diskette). Survev imdementation . .

19. Ability to enter lists of participants’ e-mail addresses. 20. Automatic sampling methods (cluster, stratified, random, etc.). 21. Support for announcing survey via a range of methods (e-mail, adding a link

22. A secure password-controlled response management environment. 23. Ability for survey administrator to control who can respond to survey based on the

24. Tkacking features (for survey administrators to track responses back to specific

25. Ability to deploy multiple surveys concurrently. 26. Ability to host survey on own server. Note: Items 27-31 are relevant for organizations that need to utilize a software vendor’s

survey hosting capacity. 27. Provision (by software company) of ample disk space for each survey. 28. Ample responses allowed for each survey. 29. Provision (by software company) of reliable server space. 30. Provision (by software company) of secure server space. 31. Provision (by software company) of ample bandwidth. 32. Extensibility (i.e., the ability to extend survey software’s capabilities by adding

to a personal Web site, etc.).

restriction(s) of choice (e.g., IP address, embedded ID).

people, events, or activities).

custom programming scripts).

208 PERSONNEL PSYCHOLOGY

TABLE 2 (continued)

- 33. 34. 35. 36. 37.

38. 39. 40.

41.

42. 43. 44. 45.

46. 47.

48.

Survey analysis and reporting Ability for survey administrator to retain control and ownership of raw data and graphs. Ability to download data at will. Ability to automate the data download process. Automatic data summary features (e.g., “results so far”) for administrator’s use. Automatic data summary features (e.g., ability to post updated survey results for

Ability to share read-only survey files with colleagues. Qualitative data coding support for open-ended responses. Ability to perform basic statistical analyses (means, standard deviations, frequencies)

Ability to perform sophisticated statistical analyses within survey software program. For example:

respondents to view immediately upon survey submission).

within survey software program.

(a) Ability to cross-tabulate data to compare responses across questions. (b) Ability to filter data to analyze results based on specific subsets of respondents. (c) Ability to analyze data based on response day, month, or year.

Ability to export raw data to other programs. Intuitive graphical user-interface for report creation. Availability of menu-driven options to ease report creation. Ability to export graphs and data to popular word processing, spreadsheet,

Ability to report survey results via a range of options (e.g., pie graphs, bar graphs). Ability to customize the report layout and presentation (the look and feel, not the

Availability of a supporthelp component (super-imposed support system to ease

or presentation programs.

statistics) in the surveying software.

report creation). Other noteworthy concerns

.What are the software costs?

.List any ancillary resource requirements (e.g., Does the software require the presence of additional things, such as a resident common gateway interface script or Microsoft Frontpage extensions on the organizational server?).

.What kind of technical support does the software company offer?

.What kind of upgrade coverage is available?

.What kinds of demands (if any) will the software place on organizational server

.In general, how flexible/modifiable/customizable is the survey software? Will

.How easy/difficult will it be for respondents and decision makers to access

.Other key advantages not covered previously?

.Other key disadvantages not covered previously?

resources (e.g., how much server space and bandwidth will be used?)?

survey designers be forced to use features they do not want?

the survey with the software?

ated with the technology; yet, we were arguably more interested in the areas of concern, especially those that can be eased by practitioners. Anonymity/confidentiality was the primary concern. We therefore con- tinued our efforts with the intention of alleviating and further gauging this apprehension prior to a full implementation of the Web-based ver- sion of the survey.

LORI FOSTER THOMPSON ET AL. 209

Study Two

The goals of the second phase of this research were to investigate online survey software choices, select an appropriate software option, and design, develop, and pilot test a Web-based version of the climate questionnaire.

Method

Survey sojlware evaluation. The second phase began with a review of the options provided by various online survey software packages. Soft- ware packages differ widely in terms of the number and types of features available. Feature options are largely intended to ease survey design, im- plementation, analysis, and/or the reporting of results. When evaluating a Web-based survey package, it is important to consider multiple factors including: cost, the availability of essential features, the ease with which features can be used, and the extent to which undesirable features can be turned off.

The list of available software features continues to grow, and evalu- ating a given package can be an overwhelming task. A “scientific selec- tion” approach to software evaluation can facilitate this process consid- erably. It is important to assess the organization’s survey needs, evaluate the attributes of affordable software options, and then estimate the de- gree to which any given package fits the organization’s needs. Bble 2 lists many of the currently available features, thereby providing a tool that can be used by practitioners wishing to evaluate the “fit” of the sur- vey software that they currently have in place. nble 2 can also be used by those aiming to select a suitable software package. When selecting survey software, a multiple-hurdle approach may prove useful. Practi- tioners can begin by expanding Bble 2 to include any noteworthy soft- ware features not listed. They can then evaluate the importance of the various features and sort the list by importance. This step encourages a focus on relevant software qualities. For example, the size and reliability of the server space offered by a software, company is immaterial for or- ganizations that wish to post questionnaires on their own servers; thus, these features should not affect the software selection process. Next, the pool of software candidates should be narrowed to affordable op- tions. Affordable software candidates can then be evaluated in terms of the degree to which they cover areas of importance. This assessment re- stricts the list of candidates to those that adequately meet organizational needs. Lastly, the user friendliness of important, available features can be looked at to choose among the final contenders.

210 PERSONNEL PSYCHOLOGY

Such a systematic approach requires a clear understanding of the priorities and objectives associated with the electronic survey initiative. During the second phase of our study, we therefore explicitly determined our electronic surveying priorities. In terms of priorities, five critical ar- eas of concern emerged. First, we needed software that permitted sur- vey creation and administration by nontechnical personnel. Second, we required adaptable software that would be able to support concurrent, multiple surveys. Third, we sought the capability to create and maintain pools of items, thereby easing the construction of future surveys. Fourth, low infrastructure overhead was important. Fifth and finally, easy Web- based access for survey respondents and decision makers was essential. We needed to minimize the possibility that people would fail to complete the survey or use the results due to access difficulties.

With these five priorities in mind, we next attempted to identify ap- propriate software. Several approaches were considered. For instance, a standard, off-the-shelf software package could be used or a proprietary system could be developed. After considering both alternatives, the pro- prietary option was deemed less desirable due to its unpredictable cost, its present and future labor needs, and its potential impact on informa- tion technology personnel. The decision to use commercially available software rather than a vendor was largely affected by cost, security, and the continued viability of the survey process. A vendor’s bid to develop a static version of the climate survey with automated feedback was over 300% more costly than customizing and implementing the selected off- the-shelf solution. Data from vendor-based survey solutions suggested that the cost would, at best, remain level for future administrations (as- suming a flat cost line from all vendors). In contrast, the reoccurring cost associated with the off-the-shelf software is limited to a minimal yearly upgrade and support fee, unless additional customization is required. Furthermore, by ensuring that sensitive data would be safeguarded and managed by in-house personnel, off-the-shelf software provided greater security guarantees. Finally, decision makers felt that compared to a vendor, an off-the-shelf system would more readily adapt to changing or- ganizational survey demands in part because in-house personnel could maintain the software and resulting surveys without the inconvenience of going through a vendor.

We proceeded with an extensive review of commercially available survey software and selected a package after considering the costs, fea- tures, and capabilities of various options. Naturally, the software selec- tion process entailed tradeoffs between cost and functionality, and our list of priorities helped us identify the features we should sacrifice in the interest of cost. For example, one potentially desirable option was the ability to post updated survey results immediately upon submission

LORI FOSTER THOMPSON ET AL. 21 1

(see Tmble 2, #37). Although this feature was available, it would have required an additional cost to implement. This extra cost (and the or- ganizational protocol of briefing the results to the commanding general ahead of time) argued against this option, which did not correspond to our list of priorities.

Web-based survey design. We purchased a software package, trans- ferred the climate survey to the Web, and prepared to pilot test the in- strument. To provide the most direct comparison with the paper-based administration, the electronic questionnaire was based on items from the 1998 paper survey distributed in Study 1. The two surveys were therefore essentially the same in terms of content, with small exceptions (e.g., a noncontroversial survey dimension, titled military-civilian relations, was dropped after 1998). These differences resulted from the normal survey refinement process and changing stakeholder information requirements. Furthermore, most of the open-ended comment blocks were temporarily eliminated to minimize the burden placed on pilot participants.

An online introductory module was built to precede the Web-based survey. This module was used to announce the survey, address questions regarding its administration, and reassure participants that responses would remain anonymous. Other items stressed in the module included an indication that the data would be sent to survey research staff, clarifi- cation regarding the use of the data, an explanation that the survey was a prototype, and instructions for accessing and completing the survey. For the purpose of the pilot test, the module also informed respondents that they would be asked to help evaluate the online survey process.

When designing the online version of the survey itself, we paid spe- cial attention to the amount of time it would take to complete the ques- tionnaire, as we did not want to increase the demands placed on the respondent. We kept the graphics to a minimum to increase the speed with which the survey would load onto participants’ computers. We also attempted to minimize scrolling requirements by designing the survey so that the entire question and response heading would appear on a screen without requiring the respondent to scroll from side to side. To stabilize the survey’s appearance across computers, we kept the format simple, using a small number of colors and fonts. We tested the survey with dif- ferent monitor resolutions, to ensure that it would appear properly in different environments.

Respondents could not access the pilot questionnaire without a pass- code assigned by the survey administrators. At this point, we had not yet decided whether the final survey would require passcodes. Because the use of passcodes was a possibility, we included them in the pilot test to confirm that this software feature worked as intended, blocking out- siders and preventing multiple submissions from the same person. All

212 PERSONNEL PSYCHOLOGY

identifiers were stripped from the instrument, and the process was de- signed to ensure that responses would remain anonymous.

The online questionnaire was constructed with a save and resume feature, which allowed respondents to close the survey midstream and then continue at a later point in time without losing their initial input. The survey was 27 pages or screens in length, with approximately five items per page. Respondents were allowed to move backward and for- ward as they wished, and they were allowed to skip survey questions if they so desired. An open-ended item at the end supplied a text box where participants were to type their input. AU other items provided objective response scales, where participants were asked to clicWselect options. A “don’t know” option was not provided. For the purpose of the pilot test, 18 new questions were added to the end of the survey. These items asked participants to assess the online questionnaire in terms of speed, ease of access/completion, and appearance. In addition, respondents were asked to specify the resolution of their monitors, rate their satisfaction with the online process, and indicate concerns regarding anonymity. Fi- nally, an open-ended item allowed input regarding additional concerns.

Web-bused survey pilot test procedure. In February of 2001, the 73- item survey5 was moved online and pilot tested. The sample used for the prototype was estimated at approximately 12% of the personnel, or 80 people. We used stratified sampling based on four categories: gen- der, civilian versus military workforce status, large versus small direc- torate, and high versus low r d g r a d e . A roster of personnel was used to randomly select 4 to 5 people from each of the 16 profiles, resulting in 78 participants. Furthermore, one directorate requested that its entire staff be sampled. An additional 88 participants were therefore included, resulting in a sample size of 166.

Results and Discussion

By the end of the 7-day pilot period, 98 useable responses were re- ceived, yielding a 59% response rate. This is slightly higher than the pre- vious paper-based survey return rate, which was 54%. The survey track- ing system indicated that one person attempted to complete the ques- tionnaire more than once, and the passcode prevented this occurrence. The ethnic makeup of the respondents was: 14 (14%) African Amer- ican; 2 (2%) American Indian; 0 (0%) Asian; 68 (69%) Caucasian; 6 (6%) Hispanic; 2 (2%) who placed themselves in an “other” category; and 6 (6%) undeclared. The sample consisted of 53 (54%) civilians, 43 (44%) military personnel, and 2 (2%) people who did not reveal a clas-

‘see. footnote #3.

LORI FOSTER THOMPSON ET AL. 213

sification. Overall, 28 (29%) of the respondents were women, 66 (67%) were men, and 4 (4%) did not choose to declare their gender.

To investigate whether some groups were particularly reluctant to respond to the electronic climate survey, we examined the participation rates within the stratified random sample. Forty-nine (63%) of the 78 randomly sampled personnel completed the survey, and chi-square anal- yses showed no difference in response rates based on gender, civilian versus military status, and work group size.

Eighty-six (88%) of the 98 respondents completed the evaluation items added to the end of the survey to assess the Web-based question- naire’s usability. Xible 3 presents participants’ responses to three evalu- ative survey items. As indicated in the table, respondents generally liked the online survey and the process for completing it. Approximately one- fourth of the participants indicated that they were concerned about their survey responses being identifiable.

Next, we tested the relationship between several demographic vari- ables and responses to the three items shown in a b l e 3. Chi-squared analyses indicated that no single demographic group (men vs. women; African Americans vs. Caucasians; military vs. civilian personnel) was particularly inclined to dislike/distrust the online survey method. These results, which corroborated the conclusions drawn from the response rate data presented previously, suggested that the online survey medium did not systematically shut out a particular demographic group.

Additional items were presented to gauge the appearance and us- ability of the online survey. The data indicated that the time require- ments were comparable to the paper version of the survey. When asked if they experienced difficulty accessing the survey online, 76 respondents (88%) answered “no.” Additional analyses did not reveal any problems with the size, color, and appearance of the text; however, only 53 re- spondents (62%) indicated that they could see the entire question and response heading on the screen without scrolling.

Responses to the open-ended question were reviewed and coded to identify any additional problems, concerns, or suggestions for improve- ment. Thirty (35%) of the respondents generated a total of 32 comments (two people produced compound comments). Of the 32 comments, 16 (50%) suggested ways to improve the survey content but did not address the online format. Overall, 5 (16%) of the comments indicated that no improvement was needed; 5 (16%) suggested technical format improve- ments, such as “too much scrolling required” and “include more items per page”; 2 (6%) comments addressed system issues, such as lengthy download time per page; and the remaining 4 (13%) comments were unique contributions addressing some other area. Only one of these other comments expressed a concern with being identified.

TABL

E 3

Rea

ctio

ns to

Onl

ine

Clim

ate Survey E

xpre

ssed D

urin

g Pi

lot S

tudy

Leve

l of a

gree

men

t St

rong

ly

Stro

ngly

N

ot

disa

gree

D

isae

ree

Neu

tral

answ

ered

To

tals

Surv

ey it

em

n(%

) n

(%)

n(%

) n(%

) n

(%)

n(%

) n

&

sa

Th

e onl

ine s

urve

y is a

useful w

ay to

com

plet

e 1 (

1%)

4 (5%)

9 (10%)

52 (60%)

19 (22%)

1 (1%)

86

3.99

0.79

I am

sat

isfie

d with

the

onlin

e pro

cess

for

3(3%)

3(3%)

13(15%)

52(60%)

15(17%)

O(O%

) 86

3.85

0.88

I am

not c

once

rned

abou

t my

surv

ey re

spon

ses

8 (9%)

14 (16%)

20 (23%)

30 (35%)

13 (15%)

1 (1%)

86

3.31

1.20

the

com

man

d cl

imat

e sur

vey.

the

com

man

d cl

imat

e sur

vey.

beiig

iden

tifia

ble.

Not

es: P

erce

ntag

es pr

ovid

ed in

par

enth

eses

are b

ased

on

the

grou

p tot

al th

at a

ppea

rs in

the

sam

e row

. Per

cent

ages

are r

ound

ed to

the

near

est w

hole

nu

mbe

r. Pe

rcen

tage

s in

the

sam

e row

may

not

tota

l to 100% d

ue to

roun

ding

erro

r.

LORI FOSTER THOMPSON ET AL. 215

On the whole, the Study 2 results were encouraging. Although the pilot survey length and items were highly similar to the 1998 climate survey, the time required for the deployment, collection, and analy- sis of data was greatly reduced. Results indicated that the survey ap- peared as intended on most respondents’ computers, with some excep- tions. Problems reported by respondents concentrated on screen presen- tatiodscrolling and response anonymity. It was therefore determined that some graphical rework was in order. Furthermore, we wished to ease confidentiality apprehensions as much as possible prior to the on- line survey’s full implementation. Related to this objective, several orga- nizational decision makers expressed concern regarding the use of pass- codes. After a considerable amount of discussion, we decided to discon- tinue the use of passcodes prior to distributing the final survey to the entire organization in an effort to enhance perceptions of anonymity. Fortunately, the organization under investigation had a very secure com- puting environment. It was therefore highly improbable that an unau- thorized person outside of the workforce would be able to access the climate survey, even with no passcodes in place. Furthermore, the likeli- hood of anyone taking the time to complete this particular questionnaire multiple times was presumed to be very small.

In short, it was concluded that the online survey system was concep- tually sound and warranted continuation subsequent to the following modifications: (a) the use of passcodes would be eliminated; (b) the introductory module would be revised to eliminate the portions indicat- ing that the survey was a pilot test; (c) more items would be included per page, thereby reducing the number of total pages from 27 to 11; (d) some additional graphical rework would occur to minimize scrolling re- quirements; (e) the open-ended response blocks would be added back into the final version of the survey as planned; and ( f ) the pilot items, which asked respondents to evaluate the usability of the survey, would be removed.

Study Three

The objectives of the third phase of th is research were to fully deploy the online survey for the purpose of assessing command climate and to evaluate the electronic survey initiative in terms of the workforce’s participation rate and respondents’ survey satisfaction.

Method

The final survey was made available to the overall workforce in July 2001. Except for the modifications mentioned previously (e.g., elimina-

216 PERSONNEL PSYCHOLOGY

tion of passcodes), this instrument was the same as the one described in the Study 2 pilot test. Data collection was terminated after 10 days.

Results and Discussion

The 73-item6 electronic version of the command climate survey was completed and returned by 403 employees (a 60% response rate). De- mographic analyses indicated that the sample was similar to the 1998 pa- per survey respondent group and reflective of the demographics within the organization as a whole. The 2001 survey was completed by 226 (56%) civilians, 168 (42%) military personnel, and 9 (2%) people who did not reveal their classifications. Overall, 139 (34%) of the respon- dents were women, 253 (63%) were men, and 11 (3%) did not indicate their gender. The ethnic makeup of the sample was: 72 (18%) African American; 3 (1%) American Indian; 8 (2%) Asian; 261 (65%) Cau- casian; 17 (4%) Hispanic; and 17 (4%) undeclared. All directorates were represented in the respondent group.

To assess the organization’s reactions to the electronic survey, two measures were analyzed. First, we calculated the paper- and Web-based survey response rates to compare the workforce’s willingness to com- plete the two different versions of the survey. Second, we compared the 1998 (paper) and the 2001 (Web-based) respondents in terms of their satisfaction with the survey format. In terms of participation, results in- dicated that the questionnaire’s transition to the Web was accompanied by a slight increase in response rate. Approximately 60% of the work- force participated in the electronic survey; only 54% completed the pa- per version that preceded it.

Next, we examined replies to the following questionnaire item, which was offered on the 1998 paper and the 2001 Web-based surveys: “I am satisfied with the content and format of the command climate survey.” Participants were asked to use a 1 to 5 scale to rate their agreement with this statement. Because the survey content remained quite stable across the 1998 and 2001 administrations, ratings of this item were to provide an assessment of respondents’ reactions to the survey medium. For ease of reporting, the 5-point scale was collapsed into three categories: satisfied (the top two scale options), neutral (the midpoint of the rating scale), and dissatisfied (the bottom two scale options). Overall, 13% of the paper- and-pencil respondent group was dissatisfied with the survey, whereas only 7% of the online respondent group expressed dissatisfaction. Sim- ilarly, the percentage of respondents that felt neutral about the paper survey (31%) exceeded the percentage that felt neutral about online sur-

%ee footnote #3.

LORI FOSTER THOMPSON ET AL. 217

vey (19%). Lastly, the percentage that felt satisfied with the paper-and- pencil survey (57%) was substantially smaller than the percentage that felt satisfied with the electronic survey (74%).

Analyses were conducted to compare the following demographic subgroups in terms of their satisfaction with the climate questionnaire: males versus females; African Americans versus Caucasians; and civilian versus military personnel. No gender and race differences were found. Military personnel (M = 3.68, SD = 0.92) expressed more satisfaction than did civilian personnel (M = 3.47, SD = 0.99) while completing the paper survey in 1998 (t[372] = -2.07, p = .039); however, military and civilian respondents did not differ in terms of their satisfaction with the 2001 online survey.

Before moving to the final phase of this research, it is important to ac- knowledge that factors other than (or in addition to) the survey medium could have affected both the response rate and survey satisfaction indices examined in Study 3. The survey medium is confounded with time. The absence of random assignment and the presence of an excessive time lag creates the possibilty that variables related to the respondents and their perceptions of the organization or its leadership could have affected will- ingness to participate and satisfaction with the survey.

Study Four

In an effort to evaluate the online survey process and better under- stand the reasons why nonrespondents did not complete the electronic climate questionnaire, we asked the workforce to complete a follow-up survey in September of 2001. The following three research questions guided this initiative: (a) Why did some personnel fail to respond to the electronic climate survey? (b) How did those who completed the elec- tronic climate survey feel about the process? (c) What predicts one’s willingness to complete an electronic survey in the future?

Method

A follow-up survey was distributed to the overall workforce in an electronic (Web-based) format. The final instrument, which was 12 items in length, did not request demographic information; however, it did ask the workforce to indicate whether they had completed the 2001 electronic climate survey. Those who did not participate in the elec- tronic climate survey were asked to indicate the primary reason for their nonresponse. Those who did participate in the electronic climate survey were asked to rate their reactions to it and their willingness to participate in future online surveys. Finally, open-ended items were included at the

218 PERSONNEL PSYCHOLOGY

end of the survey to assess: (a) problems or issues with the online survey process or format, and (b) positive experiences or thoughts associated with the online survey process.

The design philosophies described previously (e.g., simple format, minimal number of colors) guided the construction of the follow-up sur- vey, which was not password protected. In September 2001, the entire workforce was sent an e-mail request to participate. Data collection was terminated after 11 working days.

Results and Discussion

The 12-item follow-up questionnaire was completed and returned by 180 people (a 27% response rate). Five participants who produced logi- cally inconsistent data were eliminated from the analyses, yielding a final sample size of 175. Approximately 79% (138 of the 175 respondents) in- dicated that they had completed the 2001 electronic climate survey.

W h y did some personnel fail to respond to the electronic climate survey? Thirty-six of the 37 people who did not take the 2001 electronic climate survey provided reasons for their nonresponses: 12 (33%) were away during the time of the climate survey; 9 (25%) were new arrivals who did not belong to the organization at the time of the climate survey; 5 (14%) were too busy; 3 (8%) indicated that they did not receive the e-mail message requesting their participation; 3 (8%) were concerned about being identified; 1 (3%) experienced technical problems that prevented survey completion; and the remaining 3 (8%) expressed some other reason for not completing the survey.

How did the personnel who completed the electronic climate survey feel about the process? The 138 follow-up respondents who previously par- ticipated in the electronic climate survey were asked to use a 1 to 5 scale to indicate their reactions to the process. nb le 4 presents the responses to four follow-up evaluative items. As indicated in the table, person- nel generally liked the online survey and the process for completing it. Furthermore, most did not experience technical difficulties during the process, and the majority intended to participate in future online surveys. Several open-ended items were included to further gauge the workforce's reactions. First, we asked participants to indicate any prob- lems they had with the survey process or format. lbenty-eight (20%) of the 138 people who previously completed the online climate survey gen- erated a total of 32 responses. Fifteen of these responses were unrelated to the online process/format. The remaining 17 responses fell into one of four categories: 6 comments described a problem accessing part or all of the survey; 6 expressed confidentiality/security concerns; 3 complained that an aspect of the survey was not user friendly; and 2 were unique

TABL

E 4

Reac

tions

to Online C

limat

e Sur

vey

Exrp

resse

d During F

ollo

w-U

p Ini

tiativ

e E 2 Crl

Le

vel o

f agr

eem

ent

Stro

ngly

St

rong

ly

Not

di

sagr

ee

Dis

aaee

N

eutra

l &

an

swer

ed

Total

s 8 z 8

Surv

ey it

em

n(%

) n

(%)

n(%

) n

(%)

n(%

) (%

I n

M

_ SD

The o

nlin

e sur

vey

is a

usef

ul w

ay to

com

plet

e 4(

3%)

3(2%

) 13

(9%

) 60

(43%

) 57

(41%

) l(

l%)

138

4.19

0.91

I am

sat

isfie

d with

the

onlin

e pro

cess

for

4(3%

) 3(

2%)

lO(7

%)

66(4

8%)

54(3

9%)

l(l%

) 13

8 4.

19

0.89

I exp

erie

nced

no te

chno

logy

issu

'es w

hile

com

plet

ing

4 (3

%)

8 (6

%)

5 (4

%)

62 (4

5%)

58 (4

2%)

l(l%

) 13

8 4.

18

0.96

z

I will

par

ticip

ate i

n fu

ture

onl

ine s

urve

ys.

5(4%

) O(

O%)

9(7%

) 67

(49%

) 55

(40%

) 2(

1%)

138

4.23

0.

87

3 nu

mbe

r. Pe

rcen

tage

s in

the

sam

e row m

ay n

ot to

tal t

o 10

0% d

ue to

roun

ding

erro

r.

the

com

man

d cl

imat

e sur

vey.

the

com

man

d cl

imat

e sur

vey.

the

onlin

e com

man

d cl

imat

e sur

vey.

F N

otes

: Per

cent

ages

prov

ided

in p

aren

thes

es ar

e bas

ed o

n th

e gro

up to

tal t

hat a

ppea

rs in

the s

ame r

ow. P

erce

ntag

es ar

e rou

nded

to th

e ne

ares

t who

le

220 PERSONNEL PSYCHOLOGY

comments that did not fall into the preceding categories. Finally, we asked for positive experiences associated with the online survey process. Thirty-six (26%) of the 138-member sample generated 42 comments in response to this item. Six of these comments were unrelated to the on- line survey process. Eleven expressed a general positive sentiment (e.g., great idea, better than bubbling in responses, etc.). The remaining 25 comments highlighted specific benefits of online surveying: 12 indicated that online surveys are easy to use; 4 underscored the time savings asso- ciated with electronic surveys; 3 expressed an increased sense of privacy with the online method; 2 noted that the electronic survey was environ- mentally friendly; and 4 emphasized unique benefits that did not fall into the preceding categories.

What predicts willingness to complete an electronic survey in the fu- ture? Finally, we conducted a multiple regression analysis to determine whether the following experience and attitudinal variables predict in- tentions to participate in future electronic surveys: (a) satisfaction with previous online process; (b) technology problems during previous online survey completion; and (c) general belief that the online survey method is useful and effective. Tmble 5 presents the results of this analysis, which included data from the 136 follow-up participants who previously completed the electronic climate survey (two cases were eliminated from the initial 138-member sample due to incomplete data). As can be seen, satisfaction with the previous online survey process and tech- nology problems during the online survey significantly predicted the will- ingness to respond to electronic questionnaires in the future. These two past experience predictors together accounted for 65% of the variabil- ity in intentions to complete future online surveys. Respondents’ eval- uations of the usefulness of online survey methods did not contribute unique variance over and above that provided by the two past experi- ence predictors.

Overall, the follow-up survey suggested that anonymity concerns and technology troubles were not exceedingly problematic during the elec- tronic climate questionnaire, and workers’ initial experiences with online surveys largely predict their willingness to participate in such initiatives in the future. It is important to note that apaper format for the follow-up survey, which was not possible due to organizational constraints, would have been preferable during the fourth and final phase of this research. The Study 4 results should be interpreted with caution, as it is unclear whether the data accurately reflect the sentiments of the broader orga- nization. People who were extremely resistant to online surveys due to anonymity concerns may have purposely avoided the follow-up survey because of its electronic format. In addition, those who could not access the electronic climate survey due to technology problems may have en-

LORI FOSTER THOMPSON ET AL. 221

TABLE 5 Standardized Coefficients for Regression of Willingness to Complete Future Online Survqs on Particular Attitudes and Past Experiences

Willingness to complete future online surveys

Independent variable P R2 A R ~ Model 1 .617**

Satisfaction with online process .786** during past survey

Model 2 .651** .033** Satisfaction with online process .620**

Absence of technology problems .246** during past survey

during past online survey

Satisfaction with online process .488* *

Absence of technology problems .236**

Evaluation of online method usefulness **p < .01

Model 3 .656** .005

during past survey

during past online survey .158

countered similar difficulties the second time around, thereby prevent- ing their participation in the online follow-up survey.

General Discussion

This study addressed a hole in the literature via a multi-method as- sessment of people’s apprehensions and reactions to electronic surveys. We questioned workers before, during, and after the time in which their climate survey went online. This paper also elaborated on some of the issues that must be considered when implementing Web-based surveys. In general, we found that, although there were some initial anonymity concerns, most personnel spanning a variety of demographic categories were amenable to online surveying.

Lessons Learned and Practical Considerations

A number of noteworthy findings were uncovered during this re- search initiative, producing “lessons learned,” which apply to practition- ers who want to refine their current online survey processes, as well as those who are considering the transition from paper to pixels. For starters, we discovered that most people were not against the idea of completing their surveys online. Among those who opposed Web-based surveys, confidentiality was the primary issue. Several approaches were

222 PERSONNEL PSYCHOLOGY

used to allay these concerns: We developed an introductory presurvey module to assure respondents of their anonymity, we limited the number of demographic questions on the survey, and we allowed survey access without passcodes. One or all of these tactics probably increased percep- tions of anonymity during the survey’s full implementation. Other ideas for enhancing perceived anonymity, which were not employed here, in- clude giving personnel the option of printing the survey and turning it in (Sheehan & McMiUan, 1999), setting up kiosks that may alleviate the concern that responses will be traced back to personal computers, en- couraging personnel to swap passcodes to emphasize that passcodes are not tracked, and using an outside company to collect data (Zatz, 2000).

Next, we discovered that the development of a successful Web-based survey required a considerable time investment upfront. Time was spent establishing survey software priorities, projecting the organization’s fu- ture needs for such software, evaluating and selecting adequate technol- ogy, programming the questionnaire, pilot testing the draft, and modify- ing the survey based on pilot findings. Practitioners who are developing or refining online surveys should take care to schedule time for each of these activities. Notably, most personnel liked the finished product, and the quality of workers’ initial experiences with the electronic question- naire predicted their willingness to complete online surveys in the future. Thus, ongoing benefits seemingly result from initial time investments.

The literature suggests that a “digital divide” separates those who are and are not comfortable with Internet technology, and demographic variables such as race are related to this problem (Eastin & LaRose, 2000). It was encouraging to learn that, across the various phases of this research, there was no indication that the Web-based medium system- atically discouraged participation from any one type of person (based on gender, race, military vs. civilian classification, and directorate size). Barring one discrepancy between military and civilian personnel on the initial assessment that measured support of online survey techniques, demographic subgroup comparisons revealed no differences in terms of response rates and attitudes toward online surveys. From a practical standpoint, this suggests that the online survey medium itself does not tend to over sample any particular demographic subgroup.

A rather unexpected lesson involved an increased potential for over- surveying. After the implementation of the Web-based climate survey, the project stakeholders were inundated with requests from various or- ganizational elements to use the electronic surveying capacity. With paper questionnaires, the barriers of printing and distribution hold the number of different surveys dispersed in check. In contrast, the ease of use and instant gratification of electronic surveying allows organizations to literally deploy multiple questionnaires daily if desired. Practition-

LORI FOSTER THOMPSON ET AL. 223

ers should be advised that a possible consequence of Web-based surveys is the propensity to oversurvey employees. With access to the survey software, various units may develop and implement their own surveys regularly. Moreover, organizational decision makers who witness the ease with which Web-based survey data are collected may request sur- veys more and more frequently. In short, without the natural barriers of conventional surveying, there is the potential for surveying gone amuck, which can lead to a variety of problems, including: (a) poorly thought out, designed, and/or hastily implemented surveys that fail to capture the desired information and in turn irritate leaders and employees; (b) over reliance on questionnaires when surveying may not be the most appro- priate needs assessment technique for data collection; (c) poor partici- pation in future surveys (Luong & Rogelberg, 1998); and (d) low morale or negative employee reactions, which can result when decision makers are not in a position to promptly act on the large stockpile of employee input that has been collected (Rosenfeld, Doherty, Vicino, Kantor, & Greaves, 1989). All of these problems undermine the truly important questionnaires and the organizational development goals that surveys are used to facilitate. Perhaps the best strategy to deal with the chal- lenges related to oversurveying is to take steps to prevent the problems upfront. For example, organizations may wish to establish and commu- nicate a policy on data collection and surveying. In addition, they may provide all people who will be authorized to develop and implement sur- veys with certification training or they may wish to restrict surveying to the experts. Controlling the allocation of electronic survey resources through a centralized process or employee can also reduce the problem of oversurveying, as can the creation of a survey review and approval process or clearinghouse for the organization.

During the various phases of this study, we were reminded that the survey medium is only one aspect of a questionnaire’s acceptance. For example, the manner in which the organization has used past survey re- sults can influence acceptance, as can attitudes toward the survey spon- sor (Luong & Rogelberg, 1998). The acceptance of any survey, including a Web-based instrument, is a function of many variables. Most of the tra- ditional rules for facilitating survey acceptance (e.g., keeping surveys at a reasonable length, providing timely feedback) remain important regard- less of the survey medium (Luong & Rogelberg, 1998). A Web-based questionnaire that does not adhere to these principles is not likely to be accepted and completed by personnel.

Although the current study encourages the use of electronic ques- tionnaires, it is important to acknowledge that online surveys are not al- ways the best solution. From a practical standpoint, computerized ques- tionnaires make the most sense when large populations of far-flung per-

224 PERSONNEL PSYCHOLOGY

sonnel with adequate computer skills and access will be asked to com- plete one or more surveys at various points in time. At the other ex- treme, paper-and-pencil questionnaires would be a wise choice when a manageable number of accessible personnel who lack computer skills and access need to be surveyed at only a single point in time. As this study suggested, it takes time and money to move a survey online. This phase of the process is initially more expensive than creating a paper- based survey. The expense, which is likely to pay off when invested in a survey that is administered regularly, may not be worthwhile for single- administration questionnaires. Organizations that frequently survey var- ious types of employees will probably benefit from investments in Web- based survey technology, whereas institutions that rarely conduct surveys will see smaller returns on their investments in survey software and re- lated technologies.

Limitations and Future Directions

When considering the results and conclusions of this study, several limitations should be noted. To start with, it is important to recognize that variables other than the survey medium (time, organizational lead- ership, etc.) could have caused personnel to respond to the 1998 and 2001 survey formats differently. As is often the case with field research, we sacrificed some degree of internal validity (random assignment was not employed) in exchange for external validity (data were collected in a real work setting). We chose a multimethod assessment of the work- force’s attitudes toward Web-based surveys in order to offset the pre- ceding limitation. Nonetheless, it would be very useful for follow-up re- search to employ random assignment and compare reactions to identical paper and electronic surveys administered simultaneously.

Although the Study 2 pilot test and Study 4 follow-up initiative indi- cated that participants liked the online survey and the process, it is neces- sary to acknowledge that results could have been skewed because those opposed to the process altogether may have failed to respond. Study 2 rating data revealed that approximately one-fourth of the participants were skeptical about the security and confidentiality of their responses. Presumably, there will always be a certain percentage of employees with this concern, regardless of the survey medium. Stated differently, it is unclear whether respondents’ concerns about anonymity were greater than the anonymity concerns associated with paper surveys. Further- more, asking questions about confidentiality may affect concerns about the issue. It is possible that confidentiality was less of a concern before we asked about it; thus, our Study 2 numbers may overestimate this ap- prehension because we cued the respondents with the question. Indeed,

LORI FOSTER THOMPSON ET AL. 225

only 10% of the (uncued) Study 1 respondents expressed anonymity con- cerns when asked to share their thoughts about electronic surveys.

It is also uncertain whether our results generalize to other popula- tions. Replication work in private industry should test the degree to which the current results generalize to nonmilitary settings. Beyond the militaryxivilian distinction, the implications of additional characteris- tics of the present sample should be considered. For instance, computer access and skills were not a problem for our participants. One must ques- tion whether other kinds of personnel (e.g., manufacturing employees) would have reacted to an online survey differently. Follow-up research should test whether the link between survey administration mode and participant acceptance is moderated by variables such as level within the organization, comfort with computers, Internet self-efficacy, and conve- nience of computer access. Is there a digital divide based on these vari- ables, which were not measured in the present study?

Next, it is important to acknowledge the atypically long timeframe over which the current study transpired. In most organizations, the de- velopment, deployment, and evaluation of a Web-based survey initiative will need to occur much more quickly. Practitioners in other settings could easily accelerate the process employed in the current study. Nat- urally, each organization is unique, with its very own set of constraints and politics, and the requisite timeframe will vary from one institution to the next. In the context of the organization examined in this study, the entire transition from paper to pixels, excluding the initial paper poll that assessed reactions to the possibility of an online climate survey, oc- curred over a period of 6 months. (This time estimation does not fac- tor in the Study 4 follow-up survey that was administered in September 2001.) Importantly, the initial paper poll did not have to be attached to the 1998 paper climate survey-it could have occurred just prior to the investigation of survey software alternatives. Had we administered these questions as a standalone paper survey in 2001, the process described in this paper could have readily taken place within a matter of months. In short, the lessons learned during the present initiative can be helpful to practitioners working under time pressure.

Lastly, it is unclearwhether our results generalize to other online sur- veys. Not all electronic survey processes are the same, and the specific process may affect user acceptance. Future research should examine the extent to which tactics such as introductory modules and use of passcodes influence perceptions of anonymity and willingness to participate in on- line surveys. It would also be interesting to investigate whether passcode swapping options and the use of outside vendors strengthen reassurances of confidentiality. Furthermore, future work should look at the effects of various survey design features such as “save and resume” options.

226 PERSONNEL PSYCHOLOGY

In conclusion, the present study provided an initial estimate of the ease with which different types of people working in real-world settings will freely complete personnel surveys that are conducted online. Much more research needs to occur in order to fully answer questions concern- ing who does not respond to online surveys and why. Alternative designs and samples are imperative. For example, it would be useful to give em- ployees the option of either a paper-and-pencil or an online survey that includes items assessing the reasons why respondents did not opt for the survey medium they rejected. Such a program of research will help to ensure that electronic survey media do not inadvertently silence impor- tant segments of the workforce.

REFERENCES

Church AH. (2001). Is there a method to our madness? The impact of data collection methodology on organizational survey results. PERSONNEL PSYCHOLQGY. 54, 937- 969.

Couper MP. (2000). Web surveys: A review of issues and approaches. Public Opinion Quarterb 64,464494.

Dillman DA. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York Wiley.

Dommeyer CT, Moriarty E. (2000). Comparing two forms of an e-mail survey: Embedded versus attached. International Journal of Market Research, 42,39-50.

Donovan MA. (2000, April). Web-based attitude surveys: Data and lessons learned. Unpub- lished paper presented at the 15th Annual Conference of the Society for Industrial and Organizational Psychology, New Orleans, LA

Donovan MA, Drasgow F, Probst TM. (2000). Does computerizing paper-and-pencil job attitude scales make a difference? New IRT analyses offer insight. Journal of Applied Psychology, 85,305-313.

Eastin MS, LaRose R. (2000). Internet self-efficacy and the psychology of the digital divide. Journal of Computer-Mediated Communication, 6( l), n.p.

King WC, Miles EW. (1995). A quasi-experimental assessment of the effect of computer- izing noncognitive paper-and-pencil measurements: A test of measurement equiv- alence. Journal ofApplied Pgzhology, 80,642451.

Kraut AI. (2001, April). An e-mail letter to a friend. The Industrial-0rganizotionaionnl Psy-

Kraut AI, Saari LM. (1999). Organizational surveys: Coming of age for a new era. In Kraut AI, Korman AK (Eds.). Evolvingpmctices in human resource management: Responses to a changing world of work (pp. 302-327). San Francisco: Jossey-Bass.

Luong A, Rogelberg SG. (1998, July). How to increase your survey response rate. The Industrial-O?ganuationaional Psychologist, 36(1), 61-65.

Magnan SM, Lundby KM, Fenlason KJ. (2000, April). Dual media: The art and science of paper and Internet employee survey implementation. Unpublished paper presented at the 15th Annual Conference of the Society for Industrial and Organizational Psychology, New Orleans, LA.

McFarland LA, Ryan AM, Paul KB. (1998, April). Equivalence of an organizational atti- tude survey across administration modes. Unpublished paper presented at the 13th Annual Conference of the Society for Industrial and Organizational Psychology, Dallas, TX.

chologist, 38(4), 37-39.

LORI FOSTER THOMPSON ET AL. 227

Opperrnann, M. (1995). E-mail surveys: Potentials and pitfalls. Marketing Research, 7(3), 29-33.

Paolo AM, Bonaminio GA, Gibson C, Partridge T, Kallail K. (2000). Response rate com- parisons of e-mail- and mail-distributed student evaluations. Teachingand Learning in Medicine, 12,81-84.

Parker L. (1992). Collecting data the e-mail way. Trainingand Development, 46,52-54. Rosenfeld P, Booth-Kewley S, Edwards J, Thomas M. (1996). Responses on computer

surveys: Impression management, social desirability, and the big brother syndrome. Computers in Human Behavws 12,263-274.

Rosenfeld P, Doherty LM, Vicino SM, Kantor J, Greaves J. (1989). Attitude assessment in organizations: %sting three microcomputer-based survey systems. Journal of General Psychology, 116,145-154.

Saphore RB. (2000). A psychometric comparison of an electronic and classical survey instrument. Dissertation Abstracts International Section A: Humanities and Social Sciences, 60(11-A), 3976.

Schmidt WC. (1997). World Wide Web survey research: Benefits, potential problems, and solutions. Behavior Resmrch Methods, Instruments and Computers, 29,274-279.

Schuldt BA, Totten Jw. (1994). Electronic mail vs. mail survey response rates. Marketing Research, 6,3639.

Sheehan KB. (2001). E-mail survey response rates: A review. Journal of Computer- Mediated Communication, 6(2), n.p.

Sheehan KB, McMillan SJ. (1999). Response variation in e-mail surveys: An exploration. Journal OfAdvertiring Research, 39,45-54.

Spera SD. (2000, April). Transitioning to Web survey methods: Lessons @m a cautious adopter. Unpublished paper presented at the 15th Annual Conference of the Soci- ety for Industrial and Organizational Psychology, New Orleans, LA.

%era SD, Moye NA. (2001, April). Measurement equivalence betweenpaper and Web s~vvcy methods in a multinatioml company. Unpublished paper presented at the 16th Annual Conference of the Society for Industrial and Organizational Psychology, San Diego, CA.

Stanton JM. (1998). An empirical assessment of data collection using the Internet, PER-

Weible R, Wallace J. (1998). Cyber research: The impact of the Internet on data collection.

Witte RS, Witte JS. (2001). Stctistics (6th ed.). New York Harcourt College. Workforce Compensation and Performance Service (1991). Introduction to the posi-

tion class$cation standad. Retrieved June 29,2002 from the World Wide Web: http:/hvww.opm.gov/fedclass/gsintro.pdf.

Yost PR, Homer LE. (1998, April). Electronic versuspapersurveys: Does the medium affect the response? Unpublished paper presented at the 13th annual conference of the Society for Industrial and Organizational Psychobgy, Dallas, TX.

Young SA, Daum DL, Robie C, Macey WH. (2000, April). Paper versus Web survey admin- istration: Do different methods yield different results? Unpublished paper presented at the 15th Annual Conference of the Society for Industrial and Organizational Psychology, New Orleans, LA.

Yun GW, h m b o CW. (2000). Comparative response to a survey executed by post, e-mail, and Web form. Journal of Computer-Mediated Communication, 6(1), n.p.

Zatz D. (2000). Create effective e-mail surveys. HRMagazine, 45,97-103.

SONNEL PSYCHOLOGY. 51,709-725.

Marketing Research, 10(3), 19-31.