6
Using virtual communities in tourism research Steven F. Illum a, * , Stanislav H. Ivanov b , Yating Liang a a Missouri State University, 901 South National Ave., Springfield, MO 65897, USA b International University College, 3 Bulgaria Str., 9300 Dobrich, Bulgaria article info Article history: Received 21 May 2008 Accepted 24 March 2009 Keywords: Tourism research Online survey Survey methodology Virtual communities Response rate abstract In the current paper, the validity of web-based surveys was examined. An online questionnaire was sent to members of tourism research and travel-related virtual communities. The results showed extremely low opening and response rates, leading to biased samples. Several possible explanations for the results as well as practical recommendations were given for implementation of web-based surveys. Ó 2009 Elsevier Ltd. All rights reserved. 1. Introduction An online virtual community (VC) is defined as a group of people trying to achieve certain purposes, with a similar interest, inter- ested in relationship building, transaction, and fantasy under certain rules by using new information technology as their means (Kim, Lee, & Hiemstra, 2004: 345). It is ‘‘an aggregation of indi- viduals or business partners who interact around a shared interest, where the interaction is at least partially supported and/or medi- ated by technology and guided by some protocols or norms’’ (Porter, 2004). VCs can be used by businesses including tourist firms to create new types of services, enhance existing products and create new divisions and capabilities (Wang, Yu, & Fesenmaier, 2002), strengthen their positive image, establish relationships with their customers and contribute to customer loyalty and sales (Kim et al., 2004). Business potential is mainly used to create increased trust among a VC’s members combined with quality services that may improve customer loyalty (Kardaras & Karakostas, 2007). Roth- aermel and Sugiyama (2001) argue that a member’s off-site communication, experience, perceived value of site management, content, and collectively-held knowledge are positively associated with a member’s e-based economic transactions within a specific virtual community (TimeZone.com, in this case). 1.1. Rationale of research There is a broadening emergence of VCs and a rise of research on using them for data collection (a scholarly journal is dedicated to web communities: International Journal of Web Based Commu- nities). Virtual communities have been an object of research since their dawning. Research is focused on social networking (Snyder, Carpenter, & Slauson, 2007), reasons for (Dholakia, Bagozzi, & Pearo, 2004; Wang & Fesenmaier, 2004), and consequences from (Kim et al., 2004) participating in VCs. The practice of using VCs as a source of information and an outlet for distributing a ques- tionnaire has been recently adopted in social studies research (Chalkiti & Sigala, 2008; Kim et al., 2004; Thomas, Peters, & Tol- son, 2007). In social studies research, surveys are often employed as measurement tools to collect information, especially in applied social research (Trochim, 2006). Many web-based studies use surveys because of the convenience, low-cost and time efficiency. However, there are methodological issues related to web-based survey such as nature of the sample, response rate, privacy and confidentially, etc. that may affect the validity of the findings (Duffy, 2002). This paper intends to add to the growing body of literature by presenting related outcomes from an online survey performed by the authors on usability of highway maps by auto travellers. Its aim for tourism academics is to analyse the validity of web-based surveys, to identify potential pitfalls in such method * Corresponding author. Tel.: þ1 417 836 4773; fax: þ1 417 836 4200. E-mail addresses: [email protected] (S.F. Illum), stanislav.ivanov@ unwro.org (S.H. Ivanov). Contents lists available at ScienceDirect Tourism Management journal homepage: www.elsevier.com/locate/tourman 0261-5177/$ – see front matter Ó 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.tourman.2009.03.012 Tourism Management 31 (2010) 335–340

Using virtual communities in tourism research

Embed Size (px)

Citation preview

Page 1: Using virtual communities in tourism research

lable at ScienceDirect

Tourism Management 31 (2010) 335–340

Contents lists avai

Tourism Management

journal homepage: www.elsevier .com/locate/ tourman

Using virtual communities in tourism research

Steven F. Illum a,*, Stanislav H. Ivanov b, Yating Liang a

a Missouri State University, 901 South National Ave., Springfield, MO 65897, USAb International University College, 3 Bulgaria Str., 9300 Dobrich, Bulgaria

a r t i c l e i n f o

Article history:Received 21 May 2008Accepted 24 March 2009

Keywords:Tourism researchOnline surveySurvey methodologyVirtual communitiesResponse rate

* Corresponding author. Tel.: þ1 417 836 4773; faxE-mail addresses: [email protected] (

unwro.org (S.H. Ivanov).

0261-5177/$ – see front matter � 2009 Elsevier Ltd.doi:10.1016/j.tourman.2009.03.012

a b s t r a c t

In the current paper, the validity of web-based surveys was examined. An online questionnaire was sentto members of tourism research and travel-related virtual communities. The results showed extremelylow opening and response rates, leading to biased samples. Several possible explanations for the resultsas well as practical recommendations were given for implementation of web-based surveys.

� 2009 Elsevier Ltd. All rights reserved.

1. Introduction

An online virtual community (VC) is defined as a group of peopletrying to achieve certain purposes, with a similar interest, inter-ested in relationship building, transaction, and fantasy undercertain rules by using new information technology as their means(Kim, Lee, & Hiemstra, 2004: 345). It is ‘‘an aggregation of indi-viduals or business partners who interact around a shared interest,where the interaction is at least partially supported and/or medi-ated by technology and guided by some protocols or norms’’(Porter, 2004).

VCs can be used by businesses including tourist firms to createnew types of services, enhance existing products and create newdivisions and capabilities (Wang, Yu, & Fesenmaier, 2002),strengthen their positive image, establish relationships with theircustomers and contribute to customer loyalty and sales (Kim et al.,2004). Business potential is mainly used to create increased trustamong a VC’s members combined with quality services that mayimprove customer loyalty (Kardaras & Karakostas, 2007). Roth-aermel and Sugiyama (2001) argue that a member’s off-sitecommunication, experience, perceived value of site management,content, and collectively-held knowledge are positively associated

: þ1 417 836 4200.S.F. Illum), stanislav.ivanov@

All rights reserved.

with a member’s e-based economic transactions within a specificvirtual community (TimeZone.com, in this case).

1.1. Rationale of research

There is a broadening emergence of VCs and a rise of researchon using them for data collection (a scholarly journal is dedicatedto web communities: International Journal of Web Based Commu-nities). Virtual communities have been an object of research sincetheir dawning. Research is focused on social networking (Snyder,Carpenter, & Slauson, 2007), reasons for (Dholakia, Bagozzi, &Pearo, 2004; Wang & Fesenmaier, 2004), and consequences from(Kim et al., 2004) participating in VCs. The practice of using VCsas a source of information and an outlet for distributing a ques-tionnaire has been recently adopted in social studies research(Chalkiti & Sigala, 2008; Kim et al., 2004; Thomas, Peters, & Tol-son, 2007). In social studies research, surveys are often employedas measurement tools to collect information, especially in appliedsocial research (Trochim, 2006). Many web-based studies usesurveys because of the convenience, low-cost and time efficiency.However, there are methodological issues related to web-basedsurvey such as nature of the sample, response rate, privacy andconfidentially, etc. that may affect the validity of the findings(Duffy, 2002). This paper intends to add to the growing body ofliterature by presenting related outcomes from an online surveyperformed by the authors on usability of highway maps by autotravellers. Its aim for tourism academics is to analyse the validityof web-based surveys, to identify potential pitfalls in such method

Page 2: Using virtual communities in tourism research

S.F. Illum et al. / Tourism Management 31 (2010) 335–340336

and recommend measures for coping with non-response andother practical problems.

2. Literature review

2.1. Web-based surveys

Web-based surveys have been widely used in social scienceresearch (Converse, Wolfe, Huang, & Oswald, 2008; Kiernan,Kiernan, Oyler, & Gilles, 2005; Zhang, Wang, & Chen, 2001). Theyhave been both praised and deeply criticized. As a major advantageof web surveys academics usually identify the rapid turnaroundtime. Truell, Bartlett, and Alexander (2002), for example, stated that85% of the total responses in a web survey are received within oneweek of the initial or follow-up contact. Jansen, Corley, and Jansen(2006: 5) added that web-based surveys could reach a largenumber of potential respondents quickly, provide for the use ofmultiple-question formats, provide direct database connectivity,allow for data quality checking, customized instrument delivery,and guaranteed confidentiality, all of which they suggested canserve to improve the reliability of the data. In comparison withother survey methods, web surveys could be at least as effective astraditional mail instruments (Kiernan et al., 2005; Truell et al.,2002), but higher response rates could be achieved if combinedwith mail contacts with respondents (Converse et al., 2008).

The web-based survey is not without its shortcomings. Jansenet al. (2006: 5) mentioned as drawbacks the time-consumingdevelopment of the software and questionnaire, limited access topotential users (only those with Internet access), potential tech-nological problems, and the possibility of poor security threateningthe validity of the study, lack of control over the sample, which issometimes considered to be biased. Fricker and Schonlau (2002)also concluded that academics’ perceptions about costs and speedof the web survey cannot always be confirmed in empirical studies.Gosling, Vazire, Srivastava, and John (2004) found that the largeInternet samples in their research were relatively diverse, and websurvey results were not adversely affected by non-serious or repeatresponders, and were consistent with findings from traditionalmethods. They added that although online web-based surveys areexpensive to develop and specialized software is needed, when thereached population sample is large enough, the associated costs percontact will be low thus offsetting the high initial developmentcosts and creating a competitive advantage for this method overothers.

2.2. Virtual communities

Virtual communities have been the object of research since theiremergence. Major research topics on them include socialnetworking (Snyder et al., 2007), reasons for (Dholakia et al., 2004;Wang & Fesenmaier, 2004) and consequences from (Kim et al.,2004) participating in VCs. The links between VC members may bemaintained via a common mailing list (listserv), a blog or an e-group on a website like Facebook.com or MySpace.com.

VCs are used to share information among their members withcommon interests (Chalkiti & Sigala, 2008; Schmitz-Justen & Wil-helm, 2006, 2007). However, the process of information sharing isinfluenced by the trust each member has in the virtual communityitself and in its individual members (Laat, 2005; Usoro, Sharratt,Tsui, & Shekhar, 2007). The Internet provides anonymity anda person can create different virtual identities and present him/herself as another person (for further discussion see Donath, 1998).In open access VCs, this is fairly easy. However, in communitiesrequiring invitation to join and/or membership approved and

regulated by an administrator, the possibility of a fake identity ofa member decreases yet is not completely eliminated.

In the field of social sciences, web- and e-mail-based researchcommunities (e.g., TRINet, Atlas, EuroCHRIE, ELMAR, all of whichare with regulated access) facilitate knowledge dissemination andexchange of ideas among members (see also de Vries & Kommers,2004), although recent research concludes that Internet useincreases international cooperation and productivity of academicsmarginally (Sooryamoorthy & Shrum, 2007).

2.3. Employing virtual communities in social science research

VCs have recently been adopted in social studies research.Thomas et al. (2007) examined a discussion among communitymembers with the potential to provide marketers with insights bymaking a content analysis of discussion forum messages ina fashion and style VC in MySpace.com. Casalo, Flavian, and Gui-nalıu (2007) performed a web survey targeting members of severalfree software VCs. Their results showed that participation in theactivities carried out in a VC might foster consumer trust and loyaltyto the mutual interest of the community.

In the field of tourism, Kim et al. (2004) distributed onlinequestionnaires among Korean members in established, travel-related online VCs in portal sites in order to determine whetherloyalty to an online VC would lead members to purchase products.Authors reported an effective response rate of 29% and gave practicalrecommendations about the features of a VC site on the companyhomepage which could lead to increased customer loyalty (p. 353).Chalkiti and Sigala (2008) examined the postings on the DIALOGOIvirtual community of the Association of Greek Tourism Enterprisesand distributed questionnaires to its members. They concluded thatthe VC promoted information sharing and idea generation, andgeographically dispersed members working for different sectorsmanaged to communicate asynchronously thus initiating botha social network and yielding usable information potentiallydeveloping into knowledge once applied in a business context.

The papers above showed that VCs may successfully be used insocial studies research. They generated relatively moderateresponse rates and usable research results. However, the currentpaper reveals that VCs cannot always be relied upon in research.

3. Methodology

The authors’ initial purpose for research was to identify featuresthat make US highway maps usable by auto travellers. They iden-tified two populations for study – tourism research academics andall auto travellers in the United States. They wanted to discover anysignificant differences between the opinions of auto travellers andmembers of the tourism academic community. An online web-based survey was chosen as a research tool in the interest ofeconomy of time and communication of the instrument, andreaching the largest number of respondents possible.

A questionnaire was created based on the inventory of 120specific characteristics and items grouped in 13 categories (seeAppendix 1). A five-point Likert scale was used to measure the levelof importance of each highway map feature/item with ‘‘1’’ denotingnot important and ‘‘5’’ – very important. The survey instrumentwas linked with inQsit (v10.12.0) software and uploaded on theUniversity’s website of the first author. The software allowedregistering responses and the analysis of primary data (mean,standard deviation). It also recorded each opening of the surveylink providing the study with information about the date ofopening the survey and time spent in completing the question-naire. This allowed the use of time as a separator between the twosamples of academics and auto travellers (see below in text).

Page 3: Using virtual communities in tourism research

S.F. Illum et al. / Tourism Management 31 (2010) 335–340 337

Although the software showed the ID of the computer used to openand complete the questionnaire, respondents remained completelyanonymous. On average, it took about 8–10 min for each respon-dent to complete the process.

To reach the population of academics, the link to the instrumentwas sent via VCs’ mailing lists in November 2007 (see Table 1) andto one tourism research-related blog (http://www.pslplan.co.uk/blog/?p¼278). The selected VCs represent the main web-basedcommunities of academics in the field of tourism studies – TRINet(Tourism Research Information Network), ATLAS (Association forTourism and Leisure Education), EuroCHRIE (European Council onHotel, Restaurant & Institutional Education). Geographical andtourist maps can be used as a tourism promotional tool. In thisregard, ELMAR (Electronic Marketing Academic Community of theAmerican Marketing Association) was also used as virtualcommunity to reach marketing academics. Table 1 shows that thelink to the survey form was sent to 7665 recipients but, due todouble-counting (some academics are members of 2 or more ofthese communities) it likely reached a much lower total number ofacademics.

A disadvantage of using multiple VCs in the same field is thatone cannot be sure about the exact number of questionnairesdistributed. From a theoretical point of view, the total number ofdistributed questionnaires, when using multiple VCs, will not belower than the number of members of the largest VC used fora survey (if all members of the other VCs should also be members ofthis community), and not higher than the total number of members(without considering double-counting) of all the VCs (if allmembers of the VCs are members of only one community). Inpractice, the total number of distributed questionnaires will be anuncertain number somewhere between these two figures. There-fore, it may be said that the contacted academics were between5484 (largest community – ELMAR) and 7665 (total membershipnumber of all four communities).

To reach the second population (US auto travellers) theAmerican Automobile Association (AAA) was invited to includethe link to the online questionnaire in an online newsletter to itsmembers in exchange for the complete results of the survey.AAA was the most suitable and natural choice as it has morethan 50 million members (AAA’s website: http://www.aaa.com).Through this distribution channel the questionnaire mightpotentially reach one-sixth of the US population, all of which areauto travellers. Using the AAA newsletter might also increasemember trust in the survey and the subsequent response rate,assuring that the distributor of the survey would be a respectedinstitution. Even if the potential response rate was less than0.1%, this process could generate nearly 50,000 completedquestionnaires (many times higher than questionnairescompleted in other tourism-related studies), and thereforeovercome the drawbacks of web-based surveys identified byJansen et al. (2006).

The American Automobile Association refused to include thelink to the survey form in a newsletter thus leading to two

Table 1Scientific virtual communities used in the survey.

Virtual community Number of members

TRINet 891*Atlas 1128**EuroCHRIE 162*ELMAR 5484**

Total number of members 7665

* Number of members at date of posting.** Number of members on 23rd March 2008.

challenges: first, finding alternative ways to reach auto travellersand, second, the need to enlarge the population to represent allauto travellers regardless of country of residence. Without the helpof AAA, the academics could not differentiate only US citizens aswell. Financial constraints would not permit respondent identifi-cation, face-to-face contact or postal service mail to potentialrespondents using the redefined population. Thus it was decided toseek alternative means of online survey. The authors next contactedan anonymous website manager for another nationwide associa-tion in the US that likely represented a large number of cross-country automobile vacationers, the American Association ofRetired Persons (AARP). Initially, the idea was well-received.However, it took more than two months for the Missouri StateUniversity human subjects committee to approve the study. Whenthe authors were finally permitted by the University to post thelink, a second anonymous website manager deleted the link.

Two weeks after the online survey was opened for the last timeby the scientific academics, the link to the survey was sent to othervirtual communities of auto travellers, seasoned travellers, peoplewho have travelled in the past, recreation-related travellers, andmap collectors, listed in Table 2. Between 7 and 14 days after thefirst posting of the link, a second posting was added on the wall-boards of some of these communities as a reminder because thenumber of contacts had been identified as one of the factors thatpositively influences response rates in web-based surveys (Cook,Heath, & Thompson, 2000). If the first posting about the survey wasstill visible on the wallboard of the VC, the academics did notupload a second posting.

Table 2 presents the list of VCs and their respective membershipnumbers at the time of the first posting.

Table 2 shows that the link to the survey form was sent tobetween 20,408 and 78,389 members of VCs. Research-relatedgroups in Facebook.com and MySpace.com were deliberatelyavoided as academics had already been contacted in the previousstage of the research. Sending them the questionnaire again wouldnot allow discrimination between the responses of academics andthe ordinary auto travellers.

4. Results and discussion

Table 3 summarizes the response rates of both academics andordinary auto travellers. The results are calculated considering thelowest and highest possible number of distributed questionnairesfor each of the two samples.

Results can be summarized as follows:

1. The opening rates of the survey were extremely low – onlyapproximately 2% of contacted tourism academics and 0.5% ofauto travellers even opened the questionnaire.

2. Very low total response rates – only about one quarter of thosewho opened the survey completed it, which corresponds to0.45–0.64% of the approached academics and between 0.04and 0.15% of the approached auto travellers in their respectivepopulations.

3. Both academics and ordinary auto travellers showed similarresponse rates calculated as percentages of opened question-naires – 23.49% and 24.79%. Ordinary auto travellers even hada small advantage in this statistic.

4. More than 31.4% of all questionnaires completed by researcher(11 out of 35), but only 6.7% of those completed by members ofsecond sample of ordinary auto travellers (2 out of 30) con-tained missing data, thus making these useless for statisticalanalysis.

5. Some questionnaires were completed from one and the samecomputer terminal (the IDs for these responses were the same),

Page 4: Using virtual communities in tourism research

Table 3Response rates of the web survey.

Academics Ordinary autotravellers

Minimum Maximum Minimum Maximum

Total number contacted 5484 7665 20,408 78,389Number of opened questionnaires 149 149 121 121Number of opened questionnaires as

percentage of total numbercontacted

2.72% 1.94% 0.59% 0.15%

Number of completed questionnaires 35 35 30 30Number of completed questionnaires

as percentage of the number ofopened questionnaires

23.49% 23.49% 24.79% 24.79%

Number of completed questionnairesas percentage of total numbercontacted

0.64% 0.45% 0.15% 0.04%

Table 2Non-scientific virtual communities used in the survey.

Virtual community Number of membersat time of posting

Facebook.com groups: 76,681Cheap, Flexible & Easy.STA Travel’s Wall 20,408Addicted to Travelling! 8462the Official rules for the yellow car game 8321Addicted to rock climbing 6100Explore the World 5251I wanna travel the world 5192I loooovvvvee to travel!! 2041All I want to do is Travel, and maybe live in

Europe one day!1336

For The Love Of Travel 1235Travel Deals & Destinations 1214Get Paid to Travel! 1126MountaineeringjMountain ClimbingjAlpine

Trekking, Backpacking & Hiking1090

Let’s skip school to travel! 1080Holiday Travel & Tourism 996I Love Camping 985A trip down Route 360, Is a trip home for me. 956Travel Lovers 764The Moose Travel Network! 716i LOVE completely random road trips 688CSTV SEC ROAD TRIP 652A.D.I.D.A.T. – All Day I Dream About Travelling 616Tallahassee Road Trips (FAMU, FSU, TCC) 559Work and Travel 517Trip of a Lifetime. 508PASSPORT: Youth Camping with a Mission 500Im going travelling.HIGH FIVE 459Travelling Addicts 458Hiking/Camping in Nova Scotia 439Entertainment in Transit 424THE THIRSTY TRAVELER! 410I’m a Thirsty Traveler! 405National Trust for Historic Preservation 396A trip down highway 410 is a trip home for me 392Hospitality & Tourism Club 366Travel Masters (for travel & tourism) 359Rough Guides 328If it ain’t a Landrover it ain’t worth driving. 237Gravel Travel and Offroading 225Sustainable Destinations 1212007 Evangelism Road Trip 104Eco tourism forum and sustainable tourism debate 103Road Trip 71Road Trip/Funny/Embarrassing Stories 71

MySpace.com groups: 1708Road Trip Nation 1399On The Road 309

Total number of members 78,389

S.F. Illum et al. / Tourism Management 31 (2010) 335–340338

implying that probably the same person responded more thanonce or a terminal with public access was used (e.g., in a library).

While one may expect ordinary people to ignore scientificstudies, one would hope the related research community wouldnot. Low opening and response rates may be attributed to thefollowing reasons:

1) Incorrect virtual communities selected for questionnairedistribution. It is possible that the VCs chosen for the researchwere not appropriate. While it may be true that some VCsselected unite people with completely dispersed interests,most of the VCs selected were perhaps dedicated to ‘‘travel’’ (ingeneral) and maps.

2) Information overload. Today, Internet users may receivecountless messages from the surrounding environment and,through the perceptual process of selective attention, they mayscreen out most of them (Kotler, Armstrong, Saunders, & Wong,2002: 208). Possibly, the message about the highway mapssurvey was not even noticed by the potential respondents. Aperson may be a member of tens of groups in Facebook.comand/or MySpace.com and simply may not have the discre-tionary time to read all messages posted.

3) Irregular use of the Virtual Community account. Anotherpossible explanation of the low opening rate could be thatmembers of VCs do not regularly check the messages on theirwallboards. de Valck, Langerak, Verhoef, and Verlegh (2007)recently showed member satisfaction with a VC will havea positive effect on his/her frequency of visits. If the member ofthe VC is not motivated to visit the community website, theprobability of detecting a message on its wallboard mayplummet.

4) Organisation of message distribution. Posting a message onthe wallboard of a Facebook.com or MySpace.com group doesnot result in an automatic e-mail notification to the particulargroup’s members. It is just the opposite situation for scientificcommunities – Atlas and TRINet are e-mail-based (althougha Facebook.com version of TRINet was created recently).EuroCHRIE and ELMAR are web-based, but posted messages areduplicated via e-mail which assures that community membersreceive the message. This may be one of the reasons for higheropening rates for the researcher sample compared to those ofthe auto traveler sample.

5) Lack of interest. Perhaps the topic was not of interest tomembers of contacted VCs. Higher opening and response ratesmight have been generated had the survey focused on a morepopular (‘‘hot’’) topic such as tourism and climate change,terrorism, tourism and poverty alleviation, sustainability.

6) Length of questionnaire. A trade-off may have occurredbetween the length of the questionnaire (number of questions,time necessary for completion) and the number of responses.Potential respondents may have perceived the questionnaire astoo time-consuming. It took approximately 8–10 min tocomplete. This was explicitly mentioned in the sentencesintroducing the questionnaire for the academics. However, thisfactor may only explain the low response rates, not lowopening rates. The time needed to complete the questionnairewas not mentioned in most of the messages posted on thewallboards of the VCs.

7) Ease of refusing to participate in the survey. Web-basedsurveys rely greatly on respondents’ initiative to participate inthem. But the Internet mode of survey makes it easy to refuseto participate. A potential respondent could simply delete themessage received, not clicking on the link provided with noexplanation needed. Mail-based surveys are more engaging as

Page 5: Using virtual communities in tourism research

S.F. Illum et al. / Tourism Management 31 (2010) 335–340 339

the potential respondent receives a (personal) printed ques-tionnaire to a specific physical address, including a preparedand prepaid envelope for returning the completed form.

8) Mistrust/fear of breach of anonymity. Perhaps some peoplemistrust online surveys or fear that their identity may berevealed, although reality is just the opposite. As mentionedabove, although the software showed the ID of the computerused to open and complete the questionnaire, respondentsremained completely anonymous.

Other problems encountered during the survey and specificrecommendations for how to solve them in future surveys include:

U Impossibility to identify the exact number of contactedpeople, for the following reasons:

- Double-counting – Many people have membership in morethan one VC. It is possible that some potential respondents hadbeen contacted more than once through different VCs. Thisproblem arises from the fact that several VCs were used in thestudy. If only one community had been used in the research(like in Chalkiti & Sigala, 2008), the problem of double-count-ing would not have occurred.

- Dynamic nature of VC membership – If research involves a fewsmall communities on Facebook.com or MySpace.com (e.g.,with combined total membership of not more than 500members), it is possible to overcome a double-countingproblem by checking the membership roster for eachcommunity surveyed. However, membership in a particular VCmay change over time – new people join, others leave thegroup. This variable, along with the enormous number ofmembers of some VCs chosen for this study (see Table 2) posesa serious challenge for a membership check.

- Confidentiality of contact details of members in e-mail-based VCs.It is impossible to check the identity of members in a VC, thusblocking the researcher’s ability to know for a certainty that thesame respondent might have completed the process more thanonce.

The impossibility to identify the exact number of contactedpeople when using multiple VCs does not allow calculation ofreliable response rates. A possible solution may be to send the linkto the web-based questionnaire only to one very large community,to several communities using time as separator between samples,or to create a dedicated VC in Facebook.com or MySpace.com.

U A researcher cannot check the validity of demographicdata provided by the respondent. In face-to-face surveys theinterviewer can visually check some of the provided data (gender,age). However this is not possible using web-based surveys ofunknown members of VCs. A person can also create false profile inthe virtual community which further hinders the demographic datacheck. Unfortunately, this drawback of a web-based survey cannotbe overcome.

U The researcher cannot react or follow-up if some ques-tionnaires are incomplete as in a face-to-face survey. Omittedresponses decrease the validity of subsequent statistical analysis. Inface-to-face surveys, the interviewer can immediately follow-upafter the previous interview in the case that a question has not beencompletely answered. In e-mail-based surveys, the researcher cansend an e-mail message to a respondent to clarify an answer. Thesepossibilities remain out of reach for the web-based surveyor asrespondents remain anonymous. The absolute number of incom-plete questionnaires is not of such importance as is the percentageof the total number of returned questionnaires fully completed. Ifa large proportion of the returned questionnaires is incomplete, theresearcher must then ask him/herself about the data collection

design and the questionnaire. The incompleteness may be a resultof the poor wording of questions or the length and time required tocomplete the questionnaire, thus discouraging respondents fromanswering all of the questions.

In order to overcome the above mentioned problem a contactssection could be added in the questionnaire (to include e-mailaddress, for example), but there is no guarantee that the respon-dents will complete this section or that they will provide valid e-mail addresses.

U Lack of knowledge/opinion on specific questions can alsobe a reason for non-response. In a web-based survey, thepercentage of incomplete questionnaires returned can be reducedto zero if successful submission of a completed questionnairerequires the respondent to answer all questions. Softwareemployed on the Internet browser will provide insurance that thiswill occur. This strategy may lead to higher quality of responses atthe expense of lower overall response rate.

U Posting e-mails or messages to members of an onlinecommunity may require prior permission. For this study, theAmerican Automobile Association (AAA) refused to send the link tothe online survey to its members with no explanation, thoughinitially approved by one company representative. The survey link,uploaded on the website of American Association of RetiredPersons (AARP) was probably interpreted as a sales promotioninstrument and subsequently removed from the website withouta representative of AARP ever communicating this decision directlyto the academics. While it may be argued that examining thequality of highway maps could be a sales promotion, experienceshows that when prior permission is required, an organisation’sdecision to reject or not distribute a questionnaire may be arbitrary.Therefore, an intended population may not be reached, or only fewof its members may be contacted at the outset before the decisionto reject/remove the link. This will definitely lead to a biasedsample. In this regard, a good relationship with and detailedexplanation to only one VC administrator about the goals, meth-odology, confidentiality and usage of final results become crucial inthe success of a web-based survey, and only one administratormust exist. Unfortunately, it is impossible to know the completestory behind the supervision of a randomly-chosen VC. Virtualcommunity administrators often seem to do their best to retainanonymity, probably to discourage high levels of incomingmessages.

U Several responses may be detected from one ID address. Itmay be assumed that a library or public access computer was used,but this may never be certain. This issue may never be resolved orcontrolled, though academics may attempt to do so by instruction.However, the authors considered that the probability of multipleresponses was low because of the time-consuming nature of thesurvey – it took 8–10 min to complete.

5. Conclusion

Unlike other tourism studies involving virtual communities thatreport response rates higher than 20% (Kim et al., 2004), the surveyfor this study yielded many fewer completed questionnaires. Asprevious discussion reveals, the results can be attributed to objec-tive (survey-related) and subjective (respondent-related) factorsincluding questionnaire design, the organisation of questionnairedistribution, specific characteristics of web-based surveys, andselection of VCs. As subjective reasons for the low response rates,the authors identified virtual community members’ interest in thetopic, frequency of visits to VC websites, information overload andmistrust in the research. All these factors may play a significant rolein response rate, although future research should examine theirdegree of influence in greater detail.

Page 6: Using virtual communities in tourism research

S.F. Illum et al. / Tourism Management 31 (2010) 335–340340

Conducting scientific research through websites using surveysor questionnaires has become more and more popular amongacademics. In the tourism field, using surveys is one major methodfor academics to collect data. Online surveys offer several advan-tages compared to traditional paper–pencil surveys or face-to-faceinterviews. They provide convenience to survey participants andthey can proctor surveys more conveniently. Survey participants donot have to travel to certain locations to participate. They canparticipate in a survey in the comfort of their home or office. Onlinesurveys definitely can research much more geographically diverseregions that traditional surveys cannot. There are no mailingexpenses required for either participants or academics.

On the other hand, to increase the response rate and obtainreliable and valid results from online surveys, great effort isrequired. The survey needs to be carefully designed. The length andnumber of open-ended questions should be limited. On one hand, itis important to include all the necessary questions in order tocollect the information needed. On the other hand, it is importantto control the length of the questionnaire as longer surveys mayyield a lower response rate.

To control the validity of the demographic data provided byrespondents, it is important to post survey links to relatively closedvirtual communities such as listservs for people who belong tocertain organizations. Posting links on open access websites such aspopular social networking locations like Facebook.com or MySpace.com make it extremely difficult for the researcher to control thesource of the information. In addition, potential respondents mayconfuse scientific research surveys with commercial surveys andrefuse to participate. People are swamped with information on theInternet nowadays. It is easy for people to ignore a survey in theirmailbox or seeing it online posted to a group to which they belong.To convince people to participate, a survey must be recognized ascredible.

Good relationships with professional organizations are impor-tant. Online surveys often require mass e-mailing to people whobelong to certain organizations. Assistance from professionalorganizations will definitely help to obtain a sample.

Surveyors may consider offering incentives. Incentives areproven to be an effective method to increase response rate.However, on the other hand, survey participants must relinquishtheir anonymity in order to enter a contest. Still, if participants areconvinced that a survey is for scientific purposes, they may morelikely participate in a survey even if they cannot remain anony-mous. Although anonymity cannot be guaranteed, confidentialitycan be offered to survey participants for assurance.

Posting surveys on open access websites such as Facebook.comor MySpace.com can increase exposure of a survey to potentialrespondents because of the large amount of traffic these websitesare assumed to generate. However, using these means, this studyshowed that it was very hard to control survey respondents, andthe survey in this study actually produced very low participation.Future research can be conducted comparing response patternsbetween such websites and listservs.

Appendix 1. Supplementary data

Supplementary data associated with this article can be found inthe online version at doi:10.1016/j.tourman.2009.03.012.

References

Casalo, L., Flavian, C., & Guinalıu, M. (2007). The impact of participation in virtualbrand communities on consumer trust and loyalty: the case of free software.Online Information Review, 31(6), 775–792.

Chalkiti, K., & Sigala, M. (2008). Information sharing and idea generation in peer topeer online communities: the case of ‘‘DIALOGOI’’. Journal of VacationMarketing, 14(2), 121–132.

Converse, P. D., Wolfe, E. W., Huang, X., & Oswald, F. L. (2008). Response rates formixed-mode surveys using mail and e-mail/web. American Journal of Evaluation,29(1), 99–107.

Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates inweb- or Internet-based surveys. Educational and Psychological Measurement,60(6), 821–836.

Dholakia, U. M., Bagozzi, R. P., & Pearo, L. K. (2004). A social influence model ofconsumer participation in network- and small-group-based virtual communi-ties. International Journal of Research in Marketing, 21(3), 241–263.

Donath, J. S. (1998). Identity and deception in the virtual community. In P. Kollock, &M. Smith (Eds.), Communities in cyberspace (pp. 27–57). London: Routledge.

Duffy, M. E. (2002). Methodological issues in web-based research. Journal of NursingScholarship, 34(1), 83–88.

Fricker, R. D., & Schonlau, M. (2002). Advantages and disadvantages of Internetresearch surveys: evidence from the literature. Field Methods, 14(4), 347–367.

Gosling, S. D., Vazire, S., Srivastava, S., & John, O. P. (2004). Should we trust web-based studies? A comparative analysis of six preconceptions about Internetquestionnaires. American Psychologist, 59(2), 93–104.

Jansen, K. J., Corley, K. G., & Jansen, B. J. (2006). E-survey methodology. InR. A. Reynolds, R. Woods, & J. D. Baker (Eds.), Handbook of research on electronicsurveys and measurements (pp. 1–8), Information Science Reference.

Kardaras, D. K., & Karakostas, B. (2007). Exploring the potential of virtualcommunities as a business model in banking: the customers’ view. InternationalJournal of Web Based Communities, 3(3), 316–331.

Kiernan, N. E., Kiernan, M., Oyler, M. A., & Gilles, C. (2005). Is a web survey aseffective as a mail survey? A field experiment among computer users. AmericanJournal of Evaluation, 26(2), 245–252.

Kim, W. G., Lee, C., & Hiemstra, S. J. (2004). Effects of an online virtual communityon customer loyalty and travel product purchases. Tourism Management, 25(3),343–355.

Kotler, P., Armstrong, G., Saunders, J., & Wong, V. (2002). Principles of marketing(3rd ed.). Harlow: Prentice Hall.

Laat, P. (2005). Trusting virtual trust. Ethics and Information Technology, 7(3),167–180.

Porter, C. E. (2004). A typology of virtual communities: a multi-disciplinary foun-dation for future research. Journal of Computer-Mediated Communication, 10(1).Retrieved on 13th January 2008. http://jcmc.indiana.edu/vol10/issue1/porter.html.

Rothaermel, F. T., & Sugiyama, S. (2001). Virtual Internet communities andcommercial success: individual and community-level theory grounded in theatypical case of TimeZone.com. Journal of Management, 27(3), 297–312.

Schmitz-Justen, F. J., & Wilhelm, A. F. X. (2006). An empirical study of factorsimpacting on knowledge processes in online forums: factors of interest andmodel outline. International Journal of Web Based Communities, 2(3), 318–338.

Schmitz-Justen, F. J., & Wilhelm, A. F. X. (2007). An empirical study of factorsimpacting on knowledge processes in online forums: structural equationmodelling analysis and results. International Journal of Web Based Communities,3(3), 252–270.

Snyder, J., Carpenter, D., & Slauson, G. J. (2007). MySpace.com – a socialnetworking site and social contract theory. Information Systems EducationJournal, 5(2). Retrieved on 14th May 2008 from URL. http://isedj.org/5/2/ISEDJ.5(2).Snyder.pdf.

Sooryamoorthy, R., & Shrum, W. (2007). Does the Internet promote collaborationand productivity? Evidence from the scientific community in South Africa.Journal of Computer-Mediated Communication, 12(2). Article 20. Retrievedon 23rd March 2008 from URL. http://jcmc.indiana.edu/vol12/issue2/sooryamoorthy.html.

Thomas, J. B., Peters, C. O., & Tolson, H. (2007). An exploratory investigation of thevirtual community MySpace.com: what are consumers saying about fashion?Journal of Fashion Marketing and Management, 11(4), 587–603.

Trochim, W. M. (2006). The research methods knowledge base (2nd ed.).. Retrieved on14th January 2009 from URL. http://www.socialresearchmethods.net/kb/.

Truell, A. D., Bartlett, J. E., & Alexander, M. W. (2002). Response rate, speed, andcompleteness: a comparison of Internet-based and mail surveys. BehaviorResearch Methods, Instruments, & Computers, 34(1), 46–49.

Usoro, A., Sharratt, M. W., Tsui, E., & Shekhar, S. (2007). Trust as an antecedent toknowledge sharing in virtual communities of practice. Knowledge ManagementResearch & Practice, 5(3), 199–212.

de Valck, K., Langerak, F., Verhoef, P. C., & Verlegh, P. W. J. (2007). Satisfaction withvirtual communities of interest: effect on members’ visit frequency. BritishJournal of Management, 18(3), 241–256.

de Vries, S., & Kommers, P. (2004). Online knowledge communities: future trendsand research issues. International Journal of Web Based Communities, 1(1),115–123.

Wang, Y., & Fesenmaier, D. R. (2004). Modeling participation in an online travelcommunity. Journal of Travel Research, 42(3), 261–270.

Wang, Y., Yu, Q., & Fesenmaier, D. R. (2002). Defining the virtual tourist community:implications for tourism marketing. Tourism Management, 23(4), 407–417.

Zhang, Y., Wang, C. C., & Chen, J. Q. (2001). Chinese online consumers’ responses toweb-based data collection efforts: a comparison with American onlineconsumers. Journal of Database Marketing, 8(4), 360–369.