17
Personal information privacy and emerging technologiesSue Conger,* Joanne H. Pratt & Karen D. Loch *University of Dallas, Irving, USA, email: [email protected], Joanne H. Pratt Associates, Dallas, USA, email: [email protected], and Georgia State University, Atlanta, USA, email: [email protected] Abstract. This research presents a model of personal information privacy (PIP) that includes not only transactional data gathering, but also interorganisational data sharing. Emerging technologies are used as a lens through which the dis- cussion of PIP management is extended. Research directions are developed for aspects of privacy, privacy-preserving technologies, interorganisational data sharing and policy development. Keywords: personal information privacy, data sharing, data leakage, data integra- tion, data collection, emerging technologies INTRODUCTION Current technologies enable data collection and integration on a scale previously unimagined with both benefits and unintended consequences (Kontzer & Greenemeier, 2006). One unin- tended consequence is that daily, we read of some new loss of individuals’ organisation-held, personal information. As data losses amass, the realisation that personal information privacy (PIP) is no longer manageable by individuals becomes clearer. Research to date proposes that PIP is primarily the responsibility of individuals’ (cf. Smith et al., 1996); this legacy interpreta- tion of the Warren & Brandeis (1890) opinion posits that privacy is the right to be left alone and it is the individual’s responsibility to maintain that right. However, privacy as a concept is not that simple. Privacy is contextually based, a tapestry of dynamic interrelationships balancing between personal needs and wants and societal needs and wants (Mason et al., 2005; Smith et al., 2011). Westin’s (l970) definition of privacy as the right to define for oneself when, how and to what extent information is released, gives a perspective that more closely fits today’s reality. Thus, PIP relates to information an individual wishes to keep private but not to how that information is managed. The individual trades private information about himself as a kind of currency in exchange for anticipated goods and services, such as online banking (Mason et al., 2005; Smith et al., 2011). Problems arise when personal information is stolen, co-opted without doi:10.1111/j.1365-2575.2012.00402.x Info Systems J (2012) 1 © 2012 Blackwell Publishing Ltd

Personal information privacy and emerging technologies

Embed Size (px)

Citation preview

Page 1: Personal information privacy and emerging technologies

Personal information privacy andemerging technologiesisj_402 1..17

Sue Conger,* Joanne H. Pratt† & Karen D. Loch‡

*University of Dallas, Irving, USA, email: [email protected], †Joanne H. PrattAssociates, Dallas, USA, email: [email protected], and ‡Georgia StateUniversity, Atlanta, USA, email: [email protected]

Abstract. This research presents a model of personal information privacy (PIP)that includes not only transactional data gathering, but also interorganisationaldata sharing. Emerging technologies are used as a lens through which the dis-cussion of PIP management is extended. Research directions are developed foraspects of privacy, privacy-preserving technologies, interorganisational datasharing and policy development.

Keywords: personal information privacy, data sharing, data leakage, data integra-tion, data collection, emerging technologies

INTRODUCTION

Current technologies enable data collection and integration on a scale previously unimaginedwith both benefits and unintended consequences (Kontzer & Greenemeier, 2006). One unin-tended consequence is that daily, we read of some new loss of individuals’ organisation-held,personal information. As data losses amass, the realisation that personal information privacy(PIP) is no longer manageable by individuals becomes clearer. Research to date proposes thatPIP is primarily the responsibility of individuals’ (cf. Smith et al., 1996); this legacy interpreta-tion of the Warren & Brandeis (1890) opinion posits that privacy is the right to be left alone andit is the individual’s responsibility to maintain that right. However, privacy as a concept is notthat simple. Privacy is contextually based, a tapestry of dynamic interrelationships balancingbetween personal needs and wants and societal needs and wants (Mason et al., 2005; Smithet al., 2011).

Westin’s (l970) definition of privacy as the right to define for oneself when, how and to whatextent information is released, gives a perspective that more closely fits today’s reality. Thus,PIP relates to information an individual wishes to keep private but not to how that informationis managed. The individual trades private information about himself as a kind of currency inexchange for anticipated goods and services, such as online banking (Mason et al., 2005;Smith et al., 2011). Problems arise when personal information is stolen, co-opted without

doi:10.1111/j.1365-2575.2012.00402.x

Info Systems J (2012) 1

© 2012 Blackwell Publishing Ltd

Page 2: Personal information privacy and emerging technologies

permission, or otherwise compromised. Two research questions are addressed by thisresearch. The first is whether current privacy conceptualisations accommodate modern inter-organisational relationships. The second is the extent to which emerging technologies shiftprivacy concerns into new areas. This research is important because privacy preservationrequires knowledge of all parties having access to personal information and, new issues needto be addressed as found, ideally before new technology is fully commercialised.

Globally, privacy law is not a settled issue. The darkest areas of Figure 1 are parts of theworld currently covered by privacy laws; the white areas are countries not governed by generalprivacy laws (e.g. USA, most of Africa, most of Asia and parts of Latin America); the grey areasshow countries that have legal discussions ongoing (e.g. the Middle East, India and parts ofSouth America).

The European Union (EU) and the Asia-Pacific Economic Cooperation (APEC) take acomprehensive view that individual privacy is everyone’s responsibility such that all vendors/providers are responsible for its safeguarding (EU, 1995; APEC, 2005). In contrast, the USAhas a patchwork of privacy laws that seek to protect the individual in specific situations (Solove,2004). Regardless of privacy laws, there is a need for better controls to preserve privacy(Smith, 2004). A problem with the individual responsibility perspective of privacy is thatunderstanding relationships between consumers and vendors, while paramount, has nothelped to stem data breaches and excludes many issues (cf. DatalossDB.org, 2011).

In the next sections, we summarise PIP research. Then, we present an expanded model ofPIP that includes vendor/provider interactions to accommodate the full scope of data sharing

Figure 1. Global privacy laws reach less than half of the world population (Banisar, 2011).

2 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 3: Personal information privacy and emerging technologies

and data breaches. Then, we discuss emerging technologies that exemplify current andnascent privacy issues. Finally, we discuss areas for further research.

PRIOR VIEWS OF PIP

Three overlapping eras of privacy research are described in this section. Research up to thelate 1990s represented early privacy research. When Web transaction technology matured in1998, it fostered the development of further development of privacy issues. Modern privacyresearch includes transactional interrelationships in social networking and other milieus.

Past PIP research

Early research through the 1990s sought to determine the scope of the privacy problem andan appropriate frame for addressing privacy issues (Culnan, 1993; Loch & Conger, 1996;Smith et al., 2011). Privacy, at a minimum, concerns data collection, secondary use, owner-ship, accuracy and access (Culnan, 1993; Loch & Conger, 1996; Smith, 2004). Privacyresearch during this period focused on information ownership by the consumer, disclosure ofuses, the need to prevent secondary uses and opt-in vs. opt-out issues (cf. EU, 1995; Dillon& Torkzadeh, 2006). Further, most research and past directives implicitly assumed that thetypes of data gathered were limited to demographic (e.g. name, address, date of birth) andtransaction data (cf. Cheung, et al., 2005; Smith et al., 2011). Thus, the early researchperspective on personal information concentrates on vendor/provider data usage and accesspractices, seeking to articulate the issues relating to PIP. This body of research had fewreferences to internet or general data-gathering practices, which were still developing.

Web transaction capabilities matured around 1998. Post-Web maturity PIP research shiftedfocus to Web transactions that provided new consumer benefits but generated more data, useddifferent methods of data collection and gave data indefinite life (Wright et al., 2008). Theresearch confirms that the Web enables novel methods of obtaining and integrating informationon individuals, many of which are unrelated to transactions between consumers and vendors/providers. Issues relating to internet privacy include where and how information is collected,whether or not the collection is known to the consumer, vendor/provider trust, the life and breadthof information collected, perceived risks of information sharing, information use and corporateprivacy policies (McKnight et al., 2004; Dinev & Hart, 2006; Drennan et al., 2006). In addition,when specifically referencing internet purchase transactions, the research includes character-istics of consumer, product, medium, merchant and environment (McKnight et al., 2004).Further, these issues apply to every emerging technology as ways to transgress privacy evolve.

Current privacy research includes social networking but does not identify the data collectedas part of interactions (cf. McKnight et al., 2004). Some research has begun to evaluate theeffects of interorganisational data sharing (Wakefield & Whitten, 2006). However, when mul-tiple parties to transaction data are considered, they are classified as ‘secondary use’ (cf.Wakefield & Whitten, 2006). Further, technology-enabled information fungibility changes manyPIP issues, including access and abuse of data that are beyond the individual’s managementability (Wright et al., 2008).

PIP and emerging technologies 3

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 4: Personal information privacy and emerging technologies

Assessment of past privacy research

Several current practices are missing from privacy research to date. PIP models currentlyrelate a number of factors to the individual who executes a decision calculus to developsufficient trust to transact, thus sharing personal information with a vendor (Culnan & Arm-strong, 1999; Cheung et al., 2005). If the individual is the 1st party to the transaction, currentmodels of PIP stop at the 2nd party vendor/provider, assuming that privacy protection lies invendor/provider data management. That would be an incorrect assumption (Culnan, 2000).

Data shared between second parties and others also have implied assumptions that thereis limited collection, limited distribution and limited life (Belanger & Crossler, 2011; Tsai et al.,2011). Neither of these appears to be valid at present. Web identification, click streams, PCfiles and more are collected by unknown parties (Bucklin & Sismeiro, 2009). Once data leavean individual, it takes on a life of their own, to the second party of the transaction, to creditcompanies, the government and any number of data integrators. If breached, the data areshared with fourth parties as well. As a result of this unknown breadth of data sharing, the datalife appears to be indefinite as there is no attempt to limit data life at present in most situations.

As new technologies mature, the issues above will be exacerbated. PIP issues require a newway of viewing privacy in the context of different types of transactions to better manage thenew data collected, data-sharing partnerships and personal information life cycles.

AN EXPANDED MODEL OF INFORMATION PRIVACY

The current conceptualisation of Web transactions as limited to a consumer and a businessorganisation no longer fits the transaction and information sharing milieu. This section presentsa model for thinking about individual–organisation interactions that accommodates all partiesto an interaction, thus opening privacy discussions to include networks of relationships.

First parties: individuals

Figure 2 depicts all of the parties to transactions. The lower part of Figure 2 up to the secondparty vendor/provider incorporates the research summarised in Culnan & Armstrong (1999)and Cheung et al. (2005). The first party is the individual with his personal information. Thesecond party is a vendor/provider with whom the first party engages in a transaction to obtainbenefits. In the course of the transaction, the first party gives permission to the second partyto collect personal data. Thus, the second party is known to the first party.

Part of the individual’s decision implicitly includes which data to share with the second partybased on expectations relating to benefits, the data collected, expected life and uses of thecollected data (Awad & Krishnan, 2006). Ultimately, the first party perceives that the datarequested are reasonable and relevant to the transaction (Cheung et al., 2005).

Perceived reasonableness of data is a key construct in the decision calculus. The decisioncalculus results in an assessment of trust and risk, to either consummate or cancel the

4 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 5: Personal information privacy and emerging technologies

transaction and, if consummated, which data to share and the sharing duration (Dinev & Hart,2006; Pratt & Conger, 2009). The provision of personal data as part of a transaction is atrade-off in which the individual gives up some amount of privacy in order to obtain benefits(Pratt & Conger, 2009). However, the decision calculus is likely to change if individuals becomeaware of second party interorganisational data sharing, or collection of click streams or otheridentifying information (Tsai et al., 2011).

Second parties: vendors/providers

Vendors/providers offer goods and services in exchange for money and information about thepurchaser. Although the individual transacts for a service or product, he assumes that datacollected relate only to the business transaction, including name, address, contact, transactionand payment information (cf. Culnan & Armstrong, 1999; Cheung et al., 2005). Yet, vendorscollect data before, during and after an actual business transaction, combining this data withother transactional and post-transactional data and enabling the building of a consumptionhistory for individuals and families (cf. Hitlist Analytics software, Marketwave.com, 2011).Combined with the other purchase information, a decision history for individuals and house-holds might be used, for example, for discriminatory practices such as denial of insurance(Mason et al., 2005). It is important to distinguish an individual’s history from a profile. A profileinfers behaviour based on psychographic and demographic trends, whereas history recordsfacts that disclose life activities.

Figure 2. Expanded privacy model.

PIP and emerging technologies 5

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 6: Personal information privacy and emerging technologies

Consumers appear ignorant of corporate privacy policies and rely on organisations thatvouch for the trustworthiness of vendors/providers (McKnight et al., 2004). Further, consumersmay not be aware that once a transaction is completed, transaction records are automaticallyshared with third party partners such as credit bureaus. Once shared, the data have a life oftheir own, meaning that it is no longer controlled by its first and second party donors and itsfurther use may be unknown and unknowable.

Third parties: data-sharing partners

After a transaction is complete, the vendor/provider shares information with third parties. Somethird parties are legal external data-sharing entities that collect and integrate data forgovernment-regulated credit reporting and marketing purposes. For example, data integratorExperian matches consumer information to transaction information, profiling creditworthinessof consumers and reselling the integrated information. These third parties may be known toindividuals but probably are not factored into transaction decisions because they fade into theubiquity of transaction processes.

Other third party organisations operate in a legally untested area, appropriating and usingdata without permission with largely unknown consequences (Solove, 2004; Albrecht &McIntyre, 2005). For these third parties, data collection may not relate to any transaction andmay be unknown to consumers. Ambiguity arises with arguable legality both for and againstsuch actions. Legally ambiguous methods used by some third parties include pretexting, thatis, posing as a customer to obtain information, using spyware to collect data and click streams,and repurposing collected data without permission (Wright et al., 2008). Third party tracking ofmovements and click streams are aggregated with other lifestyle, legal, medical, psycho-graphic, and demographic information may provide more information than a consumer wantsknown (Tsai et al., 2011).

Government pre-emption of data for national security or other uses may also fall in thelegally unresolved area (DARPA, 2002; Wright et al., 2008). For instance, US governmentinterception of transactions to and from other countries from the SWIFT electronic fundstransfer network is an unresolved global issue (Lichtblau, 2006). Similarly, growth of surveil-lance in the UK, Australia and other countries exemplifies government data pre-emption thatcan infringe personal privacy (Michael & Michael, 2007).

At issue in the private sector is the need to balance individual privacy with corporate use ofdata to gain the societal benefits of economic growth and development (Culnan & Bies, 2003).In the government sector, individuals are obligated to relinquish personal data for their ownsecurity as well as for the collective good. Problems arise when there appear to be no boundson data collection and use; the challenge is to define those boundaries (Albrecht & McIntyre,2005).

Fourth parties: illegal entities

Fourth party data users are illegal hackers, thieves and third party employees who violatecompany policy (DatalossDB.org 2011). Criminal and terrorist activity examines the benefits

6 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 7: Personal information privacy and emerging technologies

and drawbacks of not using encryption to control second party interactions with first and thirdparties (Denning & Baugh, 1999). ‘Hacktivism,’ defined as hacking fused with activism, hasbecome a global problem (Denning & Baugh, 1999; Gupta et al., 2009). Intentional fourth partyhackers use increasingly sophisticated techniques such as spyware that self-installs on usermachines that can obtain, for example, passwords or credit information (Wang et al., 1998).Globally, fourth parties join with organised crime and/or terrorist organisations to wreak digitalhavoc as a more effective means of fighting perceived threats than war (Gupta et al., 2009).

Data loss to or theft by fourth parties also may be due to poor management. For example,a second party bank lost personal information about 90 000 individuals due to insecuretransport of unencrypted data (DataLossDB.org, 2011). Sony Corporation has been hackedover 40 times since 2002 with a global loss of over 100 million consumer records (Dataloss-DB.org, 2010).

Data loss is a global phenomenon. Watchdog groups, reports compromised ‘data elementsuseful to identity thieves, such as social security numbers, account numbers, and driver’slicense numbers’ (DatalossDB.org, 2010). More than 550 million records including personalidentifying information have been breached since 2005 in the USA, while reported globallosses add another 200 million records (DataLossDB.Org, 2011). Without improving organi-sational practices and increasing the use of privacy-preserving technologies, no identities aresafe.

Thus, the expanded privacy model includes the initial transaction interactions between a firstparty individual and a second party vendor/provider. Before, during and after a transaction, thesecond party collects information about the first party including data that may not relate directlyto the transaction. The second party then shares information with third parties, some of whoseactions are legally untested. Fourth-party, illegal users repurpose data from all other parties bytheft, hacking or collusion. Once the data have been shared, they take on a life of their own.

By expanding privacy to include vendor/provider information sharing with any number ofother parties, gaps in current organisational PIP responsibilities become accessible. The firstparty individual to second party vendor/provider transaction is well understood. However,vendor/provider sharing of data is less understood and is becoming more important to privacycontrol (Albrecht & McIntyre, 2005; Wright et al., 2008).

We argue that second party vendors/providers that include corporations, governments andother public organisations do not operate in isolation. Rather, they routinely share data withthird party entities that in turn, share that data with others. This data-sharing environment is akey source of vulnerability to fourth party illegal hackers. Once a transaction is complete, thefocus shifts to the data themselves, which take on a life of their own, becoming the currencyof a complex pattern of interchanges (Solove, 2004; Mason et al., 2005).

Once the full extent of data movement is understood, it is clear that individuals have nocontrol personal data; nor can individuals manage corporate policies and procedures, thebreach of which has led to huge numbers of data leakages (DatalossDB.org, 2011; Wanget al., 1998). This is more than a single-nation issue as closed-circuit TV, national identitycards, internet activity monitoring and other forms of privacy boundary erosions are imple-mented around the globe (Wright et al., 2008).

PIP and emerging technologies 7

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 8: Personal information privacy and emerging technologies

EMERGING TECHNOLOGIES CHANGE PIP ISSUES

This section discusses types of emerging technologies that demonstrate the value of theexpanded privacy model and privacy concerns. By ‘emerging’ technologies, we mean tech-nologies coming into existence or maturity (Wheatley & Wilemon, 1999). The relevance ofemerging technologies to issues of privacy reflects the understanding that technology is notjust innocuous artefacts; rather, technologies include consequences (Naisbitt et al., 2001).This concept is magnified by emerging technology but applies equally to any technologythrough which individuals interact with organisations.

The emerging technologies discussed in this section are geographical positioning systems(GPS), radio frequency identification (RFID) systems, smart motes as representing nanotech-nologies and engineered bio-organisms. All promise to improve many aspects of how we live;however, they also currently offer unprotected and unlimited access to personal information onan unprecedented scale. These technologies were selected because each is in a differentstage of development and adoption, and all promise to expand privacy issues as they mature.In the next sections, each technology is described. Privacy issues relating to the technologiesfollow this discussion.

GPS technology

A GPS receiver is a device that locates satellites to determine its distance to each of threeknown points, using this information to deduce its own location (Giorgia et al., 2010). GPS isavailable in stand-alone devices or embedded in devices such as cell phones and autos(Giorgia et al., 2010). The global market for GPS-based services is expected to reach $13billion by 2014 (ieMarketResearch.com, 2010). As there are roughly 5 billion cell phones for theworld’s 7 billion people, most with GPS capability, GPS represents a relatively mature tech-nology (ieMarketResearch.com, 2010).

RFID

Developed in the 1940s, RFID uses wireless computer chips to track items at a distance(Finkenzeller, 2003). An RFID system requires two basic elements: a chip (or transponder) anda reader (or transceiver). An RFID chip is composed of an antenna or coil, a capacitor tocapture and use energy from a transceiver, and a radio tag containing unique information(Finkenzeller, 2003).

RFID readers range from passive to active. Passive transceivers use radio waves to gainaccess to RFID chips and are slow, reading about 20 items within a 10-foot radius in about 3seconds (Finkenzeller, 2003). Active RFID transceivers continuously monitor, record andtransmit sensor inputs for thousands of tags located within about 90 feet. RFID are used inproducts including metals, liquids, textiles, plastics, pharmaceuticals, clothing, robots guardingplaygrounds and even shaving gear (Finkenzeller, 2003). While 1-inch RFID tags are stillcommon, dust-sized RFID tags have been developed for a variety of applications.

8 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 9: Personal information privacy and emerging technologies

RFID is still in its adoption and growth phase. The size of the market was approximately $5.4billion in 2010, up from $500 million in 2004, and is expected to grow to $26 billion by 2016(RFID World Canada, 2011).

Smart motes

Smart motes, a form of micro-electro-mechanical systems, are compact, ultra low power,wireless network devices that form self-organising, intelligent networks programmed to performsome task (Jotterand, 2010). Smart motes are a form of nanotechnology that are now twonanometres in diameter, less than 1/40 000 the width of a strand of hair (Jotterand, 2010).Smart motes will be sprayable, paintable, absorbable and ingestible organisms (Jotterand,2010). In addition to obvious problems with organisational security, motes remove any vestigeof personal privacy as well. If inhaled, motes might report on the inner health of individuals(Jotterand, 2010). Smart mote nanotechnology is predicted to become self-replicating andcapable of spanning the globe in less than 2 hours (Robert, 2009). Smart motes are adeveloping technology with approximately $2 billion in sales in 2010; the worldwide nanotech-nology market is estimated to grow to $1.6 trillion by 2013 (Bouchaud, 2010).

Bio-organisms

While smart motes are inanimate, bio-organisms are viruses or bacteria that are trained toperform intelligent functions (Bulles, 2006). In a beneficent setting, this could mean a welcomeend to invasive procedures such as a colonoscopy. In a malevolent setting, a bacterium orvirus might alter the functioning of its host cells in unique ways then provide a ‘photo’ of thealtered site to confirm the change (Robert, 2009). Smart bio-organisms are in developmentwith few commercialised offerings but the field promises to grow into a multi-trillion dollarbusinesses.

PRIVACY THREATS FROM EMERGING TECHNOLOGIES

Characteristics of the emerging technologies that pose threats to privacy relate to theirubiquity, invisibility, invasiveness, collectability of heretofore uncollectible information, pro-grammability and wireless network accessibility (Ohkubo et al., 2005; OECD, 2007). Detectingembedded nano-sized RFID and GPS, smart motes or bio-organisms is not humanly possiblewithout specialised equipment (Giorgia et al., 2010). Mote and GPS reporting to a central sitealso are undetectable (Warneke & Pister, 2004).

Ubiquity and invisibility are demonstrated by RFID readers imbedded in buildings, roads,trucks, ships, aircraft or other infrastructures (Albrecht & McIntyre, 2005). RFID chips are thusbeing read everywhere. Nanotechnologies will be even more invasive, including biologicalimbeddedness (Jotterand, 2010). There are positive, even lifesaving, benefits from thesetechnologies for instance, time and location data retrieved from a New York City MetroCard

PIP and emerging technologies 9

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 10: Personal information privacy and emerging technologies

proved the innocence of a murder suspect (Weiser, 2008). However, embedded devices canbe re-read every time a reader comes into range (Albrecht & McIntyre, 2005). Readers do notdiscriminate between smart cards, toll tags and so on, thus, every action and all digital databecome knowable (Albrecht & McIntyre, 2005).

The magnitude of potential privacy loss becomes clearer when collected data are reportedto a third or fourth party. Data aggregators such as ChoicePoint and Acxiom match credit anddemographic information with purchases, personal movement, travel routes, length of time atstops, interior health, tax, immigration, click stream, medical, biological, genetic, financial andhousehold information (Albrecht & McIntyre, 2005). All of this access and integration increasesthe potential of illegal fourth party data acquisition.

For the individual, maintaining personal privacy requires control of information, which is notpossible when data gathering and sharing practices are unknown and uncontrollable as willhappen with nano and bioengineered devices (Albrecht & McIntyre, 2005; Jotterand, 2010).Further, the management of information collected by any of the emerging technologies isbeyond the time and financial resources of most individuals. Governments may developregulations regarding nano-device data collection, but monitoring and compliance will beproblematic for the foreseeable future (Albrecht & McIntyre, 2005; Wright et al., 2008). There-fore, organisations must consider privacy management beyond their borders, if for no otherreason than that they too, will be subjected to the same potential harm as individuals.

EMERGING TECHNOLOGY AND EXPANDED MODEL OF PIP

The privacy model shows that there is a distinct break between what happens during theindividual decision process leading to a transaction with a vendor/provider and what happensafter the transaction and concomitant data sharing occur. The portion of the PIP model belowthe second party (Figure 2) is reasonably well understood for transaction processing becauseof past research. However, the emerging technologies can remove the first party decision fromthe data collection process. Concern over which data are collected will become moot asreadings by nano-sized devices or bio-organisms will be untraceable and ubiquitous (Wrightet al., 2008). It is unlikely that the readings of RFID, GPS and sensors will be conducted solelyby the organisation with which the transaction takes place. Rather, with the right equipment,anyone anywhere can read what is on, and in, an individual person to pinpoint their locationand activities (Albrecht & McIntyre, 2005).

The authors believe PIP is worth preserving but, because of emerging technologies, themethods and issues thus far identified through research no longer define the PIP landscape.The top portion of the model addresses this problem, relating second, third and fourth partiesto describe the full scope of information movement. At present, these relationships and theextent to which data take on a life of their own are not well understood. As discussed in thefollowing section, understanding data-sharing relationships is the first step to being able tomanage them.

Novel technologies will continue to erode personal privacy (Wright et al., 2008). As webenefit from RFID chips built into pets and purchased goods, use GPS-enabled cell phones as

10 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 11: Personal information privacy and emerging technologies

a necessity of modern life and desire the latest in bionanotechnology for health, lives alsobecome increasingly transparent. In addition to making life transparent, emerging technologiesremove organisational security perimeters, thus increasing risks of fourth party breaches(Albrecht & McIntyre, 2005). These continuing challenges to privacy management requiredifferent approaches to management and security than current practice (Wright et al., 2008).

FUTURE RESEARCH

Emerging technologies will change the dynamics of privacy in ways we do not now understand.Some privacy has certainly been lost not to be regained (Wright et al., 2008). However, wemay not be ready to cede all information about everything, everywhere (Tsai et al., 2011).Assuming that is the case, research plays a key role in determining the set of solutions neededto provide humans some control over sharing of their own information.

Research on data collection and its life as data propagate throughout the various parties willhelp in devising appropriate policy and legal protections. Another outcome of this researchshould be new courses of action to better detect and control unwanted collection, data leaksand breaches. The expanded PIP model helps reveal data transfer issues, contingencies andcharacteristics for each party accessing personal information that need study.

Data importance

Individuals appear to be concerned about what information is collected and used by vendors/suppliers (Mason et al., 2005; Belanger & Crossler, 2011; Tsai et al., 2011). There also appearto be significant differences even for the same person over time in what information isconsidered private (Mason et al., 2005). For instance, parental medical history may beguarded only to be made ‘public’ if the individual is away for an extended period and the parentneeds medical attention (Mason et al., 2005). In addition, generational differences in personalinformation importance appear to be considerable (Australian Government, 2008). The deci-sion calculus and the impacts of different data issues are not well understood in first–secondparty transactions. Therefore, research is needed on collection and sharing of different typesof information, generational differences or life cycle differences to better understand theirimpacts on transaction and information-sharing decisions.

Another type of research can evaluate data characteristics that cause issues. By under-standing data characteristics, companies should be better able to develop emerging privacydesigns that match consumer expectations. Emerging technologies especially need thisresearch. Development of consumer applications embedded in emerging technologies couldbe proactive rather than reactive in safeguarding privacy. Making PIP manageable requires notonly data importance understanding, but also understanding of issues relating to data collec-tion and data life cycle along with technology solutions. These are discussed next.

PIP and emerging technologies 11

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 12: Personal information privacy and emerging technologies

Data collection

Research to understand the circumstances and implications of unknown data collection andsharing practices is needed. For present technologies, analysis is needed to understand‘leech’ software that piggybacks on legitimate Web sites, monitors click streams and takes filesfrom user PCs (Bucklin & Sismeiro, 2009). Technical research on third and fourth parties mightdecompose Web site activities to expose leeching techniques. The goal of this research wouldbe to surface leeching practices and to develop counter-practices to sense and destroy suchattempts, thus freeing transactional sites from leeching activities. This research might also leadto better PC management practices that could carry over to emerging technologies. Similarresearch relating to emerging technologies is needed as Dutch and US passports were hackedalmost as they were released, showing that RFID is prone to similar problems (Finkenzeller,2003).

Another area for research is technology to solve privacy problems. Less than 10% of thegeneral public is aware of nanotechnology, including technologies ranging from RFID to motes(Finkenzeller, 2003). As a result, they would also not be expected to comprehend the potentialmagnitude of PIP losses via these technologies. Yet, in keeping with a desire to controlpersonal information, it would be expected that individuals would want to control the extent towhich they share information through emerging technologies. If technology can create privacyproblems, perhaps technology can also solve them. Research might develop technology tolimit data life cycles for instance, technology that automatically erases data after a specifiedperiod. Similarly, research on anonymising technology and fast, foolproof encryption areneeded to ensure that the network transport of data is useful only to the intended user (Kobsa,2007). Privacy enhancing technologies offer promise in this area and are diverse in addressinginterfaces, web programs, HTML and other aspects of Web technology (Kobsa, 2007). Theconcept of privacy by design is gaining adherents and provides for embedded privacy duringdesign of applications, infrastructure systems (e.g. electrical grids) and other proactive mea-sures to ensure privacy preservation (Kobsa, 2007).

Another potential research area is for development of automated compliance to legaldirectives (Knapp et al., 2007). EU Directives and US privacy laws appear to be largelyineffective against emerging technologies and the complexities of action they enable. Bydeveloping automated compliance or compliance evaluation, PIP becomes more manageable(Kobsa, 2007). Research involving scientists working in bionanotechnology, and their solutionsto potential harmful uses of their creations, could lead to novel IT solutions for self-enforcingprivacy. For instance, development of technology that, once programmed, has a limited rangeof changeability, or that shuts off or does not self-replicate may preserve privacy. If building PIPtechnology into bionanotechnology can circumvent privacy transgressions, technology solu-tions could become viable for solving many PIP issues.

Second and third parties data life cycle management

The portion of the model relating to second and third party data sharing requires research tofully support model relationships. No known research focuses on the data themselves, that is,

12 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 13: Personal information privacy and emerging technologies

their existence, movement and life cycle once released by the individual. To effectivelyresearch second–third party relationships, empirical research could be developed, for instancetracking data from one entity to another. Other research might evaluate, policies relating toorganisational interactions between second and third parties. For instance, the extent to whichsecond parties negotiate, manage and monitor partners’ data management would helpdevelop the scope and magnitude of the issues.

Emerging technologies also make it imperative to account for collection of novel types ofdata and their life cycles (Dinev & Hart, 2006). Developing such an understanding will allowpolicy and legislative decisions to be developed and will allow second party vendors/providersto better address users’ privacy concerns. In addition, research could develop technologysolutions to constrain each stage of data life cycles for anonymising collected data, restrictingintegration of some data, or erasing data as they reach an age threshold (Kobsa, 2007).

Fourth party relationship research

Analysing fourth parties is more difficult than researching transaction activities because of theclandestine and mostly illegal nature of the activities. To effectively research fourth partiesrequires a qualitative and sociological approach to hacker research. Using techniques such asgrounded research and snowball sampling, one might analyse hacker culture and motivations.This research might develop a historic point–counterpoint of hacker activities and the socialresponses to those activities. The goal of the research could be to develop guidelines forcompanies of responses that do not lead to escalation of hacking activity.

Public awareness

The public remains largely ignorant of spyware that resides on their own PCs, monitoring theiractivities (Paine et al., 2006). However, public concern can be raised by accounts of privacytransgressions and once concern is raised, measures to change organisations’ behaviours canbe taken. The outcry to Facebook’s 2010 ‘instant personalization’ program is one such publicobjection that caused changes to Facebook practices (Debatin et al., 2009). Although Face-book’s shifting privacy policies are a concern, there is some evidence that personal informationdisclosure overrides privacy concern (Belanger & Crossler, 2011; Pratt, 2011).

Research to raise public awareness of issues and potential workarounds or solutions wouldaid the general public by providing them information for action. This research could both informand define the degree of resistance to the new technology-enabled privacy losses, recom-mending adjustments to individuals’ activities that address concerns (Malhotra et al., 2004).Further, research on mechanisms to allow ‘pay for privacy’ also may be effective (Tsai et al.,2011).

Policy formation and global commerce

Philosophical and practical differences exist between control of privacy and regulations thatpertain to second, third and fourth parties. For instance, organisational self-regulation is an

PIP and emerging technologies 13

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 14: Personal information privacy and emerging technologies

oxymoron in most countries especially as relating to Fair Information Practices (Culnan, 2000).Research is needed to catalogue the parties’ differences and to identify laws regulating thirdparty transactions, inhibiting fourth party access and identifying issues relating to untestedaspects of third party data collection and integration. By comprehensive review of laws andrelated inconsistencies across countries lawmakers and policy makers can more easily evalu-ate needs to ensure their country’s privacy needs and provide safeguards and laws toimplement those needs at home and with international trading partners (cf. Australian Gov-ernment, 2008). Further, appeal to firms’ motivations for maintaining their customers’ privacycan promote adoption of good practice (Greenaway & Chan, 2005). Perhaps the triple bottomline (Savitz, 2006) should be extended from economic, social and environmental foci to includea fourth bottom line for customer privacy. In any case, cataloguing privacy laws and theircoverage would facilitate the discussion to coordinate global action by both multinationalcorporations and governments.

SUMMARY

Past privacy models integrate relationships of individuals to organisations with which theytransact and share personal information. This paper presents an expanded model of PIP,accounting for data from the time data leave the individual and all of the interorganisationalrelationships relating to that data.

Existing privacy research analyses transactions between individuals and organisations.The expanded model presented in this paper includes the other organisations that areparties to those transactions. The model also allows for a new aspect of PIP – thatpersonal data have a life of their own. After its movement from first party individualto second party vendor/provider, data move to third party integrators who develop anindividual history that incorporates significant public and private data. The model highlightsinterorganisational data sharing and enables discussion of shortcomings of currentprivacy practices. Emerging technologies, demonstrate how new nano-sized technologiesfor location awareness and programmable remote action continue to evolve privacyissues.

Several areas of research are relevant to articulating further the issues relating to theexpanded privacy model. The areas of research discussed include the decision calculus, thenature of interorganisational relationships, data importance, data collection, data life cycles,need for privacy-preserving technologies to be embedded in new digital artefacts and policyformation. These research topics cover areas of privacy concern for which little PIP-relatedbody of research can be found.

The perspective is that personal privacy is important but it must counterbalance realitiesof escalating terrorism and a need for some personal privacy erosion in the interest of socialgood. However, maintaining a balance between individual control of personal informationand protection of societal needs should be a public discussion informed by further privacyresearch.

14 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 15: Personal information privacy and emerging technologies

REFERENCES

Albrecht, K. & McIntyre, L. (2005) RFID: the big brother bar

code. ALEC Policy Forum, 6, 49–54.

APEC (2005) APEC Privacy Framework, Asia-Pacific

Economic Cooperation (APEC). [WWW document]. URL

http://www.apec.org/Groups/Committee-on-Trade-and-

Investment/~/media/Files/Groups/ECSG/05_ecsg_

privacyframewk.ashx (accessed 15 February 2012).

Australian Government (2008) Australian Privacy Law and

Practice (ALRC Report 108). Australian Government,

Sydney, Australia. [WWW document]. URL http://

www.alrc.gov.au/publications/report-108 (accessed 15

February 2012).

Awad, N.F. & Krishnan, M.S. (2006) The personalization

privacy paradox: an example evaluation of information

transparency and the willingness to be profiled online for

personalization. MIS Quarterly, 30, 13–28.

Banisar, D. (2011) Privacy laws map. [WWW document].

URL https://www.privacyinternational.org/global-data-

protection-map (accessed 1 June 2008).

Belanger, F. & Crossler, D. (2011) Privacy in the digital

age: a review of information privacy research in informa-

tion systems. MIS Quarterly, 35, 1017–1041.

Bouchaud, J. (2010) MEMS Market Rebounds in 2010

Following Two-Year Decline. isuppli.com. [WWW

document]. URL http://www.isuppli.com/MEMS-and-

Sensors/MarketWatch/Pages/MEMS-Market-

Rebounds-in-2010-Following-Two-Year-Decline.aspx

(accessed 15 February 2012).

Bucklin, R. & Sismeiro, C. (2009) Click here for internet

insight: advances in clickstream data analysis in market-

ing. Journal of Interactive Marketing, 23, 35–48.

Bulles, K. (2006) Nanomedicine. MIT Technology Review,

58–59.

Cheung, C., Chan, G. & Limayem, M. (2005) A critical

review of online consumer behavior: empirical research.

Journal of Electronic Commerce in Organizations, 3,

1–19.

Culnan, M. (1993) How did they get my name? An explor-

atory investigation of consumer attitudes toward second-

ary information use. MIS Quarterly, 17, 341–363.

Culnan, M. (2000) Protecting privacy online: is self-

regulation working? Journal of Public Policy and Market-

ing, 19, 20–26.

Culnan, M. & Armstrong, P. (1999) Information privacy

concerns, procedural fairness, and impersonal trust: an

empirical investigation. Organization Science, 10, 104–

115.

Culnan, M.J. & Bies, R.J. (2003) Consumer privacy: bal-

ancing economic and justice considerations. Journal of

Social Issues, 59, 323–342.

DARPA (2002) DarpaTech 2002 symposium: transforming

fantasy, U.S. defense allied research projects agency.

[WWW document]. URL http://www.earthpulse.com/

epulseuploads/articles/DarpaII.pdf (accessed 15 Febru-

ary 2012).

DataLossDB.Org (2011) Data loss database. Open

Security Foundation. [WWW document]. URL http://

DataLossDB.Org (accessed 1 December 2008).

Debatin, B., Lovejoy, J., Horn, A. & Hughes, B. (2009)

Facebook and online privacy: attitudes, behaviors, and

unintended consequences. Journal of Computer-

Mediated Communication, 15, 83–108.

Denning, D. & Baugh, W. (1999) Hiding crimes in cyber-

space. Information, Communication and Society, 2,

251–276.

Dillon, G. & Torkzadeh, G. (2006) Value-focused assess-

ment of information system security in organizations.

Information Systems Journal, 16, 293–314.

Dinev, T. & Hart, P. (2006) An extended privacy calculus

model for E-commerce transactions. Information

Systems Research, 17, 61–80.

Drennan, J., Mort, G. & Previte, J. (2006) Privacy, risk

perception, and expert online behavior: an exploratory

study of household end users. Journal of Organizational

and End User Computing, 18, 1–22.

EU (1995) Directive 95/46/EC of the European Parliament

and of the Council of 24 October 1005 on the protection

of individuals with regard to processing of personal data

and on the free movement of that data, Council of

the European Union (EU). [WWW document]. URL

https://www.cdt.org/privacy/eudirective/EU_Directive_.

html (accessed 15 February 2012).

Finkenzeller, K. (2003) RFID Handbook: Fundamentals

and Applications in Contactlesss Smart Cards and

Identification, 2nd edn. John Wiley & Sons, Hoboken,

NJ, USA.

Giorgia, G., Teunissena, P., Verhagena, S. & Buista, P.

(2010) GNSS remote sensing: testing a new multivariate

GNSS carrier phase attitude determination method for

remote sensing platforms. Advances in Space

Research, 46, 118–129.

Greenaway, K. & Chan, Y. (2005) Theoretical Explana-

tions for Firms’ Information Privacy Behaviors. Journal of

the Association for Information Systems, 6, 171–198.

PIP and emerging technologies 15

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 16: Personal information privacy and emerging technologies

Article 7. [WWW document]. URL http://aisel.aisnet.org/

jais/vol6/iss6/7 (accessed 14 October 2010).

Gupta, D.K., Horgan, J. & Schmid, A.P. (2009) Terrorism

and organized crime: a theoretical perspective. In: The

Faces of Terrorism: Multidisciplinary Perspectives,

Canter, D. (ed.), pp. 123–136. Wiley-Blackwell, Oxford,

UK.

ieMarketResearch.com (2010) 3Q.2010 global GPS navi-

gation and location based services forecast, 2010–2014:

global market for GPS navigation and location based

mobile services to rise to $13.4 billion in 2014, a CAGR

of 51.3%. ieMarket Research.com. [WWW document].

URL http://www.researchandmarkets.com/reports/

1284865/3q_2010_global_gps_navigation_and_

location_based (accessed 14 September 2011).

Jotterand, F. (2010) Emerging Conceptual, Ethical and

Policy Issues in Bionanotechnology. Springer Publish-

ing, NY, NY, USA.

Knapp, K.J., Marshall, T.E., Ratner, K. Jr & Ford, F.N.

(2007) Information security effectiveness: conceptual-

ization and validation of a theory. International Journal of

Information Security and Privacy, 1, 37–60.

Kobsa, A. (2007) Privacy-enhanced personalization. Com-

munications of the ACM, 50, 24–33.

Kontzer, T. & Greenemeier, L. (2006) Sad state

of data security. Information Week. January 2, 19–

22.

Lichtblau, E. (2006) Europe panel faults sifting of bank

data. The New York Times. September 25.

Loch, K. & Conger, S. (1996) Evaluating ethical decision

making and computer use. Communications of the ACM,

39, 74–84.

Malhotra, N., Kim, S. & Agarwal, J. (2004) Internet Users’

Information Privacy Concerns (IUIPC): The Construct,

the Scale, and a Causal Model. Information Systems

Research. 15, 336–335.

Mason, R.O., Mason, F., Conger, S. & Pratt, J.H. (2005)

The connected home: poison or paradise. Proceedings

of Academy of Management Annual Meeting, Honolulu,

HI, August 5-10.

McKnight, H., Choudhury, V. & Kacmar, C. (2004) Dispo-

sitional and distrust distinctions in predicting high and

low risk internet expert advice site perceptions.

E-Service Journal, 3, 35–59.

Michael, M.G. & Michael, K. (2007) A Note on Überveil-

lance: A Note on Uberveillance, From Dataveillance to

Überveillance and the Realpolitik of the Transparent

Society. Proceedings of The Second Workshop on

Social Implications of National Security, Wollongong,

Australia, 9–26. October 26.

Naisbitt, J., Philips, D. & Naisbitt, N. (2001) High Tech/High

Touch: Technology and Our Search for Meaning. Nicho-

las Brealey Publishing, London, UK.

OECD (2007) Privacy online: policy and practical guid-

ance, organization for economic co-operation and

development (OECD). [WWW document]. URL

http://www.oecdbookshop.org/oecd/display.asp?K=

5LMQCR2K3BS2&lang=EN&sort=sort_date%2Fd&

stem=true&sf1=Title&st1=privacy+online&sf3=

SubjectCode&st4=not+E4+or+E5+or+P5&sf4=

SubVersionCode&ds=privacy+online%3B+All+

Subjects%3B+&m=1&dc=2&plang=en (accessed 15

February 2012).

Ohkubo, M., Suzuki, K. & Kinoshita, S. (2005) RFID

privacy issues and technical challenges. Communica-

tions of the ACM, 48, 66–71.

Paine, C., Joinson, A.N., Buchanon, T. & Reips, U.-D.

(2006) Privacy and self-disclosure online. Proceedings

of the ACM SigCHI Conference. 1187–1192.

Pratt, J.H. (2011) Privacy loss: an expanded model of legal

and illegal data exchange. In: Security and Privacy

Assurance in Advancing Technologies: New Develop-

ments, Nemati, H.R. (ed.), pp. 30–44. Information

Science Reference, Hershey, PA, USA.

Pratt, J.H. & Conger, S. (2009) Without permission: privacy

on the line. International Journal of Information Security

and Privacy, 3, 31–46. January-March.

RFID World Canada (2011) RFID market continues to

grow as demand increases in 2011. RFID World

Canada. [WWW document]. URL http://www.rfidworld.

ca/rfid-market-continues-to-grow-as-demand-increases-

in-2011/394 (accessed 14 September 2011).

Robert, J. (2009) Nanoscience, nanoscientists, and con-

troversy. In: Nanotechnology & Society: Current and

Emerging Ethical Issues, Alihoff, F. & Lin, P. (eds), pp.

225–239. Springer Publishing, NY, NY, USA.

Savitz, A. (2006) The Triple Bottom Line: How Today’s

Best-Run Companies Are Achieving Economic, Social

and Environmental Success – and How You Can Too.

John Wiley & Sons, Inc., San Francisco, CA, USA.

Smith, H. (2004) Information privacy and its management.

MIS Quarterly Executive, 3, 201–213.

Smith, H., Milberg, S. & Burke, S. (1996) Information

privacy: measuring individuals’ concerns about organi-

zational practices. MIS Quarterly, 20, 167–196.

Smith, H., Dinev, T. & Xu, H. (2011) Information privacy

research: an interdisciplinary review. MIS Quarterly, 34,

989–1015.

Solove, D.J. (2004) The Digital Person. NYU Press, NY,

NY, USA.

16 S Conger et al.

© 2012 Blackwell Publishing Ltd, Information Systems Journal

Page 17: Personal information privacy and emerging technologies

Tsai, J., Egelman, S., Cranor, L. & Acquisti, A. (2011) The

effect of online privacy information on purchasing behav-

ior: an experimental study. Information Systems

Research, 22, 254–268.

Wakefield, R.L. & Whitten, D. (2006) Examining user per-

ceptions of third-party organization credibility and trust in

an E-retailer. Journal of Organizational and End User

Computing, 18, 1–19.

Wang, H., Lee, M. & Wang, C. (1998) Consumer privacy

concerns about internet marketing. Communications of

the ACM, 41, 63–70.

Warren, E. & Brandeis, L. (1890) The right to privacy.

Harvard Law Review, 4, 193–220.

Warneke, B. & Pister, K. (2004) An ultra-low energy micro-

controller for smart dust wireless sensor networks. Pro-

ceedings of the International Solid-State Circuits

Conference 2004 (ISSCC 2004). Feb. 16–18, San

Francisco. [WWW Document]. URL www-bsac.eecs.

berkeley.edu/archive/users/warneke-brett/pubs/17_4_

slides4.pdf

Weiser, B. (2008) Murder suspect has witness. The

New York Times. [WWW document]. URL http://www.

nytimes.com/2008/11/19/nyregion/19metrocard.html?_

r=1&pagewanted=all (accessed 1 June 2008).

Westin, A. (1970) Privacy and Freedom. The Bodley Head

Ltd, London, UK.

Wheatley, K.K. & Wilemon, D. (1999) From emerging tech-

nology to competitive advantage. Proceedings of the

Portland International Conference on Management of

Engineering and Technology (PICMET ’99). July 25–29

1 27.

Wright, D., Gutwirth, S., Friedewald, M. & Vildjiounaite, E.

(eds) (2008) Safeguards in a World of Ambient Intelli-

gence: The International Library of Ethics, Law, and

Technology. Springer, London, UK.

Biographies

Sue Conger, Ph.D., is on the Faculty of University of

Dallas as Director, Information & Technology Management

program. She publishes extensively, speaks at confer-

ences globally and is a Visiting Professor at Rhodes Uni-

versity in South Africa. Dr Conger is active in academic

and industry associations. She is on five editorial boards

and the program and planning committees for several con-

ferences. Her research interests are IT service manage-

ment, IT-related privacy, emerging technologies and

innovative uses of IT in organisations. She authored The

New Software Engineering (1994) and Planning & Design-

ing Effective Web Sites (1997) and Process Mapping &

Management (2011).

Joanne Pratt is President of her consulting firm, Joanne

H. Pratt Associates. She is a futurist recognised for

research on new work patterns enabled by technology. Her

research includes the impacts of remote work on corporate

work patterns, work/life and privacy issues. She was the

only US member of the European Union funded team

studying telework and information-age employment. She

has conducted research for the US Small Business Admin-

istration, the Bureau of Transportation Statistics and other

federal agencies, and has led research projects for private

sector clients. Her research publications include

E-Biz.com: Strategies for Small Business Success, Count-

ing the New Mobile Workforce, Impact of Location on Net

Income and Homebased Business: the Hidden Economy.

Her degrees are from Oberlin College and Harvard

University.

Karen D. Loch, Professor in the Robinson College of

Business, Georgia State University, holds a PhD in MIS

and an MA in French language and literature from the

University of Nebraska-Lincoln. Her research interests

include international IT transfer, social and ethical con-

cerns of ICT, and CSR and sustainability initiatives in

MNCs. Loch has published in journals such as MIS Quar-

terly, Information Systems Journal, Communications of

the ACM, Journal of Global Information Management,

Academy of Management Executive and Journal of Busi-

ness Ethics. She is a board member for the World Trade

Center Atlanta and the Japan American Society of

Georgia.

PIP and emerging technologies 17

© 2012 Blackwell Publishing Ltd, Information Systems Journal