fogg

Embed Size (px)

Citation preview

  • 8/3/2019 fogg

    1/8

    U-II 98.18-23 APRIL 1998 PAPERS

    Persuasive Computers:Perspectives and Research Directions

    BJ Fogg

    Stanford UniversityBox 8333, Stanford, CA 94309 [email protected], [email protected]

    www.captology.org

    ABSTRACT

    The study of computers as persuasive technologies (calledcaptology) was introduced at CHI 97 as a new area ofinquiry. This paper proposes definitions, perspectives, andresearch directions for further investigation of this field Apersuasive omputer s an nteractive technology that attemptsto changeattitudes or behaviors n some way. Perspective 1describes how computers can inherit three types of

    intentionality: endogenous, exogenous, and autogenous.Perspective 2 presents the Functional Triad, whichillustrates that computers can function as persuasive tools,media, and social actors. Perspective 3 presents a levels ofanalysis approach or captology, which include varying levelsfrom individual to societal- Perspective 4 suggests a simplemethod for exploring the design space for persuasivecomputers. Perspective 5 highlights some ethical issuesinherent in persuasive computing. The paper concludes byproposingsevendirections or further research nd design.

    Keywords

    persuasion, captology, media, computers as social actors,ethics,designmethods,computersas persuasive echnologies

    INTRODUCTIONAt CHI 97, a special nterest group meeting gathereda numberof participantswho were nterested n exploring the domain ofcomputersand persuasion [a- We agreed o call this areacaptology (built from an acronym for Computers AsPersuasive Technologies), with the graphic in Figure 1serving as a reference point for this domain.The discussionon captology at the CHI 97 SIG proved fruitfuland enlightening, with participants concurring that captologywas an intriguing area for further research and design. Thegroup also agreed hat this domain had not yet been adequatelydefinedor addressed y researchers nd practitioners of human-computer nteraction. We found that our discussion suffered attimes becausewe lacked key definitions and frameworks orunderstanding aptology.The purpose of this paper, therefore,is to contribute to the CHI communitys understandingofpersuasive computing by proposing definitions, perspectives,and research irections or the field of captology.

    lcnnission to make digimlkurl copies ofzdl or pti of this msterial forpxsonal or clssroom use is gmnted witbout fee provided that the copiesare not made or diibuted for profit or commcrcird advantqe, the copy-rightnotic+tbetitleoftbepubliwtionand its&teappear,andnoticekgiven that copyright is by permission oftbeACh% Inc. To copy otbentise,lo rcpubliih, lo post,on servers or to redistribu te to Iii requires -qecificp~mtission andlorfezCHI 9SLosAngelesCA USAcopyright99s o-g9791-975-o19W 4.s.00

    Sun Microsystems901 San Antonio Road, MPK17-105

    Palo Alto, CA 94303 [email protected]

    Figure 1: Captology describes the shaded area where computingtechnology and persuasion overlap.

    This paper irst presents ive perspectives on computers andpersuasion.They include the following:

    Perspective : Definition of persuasive omputersPerspective : A functionalview of persuasiveomputersPerspective : Levels of analysis or captologyPerspective : The design space or persuasive echnologiesPerspective : Ethicsof computershat persuade

    Each of these five perspectives provides a different way toview persuasivecomputers,while also describingexamples ofrelevant technologies.After setting forth the five perspectiveson persuasive echnologies, his paper concludes by outliningseven profitable directions for further research n the area ofpersuasive omputers.To be clear, this paper makes a contribution by articulating arange of approaches o captology. It is the role of later work toexpandand revise he ideas proposed n this paper.

    PERSPECTIVE 1:DEFINITION OF PERSUASIVE COMPUTERS

    What is a persuasive computer?

    Simply put, a persuasive computer is an interactivetechnology that changesa persons attitudes or behaviors.Thisdefinition works well in many cases,but a more thoroughdefinition gives a better understanding of persuasivecomputing.The psychology literature suggests many definitions for theword persuasion [e.g., 24, 361. After reviewing the work ofpersuasion cholars, ve synthesized he various definitions todefine persuasion as an attempt to shape, reinforce, orchange behaviors, eelings, or thoughts about an issue, object,or action.

    225

  • 8/3/2019 fogg

    2/8

    - - .A. - _ - -- - ,- . . . .ii ; r- -- - - --x^- _-,. .&:..-,.. - _ I_-,

    PAPERS CHI 98 . 18-23 APRIL 19

    Persuasion and intentionalityOne key point in this definition is that true persuasion mpliesan intent to change attitudes or behaviors; in other words,persuasion equires ntentionality. Therefore, not all behavioror attitude change s the result of persuasion. For example, arain storm may causepeople to buy umbrellas, but the stormis not a persuasive event because t has no intentional@associatedwith it. (However, if an umbrella manufacturercould somehowcause ain, then the rain storm might qualifyas a persuasive actic.)Becausemachines do not have intentions [S], a computerqualifies as a persuasive echnology only when those whocreate,distribute, or adopt he technology do so with an intentto affect human attitudes or behaviors. To be clear, thepersuasive ature of a computerdoesnot residewith the objectitself; instead, a computer being classified as persuasivedepends n the context of creation, distribution, and adoption.I propose hat if an ntent to change attitudesor behaviors s afactor in the creation, distribution, or adoption of atechnology, hen that technology nherits a type of intent fromhuman actors.

    Three ypes of intent- endogenous, exogenous, andaufogenousFor the purposes of captology, I propose three kinds ofinherited persuasive intent: endogenous, exogenous, andautogenous. According to my definitions, a computertechnology nherits endogenous intent when a designer orproducer reates technology with intent to persuade sers nsome way. A computer technology inherits exogenousintent when one person provides another person with acomputer technology in an attempt to change hat personsattitudes or behaviors. A computer technology inheritsautogenous intent when a person chooses o use or adopt atechnology in order to change his or her own attitudes or

    behaviors.Table 1 makes his idea clearer.

    I ype of intent

    external factors

    autogenous

    self-produced

    Where intent comes Examplefrom

    Those who create or Health-Hero video games areproduce the designed to persuadeinteractive children to develop goodtechnology health habits [IT&

    Those who give A mother may give her son aaccess to or Palm Pilot PDA in hopes thatdistribute the he will become moreinteractive organized.technology to others

    The person adopting A person may by and use aor using the calodecounting computerinteractive device to help change his ortechnology her own eating behavior.

    Table 1: Three types of intent wfth examples

    Although the abovecategoriesaim to distinguish among ypesof persuasiveechnologies, recognize hat these categories renot always precise, and they are not always mutuallyexclusive. ndeed,making nferencesabout ntentions is trickybusiness-we may infer intent where there s none, or we mayfail to perceive intent when intent does indeed exist.

    Furthermore, it is quite possible that a given interacttechnology may fall into more than one category.Despite potential ambiguities, I find these hree categorieshelpful better understanding the range and roles of persuasivcomputing echnologies.

    PERSPECTIVE 2:A FUNCTIONAL VIEW OF PERSUASIVECOMPUTERSWhile Perspective1 provides an intentional frameworkpersuasive computers, Perspective 2 presents what I callfunctional view. To explain Perspective2 clearly, I fidescribe he basics of the Functional Triad. I then show hthis framework can provide key insights into the studypersuasive omputers.

    The Functional TriadI propose hat todays computers unction in three basic wayas tooIs, as media, and as social actors. Researchersdesigners ave often discussed ariants of these unctions e.g.18,22,33], usually as metaphors or computer use. HowevI suggest hat these hree categories are more than metaphothey are basic ways that people view or respond o computintechnologies. refer to these hree areas as functions.As a tool, the computer (or the computer application system)provides humans with new ability or power, allowpeople o do things they could not do before, or to do thinmore easily [28,29].Computersalso function as media [13, 30, 321, a role thhas becomemore apparentand mportant in recentyears 3341. As a medium, a computer can convey either symbocontent (e.g., ext, data graphs, cons) or sensory content (e.real-time video, simulations, virtual worlds).Computerscan also function as social actors [12, 16, 2321.Usersseem o respond o computers as social actors 2when computer technologies adopt animate characterist(physical features, emotions, voice communication), planimate roles (coach, pet, assistant, opponent), or followsocial rules or dynamics greetings,apologies, urn taking).

    Mapping he funcfionsOne way to view these hree functions simultaneously is map the three categories nto two dimensions. Of course,all but the most extremecases, hese unctions mix and bluin any one given interactive technology. Howevconsciouslymapping nteractive technologies nto a trianguspace call the Functional Triad generatesnsight into troles and relationships of different computer technologFigure 2 represents the Functional Triad with somprototypical examples.

    Persuasion and the functional view of computersAlthough the functional triad is a useful frameworkunderstanding omputer echnologies n general, adopting hfunctional view of computers yields specific insights analyzingpersuasive nteractive technologies.

    226

  • 8/3/2019 fogg

    3/8

    SocialActor

    MediumFigure 2: The Functiond Triad with examples

    By viewing a computer echnology as a tool, one can then askhow tools can be persuasive devices. n other words, KOW dotools change attitudes or behaviors? While this questiondeservesmore detailedexploration, one preliminary answer sthat tools can be persuasive by (1) reducing barriers andthereby ncrease de likelihood of a certain behavior [2, 111,(2) increasingself-efficacyby making a certain behavior seemachievable [2, 171, (3) providing information that allowsinformed ,decisions 19], and (4) shaping a persons mentalmodel by channeling behavior in a certain pattem-One could also pose similar questions about the other twofunctional areas or computers: What makes computers asmedia persuasive? and What makes computers as social

    actorspersuasive? 4,5,20]. While this paper will not fullyanswer hese questions, Table 2 highlights some persuasiveaffordancesn each of the three unctional areas:

    Function

    computer asPool orinstrument

    Essence

    increasescapabilities

    Persuasive affordances

    l reduces barriers (time, effort,cost)

    l increases self-efficacy

    l provides information for betterdecision making

    l changes mental models

    computer as providesmedium experiences

    l provides ffrst-hand learning,insight, visualization, resolve

    l promotes understandiig ofcause/effect relationships

    l motivates through experience,sensation

    computer associal actor

    createsrelationship

    l establishes social norms

    l invokes social rules anddynamics

    . provides social support orsanction

    Table 2: Three compu ter functions and their persuasive affordances

    Examples in each functional areaAs the Table 2 shows, computers unctioning as tools, media,or social actors can change attitudes and behaviors throughdifferent means. Examples of persuasive computers in eachcategory ollow:

    Example of computeras persuasive tool

    Polar Heart Rate Monitor [www.polar.fi]: This exercisedevice soundsan alarm when a persons heart rate falls outsidea pre-set zone. The device not only can motivate a person tochange behavior during exercise,but it may also increase elf-efficacy about ones ability to exercise effectively (thusincreasing ikelihood of adherenceo an exerciseprogram).

    Eirample of computer as persuasive mediumHIV Roulette [www.exploratorium.edu]: A computerizedexhibit at the San Francisco Exploratorium allows visitors tomake hypothetical choices about sexual behavior and thenvicariously experiencehow those choices would affect theirchancesof contracting HIV. This exhibit attempts o motivate

    people o avoid unsafe sexual behaviors.

    Example of computeras persuasive social actor5-A-Day Adventures [www.dole5aday.com]: his CD-ROM, designed or children, features a character named HBwho guides users through the application. HB encouragesusers o eat more fruits and vegetables, and he praises hem forcompliance.

    Functional view illuminates affordancesThe functional view of persuasive computers offers keyinsights nto the different affordances f persuasive omputers.Not only does this framework help differentiate amongpersuasive omputers as well as among different aspects f asingle persuasive echnology), but this perspective can alsoprovide researchers ith guidance or primary and secondaryresearch fforts.

    PERSPECTIVE 3:LEVELS OF ANALYSIS FOR CAPTOLOGYThe third perspective propose or understanding computersaspersuasive echnologies s one that directs specific attention tovarious levels of analysis for both computer use andbehavior/attitude hange.

    Levels of analysis: Not just individuals

    Most people think of computer use as it applies to anindividual-an individual interacts with or through acomputer.The same holds true for notions of persuasion:Anindividual changeshis or her attitudes or behaviors. In bothcases, his level of analysis focuses on the individual. Eventhough both computeruse and persuasion are often thought tobe most germane to individuals, other levels of analysisbesides he ndividual level can generate mportant insights inresearchingand designing computing echnologies.In recent years, HCI researchers ave done admirable work inconceptualizing computeruse on a level of analysis other thanindividual [see reference 10 for resources; see alsowww.acm.org/sigchscw98]. Usually referred to as

    337

  • 8/3/2019 fogg

    4/8

    PAPERS CHI 98 l 18-23 APRIL 1

    computer-supported ooperative wore this area examinescomputing as t pertains o groups or organizations group andorganizational evels of analysis). It is important for the HCIcommunity to continue defining and exploring the variouslevels of analysis elated o computeruse.The fields of public information campaigns and healthpromotion interventions have also done notable work inconceptualizingdifferent evels of analysis [e.g., see 11. Thelevels of analysis rom these ields include-

    * intraindividual evell interindividual evel(dyads, couples, friends)l family levell group levell organizational levell community levell societal level

    To be sure, previous HCI work has examinedcomputing at allof the above evels, but our field can benefit from more clearlydifferentiating hesedifferent evels and rom deliberatelyusingthese evels more often as a driving force in researching ordesigning computing technologies. HCI and captology havemuch to gain from viewing computing from these variouslevels of analysis.Different levels of analysis cause different variables to besalient, which generates ew insights in research or design.The different levels also draw on different theoreticalframeworks, methodologies, and measures. For example,persuasion on the individual level is conceptualized andmeasured ifferently than persuasion on the community level.Although examining each evel as it applies to persuasivecomputers s beyond he scope of this paper, Table 3 givessomeexamples o help illustrate this point.

    Level of analysis Computer artffacb Behavior change ofapplication, or system interest

    individual Baby Think It Over To motivate an

    (A computerized doll that individual to be more

    requires care. Designed to sexually responsible

    teach teens about the so he or she doesnt

    diiculties of parenting become a teenage

    [ww.btio.com)) parent.

    family Home video game system To increase family

    (A parent may want her interactions (possibly

    family to interact more measured by number

    wilh each other rather than of conversations,

    passively watching TV.) frequency of eyecontact, etc.)

    organixational Remote work system To reduce

    (A computer system that absenteeism.

    allows people to work To increaseeffectively from home.) employee retention.

    community Ride-sharing system To reduce the

    (A community paging community% use of

    nehvork that allows people private cars.

    to coordinate ridesharing To reduce rafficwith minimal prior congestion andplanning.) pollution.

    Table 3: Levels of anal@, tecbnologi~~, nd behavioral outcc~mes

    The table shows hat, at least n principle, certain technologare best suited for different levels of analysis-some aindividual-level technologies and some are community-letechnologies. (Admittedly, todays default view is bitoward computers as individual-level devices; howevadvancesn online technology are making community societal nteractive technologies more common. To be surthe rise of the Internet has helped shift our focus to larglevels of analysis.) Furthermore, the table implies thbehavior change is not just an individual-level variafamilies change, organizations change, and communitichange.Although this section of the paper only begins to explore benefits of conceptualizing persuasivecomputers at differlevels of analysis, the potential benefit to HCI and captolois substantial. Public information campaign researchers aaccomplished much by conceptualizing interventions different evels of analysis 1,261.The sameshould be true fthose of us working in HCI-related fields, especially thinterested n researching r designing nteractive technologthat promotebeneficial social changes 27].

    PERSPECTIVE 4:THE DESIGN SPACE FOR PERSUASIVETECHNOLOGIES

    The previous hree perspectives ropose rameworks or bettunderstanding ersuasive computing technologies. Althouthe ideas rom the previous sections serve as analytical toolto study existing persuasive computing technologies, theframeworks an also serve a useful generative purpose. other words, he previous rameworkscan help create deas onew types of persuasive omputing technologies. When usfor their generativepowers, the three previous perspectivcontribute to my fourth perspective: the design space fpersuasive echnologies.

    The captologydesignspace s large and relatively unexploreBecause aptology presents a new perspective on the role computers, it is relatively easy to conceptualize netechnologies,or to add persuasive enhancements o existitechnologies.To approach his design space methodologically, I propostwo-step process: 1) identify domains and ssues of interesand (2) use he above rameworks o gain new perspectives.

    Identifying domains and issuesMy searches n the academic iterature indicate that mosexisting computer systems which attempt to changeattitudor behaviors fall into just a few categories: health, menhealth, and education [e.g., see 311. My efforts to identipersuasive nteractive echnologies n todays marketplace lsshowed health as a central domain. Of course, health aneducation are still excellent areas or research and design opersuasive technologies, but many other domains remairelatively unexplored.When my students and I set out to identify appropridomains for captology, we first made an extensive list pervasivesocial or personalproblems. We then clustered hproblems and nested them under broader headings. In thisexercisewe developed long (though not exhaustive) ist domains n which persuasive omputer echnologiesmay serv

    228

  • 8/3/2019 fogg

    5/8

    a useful role. Table 4 contains a brief excerpt from our list(seewww.captology.org or more domains).

    I-Domain Issues

    safety . safe driving

    l wearing bike helmets

    l substance abuse

    endronment l recycling. conservation

    l bicycle commuting

    personal management . time management. study habits

    . persona l finance

    Figure 4: A sample f domainsor persuasive computers

    Using captology frameworksOnce a person selects a domain or issue, the next step in,exploring he design space or persuasive computers s to usethe captology rameworksdescribed n this paper to generate

    variety of perspectivesand deas about the selected omain orissue. For example, one might ask the following questions:Levels of analysis: What do different levels of analysisoffer to this issue or problem? Which level of intervention islikely to bemost productive?Functional Triad: What functional areas will thepersuasive echnology leverage? Should it focus on computeras ool, medium, social actor, or combinations of the three?Inteutionality: For this issue or domain, what are thepossibilities for endogenously,exogenously, or autogenouslypersuasive nteractive technologies?An example of using these iameworks n design explorationfollows.

    Example: Recycling paper productsFor this example, have chosen recycling as the issue toaddress using persuasive technology. Specifically, thisexample ocuses on recycling paperproducts.Levels of analysis perspective: One can profitablyconceptualize ecycling paper at various levels: individual,family, organization, or community. Of these levels, I havechosen the organizational evel for this example. The goal,then, is to explore the possibilities for creating a persuasivetechnology for increasing paper recycling behavior at theorganizational level. (At this point one might then study

    various ssues or more insight: organizational culture, barriersto recycling, etc.)Functional Triad perspective: One would also explorehow the three different functions of computers tools, media,social actors) might profitably motivate paper recyclingbehavior. Even on the organizational level, many ideas arepossible. For this example, one might propose he following:A computeracting as a tool could weigh the recycled paper nthe organization and calculate he number of trees saved romrecycling efforts. Other types of calculations are also possible(trees saved per week, oxygen prodncedby the saved trees,erc.). Knowing this information mightmotivate people in theorganization o recycle their paperproductsmore.

    The computeracting as a medium could then render an imageof a tree growing as paper s added o the recycling container.(The mage might be on the recycling container itself.) Seeingthe tree grow in size might provide motivation to recyclebecause he image helps people visualize a cause/effectrelationship in recycling.The computer as social actor could audibly thank peopleeach ime they addedpaper o the recycling container. A moreextreme idea s to have an anthropomorphic ree that smiles ortells humorous stories n response o recycling efforts .Intentionality perspective: The recycling technologydescribed bove would inherit both endogenous persuasiveintent (from the designers of the device) and exogenouspersuasive ntent (from the management eam hat provides hetechnology for rest of the organization).To further explore the design space, one might profitably ask,How might au autogenously persuasive technologyfunction in this organizational setting? Because his designexample ocuses on the organizational evel, autogenous ntentwould imply that the organization as a whole (not just themanagement)would choose o adopt a technology to changetheir collective recycling behavior.

    Design explorations generate insightExploring the design space or persuasive computers s oftenan enlightening process.Especially if one has no intentions ofturning ideas into marketable products, one can push theenvelope of the design space to illuminate new theoretical.areasof persuasivecomputing. As the above example shows,the two-stepmethod of (1) identifying domains/issues nd hen(2) applying the captology frameworks s a simple processthat yields rich-and often surprising-results.

    PERSPECTIVE 5:ETHICS OF COMPUTERS THAT PERSUADEEthics is yet another perspective from which to viewcomputers as persuasive technologies. Adopting an ethicalperspective on this domain is vital because he topic ofcomputers and the topic of persuasion both raise importantissues about ethics and values [7, 15,361. Therefore,when thedomains of computers and persuasion merge (as they do incaptology), the ethical issues are sure to play a crucial role.This section presentsexamplesof some ethically questionabletechnologies,proposesethical responsibilities for designers fpersuasive omputers and captology researchers, nd discussesthe importanceof educatingabout persuasion.

    Persuasive technologies that raise ethicalquestionsMost examples of persuasive computers in this paper atetechnologies that promote widely held conceptions of thegood [23]: a computerizeddoll motivates responsiblesexualbehavior, a CD-ROM encourages ating fruits and vegetables,a ride-sharing technology cuts down on pollution in a city.However, persuasive technologies can also serve ignoblepurposes.Table 5 contains examples of persuasive omputingthat may be ethically questionable, along with a briefgain/loss analysis for stakeholders (which is, admittedly,highly subjective).

    229

  • 8/3/2019 fogg

    6/8

    PAPERS CHI 98 . 18-23 APRIL 19

    Summary of persuasive Stakeholder analysistechnology in question

    A computertzed slot machine gain: manufacturer and casinosuses animation and narrationto make the gambling

    loss: indiiduals (money, time)

    experience more compelling.

    A computer system records gain: companyemployees activities on the

    Web.loss: employees (privacy, trust)

    A computer system monitors gain: restaurant patronsrestaurant employees handwashing behavior after using

    loss: employees (privacy, trust)

    the restroom.

    A software installation gain: companyseems to require registrationwith the company in order to

    loss: indffidual users (personal

    complete the installation.information)

    The software automaticallydia1.s ompany to dovmloadpersonal information.

    Table 5: A simple takeholder analysis of some persuasive technologies

    I believe that the simple gaitioss analysis in Table 5 helpsshow why the above technologies could be ethicallyquestionable. n most cases above, companies stand to gainprofit or information, while individuals stand o lose money,privacy, or freedom. n contrast, a simple gain/loss analysisfor the other persuasive omputers n this paper would likelyshow gains or all stakeholders,ncluding individuals.

    Ethics of distributing or creating computers tochange attitudes or behaviors

    The ethical implications for those who design persuasivetechnologiesare similar to the ethical implications for other

    persuadersn society (e.g., sales people, coaches, ounselors,religious leaders,etc.) [15]. Because alues vary widely, nosingle ethical system or set of guidelines will serve in allcases, so the key for those designing persuasive nteractivetechnologies is gaining a sensitivity to the ethicalimplications of their efforts. High-tech persuaders ould dowell to base heir designs on a defensibleethical standard Atthe very least,a few core values should apply to all persuasivecomputing designs, such as avoiding deception, respectingindividual privacy, and enhancingpersonal reedom.

    Ethics of studying persuasive computers

    The ethical implications for those who study persuasivecomputersare somewhat different rom those for designers-propose hat those who study persuasive echnologies have aresponsibility to play a watchdog role for the HCIcommunity, n particular, and for technology users, n general.Ethical actions or those who study persuasive omputers allinto four categories:1. Identify artifacts and techniques. By using thevarious frameworks uggestedn this paper, a researcher anbetter identify persuasive technologies, as well as thepersuasive trategies technology uses.2. Examine effectiveness and effects. Researchersshould also assess he effectiveness the intended mpact) and

    the effects (the unintended side-effects) of persuasivtechnologiesor strategies.3. Disclose findings. Those who study persuasicomputers hen have an obligation to disclose heir findings.4. If needed, take or advocate social action. Finallyif a computing artifact is deemed armful or questionablesome egard, a researcher hould then either take social actionor advocate hat others do so.

    Education is the key

    I propose that the best approach regarding the ethics ofcomputersand persuasion s to educatewidely about this nearea of research nd design. Education about captology helppeople in two important ways: First, increasedknowledabout persuasive omputers allows people more opportunitto adoptsuch echnologies o enhance heir own lives, if thechoose.Second,knowledgeabout persuasive omputers helppeople recognize when technologies are using tactics topersuade hem. In the end, the best path for captology is thsameas he path taken by various persuasion esearchers 124, 361: Educate widely about persuasion. This pap

    represents ne effort to bring such ssues o light.

    RESEARCH DIRECTIONS FOR CAPTOLOGY

    Because aptology is a relatively new field of inquiry, maquestionsaboutcomputersand persuasion emain unanswereTo help move he field forward, I proposeseven directions oresearch nd design hat will yield the greatest nderstanding persuasive omputers n the shortest amount of time.

    Seven profitable directions for captology

    Direction A: Captology should focus on interactivetechnologies that change behaviors.

    As statedearlier, my definition of persuasion s an attempt shape, reinforce, or change behaviors, feelings, or thoughtabout an issue, object, or action. Although this is a godefinition, it is too broad to be useful in focusing research design n the early stages of captology. Therefore,at this poipeople involved in captology would do well to purbehavior change as the test and metric for persuasivcomputers.Behavior change s a more compelling metric than attituchange for at least three reasons: (1) behavior change isthought to be more difficult to achieve than attitude chan[15, 361, (2) behavior change is more useful to peopconcerned ith real-world outcomes 9, 31, 361, and (3) oncan measure ehavior change without relying on self-repo[9] (attitudechangemeasures inge on self-reports).

    Direction B: Captoiogy should follow the well-established CHtradition of adopting and adapfing theories and frameworksfrom other fields.While captologyhas he potential to generate ew theories anCameworks-as demonstratedo some degree n this paperthose of us who study computers as persuasive echnologiewould do well to find extant persuasion theories aframeworksand nvestigatehow they apply to captology. Fexample, Aristotle certainly did not have computers n mi

    230

  • 8/3/2019 fogg

    7/8

    7 ->Te .._ -- -

    CHI 98 e 18-23 APRIL 1998

    ;: . _. .~ *._. .I ..i:., :,,,.. _, .e *.-.A ..a; ,. ~I._ ; -. -.-,& 1

    PAPERS

    when he wrote about the art of persuasion, but the ancientfield of rhetoric can apply to captology n interesting ways.The field of psychology-both cognitive and social-has atradition of examining different types of persuasion andinfluence. The theories and methods from psychology transferwell to captology, enriching the field. In addition, the field ofcommunication has a history of examining the persuasiveeffects of media and other types of message sources.Specifically, the applied domain of public informationcampaigns as a set of theories and practices hat give insightinto the study of persuasivecomputers.

    Direction C: Captology should examine and inform the designof interactive technologies that are specialized, distributed, orembedded.While tbetypical computer of today lives on top of a desk andrquires a keyboard and a monitor, people involved incaptology would do well to focus mainly on technologies hatare specialized, distributed, or embedded [for more onthe conceptof ubiquitous computing, see eferences and 341.From my vantage point, the most interesting interactivetechnologies oday seem to fall in at least one of these threecategories.And it seems that most persuasive echnologies ofthe future will not be associatedwith desktopcomputers; heywill be specialized,distributed, or embedded f this is trne,then it would be a relatively poor use of time to examineexisting desktopapplications or to design persuasive programssolely for desktop computers. Persuasive situations occnrmost frequently n the context of normal life activities-notwhen peopleare seated t a desktop computer-

    Direction D: Captofogy shou/d focus on endogenouslypersuasive technologies.Ive proposed three types of persuasive technologies:

    endogenously ersuasive dose designed with persuasivegoalsin mind), exogenously persuasive (those used to persuadeanotherperson), and autogenously persuasive (those used topersuade oneself). Understanding endogenouslypersuasive technologies seems more essential tounderstanding aptolo,v than the other two types. Accordingto my definition, endogenouslypersuasive echnologies werecreatedwith an intent to changeattitudes or behaviors. As aresult, the strategies and techniques o persuade are embeddedin the technology itself. In contrast, the other two types ofpersuasive technologies-exogenous and autogenous-relyheavily on external actors or their persuasivepower.

    Direction E: CaptoIogy can learn from other media but shouldsteer clear of comparing computerpersuasion with persuasionfrom other media.Although captology has much to gain from other mediaresearch such as effects of video, print, radio, etc.), comparingcomputer persuasion with persuasion from other sourcesseems o be a dead end. Many studies have attempted o dothis, and virtually all of them contain serious weaknessesfora longer discussion of this claim, see 141.Although a researcher an clearly determine that computerpro,= X is more persuasive han video Y or pamphlet Z,these results hold only for artifacts X, Y, and Z-not forcomparing computers,videos, and pamphlets n general. One

    problemis

    that toomany

    variables are at play in cross-media

    studies; as a result, no useful theory comes rom this type ofresearch 21].In order to avoid this well-traveled dead-end oad, thoseinvolved in captology research hould compare he persuasiveeffects of one aspect of computing technology with otheraspects f computing technology.

    Direction F: Capto/ogy should focus on both what is andwhat could be.Captology should focus both on existing persuasivetechnologiesand on potential persuasive echnologies. A fairlygood number of persuasive technologies already exist.Research nto the impact, uses, and implications of thesetechnologiescan yield valuable insight for the field.But captology also has a strong generative component.Because his field provides a new lens for viewing computingtechnology, captology generates insights about novelpossibilities for computers, as well as new ways to bringabout attitude and behavior changes.

    Direction G: Captology should be pursued with ethical issuesand implications in mind.The study of computers as persuasive technologies raisesimportant ethical questions. This is understandable andinescapable, iven the fact that persuasion has long been anarea or ethical debate, and that computer echnology has raisedrecent ethical questions. As with any other means ofpersuasion, one could compromise values and ethicsintentionally or unintentionally. Therefore, hose who studypersuasive echnologies should have a sound understanding fthe ethical implications of this field. A subfield on the ethicsof persuasive omputing s a worthy and mportant endeavor.

    SUMMARY

    This paper has defined captology and has articulated fivedifferent perspectives or studying computers as persuasivetechnologies. This paper also proposed seven directions forfuture research. While these perspectives and researchdirections may be superseded by better ideas and newperspectives, this paper lays the groundwork for takingsignificant steps orward in understanding his new field.

    ACKNOWLEDGMENTS

    Cliff Nass,Byron Reeves,Terry Winograd, and Phil Zimbardohave been nstrumental n my work on persuasive computing.

    REFERENCES

    Ul

    PI

    131

    Atkins, C., & Wallace, L. Mass Communication andPublic Health. Newbury Park, CA: Sage,1990.Bandura,A. Social Foundations of Thought and Action:A social cognitive theov. Englewoods Cliffs, NJ:Prentice Hall, 1986.Buxton, W. Living in Augmented Reality: Ubiquitousmedia and reactive environments. Web URL:www.dgp.utoronto.ca/OTP/papers/hill.buxton/augmen~edReality.html

    231

  • 8/3/2019 fogg

    8/8

    PAPERS CHI 98 l 18-23 APRIL 1

    r41

    [51

    161

    171

    PI

    PI

    [lOI

    Ull

    WI

    u31

    1141

    1151

    WI

    u71

    WI

    WI

    [201

    Fogg, B.J. & Nass, C-1. How Users Reciprocate oComputers:An experiment hat demonstrates ehaviorchange. n Extended Abstract-s of CH197,ACM Press,p. 331-332.Fogg, B.J. & Nass, C-1. Silicon Sycophants: Theeffectsof computers hat flatter. International Journalof Human-Computer Studies (1997), 46,551-561.Fogg, B.J. Captology: The Study of Computers asPersuasive Technologies. In Extended Abstracts ofCHI97, ACM Press,p. 129.Friedman, B. Human Values and the Design ofComputer Technology. New York: CambridgeUniversity, 1997.Friedman, B., and Kahn, P. Human Agency andResponsible Computing: Implications for ComputerSystemsDesign.Journal of Systems Sofnuare, 17:7-14, 1992.Graeff, J., Elder, J., and Booth, E. Communication forHealth and Behavior Change. San Francisco, CA:Jossey-Bass, 993.

    Honeycutt, L. Resources on Computer-SupportedCooperative Work http://www.dcr.rpi.edulcscw.htmlHuddy, C., Herbert, J., Hyner, G., & Johnson, R.Facilitating changes in exercise behavior.Psychological Reports, 1995,76(3), 867-875.Kay, A. Computersoftware. Scientific American,251, (1984), 53-59.Kozma, R. Zeaming with media. Review ofEducational Research, 61, (1991), 179-211.Kuomi, J. Media comparison and deployment: Apractitioners view. British Journal of l2iucationalTechnology, 1994,25(l), 41-57.Larson, C. Persuasion Reception and responsibility,7th ed. Belmont, CA: W adsworthPublishing, 1995.Laurel, B. The Art of Human-Computer Inter&eDesign. Reading,MA: Addison-Wesley,1990.Lieberman, D. Interactive Video Games for HealthPromotion. In Health Promotion and InteractiveTechnology, eds, R. Street, W. Gold, & T. Manning.Mahwah, NJ: LawrenceEarlbaum, 1997.Marcus, A. Human communication ssues n advancesUIs. Communications of the ACM, 36(4), 1993, lOl-109.Mutschler, E. Computer assisteddecision making.Computers in Human Services. 1990 6(4), 231-250.Nass, C. I., Fogg, B. J., Moon, Y. Can computersbeteammates? ntemational Joumal of Human-ComputerStudies, 45, 1996,669-678.

    WI

    P21

    1231

    [241

    PI

    ml

    VI

    PSI

    rw

    1301

    Dll

    [321

    I331

    [341

    I351

    1361

    Nass, C. I. and Mason, L. On the Study Technologyand Task: A variable-based pproach. n Fulk & C. Steineld (eds.), Organizations aCommunication Technology, 46-67. Newbury PaCA: Sage, 1990.Patton, P. User interfacedesign: Making metaphoI.D. Magazine, April, 1993, 62-65.Rawls, J. A Theory of Justice. Cambridge: HarvUniversity Press,1971.Reardon, K. Persuasion in Practice. Newbury PaCA: Sage, 1991.Reeves, B. & Nass, C. The Media Equation: Hpeople treat computers, television, and new media lireal people and places. New York: CambrUniversity Press,1996.Rice, R. and Atkin, C. Public CommunicatiCampaigns. Newbury Park, CA: Sage,1989.Salmon, C. Information Campa igns : Balancing socvalues and social change. Newbury Park, CA: SPublications, 1989.Schneiderman,B. Designing the User Intelface. NYork: Addison Wesley, 1987.Sheridan, T. Telerobotics, Automation, and HumSupervisory Control. Cambridge, MA: MIT Pr1992.Steuer, J. Defining virtual reality: Dimendetermining elepresence. oumal of Communicatio1992,42(4).Street, R., Gold, W., and Manning, T. HeaPromotion ond Interactive Technology. Mahwah, NLawrenceErlbaum, 1997.Turkle, S. Who am we? Wired Magazine, 19January, 148-202.Verplank, B., Fulton, J., Black, A., & MoggridgObservation and Invention: Use of Scenarios Interaction Design. Tutorial at lNTERCHIAmsterdam,1993.Weiser, M. The computer for the 21st centurScientific American, 1991,265(3), 94-104.Winograd, T. The design of interaction. In Denning & R. Metcalfe (Eds.).Beyond CalculatioThe nextfifty years of computing (pp. 149-161) NYork: Springer-Verlag,1997.Zimbardo, P. and Leippe, M. The Psychology Attitude Change and Social Influence. New YoMcGraw Hill, 1991.