Beyond User-Centered Design and User Experience: Designing for User Performance

Embed Size (px)

Citation preview

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    1/12

    Preprint of article appearing in Cutter IT Journal, 17(2), February, 2004. 2004, L. L. Constantine

    PREPRINT

    Beyond User-Centered Design and User Experience:Designing for User Performance

    Larry L. Constantine, IDSAChief Scientist, Constantine & Lockwood, Ltd.

    Director, Laboratory for Usage-centered Software EngineeringUniversity of Madeira, Funchal, Portugal

    User-centered design is everywhere in the IT world. Just as it was once fashionable to

    tout user friendly interfaces, these days nearly everyone has jumped on the user-centered bandwagonor is running to catch up. The bandwagon is a roomy one, anduser-centered design can be almost anything in practice so long as it adheres to thecore philosophy of keeping users at the center of the software development process.This focus on users as the central subject certainly seems to be a step forward fromthe technology-centered focus of bygone days, when users were all too often regardedas an annoyance to be ignored as much as possible. However, the frustrations ofeveryday experience with even the best of modern software products and Web sitestells us that something is still badly wrong with the picture. You need only reflect onhow many times a day you click on the wrong object or miss a step in a sequence orforget where a function is buried or curse the way some feature works to recognizehow much modern user interfaces fall short of their potential.

    Putting users in the center of the picture and using techniques that focus on them andtheir experience may look to be reasonable, decent, and proper things to do, butdespite the good intentions and noble efforts of designers, progress in usabilityremains unremarkable. Instead of getting breakthroughs in performance and leapsforward in what can be accomplished with computers, users are often left with a levelof tolerable mediocrity marked by missing functions, frequent and irrelevantinterruptions by modal messages that belabor the obvious, and multi-click detours tocomplete the most mundane of tasks.

    Breakthroughs in usability are possible, however. Consider these, for example.

    Q In a debriefing at the end of the first day of live use of a newly designed medicalrecords application, a nursing professional was moved almost to tears because,despite a new system and only the briefest training, she found she was alreadygetting more time with her patients.

    Q After seeing a new classroom information system, a veteran teacher declaredthat for the first time he felt that designers had understood what classroomteachers really do and what they needed from a computer system.

    Constantine & Lockwood, Ltd.

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    2/12

    Page 2 of 12

    Q Using a radically redesigned integrated development environment, programmersof automation applications were able to complete a typical mix of tasks in halfas many steps as previously.

    To consistently produce such radically outstanding results may require that designersand developers radically rethink how they approach the process of visual and

    interaction design. It may require them to consider the unthinkable, that there couldbe something wrong with user-centered design and its pre-occupation with users anduser experience.

    Among usability professionals, user-centered design is so established that even to hintat problems in its premises or practices is regarded as sacrilege. As a co-inventor ofusage-centered design (Constantine and Lockwood, 1999; Constantine and Lockwood,2002), I have more than once been the target of such accusations. After one session ata SIGCHI conference, an audience suggested on an evaluation form that I never beallowed to speak at the conference again because I had I questioned some of thereceived wisdom regarding the role of user studies and usability testing in user-centered design!

    User-Centered Approaches

    Fully forewarned that I may be treading the path to design apostasy, I want to explorewhat user-centered design gets right and where it goes wrongand to suggest someways to fix what is wrong with the process. Where user-centered design gets it right isthe easy part. Involving end users and learning about their real needs is a good idea, notwo ways about it. Spending time upfront to understand user requirements is anabsolute prerequisite for sound design practice, irrespective of your approach orphilosophy. Absent the goad of a user-centered approach, many projects would plungetoo quickly into software design and construction. The result is the illusion of progress(Were in the first week and were already coding!) purchased at the price of

    premature commitment to particular solutions that invariably compromise utility andusability. (Too late to fix that, its already hard coded.)

    To understand what might be wrong with user-centered design and what needs to bedone about it, we first need to understand better just what user-centered design is.Beyond its requisite focus on users, user-centered design gets a bit fuzzy. Crisptextbook definitions aside, user-centered design in practice is a rather clutteredcollection of loosely related techniques and approaches having in common little morethan a shared focus on users, user input, and user involvement. While it may bedifferent things in the hands of different practitioners, at its core, user-centered designis distinguished by a few common practices: user studies, user feedback, and usertesting.

    Through various techniques and tools ranging widely in formality and sophistication,user-centered design seeks to understand users as thoroughly as practical. Initial userstudies provide the essential input for iterative prototyping driven by user feedback,which is followed by user testing and, one hopes, further refinement of the product.Thats it. Partisan adherents of particular variants of user-centered design may arguethat this characterization omits some cherished technique, tool, or activity central to

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    3/12

    Page 3 of 12

    their preferred approach, but this admittedly over-simplified view highlights whatuser-centered design is really about and where it goes wrong. Lets start with design.

    Design in User-Centered Design

    A skeptical analysis might conclude that none of its core practicesuser studies, userfeedback, and user testingreally have very much to do with design itself. Despite itsname, there is not much design in user-centered design. Indeed, books on user-centered design often have much to say about users, user studies, human perceptionand cognition, human-machine interaction, user interface standards and guidelines,and usability testing but relatively little to say about design or design processes.

    The dirty secret that few advocates and practitioners will admit to is that user-centereddesign in practice is largely trial-and-error design. Stripped of semi-scientific rhetoricand professional self-justification, most user-centered design involves little more thangood guessing followed by repeated corrections and adjustments guided by checkingwith users. Once the mandatory user studies are out of the way, a potentially workablesolution is quickly sketched as a paper prototype. Little has been written about how

    this initial idea is conceived and few designers can articulate the mental legerdemaininvolved in its creation, but once you have something, you put it in front of one ormore users to find out what is wrong with it.

    Basically, iterative refinement based on paper prototyping relies on users to tell thedesigners what is wrong and how to get it right. Done well, it certainly helps users tofeel involved and empowered. It can also be reassuring to designers, particularly ifthey are unsure about their own guesses or lack complete confidence in their designskills. After all, the end product is the result of real feedback from real users. (Well,we did the right thing and got user feedback even if the result didnt work.) Forsimilar reasons, repeated refinement through iterative prototyping is reassuring toclients and management. They get to see early evidence of apparent progress; it may

    not be real code but at least they get screen mockups. Furthermore, designers candefend the ultimately delivered design as being based on real datareal informationfrom real users.

    So, whats wrong with iterative prototyping with user feedback? Here are the basicflaws and failings.

    Q It contributes to the illusion of progress.Q It encourages premature preoccupation with detail.Q It discourages courage.Q It relies excessively on users.

    As already hinted at, the flurry of paper prototypes can contribute to the illusion ofprogress. Decisions are being made and design artifacts are being generated, but thisdoes not mean that real progress is being made toward a first-rate solution. For onething, at the stage that realistic or representative paper prototypes are typicallyproducedprototypes that are recognizable and make sense to usersit is usually tooearly in the game to be worrying about what the screens will look like and whatfunctions they will present. The early involvement of users and the need for them to beable to interpret and react to the prototype forces premature investment in realism

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    4/12

    Page 4 of 12

    and the details of the design. Early paper prototyping encourages hurried decisionsabout details at the level of individual screens and widgets without first consideringwhat screens in what arrangement and supporting what functions best support userneeds. Designers are frequently seduced away from the more abstract and often lessexciting work of first mapping out a sound overall architecture for the user interface.When paper prototypes are constructed earlyoften ahead of or even instead of a full

    analysis of user requirementsdetailed decisions can rely too heavily on unverifiedassumptions.

    Unlike a properly behaved iterative computer algorithm, the cycle of feedback andchange in iterative prototyping does not necessarily converge toward a better solution.Users change their minds and different users will have different views andperspectives to offer. I have seen designs oscillate between solutions as one userrejects an approach and lobbies for its alternative, followed by another user whosefeedback is the reverse. Any number of times I have seen designers reinvent discardeddesign ideas, blithely oblivious to the fact that they are going around in circles.Because it depends so heavily on the comments and contribution of users, iterativeprototyping is vulnerable to the peculiarities of particular users. If the same user orusers are involved repeatedly, the design can end up excessively tailored to their

    biases. Alternatively, varying the users from round to round can lead the design tojump from variation to variation without significant improvement. If early prototypesare also shown to clients to demonstrate progress or garner support, the design

    becomes vulnerable to the whims and biases of vocal or powerful influences. In onecase, an egregiously bad Web design nearly went into production because an earlydesign concept had been favored and championed by the company president.

    Perhaps most damning and least recognized among the limitations of user-centereddesign is the way it subtly discourages courage. Courage is one of the central tenets ofextreme programming and agile development methods (Beck, 2001). Cooper (2003)

    advocates courageous programming that is decisive in response to user actions insteadof saddling users with a plethora of irrelevant alerts and ineffective confirmationmessages that reflect hesitant design.

    User-centered design, however, makes it too easy for designers to abdicateresponsibility in deference to user preference, user opinion, and user bias. In truth, itis hard to stick with something you know works when users are screwing up theirfaces at it. What if you are wrong? What if you are not as good a designer as youthought you were? It takes real courage and conviction to stand up for an innovativedesign in the face of users who complain that it is not what they expected or who wantit to work just like some other software or who object to certain sorts of features as amatter of course. It takes responsible judgment to know when to listen to users and

    when to ignore them.

    Designers and developers need to keep in mind that anything genuinely new andimproved entails risk, and real progress invariably provokes resistance. Most usersare inherently conservative when it comes to user interface design. Many would prefera bad but familiar design to a better but unfamiliar one. That so many systems have

    been so difficult to learn and so nearly impossible to master makes users all the morereluctant to take on the burden of learning to use anything new. Assigning too muchweight to user input, as user-centered design is prone to do, helps sustain the tyranny

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    5/12

    Page 5 of 12

    of legacy systems and legacy users. Without the overthrow of this tyranny, establishedbut ineffective designs are perpetuated and progress in usability is impeded.

    I have seen designers back off from clever and effective solutions because of a singleround of user feedback. All too often they settle for something not only less creative

    but also less effective. This is not to imply that user feedback is a bad idea or that

    users should be ignored altogether. In the absence of feedback from the real world,designers are left in a vacuum where design fantasies can send them spinning off intoouter space. User feedback helps designers avoid really stupid mistakes or really baddesign, but it also tends to put the brakes on creativity. Taken together, putting usersat the center and making user feedback pivotal in a process of early prototyping withpaper designs that are intended to resemble or represent real interfaces tends to favorsolutions that are acceptable but uninspired at best. User-centered design thus canhamper real progress.

    Studying Users

    User feedback on designs may be the pivotal process in user-centered design but it is

    not the whole story. Although some designers on some projects may plunge into paperprototyping on day one, the tenets of user-centered design call for user studies to kickoff the process.

    Here, too, making users themselves the primary focus is one of the problems withuser-centered design in practice. As with user design feedback, user studies are a goodidea. You cannot do a good design without knowing something about your users. Thequestion is a matter of just what you need to know. Field techniques are varied andmay incorporate many things: structured and unstructured observations, formal andinformal interviews, surveys and questionnaires, analysis of artifacts, ethnographicinvestigation, requirements gathering, and the like. Unfortunately, thoroughgoing userstudies can generate an overwhelming volume of data of many kinds, all of which must

    be organized, digested, and understood. In the interest of deep understanding, moreinformation is assumed to be better. However, in the midst of such bounty, it can be alltoo easy for key information to be missed or lost. Ultimately, only some of the findingsand conclusions based on some portion of the gathered data will be relevant for designof an effective user interface. The dilemma is to figure out what to focus on and whatto ignore.

    User-centered approaches typically represent users in suitably user-centered waysthrough detailed profiles of user types or as recognizable stereotypes. Personas(Cooper, 2003) are a popular form for representing the results from user studies. Apersona comprises a realistic and believable constructed description of a singlearchetypal user. Typically, personas incorporate considerable detailsuch as

    background, personality, personal history and experience, attitudes, or habitsthatmake them seem more real and understandable but can be gratuitous distractions inthe context of creating a well-designed user interface.

    From the standpoint of effective visual and information design, a few questions standout as most important regarding users. What are they trying to accomplish and whatdo they need from the system in order to accomplish it? What is or will be theirrelationship with the system? Other things about userstheir personality, socio-

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    6/12

    Page 6 of 12

    economic status, personal preferences, work environment, and so forthmay beinteresting and of some relevance, but are of distinctly lesser importance.

    To the extent that user-centered approaches do attend to the real work of users andwhat it is really about, they tend to do so in decidedly user-centered ways that depicthypothetical but realistic users and make them central characters in a story that is

    easily understood by users. They employ realistic scenarios, customer stories, usernarratives, and even movie-style storyboards that seem authentic and are accessible tousers. Unfortunately, while these may appeal to and serve some of the interests ofusers, stories in their several forms may have some disadvantages for designers. In theinterest of verisimilitude, scenarios are often fleshed out with superfluous detail thatenhances their appeal but can also obscure essentials. Scenarios often incorporatedetours into exceptional or unusual cases in order to be more comprehensive indescribing work, but this practice can also give undue attention to such uncommon orspecial cases.

    What is wrong with user studies as part of user-centered design is not that they do notdeliver but that they are prone to delivering too much and to cast it in the wrong form.Focusing on users broadly necessarily entails not focusing more narrowly or sharply.The most critically important information is too easily missed amidst mountains ofdata or obscured by fascinating but less-important findings. Moreover, it costs extra togather a surplus of information and takes extra time to analyze it and extract thecrucial bits. Indeed, the most commonly heard management objection to upfront fieldinvestigation is that it costs too much and delays the start of the real work of designand construction. Arguments that building the wrong system is even more costly andtime consuming usually fall on deaf ears. Consequently, user studies are often rushedor shortchanged for lack of time and resources. However useful they may be,ambitious and thorough ethnographic methods, such as, contextual inquiry (Beyer andHoltzblatt, 1998), are apt to be regarded as luxuries that are beyond the budget of

    many projects.

    Like user feedback, user studies may offer little to help designers to distinguish userwants from user needs. Indeed, it is common in both contexts to ask users what theywant or would like to see. If the interviewer or designer then asks whether users reallyneed something, the answer tends to be yesregardless of actual importance ordemonstrable impact. In addressing actual users and actual situations, field techniquesalso tend to concentrate more on how things are done than on why. Why something isdone the way it is or why it is done at all gets closer to the core of what users trulyneed. Good questions from a good interviewer or evaluator certainly help, but userstudies in themselves do not distill genuine needs from the mish-mash of user wishesand fantasies and the mess of current practice.

    User Testing

    Raising questions about the practice and value of user feedback and user studies mayborder on sacrilege, but even to look skeptically askance at user testing is tantamountto professional heresy. User testing is the absolute centerpiece of usability efforts inmany organizations. The good part, once again, is the easy part. User testing is useful.In many cases it can uncover subtle but serious usability problems that are apt to bemissed by designers as well as expert evaluators. Some formal usability testing with

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    7/12

    Page 7 of 12

    real or representative users is always a good idea. However, it is not a good idea to puttoo many eggs in the user testing basket and depend on it as the primary means forimproving usability.

    The problems with usability testing are the problems with testing of any form. Testingcomes too late. By the time software is available for live testing with users it is

    typically too late to change many things. Relative to some important alternatives, likeusability inspections, usability testing is relatively expensive for the information ityields. Then there is the coverage issue. It is impossible to do enough tests withenough scenarios to thoroughly exercise all the interaction paths in any interestinglycomplex system.

    Perhaps the biggest problem with usability testing is that it reinforces some of theproblems with user studies and user-driven design. Particularly as currently practicedunder the influence of discount usability (Nielsen, 1993) with only a small number ofuser subjects, user testing can make the experiences of particular users, who may ormay not be representative, unduly important in shaping the outcome. While it can bequite effective at uncovering localized design flaws and suggesting directions forcorrection, it is less effective at exposing problems in the fundamental organization ofthe user interface, even when this is actually the root cause of user difficulties.

    User testing as typically carried out can also tend to favor pedestrian solutions overinnovation. Nearly all user testing is based on a single encounter with the system byany one user. Since user subjects are seldom trained on the system and rarely givenopportunities to practice or build skills, standard solutions usually fair better, even if anon-standard design might ultimately lead to greater productivity or fewer errors.Testers also tend to interpret user hesitation or uncertainty as indicative of usabilityproblems. In the widely used think aloud test protocol, if a user says Whats that?or Im not sure, it can count against a design feature, even if the user performs well

    or would have no difficulty on a second attempt. Because usability tests are focusedmore on user experience than user performance, they seldom if ever look at a secondor third encounter with a given part of the user interface.

    This shortcoming is particularly significant in light of what has been learned aboutmaking user interfaces self-teaching through so-called instructive interaction(Constantine and Lockwood, 2002b). Observation of users interacting with novel userinterfaces has shown that certain design practices can enable users to learn andremember how to use a completely unconventional user interface after only a singletrial. However, designs based on instructive interaction may not fare well inconventional usability testing even in cases where they work extremely well in actualuse.

    From User Experience to User Performance

    It would be unfair to point out flaws and limitations without suggesting some potentialfixes. Here is a series of well-defined practices that are readily incorporated into whatmost practitioners regard as user-centered design and that can substantially improvethe process and its results. Some may already be in the repertoire of forward-thinkinguser-centered designers. All are relatively easy to learn.

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    8/12

    Page 8 of 12

    Q Exploratory modeling to make user studies more efficient and tightly focused.Q Comprehensive task modeling to capture and represent essential user needs.Q Deriving initial designs from models rather than by magic.Q Abstract prototyping to defer introduction of realistic details.Q Usability inspections to multiply the effectiveness of user testing.Q Elevating user performance over user experience.

    Model-driven exploration.

    Exploratory modeling (Windl, 2002a) is a technique for speeding up and simplifyingthe process of user study and requirements definition. Where common ethnographicapproaches to user study begin with observation and data gathering followed by

    building models based on the data, exploratory modeling reverses the order, beginningwith preliminary modeling on which to base subsequent user study. Provisional modelsof users and user tasks are first constructed in order to help identify holes inunderstanding, formulate questions, and to highlight areas of ambiguity and

    uncertainty or outright confusion. These admittedly incomplete models are used toclarify priorities and guide the investigation and observation by sharpening the focusonto key areas. Provisional models are then refined based on findings from a morerapid and pointed data gathering process. As originally proposed and most commonlypracticed, exploratory modeling constructs condensed and simplified inventories ofessential needs: the roles that users will play in relation to the planned system and thetasks that must be supported in order for users to successfully perform those roles.

    Instead of a protracted investigation producing a potentially overwhelming surplus ofdata, model-driven exploration quickly and efficiently delivers answers to the mostimportant questions. In principle, model-driven exploration entails a risk thatmodelers will not realize where information is missing or will not recognize what is

    unknown or misunderstood, but in practice this does not seem to be a problem. In anyevent, it is almost invariably cheaper and easier to go back and fill in a few blanksmissed on a tightly focused initial inquiry than to gather great quantities ofsuperfluous data that must nevertheless be processed and understood before it can bediscarded.

    Complete task modeling.

    Of all the thing that visual and interaction designers and other usability professionalsneed to understand about users, none is more important than their intentions. Whatare the tasks that users intend to accomplish with the product? Guided by a taskmodel that maps out all the user tasks and how they are interconnected, designers arein a better position to put features that support user tasks in the most appropriate

    places. Task cases and essential use cases (Constantine, 1995; Constantine andLockwood, 2001) offer a more precise and less wordy alternative to user-centeredmodels like scenarios and customer stories. These models have a structure thatfocuses on the intentions of users and the responsibilities of the system in support ofthose needs. Their fine-grained and succinct format helps promotes compact butcomprehensive modeling of task essentials. They the have a long record of success inleading to world-class designs that enhance user performance (Windl, 2002b).

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    9/12

    Page 9 of 12

    Model-driven abstract prototyping.

    As most widely practiced, good user interface design is often a mix of widget wizardryand software sorcery tempered by the fire of trial-and-error refinement. The mosttalented and skilled designers can almost always pull a rabbit out of the hat andproduce a good approximation with the first paper prototype, but lesser wizards andapprentices are often left in the dark as to how to get from user data to user interfacedesign. Tools and techniques that help designers derive initial designs directly fromwell-formed task models not only take the magic out of the process but lead toward

    better designs. Abstract prototypes (Constantine, 1998) based on canonical abstractcomponents (Constantine, 2003) are one proven approach that does precisely that.

    Instead of a rough sketch intended to resemble or suggest an actual user interface,abstract prototypes represent the tools and materials to be presented by a userinterface stripped of details of appearance and behavior. An abstract view selector,for instance, represents a needed capability that might ultimately be realized as amenu, a dropdown list, or even a dialog box. The important thing initially is that a viewselector is needed particular places in the user interface to enable users to perform

    certain well-defined tasks. Abstract prototypes make it easier for designers to get theimportant aspects of the content and organization of the user interface right whiledeferring details about what the bits and pieces will look like and exactly how they willoperate. Abstract prototypes in canonical form offer designers a toolkit of standardcomponents from which to construct their abstract prototypes. Standardizationfacilitates comparisons and the recognition of common problems and solutions thatcan contribute to the compendium of design patterns (Constantine, 2003).

    Abstract prototypes are easily derived directly from good task models. The stepswithin well-constructed task cases clearly imply the need for particular abstract toolsor materials. Abstract components suggest particular real components and designsolutions. One large leap of magical transformation that can require advanced training

    in widget wizardry is thus replaced by two small steps of direct translation that aremuch easier to learn and to master.

    Usability inspections.

    For identifying software defects, code inspections and structured walkthroughs haverepeatedly proved to be more efficient and cost effective than testing. Based on similarprinciples and premises, collaborative usability inspection (Constantine and Lockwood,1999; Lockwood and Constantine, 2003) is a systematic technique developed andrefined specifically for identifying usability defects. Through a highly structuredprocedure with explicit rules and highly refined roles, collaborative usabilityinspections identify more usability problems more quickly and at an earlier stage thanusability testing. Collaborative inspections have a number of advantages. Experienced

    inspection teams can identify upwards of 100 usability defects per hour. Inspectionsrequire only one or two users or user surrogates to be effective. Inspections are goodat identifying numerous small problems that in combination can substantially degradeuser performance.

    Usability inspections do not replace user testing, but, by leading to a cleaner and morerefined product thorough inspection of designs and prototypes, they can markedlyreduce the amount of testing needed.

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    10/12

    Page 10 of 12

    Designing for use.

    So far, the suggestions for process improvement represent additional or alternativetechniques, but simple changes in practice are not the whole story. All of the proposalsare grounded in a common theme that is more radical than mere technique. Ultimately,designers need to begin designing for use rather than designing for users. As long asusers are center stage, designers will be distracted from the real action where dramaticimprovement is possible.

    Although it may be tempting to try, one cannot have it both ways. If your attention ison users then it is not on use. If users are at the center, then other matters are madeperipheral. If some things are in focus, others are blurred. The choice of focus matters.It is well established that designers and developers tend to optimize for whateverfactor is the focus of their attention at the expense of whatever is not. For usabilityand usefulness, uses are more important than users and supporting effective userperformance is more important than promoting good user experience. If yourdesigners concentrate their efforts on creating a good user experience they all tooeasily fail to support good user performance, which is arguably the most important

    contributor to the experience. No matter how amusing the descriptions or pleasant thegraphics, if users fail to find the product they seek on a Web site, its a bad experience.If an elegant design prevents me from getting my work done on time, its a badexperience. Regardless of how many features a product might offer, if most of themare irrelevant and important ones are missing or impossible to find, the userexperience is negative.

    User performance, not user experience, is also the design outcome that translates mostdirectly into real value. It means increased sales, more work completed, fewer errors,faster learning, and better retention of skills. The seemingly nobler but decidedlyfuzzier goal of good experience may or may not have anything to do with the realvalue of a system. Good experience in e-commerce shopping, for instance, means

    nothing if users repeatedly fail to perform correctly in entering credit card details. It isprecisely such mundane tasks that tend to get short shrift when attention is on the

    broad landscape of users and the global goal of good user experience. Many userexperience designers seem to forget that the single most important factor in usersatisfaction is goal satisfaction: Did the user accomplish what they intended or neededto accomplish?

    Dramatic improvements in support of user performance are made possible byconcentrating on essentials and downplaying non-essentials, by conducting efficientand tightly targeted inquiry, by building compact models that highlight the mostimportant elements and forego flowery verbiage, and by following systematicprocesses that translate models into results. A decade of experience on diverse

    projects demonstrates that these practices of conceptual economy lead to bettersystems that actually cost less to design and develop.

    Perhaps the most persuasive reason for constructing a comprehensive task model thatrepresents all the real needs of the user is that it not only can save you from creatingfeatures that are not needed or will just get in the way, but it can also keep you frommissing vital functions. For example, in my work as a designer I often use a portableelectronic whiteboard attachment that captures in digital form the work my team does

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    11/12

    Page 11 of 12

    on any whiteboard. The system has a reasonably straightforward user interface marredby one or two really stupid design choices. The button to save a snapshot frame of thecurrent whiteboard is marked by an obscure icon intended to represent the completelynon-intuitive idea of a tag. That button is adjacent to the button to start afresh witha blank board. This minor defect becomes a fundamental design flaw because anabsolutely critical function is missing from the software. If the user ever inadvertently

    presses new instead of tag there is no way to recover because the software allowsyou to add frames to a file but not to delete them. A client team recently wasted morethan an hour trying to find a workaround that would enable them to resume work onthe design on one board after having moved the capture bar to another board towork on an alternative drawing. They ended up having to manually trace over acomplex drawing in order to re-enter it in a fresh file. It is inconceivable that aworkable task model based on essential use cases would have missed this absolutelyessential function. Ignorant of the design flaw, I have repeatedly recommended thisdigital whiteboard; I will not recommend it again.

    Remarks

    There, it is done. The heretical theses have been posted in hopes of stirring areformation. User-centered design is a good idea in need of improvement. The neededimprovement is found in practices that put uses rather than users at the center ofdesign and in changing the prime objective from enhancing user experience toenhancing user performance. For the record, this is the basis of usage-centered design,the approach responsible for the breakthrough examples given earlier.

    References

    Constantine, L. L. (1995) Essential Modeling: Use Cases for User Interfaces, ACM interactions 2(2), March/April.

    Constantine, L. L. (1998) Rapid Abstract Prototyping. Software Development, 6, (11),November. Reprinted in S. Ambler and L. Constantine, eds., The Unified Process Elaboration

    Phase: Best Practices in Implementing the UP. CMP Books: Lawrence, KS, 2000.Constantine, L. L. (2003) Canonical abstract prototypes for abstract visual and interaction

    design. In J. Jorge, N. Jardim Nunes, and J. Falcao e Cunha, Eds. Interactive Systems: Design,Specification, and Verification. Proceedings, 10th International Workshop, DSV-IS 2003,Funchal, Madeira Island, Portugal, 11-13 June 2003. Lecture Notes in Computer Science, Vol.2844. ISBN: 3-540-20159-9 Springer-Verlag.

    Constantine, L. L., & Lockwood, L. A. D. (1999) Software for Use: A Practical Guide to the Modelsand Methods of Usage-Centered Design. Boston: Addison-Wesley.

    Constantine, L. L., and Lockwood, L. A. D. (2001) "Structure and Style in Use Cases for UserInterfaces." In M. van Harmelan (ed.), Object Modeling and User Interface Design. Boston:Addison Wesley.

    Constantine, L. L., and Lockwood, L. A. D. (2002a) Usage-Centered Engineering for WebApplications. IEEE Software, 19(2), March/April.

    Constantine, L. L., and Lockwood, L. A. D. (2002a) Instructive Interaction. User Experience, 1(3), Winter.

    Cooper, A., and Reimann, R. M. (2003) About Face 2.0: The Essentials of Interaction Design. NewYork: Wiley.

    Lockwood, L. A. D., and Constantine, L. L. (2003) Usability by Inspection: CollaborativeTechniques for Software and Web Applications. In L. Constantine, ed., Performance byDesign: Proceedings of forUSE 2003, Second International Conference on Usage-CenteredDesign. Rowley, MA: Ampersand Press.

  • 7/28/2019 Beyond User-Centered Design and User Experience: Designing for User Performance

    12/12

    Page 12 of 12

    Nielsen, J. (1993) Usability Engineering. Boston: Academic Press.

    Windl, H. (2002a) Usage-Centered Exploration: Speeding the Initial Design Process. In L.Constantine (Ed.), forUSE 2002: Proceedings of the First International Conference on Usage-Centered, Task-Centered, and Performance-Centered Design. Rowley, MA: Ampersand Press.

    Windl, H. (2002b) Designing a Winner: Creating STEP 7 Lite with Usage-Centered Design. In L.Constantine (Ed.), forUSE 2002: Proceedings of the First International Conference on Usage-

    Centered, Task-Centered, and Performance-Centered Design. Rowley, MA: Ampersand Press.