24 Performance Improvement APRIL 2001
You are creating a distance learning or a computer-based training environment.You would like it to be as learner centered and user friendly as possible. Howdo you make that happen?First, consider what is meant by usability. The term usability typically refers to tech-
nical issues, whether a product or system is comfortable, bug-free, and intuitively
operable. We have found this connotation is suitable for computer programs and
information-seeking environments where performance of specific tasks and access to
information is the key objective.
However, usability in instructional environments relates to more than information
access and technical concerns. User interaction with an instructional environment to
some extent involves comprehension, rehearsal, and retrieval cognitive processes
(Grabowski, 1995). While it is not practical to suggest that usability testing go so far
as to address measures of learning, it is feasible to suggest that usability tests consider
whether the learner recognizes and accesses instructional elements as the designer of
the learning environment intended.
This article shares a three-step approach to learner-centered usability testing. In a nut-
shell, these three steps will help you or your usability expert focus on the critical ele-
ments of an instructional interface, the development of an observation/interview
learning-based usability checklist, and the implementation of a one-on-one user-test-
ing methodology. Just three overall questions help you through this process:
1. Is there a teacher in the interface?
2. Are there particular elements in the environment you have concerns about?
3. Have you put the instructional environment to the real test?
1. Is There a Teacher in the Interface?
One of the easiest ways to evaluate usability is to imagine that the instructional inter-
face is a teacher. An instructional interface is defined as the elements in any product
or system that support the tasks of the learner while he or she is learning (Lohr,
2000). By this definition, an interface can be most anything: a paper-based product, a
computer-based environment, or a combination of several training approaches.
Tools for Creating a Learner-Friendly Instructional Environmentby Linda L. Lohr and Carol Eikleberry
Performance Improvement Volume 40 Number 4 25
Regardless of format, the importantquestion to ask is whether your inter-face is performing the many functionsof a responsive teacher. Does it antici-pate the types of questions learners typ-ically have when taking part in any typeof training environment? For instance,does the interface answer questionssuch as, Am I being graded? What am Isupposed to do? Am I doing things theright way? Lohr (2000, pp. 161-182)suggests that designers and usabilityexperts quickly do a run-through of aninterface to see if it is addressing someof the most basic types of learner ques-tions (see Figure 1). If there are clearlyvisible elements that address the basicquestions in Figure 1, your interfaceshould be fairly functional.
2. Are There Particular Elements inthe Environment That You HaveDoubts About?
This step of testing requires you or yourusability expert to specifically identify ele-ments in the learning environment aboutwhich you are not certain. This is the casewhen you are trying a new type of menudesign that most learners are unlikely to haveever encountered before. Or you may be won-dering if the organization of your learningenvironment is memorable. For instance, youmight want to put important email addressesand phone numbers in the orientation sectionof the learning environment, but are wonder-ing if learners would actually remember thatemail addresses and phone numbers arelocated in that section. You might be thinkingthat it would be better to put email addressesand phone numbers in a more visible location.The only way you are really going to know isto test out your ideas with the learners.
In situations where we doubt the usability of aparticular element, we develop a check-sheetmatrix to guide our usability testing (seeFigure 2.) The matrix design consists of two
Figure 1. Some Anticipated Learner Questions.
Figure 2. Observation/Interview Usability Matrix.
26 Performance Improvement APRIL 2001
overall sections: user actions we can observe and questionswe can ask the user (see section A in Figure 2.) Within eachof these sections, we use the same categories identifiedabove: Welcome and Directions, Learning Help, andFeedback (see section B in Figure 2). We add three usabilitycategories to each section as well: Effectiveness, Efficiency,and Appeal (see section C in Figure 2). We define effective-
ness as a measure of whether something is technically work-ing or accurate. Efficiency is a measure of how easy some-thing is to use. Appeal is a measure of how much people likethe element, feel confident about it, or are comfortable withit. Once we have this grid structure set up, we are ready tofill it in with criteria or questions that have a particular inter-est to us. When we insert a question onto a line in the chart,
A SET UP YOUR TESTING ENVIRONMENTBRING: __ two pencils/pens (one for you and one for the interviewee), ___ this packet, __someone to take notes (if possible),
__alternative storyboards (if applicable), __watch, __tape recorder
SETUP: learning environment (load software, assemble any other instructional artifacts)
SET: your chair at a 45-degree angle to the user. (This way the user wont feel like you are staring at them, but you will be able to see
what they are doing and what they are looking at.)
B FILL IN THIS INFORMATIONName of interviewer:_________________________________________________ Name of interviewee:_____________________________
Date: Time started: Time completed:
C SIGN PARTICIPANT RELEASE FORM (optional)I understand that the purpose of this testing is to provide feedback on the usability of an instructional product. I understand that I will be providing feed-
back related to the usability of this tool during its development. I understand that my interactions with the tool will be observed, and that I will take part in
interviews with one of the designers. I understand that my participation in the data collection is fully voluntary, and I may withdraw from the study at any
time. I realize that there are no risks attached to my participation, neither will there be any effects if I choose not to participate at a later date. I understand
that all information gathered will remain confidential.
Participants Signature_________________________________ Date ______________
Researchers Signature_________________________________ Date ______________
D START THE INTERVIEW
Break the Ice: Exchange names and ask the user about themselves (what they do for a living, what their hobbies are) You want them to feelcomfortable talking to you. Be sure to listen carefully and ask questions related to what they are saying. They will feel like you are
interested in them, and by answering your questions they will get more familiar with the interviewing format.
Explain the Purpose: For the next hour Im going to be asking your opinion on some training materials my group is designing. The product we are design-ing is (insert your description here). We want to make these materials as easy as possible to use. Your opinion is very important in help-
ing us design something that learners like and can use. You will not hurt our feelings when you tell us that you dont like something.
Remember that we are evaluating our learning product, not evaluating you. When you have trouble knowing what to do, or get mixed
up, it means to us that our training product isnt as good as it should be. In other words, we think our design is the problem, not you.
Id like you to think aloud while you are looking at this product. Tell me what you are doing and thinking about every step of the
way. Tell me what you like and dont like. Tell me if what you see seems relevant to you. Does it make you curious? Does it bore
you? Share all of these things with me. Ill give you an example of thinking aloud. (Demonstrate the think-aloud process on the
first screen or page of your product.) If you feel self-conscious with this process just imagine that you are talking with a friend,
explaining each step of what you are doing. Go ahead and fully explore. Say everything you are thinking, even if it seems irrele-
vant. I may interrupt you from time to time if I need more information about what you are doing.
E RECORD YOUR OBSERVATIONS
Keep these Try to keep yourself busy taking notes so that the user doesnt feel too self-conscious.Points in Mind: Give the user feedback from time to time, perhaps reminding them that we want to find the problems with these materials, and
they are helping us do this. Prompt the user to fully explore the interface. Prompt with questions in a non-threatening way, such
as I see that you chose to click on the arrow there instead of here. Can you share with me why you didnt click here instead?
This is less threatening than, Why didnt you click this? Be sure to note how the user begins. Does the user have problems?
Does the user freely explore the site?
F CONCLUDE YOUR INTERVIEW This concludes our interview today. Thank you for your time.
Figure 3. Learner-Test Protocol.
Performance Improvement Volume 40 Number 4 27
we then place an X into each applicable column on that lineas well. For example, if the question addresses efficiency, weplace an X in the efficiency column. We consider ourselvesready to test with the learner when we have a nice distribu-tion of Xs among all the columns.
3. Have You Put the Learning Environment tothe Real Test?
Up to this point in the usability testing procedure, theprocess has been directed or structured around specific cri-teria of interest. This final step, perhaps the most importantstep in the entire process, is unstructured. If we do this stepcorrectly, we will get the chance to see problems or learnerconfusion we had not even thought about, perhaps becausewe are too close to the design to see the problem, or we justdo not have the unique perspective of the learner.
During this step we ask learners to speak their thoughts whilethey work through an instructional experience. A typicaltalk-aloud procedure requires the learner to start and progressthrough an instructional program without any interventionfrom the tester or observer. The purpose of this approach is tounderstand the environment from the learners perspective. Ifpossible, this step should be conducted through videotapingor some other type of nonintrusive observation, so the learneris not self-conscious. Most often though, we simply sit andobserve the procedure, recording the learners comments andbehaviors on a notepad. Figure 3, derived from Tessmer(1993), shows the script and procedure we follow when con-ducting a talk-aloud protocol.
There are, however, limitations to the talk-aloud approach.Many learners can be shy and blame themselves rather thanthe instruction for their inability to understand what to do.When this is the case, we try one of two approaches. We sug-gest that the learner work through the learning environmentwith another learner, with both learners talking aloud theirthoughts, good and bad. If another learner is not present, wesometimes do the talk-aloud for the learner and ask the learnerto interrupt us when he or she does not agree with somethingwe say or do. This last approach is the least ideal, but therehave been times we have had reasonably successful results.
While this three-step procedure may seem lengthy, it cantake place quite quickly, requiring no more than 12 hoursper learner to implement. One remaining question for many
readers is likely to be, How many learners do I need totest? To that we have a practical reply: As many as possi-ble. Keep in mind that Nielson and Landauer (1993, pp.206213) suggest a minimum of three to five users. Whilethis falls short of sample size requirements learned in basicresearch methods and statistics courses, it is realistic andfits with the demands of most development environments,where time and money are always key drivers of design.Consider as well that usability testing for technical errors isan important precursor to learner-centered usability testing.
Hopefully we have suggested useful tools for moving usabil-ity testing to its next level. You are well on your way to creat-ing both user-friendly and learner-friendly environments.
Grabowski, B.L. (1995). Message design; issues and trends.In G.J. Anglin, (Ed.), Instructional technology, past, presentand future (pp. 222231). Englewood, CO: LibrariesUnlimited, Inc.
Lohr, L. (2000). Designing the instructional interface.Computers in Human Behavior 16 (2), 161182.
Nielsen, J., & Landauer, T.K. (1993). A mathematical modelof the finding of usability problems. Proceedings ACM/IFIPINTERCHI93 Conference (Amsterdam, The Netherlands,April 2429), 206213.
Tessmer, M. (1993). Planning and conducting formativeevaluations: Improving the quality of education and train-ing. London: Kogan Page Limited.
Linda L. Lohr, EdD, is an assistant professor at the University of NorthernColorado and is currently writing Creating Visuals for Learning and
Performance, a Prentice-Hall textbook. She enjoys the challenge of making
performance materials effective, attractive, and easy to use. She is continually
looking for examples of performance materials that need to be improved. Her
research interests focus on learner-centered design. Linda may be reached at
Carol Eikleberry, PhD, is a licensed psychologist and the author of TheCareer Guide for Creative and Unconventional People. In her professional role,
she values helping people apply their unique style of creativity to real-world
problems. She is most interested in the research area of individual differences.
To enhance effectiveness, she promotes the idea of matching the person to
the task. On or off the job, she practices her own version of the three Rs: read-
ing, writing, and relating. Carol may be reached at email@example.com.