Learner-centered usability. Tools for creating a J learner-friendly instructional environment

Embed Size (px)

Text of Learner-centered usability. Tools for creating a J learner-friendly instructional environment

  • 24 Performance Improvement APRIL 2001

    You are creating a distance learning or a computer-based training environment.You would like it to be as learner centered and user friendly as possible. Howdo you make that happen?First, consider what is meant by usability. The term usability typically refers to tech-

    nical issues, whether a product or system is comfortable, bug-free, and intuitively

    operable. We have found this connotation is suitable for computer programs and

    information-seeking environments where performance of specific tasks and access to

    information is the key objective.

    However, usability in instructional environments relates to more than information

    access and technical concerns. User interaction with an instructional environment to

    some extent involves comprehension, rehearsal, and retrieval cognitive processes

    (Grabowski, 1995). While it is not practical to suggest that usability testing go so far

    as to address measures of learning, it is feasible to suggest that usability tests consider

    whether the learner recognizes and accesses instructional elements as the designer of

    the learning environment intended.

    This article shares a three-step approach to learner-centered usability testing. In a nut-

    shell, these three steps will help you or your usability expert focus on the critical ele-

    ments of an instructional interface, the development of an observation/interview

    learning-based usability checklist, and the implementation of a one-on-one user-test-

    ing methodology. Just three overall questions help you through this process:

    1. Is there a teacher in the interface?

    2. Are there particular elements in the environment you have concerns about?

    3. Have you put the instructional environment to the real test?

    1. Is There a Teacher in the Interface?

    One of the easiest ways to evaluate usability is to imagine that the instructional inter-

    face is a teacher. An instructional interface is defined as the elements in any product

    or system that support the tasks of the learner while he or she is learning (Lohr,

    2000). By this definition, an interface can be most anything: a paper-based product, a

    computer-based environment, or a combination of several training approaches.

    Learner-Centered Usability

    Tools for Creating a Learner-Friendly Instructional Environmentby Linda L. Lohr and Carol Eikleberry

  • Performance Improvement Volume 40 Number 4 25

    Regardless of format, the importantquestion to ask is whether your inter-face is performing the many functionsof a responsive teacher. Does it antici-pate the types of questions learners typ-ically have when taking part in any typeof training environment? For instance,does the interface answer questionssuch as, Am I being graded? What am Isupposed to do? Am I doing things theright way? Lohr (2000, pp. 161-182)suggests that designers and usabilityexperts quickly do a run-through of aninterface to see if it is addressing someof the most basic types of learner ques-tions (see Figure 1). If there are clearlyvisible elements that address the basicquestions in Figure 1, your interfaceshould be fairly functional.

    2. Are There Particular Elements inthe Environment That You HaveDoubts About?

    This step of testing requires you or yourusability expert to specifically identify ele-ments in the learning environment aboutwhich you are not certain. This is the casewhen you are trying a new type of menudesign that most learners are unlikely to haveever encountered before. Or you may be won-dering if the organization of your learningenvironment is memorable. For instance, youmight want to put important email addressesand phone numbers in the orientation sectionof the learning environment, but are wonder-ing if learners would actually remember thatemail addresses and phone numbers arelocated in that section. You might be thinkingthat it would be better to put email addressesand phone numbers in a more visible location.The only way you are really going to know isto test out your ideas with the learners.

    In situations where we doubt the usability of aparticular element, we develop a check-sheetmatrix to guide our usability testing (seeFigure 2.) The matrix design consists of two

    Figure 1. Some Anticipated Learner Questions.

    Figure 2. Observation/Interview Usability Matrix.

  • 26 Performance Improvement APRIL 2001

    overall sections: user actions we can observe and questionswe can ask the user (see section A in Figure 2.) Within eachof these sections, we use the same categories identifiedabove: Welcome and Directions, Learning Help, andFeedback (see section B in Figure 2). We add three usabilitycategories to each section as well: Effectiveness, Efficiency,and Appeal (see section C in Figure 2). We define effective-

    ness as a measure of whether something is technically work-ing or accurate. Efficiency is a measure of how easy some-thing is to use. Appeal is a measure of how much people likethe element, feel confident about it, or are comfortable withit. Once we have this grid structure set up, we are ready tofill it in with criteria or questions that have a particular inter-est to us. When we insert a question onto a line in the chart,

    LEARNER-TEST PROTOCAL

    A SET UP YOUR TESTING ENVIRONMENTBRING: __ two pencils/pens (one for you and one for the interviewee), ___ this packet, __someone to take notes (if possible),

    __alternative storyboards (if applicable), __watch, __tape recorder

    SETUP: learning environment (load software, assemble any other instructional artifacts)

    SET: your chair at a 45-degree angle to the user. (This way the user wont feel like you are staring at them, but you will be able to see

    what they are doing and what they are looking at.)

    B FILL IN THIS INFORMATIONName of interviewer:_________________________________________________ Name of interviewee:_____________________________

    Date: Time started: Time completed:

    C SIGN PARTICIPANT RELEASE FORM (optional)I understand that the purpose of this testing is to provide feedback on the usability of an instructional product. I understand that I will be providing feed-

    back related to the usability of this tool during its development. I understand that my interactions with the tool will be observed, and that I will take part in

    interviews with one of the designers. I understand that my participation in the data collection is fully voluntary, and I may withdraw from the study at any

    time. I realize that there are no risks attached to my participation, neither will there be any effects if I choose not to participate at a later date. I understand

    that all information gathered will remain confidential.

    Participants Signature_________________________________ Date ______________

    Researchers Signature_________________________________ Date ______________

    D START THE INTERVIEW

    Break the Ice: Exchange names and ask the user about themselves (what they do for a living, what their hobbies are) You want them to feelcomfortable talking to you. Be sure to listen carefully and ask questions related to what they are saying. They will feel like you are

    interested in them, and by answering your questions they will get more familiar with the interviewing format.

    Explain the Purpose: For the next hour Im going to be asking your opinion on some training materials my group is designing. The product we are design-ing is (insert your description here). We want to make these materials as easy as possible to use. Your opinion is very important in help-

    ing us design something that learners like and can use. You will not hurt our feelings when you tell us that you dont like something.

    Remember that we are evaluating our learning product, not evaluating you. When you have trouble knowing what to do, or get mixed

    up, it means to us that our training product isnt as good as it should be. In other words, we think our design is the problem, not you.

    Id like you to think aloud while you are looking at this product. Tell me what you are doing and thinking about every step of the

    way. Tell me what you like and dont like. Tell me if what you see seems relevant to you. Does it make you curious? Does it bore

    you? Share all of these things with me. Ill give you an example of thinking aloud. (Demonstrate the think-aloud process on the

    first screen or page of your product.) If you feel self-conscious with this process just imagine that you are talking with a friend,

    explaining each step of what you are doing. Go ahead and fully explore. Say everything you are thinking, even if it seems irrele-

    vant. I may interrupt you from time to time if I need more information about what you are doing.

    E RECORD YOUR OBSERVATIONS

    Keep these Try to keep yourself busy taking notes so that the user doesnt feel too self-conscious.Points in Mind: Give the user feedback from time to time, perhaps reminding them that we want to find the problems with these materials, and

    they are helping us do this. Prompt the user to fully explore the interface. Prompt with questions in a non-threatening way, such

    as I see that you chose to click on the arrow there instead of here. Can you share with me why you didnt click here instead?

    This is less threatening than, Why didnt you click this? Be sure to note how the user begins. Does the user have problems?

    Does the user freely explore the site?

    F CONCLUDE YOUR INTERVIEW This concludes our interview today. Thank you for your time.

    Figure 3. Learner-Test Protocol.

  • Performance Improvement Volume 40 Number 4 27

    we then place an X into each applicable column on that lineas well.