36
Formative and Summative Evaluations Instructional Design For Multimedia

Formative and Summative Evaluations Instructional Design For Multimedia

  • View
    221

  • Download
    0

Embed Size (px)

Citation preview

Formative and Summative Evaluations

Instructional DesignFor Multimedia

Summative Evaluation

Evaluation Phases

Formative Evaluation

Analysis Design Development Implementation

Formative Evaluation Occurs before implementation

Determines the weaknesses in the instruction so that revisions can be made

Makes instruction more effective and efficient

Formative Evaluation Is Especially Important When… Designer is novice Content area is new Technology is new Audience is unfamiliar Task performance is critical Accountability is high Client requests/expects evaluation Instructions will be disseminated widely Opportunities for later revision are slim

Formative Evaluation Phases

Design Reviews

Expert Reviews

One-to-One Evaluation

Small-Group Evaluation

Field Trials

Ongoing Evaluation

Lea

rner

Val

idat

ion

Design Reviews Should take place after each step of

the design process Goal Review Review of Environment and Learner

Analysis Review of Task Analysis Review of Assessment Specifications

Design Reviews

Goal ReviewQuestion

Does the instructional goal reflect a satisfactory response to the problems identified in the needs assessment?

Possible Methods Have client review and approve learning goals

Design ReviewsEnvironment and Learner Analysis Review

Question

Do the environment and learner analyses accurately portray these entities?

Possible Methods Collect survey or aptitude data Give a reading test to sample learners Survey managers to confirm attitudes

Design Reviews

Task Analysis ReviewQuestion

Does the task analysis include all of the prerequisite skills and knowledge needed to perform the learning goal, and is the prerequisite nature of these skills and knowledge accurately represented?

Possible Methods Test groups of learners with and without prerequisite skills Give a pretest to a sample audience on the skills to be learned

Design Reviews

Assessment Specification Review

Question

Does the test items and resultant test blueprints reflect reliable and valid measures of the instructional objectives?

Possible Methods Have experts review assessment items Administer assessment instruments to skilled learners to determine practicality

Expert Reviews Should take place when instructional

materials are in draft form Experts include:

Content Experts Instructional Design Experts Content-specific Educational Experts Learner Experts

Expert Reviews

Content Experts Subject matter experts (SMEs)

review for accuracy and completeness Is the content accurate and up-to-date? Does the content present a consistent

perspective?

Example: Physics expert

Expert Reviews

Instructional Design Experts Reviews for instructional strategy

and theory Is the instructional strategies consistent

with principles of instructional theory?

Example: Instructional Designer

Expert Reviews

Content-Specific Educational Expert

Reviews for pedagogical approach in content area Is the pedagogical approach consistent

with current instructional theory in the content area?

Example: Science education specialist

Expert Reviews

Learner Expert

Reviews appropriateness such as vocabulary, examples and illustrations Are the examples, practice exercises,

and feedback realistic and accurate? Is the instruction appropriate for target

learners?

Example: 6th grade teacher

Expert Reviews

Process

1. Distribute draft material to experts2. Collect comments and prioritize into

categories such as: Critical

Revisions should be made immediately Non-critical

Disregard or address at a later date More Info

Find more data or information

Learner Validation Try instruction with representative

learners to see how well they learn and what problems arise as they engage with the instruction

1. One-to-One Evaluations2. Small Group Evaluation3. Field Trials

Learner Validation

One-to-One Evaluation Present materials to one learner at a time Typical problems that might arise are:

Typographical errors Unclear sentences Poor or missing directions Inappropriate examples Unfamiliar vocabulary Mislabeled pages or illustrations

Make revisions to instruction Conduct more evaluations if necessary

Learner Validation

One-to-One Evaluation Process

1. Present materials to student2. Watch student interact with material3. Employ “Read-Think-Aloud” method4. Continually query students about

problems they face and what they are thinking

5. Assure student that problems in the instruction are not their fault

6. Tape record or take notes during session7. Reward participation

Learner Validation

Small Group Evaluation Present materials to 8-12 learners Administer a questionnaire to obtain general

demographic data and attitudes or experiences

Problems that might arise are: Students have more or less entry level skills than

anticipated Course was too long or too short Learners react negatively to the instruction

Make revisions to instruction Conduct more evaluations if necessary

Learner Validation

Small-Group Evaluation Process

1. Conduct entry-level and pretests with students2. Present instruction to students in natural setting3. Observe students interacting with materials4. Take notes and/or videotape session5. Only intervene when instruction cannot proceed

without assistance6. Administer posttest7. Administer attitudinal survey or discussion8. Reward participation

Learner Validation

Field Trials Evaluation Administer instruction to 30 students Problems that might arise:

Instruction is not implemented as designed Students have more or less entry-level skills Assessments are too easy or difficult Course is too long or too short Students react negatively to instruction

Make revisions Conduct more field trials if necessary

Learner Validation

Field Trials Evaluation Process

1. Administer instruction students in normal setting, in various regions and with varying socioeconomic status

2. Collect and analyze data from pretests and posttests

3. Conduct follow-up interviews if necessary

4. Conduct questionnaire with instructors who deliver the training

Formative Evaluation

Ongoing Evaluation Continue to collect and analyze data Collect all comments/changes made by

teachers who deliver the instruction Keep track of changes in learner

population Revise instruction or produce new

material to accompany instruction as needed

Formative Evaluation Summary Conduct design reviews after each stage of

design including goals, environment and learner analysis, task analysis and assessment specifications

Conduct expert reviews with content, instructional design, content-specific educator and learner experts

Conduct one-to-one evaluations with students Conduct small-group evaluations with 8-12

students Conduct field trials with 30 or more students Conduct ongoing evaluations

Summative Evaluation Occurs after implementation

(after program has completed full cycle)

Determines the effectiveness, appeal, and efficiency of instruction

Assesses whether the instruction adequately solves the “problem” that was identified in the needs assessment

Summative Evaluation Phases

Determine Goals

Select Orientation

Select Design

Design/Select Evaluation Measure

Collect Data

Analyze Data

Report Results

Summative Evaluation

Determine Goals Identify questions that should be

answered as a result of the evaluation Does implementation of the instruction solve the problem

identified in the assessment? Do the learners achieve the goals of the instruction? How do the learners feel about the instruction? What are the costs of the instruction, what is the return

on investment (ROI)? How much time does it take for learners to complete the

instruction? Is the instruction implemented as designed? What unexpected outcomes result from the instruction?

Summative Evaluation

Determine Goals Select indicators of success

If program is successful, what will we observe it in:• Instructional materials?• Learner’s activities?• Teachers knowledge, practice and

attitudes?• Learner’s understanding, processes, skills,

and attitudes?

Summative Evaluation

Select Orientation Come to an agreement with client on

most appropriate orientation of evaluation Objectivism – Observation and quantitative

data collected to determine the degree to which the goals of the instruction have been met

Subjectivism – Expert judgment and qualitative data not based on instructional goals

Summative Evaluation

Select Design of Evaluation What data will be collected, when,

and under what conditions? Instruction, Posttest Pretest, Instruction, Posttest Pretest, Instruction, Posttest, Posttest,

Posttest

Summative Evaluation

Design or Select Evaluation Measures

Payoff Outcomes - Review statistics that may have changed after instruction was implemented

Learning Outcomes - Measure for an increase in test scores

Attitudes - Conduct interviews, questionnaires, and observations

Level of Implementation - Compare design of program to how it is implemented

Costs - Examine costs to implement and continue program, personnel, facilities, equipment, and material

Summative Evaluation

Collect Data

Devise a plan for the collection of data that includes a schedule of data collection periods

Summative Evaluation

Analyze Data

Analyze the data so that it is easy for the client to see how the instructional program affected the problem presented in the needs assessment.

Summative Evaluation

Report Results

Prepare a report of the summative evaluation findings that includes: Summary Background Description of Evaluation Study Results Discussion Conclusion and Recommendations

Summative Evaluation Summary Determine the goals of the evaluation Select objective or subjective

orientation Select design of evaluation plan Design or select evaluation measures Collect the data Analyze the data Report the results