39
How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Embed Size (px)

Citation preview

Page 1: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

How Do We Know It’s Working?

Creating Evaluations for Technology Projects and

Evaluations (part I)

Page 2: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Contact Information

[email protected] 978-251-1600 ext. 204

www.edtechevaluation.com This presentation will be linked to that site (on the

Tools page)

Page 3: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Where Do We Stand?

Who’s working on an actual project? Current? Anticipated?

Your expectations for today

Page 4: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Workshop Goals

To review the key elements of effective program evaluation as applied to technology evaluations

To consider evaluation in the context of your actual projects

Page 5: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Why Evaluate?

To fulfill program requirements NCLB and hence Title IID carry evaluation

requirements

To realize your investment in technology What sort of “difference” has all of this

technology made?

Page 6: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Basis in NCLB

“The application shall include:…

A description of the process and accountability measures that the applicant will use to evaluate the extent to which activities funded under this subpart are effective in integrating technology into curricula and instruction, increasing the ability of teachers to teach, and enabling students to meet challenging State academic content and student academic achievement standards.”

NCLB Act, Title II, Part D, Section 2414(11)

Page 7: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

One consistent thread in NCLB is evaluation and assessment How can you document that this “intervention”

is making a difference?

All funded work must be based in reflection and data-driven decision-making

Naturally, this translates to local district proposals

Page 8: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

A Framework for Review

From Designing ProfessionalDevelopment for Teachers of Scienceand Mathematics, Loucks-Horsley,Hewson, Love, and Stiles. CorwinPress Inc. 1998

Page 9: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Evaluation

Helps clarify project goals, processes, products Must be tied to indicators of success written for your

project’s goals Not a “test” or checklist of completed activities Qualitatively, are you achieving your goals? What adjustments to can be made to your project to

realize greater success?

Page 10: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

The Basic Process

Evaluation Questions Tied to original project goals

Performance Rubrics Allow for authentic, qualitative,

and holistic evaluation

Data Collection Tied to indicators in the

rubrics

Scoring and Reporting Role of this committee (the

evaluation committee)

Creating a

District-wide

Technology

Evaluation

Generate

leadership

support

Determine scope

of the evaluation

effort

Formulate

Evaluation

Questions

Appoint

Committee

Review

Questions

Develop Indicator

Rubrics

Data Collection

Data Analysis

Scoring the

Rubrics

Recommendations

Dissemination of

Report

Findings

Initiating the Next

Review Cycle

Orient and Train

In-District

Evaluation

Committee

Stage 1

Committee orientation,

evaluation framing, and

training

Stage 2

Data collection and

analysis

Stage 3

Findings, recommendations,

and reporting

Page 11: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Who Evaluates?

Committee of stakeholders (pg 12)Outside facilitator?Data collection specialists?Task checklistOther issues:

Honesty Perspective Time-intensive

Page 12: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Evaluation Starts with Goals

Evaluation should be rooted in your goals for how you are going to use or integrate that technology Is more than an infrastructure plan Focuses on technology’s impact on teachers

and students Has clear goals and objectives for what you

want to see happen

Page 13: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Evaluation Logic Map

Page 14: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Project Sample

Page 15: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Your Project?

Using the Evaluation Logic Map, map your: Project purpose/vision Goals Objectives Actions

Page 16: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Goals Lead to Questions

What do you want to see happen? These are your goals Rephrase goals into questions

Achieving these goals requires a process that can be measured through a formative evaluation

Page 17: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

We Start with Goals…

To improve student achievement through their participation in authentic and meaningful science learning experiences.

To provide advanced science and technology learning opportunities to all students regardless of learning styles or abilities.

To produce high quality science and technology curriculum in which the integration of technology provides “added value” to teaching and learning activities.

To increase students’ knowledge of the Connecticut River’s history and geology, and to gain and understanding its past, present and possible future environmental issues.

Page 18: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

…and move to questions

Has the project developed technology-enhanced science learning experiences that have been instrumental in improving student mastery of the Skills of Inquiry, understanding of the history/geology/ecology of the Connecticut River, and of the 5-8 science curriculum in general?

Has the project offered teacher professional development that has resulted in improved teacher understanding of universal design principles and technology integration strategies?

Page 19: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

…And Then to Indicators

What is it that you want to measure? Whether the projects have enhanced learning The relationship between the units and

The selected curriculumThe process by which they were developed

Increases in teacher technology skills (in relation to particular standards)

Whether the professional development model met with its design expectations

Collaborative and sustainableInvolves multiple subjects and administrators

Page 20: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Indicators should reflect your project’s unique goals and aspirations Rooted in proposed work Indicators must be indicative of your unique

environment...what constitutes success for you might not for someone else

Indicators need to be highly descriptive and can include both qualitative and quantitative measures

Page 21: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Try a Sample Indicator

Going back to the Logic Map, try to develop a few indicators for your sample project Keep it simple Qualitative and quantitative Will you be able to seesee the indicator?

Page 22: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

To Summarize...

Start with your proposal or technology plan

From your goals, develop indicators and a performance rubric

Page 23: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Coming in Part II

Data CollectionReporting

Page 24: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)
Page 25: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

How Do We Know It’s Working?

Creating Evaluations for Technology Projects and

Evaluations (part II)

Page 26: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Creating a

District-wide

Technology

Evaluation

Generate

leadership

support

Determine scope

of the evaluation

effort

Formulate

Evaluation

Questions

Appoint

Committee

Review

Questions

Develop Indicator

Rubrics

Data Collection

Data Analysis

Scoring the

Rubrics

Recommendations

Dissemination of

Report

Findings

Initiating the Next

Review Cycle

Orient and Train

In-District

Evaluation

Committee

Stage 1

Committee orientation,

evaluation framing, and

training

Stage 2

Data collection and

analysis

Stage 3

Findings, recommendations,

and reporting

Page 27: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

A Basic Process

Evaluation Questions Must be tied to original planning goals

Performance Rubrics Allow for authentic, qualitative, and holistic

evaluationData Collection

Tied to indicators in the rubricsScoring and Reporting

Page 28: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Measures?

Classroom observation, interviews, and work-product review What are teachers doing on a day-to-day basis to

address student needs?Focus groups and surveys

Measuring teacher satisfactionTriangulation with data from administrators and

staff Do other groups confirm that teachers are being

served?

Page 29: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Data Collection

Review Existing Data Current technology plan Curriculum District/school improvement plans

www.sun-associates.com/eval/sampleCreate a checklist for data collection

Page 30: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Surveys

Creating good surveys length differentiation (teachers, staff, parents,

community, etc..) quantitative data attitudinal data timing/response rates (getting returns!)

www.sun-associates.com/eval/samples/samplesurv.html

Page 31: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Surveys

Online Profiler LoTi Zoomerang

Page 32: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Survey Issues

Online surveys produce high response rates

Easy to report and analyze data Potential for abuse Depends on access to connectivity

Page 33: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Focus Groups/Interviews

Focus Groups/Interviews Teachers Parents Students Administrators Other stakeholders

Page 34: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Classroom Observations

Using an observation templateUsing outside observers

Page 35: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Other Data Elements?

Artifact analysis A rubric for analyzing teacher and student

work?

Solicitation of teacher/parent/student stories This is a way to gather truly qualitative data What does the community say about the use

and impact of technology?

Page 36: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Dissemination

Compile the reportDetermine how to share the report

School committee presentation Press releases Community meetings

Page 37: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

Conclusion

Build evaluation into your technology planning effort

Remember, not all evaluation is quantitative

You cannot evaluate what you are not looking for, so it’s important to —

Develop expectations of what constitutes good technology integration

Page 38: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)

More Information

[email protected] 978-251-1600 ext. 204

www.sun-associates.com/evaluationwww.edtechevaluation.com

This presentation is linked to that page

Page 39: How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)