Group4 present3 3-15

Preview:

Citation preview

The 2010 User Friendly

Handbook of Project

Evaluation

Chapter 4: The Evaluation

Process: Carrying Out the

Study and Reporting

Presented by

M. Bess-Frazier

T. Flemming

V.Lee

A. Smith

N. Wooten

Introduction

The Process of Evaluation and Reporting

consists of 4 general steps:

• 1. Data Collection

• 2. Data Analysis

• 3. Report the Findings

• 4. Publishing the information

Step 1: Data Collection

Step 2: Analyze the Data

Step 3: Report the Findings

Step 4: Publish (Disseminate) the Information

Step 1: Data Collection

Conducting the Data Collection

Things to consider:

• Get permission

• Consider needs and sensitivities of

participants

• Make sure data collectors are properly

trained, unbiased, and objective

• Get data from as large of a sample as

possible

• Disrupt as little as possible

( Frechtling, 2010, p. 39)

The Necessary Clearances and

Permissions

Think about the setting of your data

collection!

• Talk to teachers, administrators, bosses,

or parents where applicable

• Consider permission forms

• Consider who needs a copy of the final

data collection and conclusion

It may help with cooperation if you offer to

share the results after you’re done!

Consider the Needs and Sensitivities of Participants

• Be honest with participants

• Assure participants that no personal

repercussions will result from any

information provided

As an anecdote, consider a survey on what

percentage of the population uses their seatbelt

when in a care. Do you think the results would

differ if a police officer gave the survey?

Make Sure Data Collectors are Properly Trained

• Provide data collectors information about the community or culture where they will be collecting

• Data collectors need to ask the same questions and use the same prompts

• They should be supervised to avoid “coaching” and a distortion of data

• Data collectors should be unbiased

Collect Data from as Many

Members as Possible

• More members will decrease the margin

of error (Moore, 1993)

• A response rate of 70% or higher within

the sample is considered to be high

quality

• Follow up on non-participants

Avoid Disruption

Consider the setting of your data collection!

Consider the schedules of participants and

the project as well.

Step 2: Analyze

the Data

Data analysis checklist

✓Check the raw data and prepare them for

analysis

✓Conduct initial analysis based on the

evaluation plan.

✓Conduct additional analyses based on

the initial results.

✓ Integrate and synthesize findings.

Check data for unusual responses.

– Example: selecting

more than one answer

instead of the one

required

– Choosing the same

response for all

question (i.e. “c”)

– Inconsistent answers

These questions may need to

be eliminated from the data that

will be analyzed. The data is

then prepared for coding and

entering for computer analysis.

Verification and quality control

procedures should already be in

place.

✓Check the raw

data and prepare

them for analysis.

– The analysis may raise

more questions as results

are calculated. Be careful

to stay with the set of

analysis that were

originally of interest.

– Updated software is better.

These offer support to

analysts and a way to

manage large sets of

narrative data.

– quantitative data collection

should be consistent to

avoid unnecessary

invalidation or affect the

evaluation integrity.

✓Conduct initial

analysis based on

the evaluation plan

“It is very likely that the initial analyses

will raise as many questions as they

answer.”

– The second analyses will address

these.

» Example: “The first analysis

looked at teacher performance, a

second analysis might subdivide

the total group into subunits of

particular interest—e.g. more

experience versus less

experienced…..examine if any

significant differences were found

between them. “

» These cycles can continue as

resulting data suggest other

interesting avenue to explore;

even if they were not planned

(Frechtling, 42).”

– Available time and money determines

which tasks can be completed.

✓Conduct

additional

analysis based

on the initial

results.

Integration of the separate analyses,

the development of the findings’

conclusions are the last steps of

data analysis. The results may not

produce completely consistent

findings or fit into a report that

explains ambiguities. Some report

findings can remain unanswered...

✓ Integrate

and

synthesize

findings

Step 3: Report the Findings

● Requires:

o pulling together the collected data

o condensing the findings to correlate the

original questions of the evaluation

o publicize the findings

Main Sections of Formal Reports

“1. Background

2. Evaluation study questions

3. Evaluation procedures

4. Data analyses

5. Findings

6. Conclusions (and recommendations) “ (Frechtling,44)

BackgroundThis section can include:

• a need or focus of the project and objectives• possibly a journal or literature review

• the stakeholders and their background

• the study participants

• the activities and components

• project expected timeframe and venue

• the project resources used

• the expected measurable outcomes

Evaluation Study Questions

• Lists the questions that the study

addressed.

• Also includes variables that may affect the

project outcome, for instance, issues with

data collection, time constraints.

Evaluation

Procedures ● This section also describes the

types of data collected and the

instruments used for the data

collection activities.

● It is helpful at the end of this

section to include a matrix or

table that summarizes the

evaluation questions, the

variables, the data collection

approaches, the respondents,

and the data collection schedule.

● This section

describes the

project

participants.

● The participants’

selection method

and whether

sampling was

used.

Data AnalysesSection Explains:

• Data analyses methods

• Data analyses stages and how they were applied

• Data integrity safeguards to exclude study confounds

• Description of measures to ensure the study

participants represented the subject population

• A table summarizing the data analyses, if feasible

Findings

• Includes the analyses results

• Lists the study/ project questions asked even if no

answer was produced from the study

• Visual displays such as graphs if applicable

• A final project summary that explains the major

conclusions

Conclusions and Recommendations

• Summarizes more generally

• Addresses the project’s findings,

evaluation questions and program goals

Other Sections

• If applicable , there can also be formal

reports that have:

– A study abstract with findings summary

– An executive summary which contains

more detail as to the study synopsis,

conclusions, including implications for

relevant practices

Developing an Evaluation Report

• Ask colleagues or stakeholders for a critical

review of the report and any suggestions for

improvement prior to the final publication

• The report should be presented in a way

that is suitable for intended audience

Step 4: Disseminating the Information

● Dissemination is the last part of project

evaluation. (Spreading your information)

● It should include the project funding source or prospective funding sources

The Dissemination should have:

• What the group needs to know and the best

way to communicate the information to them

• In what way did the project accomplish the

goals according to the National Science

Foundation?

Sharing your Findings

• Make a list of the various audiences to

share findings.

• It is important to share projects that

worked and projects not so fruitful.

Conclusion

An effective study evaluation should include

collecting data, analyzing the data, and reporting and

publishing the study information. There are various

elements to consider to maintain integrity and avoid

biased in collecting study data. Objectivity in reporting

the study evaluation findings to stakeholders, school

district administrators and study funding sources will

enhance the ongoing development of educational

research for all professionals.

References

Frechtling, J., Mark, M., Rog, D., Frierson, H., Thomas,

V., Hood, S., Hughes, G. (2010). The 2010 User Friendly

Handbook of Project Evaluation. Arlington, VA: National

Science Foundation, Directorate for Education & Human

Resources, Division of Research, Evaluation and

Communication.

Moore, D., & McCabe, G. (1993). Introduction to the

practice of statistics (2nd ed.). New York: Freeman.

Open Ended Questions for the

Class

● In data collection, we must consider the needs and

sensitivities of the participants. For example, it would

be biased to have a police officer ask people if they

broke the law. What other examples can you think of

where the data collector might affect the information

received? How might this affect us in the classroom

as teachers?

● For Reporting the Findings and Study Dissemination,

what are the most important factors to consider in how

and to whom the findings are reported?