3
This chapter offers insights related to the value of involving evaluation early in the program development process and helping funders understand the benefits of evaluation. 7 Another Perspective: An Interview with David Smith Ruth A. Bowman, Kelli Johnson To provide another perspective on evaluation within nonformal settings, New Directions for Evaluation recently interviewed David Smith, the coor- dinator of the Professional Learning to Close the Achievement Gap program for the Kansas City, Kansas, Public Schools, who has extensive background in education and educational research. He formerly held positions with the Partnership for Children (a child advocacy organization), the Annenberg Institute for School Reform, and the Kettering Foundation. We asked him to reflect on his experiences with evaluating these programs from the per- spective of program staff, evaluator, and funder. NEW DIRECTIONS: What are the critical aspects of nonformal education that influence the evaluation of these programs? DAVID SMITH: Everything matters, but one aspect that gets less attention than it should is building the knowledge base among nonformal education prac- titioners about the field of evaluation. Frequently, programs seek evaluation services only after programs are up and running—in a manner akin to designing the brakes while the car is speeding down the highway. I worked for a program that supported the creation of after-school pro- grams for middle school students. They obtained funding to begin the program, but before they could expand, they needed to quickly demonstrate NEW DIRECTIONS FOR EVALUATION, no. 108, Winter 2005 © Wiley Periodicals, Inc. Published online in Wiley InterScience (www.interscience.wiley.com) • DOI: 10.1002/ev.173 81

Another perspective: An interview with David Smith

Embed Size (px)

Citation preview

Page 1: Another perspective: An interview with David Smith

This chapter offers insights related to the value ofinvolving evaluation early in the program developmentprocess and helping funders understand the benefits ofevaluation.

7

Another Perspective: An Interviewwith David Smith

Ruth A. Bowman, Kelli Johnson

To provide another perspective on evaluation within nonformal settings,New Directions for Evaluation recently interviewed David Smith, the coor-dinator of the Professional Learning to Close the Achievement Gap programfor the Kansas City, Kansas, Public Schools, who has extensive backgroundin education and educational research. He formerly held positions with thePartnership for Children (a child advocacy organization), the AnnenbergInstitute for School Reform, and the Kettering Foundation. We asked himto reflect on his experiences with evaluating these programs from the per-spective of program staff, evaluator, and funder.

NEW DIRECTIONS: What are the critical aspects of nonformal education thatinfluence the evaluation of these programs?

DAVID SMITH: Everything matters, but one aspect that gets less attention thanit should is building the knowledge base among nonformal education prac-titioners about the field of evaluation. Frequently, programs seek evaluationservices only after programs are up and running—in a manner akin todesigning the brakes while the car is speeding down the highway.

I worked for a program that supported the creation of after-school pro-grams for middle school students. They obtained funding to begin the program, but before they could expand, they needed to quickly demonstrate

NEW DIRECTIONS FOR EVALUATION, no. 108, Winter 2005 © Wiley Periodicals, Inc.Published online in Wiley InterScience (www.interscience.wiley.com) • DOI: 10.1002/ev.173 81

Page 2: Another perspective: An interview with David Smith

82 EVALUATING NONFORMAL EDUCATION PROGRAMS AND SETTINGS

impressive program impact. So they brought in evaluators at that pointrather than at the beginning, during the program design phase.

In addition, many nonformal programs feel pressure to use evaluationto “prove” something to individuals outside the field. In youth developmentwork, for example, many evaluators recognize that the important outcomesare whether the youth achieve the developmental milestones toward becom-ing well-functioning, productive adults. We know that access to supportsand opportunities (for example, high-quality after-school programs) makesthem more likely to reach these milestones.

Unfortunately, many policymakers are focused on how many acts ofviolence were prevented or how much grades and test scores improved.Especially in times of scarce funding, this situation results in many well-meaning program directors’ promising things that evaluation really can’tdeliver.

NEW DIRECTIONS: What advice do you have for novices or other evaluatorsnew to nonformal settings? What is the most important characteristic foran evaluator of a nonformal program?

DAVID SMITH: Evaluators who find themselves in unfamiliar nonformal edu-cation settings should be sure to learn about the field they are working inrather than counting on the local program to have that knowledge. Suchevaluators will also be well served by getting in on the ground floor of plan-ning the program design rather than being brought in after the program isup and running. Evaluators who are conversant about the state of researchand evaluation in that field and bring new information to the table will havesubstantially more credibility and voice at the table.

NEW DIRECTIONS: What specific evaluation practices have been particularlyeffective in working with nonformal programs and settings?

DAVID SMITH: It is particularly effective for evaluators to build relationshipswith the program funders by educating them about best practices in boththe program area and the evaluation of that area. Educated funders can beextremely helpful in ensuring that a thoughtful evaluation component isbuilt into the program design. If funders have a deep understanding of eval-uation and what it can tell them about the work they are being asked tofund, they will be better able and much more likely to build resources intotheir grant making to support quality evaluations.

NEW DIRECTIONS: Describe the best practices in data collection for evalua-tion of nonformal programs. What methods work well?

NEW DIRECTIONS FOR EVALUATION • DOI 10.1002/ev

Page 3: Another perspective: An interview with David Smith

ANOTHER PERSPECTIVE 83

DAVID SMITH: First, the methods for evaluating nonformal education pro-grams and settings are very program and field specific. Next, the best eval-uations employ a strong degree of standardization in data collection, andthey provide training before the program starts. Finally, the top programsalso provide funding for data collection as a part of the grants.

NEW DIRECTIONS: Why is this volume of New Directions needed?

DAVID SMITH: I think this is a critical area of evaluation growth, in partbecause of what we are learning about the important role that nonformaleducation programs play in youth development—information that has comefrom the field of evaluation.

NEW DIRECTIONS: What should be next in the evolution of evaluation innonformal settings?

DAVID SMITH: As I mentioned earlier, it is critically important for evaluatorsto build relationships with the program funders. This can be done by tak-ing the time to educate funders about program and evaluation best practicesin a particular issue area. Funders who understand the important contri-butions evaluation offers a program are more likely to support resources forevaluation as part of their grant-making process.

RUTH A. BOWMAN is an evaluation studies doctoral student in the Departmentof Educational Policy and Administration at the University of Minnesota.

KELLI JOHNSON is an evaluation studies doctoral student in the Department ofEducational Policy and Administration at the University of Minnesota.

NEW DIRECTIONS FOR EVALUATION • DOI 10.1002/ev