19
How do we evaluate how effective we are in helping our students master course material? In promoting our broader educational agendas? - Prelim and final exam performance - % F, D and C- grades - Course evaluations The problem of silent evidence.

How do we evaluate how effective we are in helping our students master course material?

  • Upload
    bishop

  • View
    33

  • Download
    0

Embed Size (px)

DESCRIPTION

How do we evaluate how effective we are in helping our students master course material? In promoting our broader educational agendas? Prelim and final exam performance % F, D and C- grades Course evaluations The problem of silent evidence. . - PowerPoint PPT Presentation

Citation preview

Page 1: How do we evaluate how effective we are in helping our students master course material?

How do we evaluate how effective we are in helping our students master course material?

In promoting our broader educational agendas?

- Prelim and final exam performance- % F, D and C- grades- Course evaluations

The problem of silent evidence.

Page 2: How do we evaluate how effective we are in helping our students master course material?

How do we evaluate how effective we are in helping our students master course material?

In promoting our broader educational agendas?

- % of initial enrollees who complete the course.

- Course enrollment trends (e.g., elective enrollees)

- Standardized Pre- and Post- Testing

Page 3: How do we evaluate how effective we are in helping our students master course material?

Idea:

• Generate and validate a test to evaluate student understanding of key concepts

• Give the test to the students at the beginning of the semester.

• Give the same test to the students at the end of the semester.

• Compare pre- and post- test scores for each student to evaluate learning gains.

Page 4: How do we evaluate how effective we are in helping our students master course material?

Two kinds of tests:

• Concept Tests: Evaluate conceptual understanding of core course content

Force Concept Inventory (FCI)

Force and Motion Conceptual Evaluation (FMCE)

Conceptual Survey in Electricity and Magnetism (CSEM)

Page 5: How do we evaluate how effective we are in helping our students master course material?
Page 6: How do we evaluate how effective we are in helping our students master course material?
Page 7: How do we evaluate how effective we are in helping our students master course material?

Two kinds of tests:

• Attitude Tests: Evaluate attitudes and beliefs about science and learning science.

Epistomological Beliefs Assessment for Physical Science (EBAPS)

Colorado Learning Attitudes aboutScience (CLASS)

Maryland Physics Expectations Survey (MPEX)

Page 8: How do we evaluate how effective we are in helping our students master course material?

2.When it comes to understanding physics or chemistry, remembering facts isn’t very important.

13. If physics and chemistry teachers gave really clear lectures, with plenty of real-life examples and sample problems, then most good students could learn those subjects without doing lots of sample questions and practice problems on their own.

Page 9: How do we evaluate how effective we are in helping our students master course material?

20. In physics and chemistry, how do the most important formulas relate to the most important concepts? Please read all choices before picking one.

(a) The major formulas summarize the main concepts; they’re not really separate from the concepts. In addition, those formulas are helpful for solving problems.

(b) The major formulas are kind of "separate" from the main concepts, since concepts are ideas, not equations. Formulas are better characterized as problem-solving tools, without much conceptual meaning.

Page 10: How do we evaluate how effective we are in helping our students master course material?

25. Anna:I just read about Kay Kinoshita, the physicist. She sounds naturally brilliant.Emily: Maybe she is. But when it comes to being good at science, hard work is more important than “natural ability.” I bet Dr. Kinoshita does well because she has worked really hard.Anna: Well, maybe she did. But let’s face it, some people are just smarter at science than other people. Without natural ability, hard work won’t get you anywhere in science!(a) I agree almost entirely with Anna.(b) Although I agree more with Anna, I think Emily makes some good points.

. . .

Page 11: How do we evaluate how effective we are in helping our students master course material?

1990's: FCI in P207 (Littauer, Holcomb)

2008: PhysTEC-mandated testing in intro courses

2009: Online pre- and post-testing

Physics 1112, 1116: FMCE, EBAPS

Physics 2207: FCI, EBAPS

Physics 2213, 2217: CSEM

Page 12: How do we evaluate how effective we are in helping our students master course material?

Fall 2009 Pre-Test Response Rate

P1112: 155/195 79%*P1116: 029/071 41%P2207: 123/319 39%P2213: 159/397 40%P2217: 023/047 49%

* P1112 offered a tiny amount of course credit for taking the test.

Page 13: How do we evaluate how effective we are in helping our students master course material?

P2207Average: 14.8 (49%)St Dev: 5.9 (20%)

Score

Page 14: How do we evaluate how effective we are in helping our students master course material?

P1112Average: 20.5 (62%)St Dev: 8.5 (26%)

Score

Page 15: How do we evaluate how effective we are in helping our students master course material?

P1116Average: 29.2 (88%)St Dev: 7.3 (22%)

Score

Page 16: How do we evaluate how effective we are in helping our students master course material?

P2213Average: 12.8 (43%)St Dev: 4.9 (16%)

Score

Page 17: How do we evaluate how effective we are in helping our students master course material?

Score

P2217Average: 20.0 (65%)St Dev: 4.3 (14%)

Page 18: How do we evaluate how effective we are in helping our students master course material?

Using Pretest Results

- Average level and distribution of student preparation. (Are you attracting the most diverse class possible?)

- Identify students who may need help - Identify common misconceptions, and give

homework problems and exam questions to rectify.- Disaggregate student completion/performance

based on pretest score/prior preparation.

Page 19: How do we evaluate how effective we are in helping our students master course material?

FCI Results for Physics 207 Semester N Pre-Test

Average Post-Test Average

Normalized Gain

Standard Error

Fall 2007 142 0.442 0.682 0.436 0.025 Fall 2008 244 0.475 0.693 0.419 0.021