42
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center www.idea.ksu.edu

IDEA Student Ratings of Instruction Update

  • Upload
    emil

  • View
    41

  • Download
    0

Embed Size (px)

DESCRIPTION

IDEA Student Ratings of Instruction Update. Carrie Ahern and Lynette Molstad. Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center. www.idea.ksu.edu. Presentation. Process at DSU for online IDEA surveys Review IDEA - Student Ratings of Instruction system Forms - PowerPoint PPT Presentation

Citation preview

Page 1: IDEA Student Ratings of Instruction Update

IDEA Student Ratings of Instruction Update

Carrie Ahern and Lynette Molstad

Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center

www.idea.ksu.edu

Page 2: IDEA Student Ratings of Instruction Update

04/21/232

Presentation

Process at DSU for online IDEA surveys

Review IDEA - Student Ratings of Instruction system FormsReports

Questions

Page 3: IDEA Student Ratings of Instruction Update

04/21/233

Process for IDEA Surveys

Faculty receive e-mail for each course with a link to the FIF (new copy feature)

Faculty receive unique URL for each course- must provide this to students

Faculty receive status update on how many students completed

Questions

Page 4: IDEA Student Ratings of Instruction Update

IDEA as a Diagnostic to Guide Improvement

And as a Tool to Evaluate Teaching Effectiveness

Page 5: IDEA Student Ratings of Instruction Update

IDEA Student Ratings of InstructionThe Student Learning Model

Page 6: IDEA Student Ratings of Instruction Update

04/21/236

Student Learning Model

Types of learning must reflect instructor’s purpose

Effectiveness determined by student progress on objectives stressed by instructor

Page 7: IDEA Student Ratings of Instruction Update

IDEA Student Ratings of Instruction OverviewFaculty Information FormStudent Survey - Diagnostic Form

Page 8: IDEA Student Ratings of Instruction Update

IDEA: FIF

Faculty Information Form

Page 9: IDEA Student Ratings of Instruction Update

04/21/239

Faculty Information Form

Some thoughts on selecting objectives

http://www.theideacenter.org/SelectingObjectives

Video for Faculty on completing the FIF

http://www.theideacenter.org/FIFVideo

Page 10: IDEA Student Ratings of Instruction Update

04/21/2310

Faculty Information Form

One FIF per class being evaluated Course Information

IDEA Department Codes• Extended list:

http://www.idea.ksu.edu/StudentRatings/deptcodes.html

12 Learning Objectives Course Description Items

Optional Best answered toward end of semester

Page 11: IDEA Student Ratings of Instruction Update

04/21/2311

FIF: Selecting Objectives

3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help

students accomplish the objective? Does the student’s progress on the objective

influence his or her grade? In general, progress ratings are negatively

related to the number of objectives chosen. Research Note 3

Page 12: IDEA Student Ratings of Instruction Update

04/21/2312

Best Practices

Multi-section courses Curriculum committee review Prerequisite-subsequent courses Discuss meaning of objectives with

students Incorporate into course syllabus

Page 13: IDEA Student Ratings of Instruction Update

04/21/2313

New feature- as of 2/2010

Copy FIF objectives from one course to another

Previous FIFs will be available in a drop down menu (linked by faculty e-mail address)

Page 14: IDEA Student Ratings of Instruction Update

04/21/2314

Page 15: IDEA Student Ratings of Instruction Update

Student Survey

Diagnostic Form

http://theideacenter.org/sites/default/files/Student_Ratings_Diagnostic_Form.pdf

Page 16: IDEA Student Ratings of Instruction Update

04/21/2316

Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items 21-32 Student and Course

Student Characteristics: Items 36-39, 43 Course Management/Content: Items 33-35

Global Summary: Items 40-42 Experimental Items: Items 44-47 Extra Questions: Items 48-67 Comments

Page 17: IDEA Student Ratings of Instruction Update

04/21/2317

FalseFalse Assumptions

Effective instructors effectively employ all 20 teaching methods.

The 20 teaching methods items are used to make an overall judgment about teaching effectiveness.

Students should make significant progress on all 12 learning objectives

Page 18: IDEA Student Ratings of Instruction Update

04/21/2318

Resources: Administering IDEA

www.idea.ksu.edu Client Resources

IDEA Resources Best practices Directions to Faculty Using Additional Questions Some Thoughts on Selecting IDEA Objectives Disciplinary Selection of Learning Objectives Guide to Administering IDEA Team Teaching

All resourceson our website.

Page 19: IDEA Student Ratings of Instruction Update

ReportBackground

Comparison GroupsConverted Scores

Page 20: IDEA Student Ratings of Instruction Update

04/21/2320

The Report: Comparative Information

Comparison GroupsIDEADisciplineInstitution

Page 21: IDEA Student Ratings of Instruction Update

04/21/2321

Comparison Groups (norms)

IDEA Comparisons Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5%

of the database 128 institutions 44,455 classes Updated only periodically

Page 22: IDEA Student Ratings of Instruction Update

04/21/2322

Comparison Groups (norms)

Discipline Comparisons Updated annually (September 1)Most recent 5 years of data

• Approximately July 1-June 30

Exclusions same as IDEA Comparisons

• Also exclude classes with no objectives selected

Minimum of 400 classes

Page 23: IDEA Student Ratings of Instruction Update

04/21/2323

Comparison Groups (norms)

Institutional ComparisonsUpdated annually (September 1)Most recent 5 years of data

• Approximately July 1-June 30

Most recent 5 years of dataIncludes Short and Diagnostic FormExclude classes with no objectives

selectedMinimum of 400 classes

Page 24: IDEA Student Ratings of Instruction Update

04/21/2324

Norms: Converted Averages

Method of standardizing scores with different averages and standard deviations

Able to compare scores on the same scaleUse T Scores

• Average = 50• Standard Deviation = 10

They are not percentiles

Page 25: IDEA Student Ratings of Instruction Update

ReportBackground

Adjusted Scores

Page 26: IDEA Student Ratings of Instruction Update

04/21/2326

Adjusted Scores

Control for factors beyond instructor’s control

Regression equations

Link to video clip explaining Adjusted Scores

http://theideacenter.org/taxonomy/term/109

Page 27: IDEA Student Ratings of Instruction Update

04/21/2327

Adjusted Scores: Diagnostic Form

Student Work Habits (#43) Student Motivation (#39) Class Size (Enrollment, FIF) Student Effort (multiple items) Course Difficulty (multiple items)

Page 28: IDEA Student Ratings of Instruction Update

IDEA...The Report

Page 29: IDEA Student Ratings of Instruction Update

04/21/2329

The IDEA Report

Diagnostic Form ReportWhat were students’ perceptions of

the course and their learning?

What might I do to improve my teaching?

Page 30: IDEA Student Ratings of Instruction Update

04/21/2330

Questions Addressed: Page 1 What was the response rate and how

reliable is the information contained in the report?

What overall estimates of my teaching effectiveness were made by students?

What is the effect of “adjusting” these measures to take into consideration factors I can’t control?

How do my scores compare to other comparison groups?

Page 31: IDEA Student Ratings of Instruction Update

04/21/2331

Summary Evaluation of Teaching Effectiveness

Page 32: IDEA Student Ratings of Instruction Update

04/21/2332

Questions Addressed: Page 2 How much progress did students report on

the learning objectives that I identified as “Essential”?

How does this progress compare to the available comparison groups?

How much progress did students report on the “Important” objectives?

How does this progress compare to the available comparison groups?

Do conclusions change if “adjusted” rather than “raw” ratings are used?

Page 33: IDEA Student Ratings of Instruction Update

04/21/2333

Progress on Specific Objectives

4.1+4.14.0+4.0+3.8+3.9

6

Page 34: IDEA Student Ratings of Instruction Update

04/21/2334

Questions Addressed: Page 3 Which of the 20 teaching methods are most

related to my learning objectives? How did students rate my use of these

important methods? What changes should I consider in my

teaching methods? Do these results suggest some general

areas where improvement efforts should focus?

Page 35: IDEA Student Ratings of Instruction Update

04/21/2335

Improving Teaching Effectiveness

Page 36: IDEA Student Ratings of Instruction Update

04/21/2336

Improving Teaching Effectiveness IDEA Website: http://theideacenter.org/

IDEA Papers

http://www.theideacenter.org/category/helpful-resources/knowledge-base/idea-papers

Page 37: IDEA Student Ratings of Instruction Update

04/21/2337

Questions Addressed: Page 2

How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter?

How distinctive is this class with regard to student self-ratings?

Page 38: IDEA Student Ratings of Instruction Update

04/21/2338

Description of Course and Students

Page 39: IDEA Student Ratings of Instruction Update

04/21/2339

Questions Addressed: Page 4 What was the average rating on each of the

questions on the IDEA form? How much variation was there in these

ratings? Are the distributions of responses relatively

“normal” (bell-shaped) or is there evidence of distinctive subgroups of students?

What are the results for the additional questions I used?

Page 40: IDEA Student Ratings of Instruction Update

04/21/2340

Statistical Detail

Page 41: IDEA Student Ratings of Instruction Update

04/21/2341

Statistical Detail

Page 42: IDEA Student Ratings of Instruction Update

Questions & Discussion