23
1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

Embed Size (px)

Citation preview

Page 1: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

1

Evaluating Student Learning Experiences

Michael ProsserHigher Education Academy

1

Page 2: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

2

Academy Mission

The Academy’s mission is to help institutions, discipline groups and all staff to provide the best possible learning experience for their students.

Page 3: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

3

Presentation based upon:

• Over 20 years research into the student learning experience in higher education in the United Kingdom, Australia, Sweden and Hong Kong

• Experience in Australia with the Course Experience Questionnaire

Interpretations of the results of evaluations of student learning experiences are not value or theory free:

• Interpretations in terms of student satisfaction

• Interpretations in terms of student learning experiences

Page 4: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

4

  Overview of the student learning perspective

Figure 1: Model of Student Learning

CHARACTERISTICS OF THE STUDENT (e.g. previous experiences, current understanding) STUDENTS'

PERCEPTIONS OF CONTEXT (e.g. good teaching, clear goals)

COURSE AND DEPARTMENTAL LEARNING CONTEXT (e.g. course design, teaching methods, assessment)

STUDENTS' APPROACHES TO LEARNING (how they learn e.g. surface/deep)

STUDENTS' LEARNING OUTCOMES (what they learn quantity/quality)

Page 5: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

5

STUDENT APPROACHES TO LEARNING

 

Surface Approach

Intention to reproduce

- rote memorise information needed for assessment

- failure to distinguish principles from examples

- treat tasks as external impositions

- focus on discrete elements without integration

 

Deep Approach

Intention to understand

- meaningfully memorise information for later use

- relate new ideas to previous knowledge

- relate concepts to everyday experiences

- relate evidence to conclusions 

Approaches vary between tasks and modules

Page 6: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

6

STUDENTS’ EPERIENCES OF THE LEARNING CONTEXT

Early research by Entwistle and Ramsden (1983) using both interviews and questionnaires identified a number of student experiences relating to the way they approached their studies

Student experiences of:

Quality of teaching – including quality of feedback

(NSS: Teaching, Assessment and Feedback))

Clearness of goals of course and standards of assessment

(NSS: Assessment and feedback)

Workload so high that it was not possible to understand everything

Assessment measuring reproduction and not understanding

were found to relate to how they approach their studies and to learning outcomes (exam results and other indicators)

Page 7: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

7

RELATIONSHIP BETWEEN COURSE EXPERIENCES AND APPROACHES

Study of over 8000 students in first year subjects around Australia

Amongst the data collected were students responses to:

contextualised Ramsden's Course Experience Questionnaire and a

contextualised Biggs Study Process Questionnaire

1994-1996: Australian Research Council ; Academic Departments and the Quality of Teaching and Learning; Paul Ramsden, Griffith University, Elaine Martin, RMIT, Michael Prosser, La Trobe University, Keith Trigwell, UTS

Page 8: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

8

Approaches to Study

Surface Approach

32. Although I generally remember facts and details, I find it difficult to fit them together into an overall picture

35. The best way for me to understand what technical terms mean is to remember the textbook definitions

 

Deep Approach

28. I try to relate ideas in this subject to those in other subjects, wherever possible

34. In trying to understand new ideas, I often try to relate them to real life situations to which they might apply.

 

Biggs Study Process Questionnaire

Page 9: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

9

Student Experiences Of Learning Environment

Good Teaching

15. The staff made a real effort to understand difficulties students might be having with their work.

Clear Goals and Standards

1. It was always easy to know the standard of work expected

6. I usually had a clear idea of where I was going and what was expected of me in this subject.

Appropriate Workload

25. The sheer volume of work in this subject meant that it couldn't all be thoroughly comprehended (-).

Appropriate Assessment

8. To do well in this subject, all you really need is a good memory (-).

Ramsden’s Course Experience Questionnaire

Page 10: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

10

Factor Analysis of Experiences of T & L Context and Approach to Study

_________________________________________________________________

Scale Factors

____________________________________

1 2

_________________________________________________________________

Experiences of Context

Good teaching .80

Clear Goals and Standards .67

Appropriate Workload -.69

Appropriate Assessment -.65

Approach to Study

Surface Approach .81

Deep Approach .73

_________________________________________________________________

Principal Components, Variamax Rotation, n=8837

Page 11: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

11

 In each subject:

 

1.   A deep approach is associated with experiences that the teaching is good and the goals and standards are clear (NSS: Teaching, Assessment and Feedback)

 

2.   A surface approach is associated with experiences that the workload is too high and assessment tests reproduction

That is, variation in students’ experiences of the learning environment within subjects is associated with the approaches to study within subjects – within subject variation in perception is not measurement error.

1994-1996: Australian Research Council ; Academic Departments and the Quality of Teaching and Learning; Paul Ramsden, Griffith University, Elaine Martin, RMIT, Michael Prosser, La Trobe University, Keith Trigwell, UTS

Page 12: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

1221

Comments on NSS

• Substantial amount of research linking results on surveys such as the NSS with student approaches to study and learning outcomes

• Student experiences are a function of both their prior experiences and understandings and the course design and teaching

• Do not necessarily improve student satisfaction by focusing on satisfaction - the individual items or scales – need to better understand why they responded the way they

• Spread of results – proportions responding in certain ways – better than mean of responses

• Changes is scores over time of surveys such as the NSS represent broad cultural change – departmental & institutional experiences

• Can only expect small effect sizes in changes in scores over time - .2 of a SD – if over 3 - 5 years changes of the order of .1 to .2 points (3.5 to 3.6 or 3.7).

• Follow up with more qualitative studies – interviews, open-ended written statements, focus groups to better understand experiences

Page 13: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

13

Comments on NSS

• Substantial amount of research linking results on surveys such as the NSS with student approaches to study and learning outcomes

• Student experiences are a function of both their prior experiences and understandings and the course design and teaching

• Do not necessarily improve student satisfaction by focusing on satisfaction - the individual items or scales – need to better understand why they responded the way they

• Spread of results – proportions responding in certain ways – better than mean of responses

• Changes is scores over time of surveys such as the NSS represent broad cultural change – departmental & institutional experiences

• Can only expect small effect sizes in changes in scores over time - .2 of a SD – if over 3 - 5 years changes of the order of .1 to .2 points (3.5 to 3.6 or 3.7).

• Follow up with more qualitative studies – interviews, open-ended written statements, focus groups to better understand experiences

Page 14: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

14

  Overview of the student learning perspective

Figure 1: Model of Student Learning

CHARACTERISTICS OF THE STUDENT (e.g. previous experiences, current understanding) STUDENTS'

PERCEPTIONS OF CONTEXT (e.g. good teaching, clear goals)

COURSE AND DEPARTMENTAL LEARNING CONTEXT (e.g. course design, teaching methods, assessment)

STUDENTS' APPROACHES TO LEARNING (how they learn e.g. surface/deep)

STUDENTS' LEARNING OUTCOMES (what they learn quantity/quality)

Page 15: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

15

Recent Review of Research and Evaluation on Student Learning Experience in

E-learning

Rhona Sharpe, Greg Benfield, Ellen Lessner and Eta DeCicco,:

• Much research on design of learning experiences – teacher, course or program focused

• Much research on student observable behaviours

• Little or no research on the student learning experiences of those designs (intentions, perceptions, beliefs, outcomes etc)

• Little research on the student experience of the integration of online with face-to-face – how e-learning relates to their whole learning experience and to their lives and jobs

Page 16: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

16

Learning Through Blended Discussion

Peter Goodyear, Rob Ellis, Mike Prosser

(acknowledge Australian Research Council Support)

Work in Progress

Student experience of learning through online and face-to face discussion

Using a phenomenographic approach to analysing student learning experiences

• Key aspects of differences in the way something is experienced – not rich, thick descriptions

• Second-order description and analysis – not describing what it is but how it is experienced.

• Categories of description

Page 17: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

17

Investigate associations amongst:

• how the students approached their discussion within blended learning context,

• how this variation was related to the quality of what they thought they were learning through discussion, and

• their performance in the course as a whole

In-depth, semi structured interviews; short open-ended questionnaire, assessment results

Interviews: 30 minutes, transcribed, main data for construction of categories and outcome space

Short open-ended questionnaire – same questions and leading questions for interview

Page 18: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

18

Questions:

1. What did you learn through discussion in your course? This includes all of the discussions that you were involved in in the course

2. How did you approach engaging in face-to-face discussion in your course? What sorts of things did you do to engage in the discussions? Why did you use these strategies to engage

3. How did you approach engaging in online discussion in your course? What sorts of things did you do to engage in the discussions? Why did you use these strategies to engage

Sample:

51 second your university undergraduate students studying social work in a large, research intensive metropolitan Australian University – 19 completed semi-structured interviews

Page 19: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

19

Category Quote

A: Checking Ideas

Fragmented

Getting the teacher’s point of view … it’s good being able to talk and make sure you are really learning what you are supposed to be learning

B: Acquiring Ideas

Fragmented

It elaborates the readings even more like it sort of expands the readings a bit … Like you sort of remember it a bit more

C. Developing Ideas

Cohesive

It sort of gives you different views of what people are getting out of the readings .. I guess it gives me an appreciation that people do see things differently, that its not clear cut

D. Challenging Ideas

Cohesive

It challenges my belief, which is always good … because a belief is something that is based on knowledge and experience and you understanding of the world, and if it is being challenged you are testing it … If my beliefs are challemde I believe that my understanding of concepts is more complete

Categories of Description of Conceptions of Learning through Discussions

Page 20: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

20

Category Online Discussion Quote Online Discussion

A

Surface

Engaging in online discussions to read all postings to avoid repetition

I tend to read all of them first. Because I tend to want to write something a bit different … and sort of stand out a little bit because I thought I would get good marks for that … But then if I read all of them as well I can reply to some of them …

B

Surface

Engaging in online discussion to use postings to add to my ideas

C

Deep

Engaging in online discussions to evaluate posting to challenge ideas

D

deep

Engaging in online discussions to evaluate postings to reflect key ideas

It just makes me think … It wasn’t really that original but it was something which I hadn’t thought before. So I mean, I didn’t respond to it because I didn’t have much to say .. It is something for me to think about and reply later.

Categories of Description of Approaches to Learning through Online Discussion

Page 21: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

21

Conceptions of Learning through discussion

Approaches to Discussion Online

_________________________

Totals

Surface (A&B) Deep (C&D)

Fragmented (A&B) 26 0 26

Cohesive (C&D) 9 16 25

Totals 35 (69%) 16 (31%) 51

Chi-square=24.2, phi=.69, p<.001 Statistically significant and large effect size

No cases of Fragmented conception of learning with Deep approach to discussion on-pile

Student understanding of why they are engaging in discussion fundamental to adopting deep approaches to discussion online.

Page 22: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

22

Aspects of the experience of learning through discussion

Final Mark

___________________________

Effect Size

Mean Standard Deviation

Conceptions

Fragmented 63.5 8.4

Cohesive 70.0 8.4

T test: t=2.8, p<.05 d=.77 (large)

Approaches online

Surface 64.9 8.6

Deep 70.6 8.7

T test: t=2.2, p<.05 d=.66 (medium to large)

Relationship between Conceptions, Approaches and Performance

Page 23: 1 Evaluating Student Learning Experiences Michael Prosser Higher Education Academy 1

23

• Focus on the student experience rather than satisfaction – better understand their experiences

• Comprehensive and aligned set of student evaluation instrument designed and interpreted in terms of student experiences

• Conceptual model of teaching and learning based upon the student experience underlying all developments

• Follow up with more in depth qualitative studies

• Focus on the student experience