20
Collecting feedback on quality indicators of the higher education student experience Sonia Whiteley The Social Research Centre Student Experience Conference, Brisbane, Australia. 9 th -10 th December, 2014.

Collecting feedback on quality indicators of the higher education student experience

Embed Size (px)

Citation preview

Collecting feedback on quality indicators of the higher education

student experience

Sonia Whiteley

The Social Research Centre

Student Experience Conference, Brisbane, Australia.9th-10th December, 2014.

About the Social Research Centre

• We are a private, for profit company owned by ANU Enterprise, a subsidiary of the Australian National University and co-founders of the Australian Centre for Applied Social Research Methods (AusCen).

• Our resources include 60 professional staff, 125 station call centre, a panel of 250 interviewing staff and qualitative interviewing facilities.

• Typical services provided include survey design and execution (including sampling, questionnaire design, survey administration and interviewer training), qualitative research, survey data management, statistical consulting and analytical and interpretative reporting.

2

Context for the development of quality indicators

• A series of reviews conducted by the Australian Quality in Higher Education Reference Group, concluded that indicators should be:

– Fit for purpose

– Consistent

– Auditable

– Transparent

– Timely

• With the ultimate goal of providing a robust and reliable measure of teaching performance throughout the Student Life Cycle.

3

The Student Life Cycle

4

Graduate employment

Source: The Department of Education - Australian Graduate Survey

5

Student satisfaction

Source: The Department of Education - Australian Graduate Survey

6

Our role in Quality Indicators for Learning & Teaching

• The Upholding Quality – Quality Indicators for Learning and Teaching (QILT) measure was part of the 2014-15 Budget.

• The Social Research Centre was commissioned by the Department of Education as the independent administrator of QILT which involves:

– Undertaking data collection for University Experience Survey (UES), the Graduate Outcomes Survey (GOS) and the Employer Satisfaction Survey (ESS)

– Reporting on the outcomes of each of these surveys

– Supporting all higher education institutions (universities and private providers) in relation to QILT

– Creating, monitoring and updating the QILT website.7

QILT in the context of the student life cycle

8

The University Experience Survey

• The University Experience Survey has been undertaken annually since 2012 with:

– Undergraduate university students

– Who are onshore in Australia, and

– Commencing their studies or in the final year of their degrees.

• Graduate Careers Australia and the Social Research Centre have conducted the UES in 2013 and 2014.

• Around 100,000 surveys are completed by students each year.

• Data is collected through an online platform.

9

The University Experience Questionnaire

• The UEQ was developed in 2011 and 2012 through research review, consultation and experience.

• The instrument focuses on aspects of the university experience that are measureable, linked to learning and development outcomes, and amendable to institutional influence.

• The current UEQ domains include:

– Skills Development

– Learner Engagement

– Teaching Quality

– Student Support, and

– Learning Resources.

10

Examples of UEQ items

Domain Item

Skills DevelopmentTo what extent has your course/program developed your ability to solve complex problems

Learner Engagement

To what extent have you felt prepared for your study?

Teaching Quality Overall how would you rate the quality of your entire educational experience this year?

Student Support To what extent have you experienced efficient enrolment and admissions processes?

Learning ResourcesThinking of this year, overall how would you rate the online learning material provided for your course/program?

11

Overall UES results

• 81% of commencing students were satisfied with the quality of teaching they received, compared with 76% of later years students.

Domain Commencing Later year

Skills Development 77 82

Learner Engagement 56 58

Teaching Quality 81 76

Student Support 57 48

Learning Resources 86 79

12

Annual comparisons

• In 2012 83% of students were satisfied with the quality of teaching, declining to 79% in 2013.

Domain 2012 2013

Skills Development 82 79

Learner Engagement 58 57

Teaching Quality 81 79

Student Support 53 53

Learning Resources 82 83

13

UES & selected student characteristics

• Males are less satisfied with skills development.

• Younger students are more satisfied with learner engagement.

• Indigenous students are more satisfied with student support.

• International students are less satisfied with teaching quality.

• Students with a disability are less satisfied with learning resources.

14

The UES & subject area / field of education

• There is considerable variation across and within subject areas across all of the UES domains.

• Student support showed the lowest variation (16 percentage points) but these were the lowest scores (so everyone is pretty dissatisfied irrespective of their field of education).

• Learner engagement had the widest range (33 percentage points) so the overall rating of 57% masks a huge amount of variation.

15

But what does this all mean?

• The national figures obscure important differences and insights at both an institutional and a faculty level.

• At an institutional level UES data can be used to support benchmarking and service improvement.

• Institutions need to identify their own internal benchmarks and appropriate ‘like’ institutions for external benchmarking activities.

– All institutions have access to the UES National Data File (unit record data) for this specific purpose.

• UES data is provided in a timely manner and action can be taken to improve the university experience for current and future students.

16

The UES in 2015

• The UES will be ‘rebadged’ as the SES, primarily to accommodate the inclusion of the Non-University Higher Education Providers (NUHIEs).

• Online data collection will commence in August 2015 with 40 universities and up to 105 NUHIEs.

• The online survey is open for approximately 30 days.

• Raw data is available to institutions throughout fieldwork.

• Final, coded data will be available to participating institutions in November 2015.

• The National Report is scheduled for release in January 2016.

17

Implications for higher education institutions

• With a move towards a more deregulated higher education system for undergraduate students, it is anticipated that a driver of quality will be competition for students between higher education institutions.

• To ensure competition does drive quality, prospective students and their families require clear information about the quality of courses and institutions.

• The UES/SES will be a key source of data about student’s views on the quality of their higher education learning and teaching experience.

18

Opportunities

• Access to ‘live data’ allows institutions to take immediate action with vulnerable or at-risk students.

• Institutions are able to tailor the UES/SES by adding questions and including additional student groups to collect specific data about.

• Benchmarking can be undertaken for institutions, faculties and occasionally courses.

19

Questions?

[email protected]

20