Click here to load reader

Preparing for an inter-institutional Benchmarking activity using the ACODE Benchmarks for TEL

  • View

  • Download

Embed Size (px)

Text of Preparing for an inter-institutional Benchmarking activity using the ACODE Benchmarks for TEL

PowerPoint Presentation

Preparing for an inter-institutional Benchmarking activity using the ACODE Benchmarks for TEL

Associate Professor Michael SankeyVice-President ACODE

Tuesday March 7, 2017

IntroductionIn 2014 & 2016 ACODE ran two major Benchmarking Summits in Sydney and Canberra, using the ACODE Benchmarks for TEL. Unique events in Australasian HE, with 35 institutions from 5 countries engaged so far. Each first undertook a self-assessment of their capacity in TEL against the Performance Indicators, then shared this with the other institutions involved. Each assessed a minimum of two benchmarks, with many doing far more. To maximize the value of this activity you need to start engaging with the tool well prior to the formal benchmarking activity. This workshop will help those new to the ACODE benchmarks:understand what is required when using the tool, provide a guide to ensure participation is undertaken in a rigorous way, andfind a practical way to run an internal activity, before being involved inter-institutionally. We will work through a number of different scenarios to help you understand the many facets needing to be considered in undertaking this activity, and look at a plan of action for your institution to be involved.

ACODEACODE is a representative body; a council, not a society or guild. It exists to:disseminate and share knowledge and expertise;support prof development & provide networking opportunities;investigate, develop & evaluate new approaches;advise and influence key bodies in higher education; andpromote best practice.

Well used e-learning quality & benchmarking tools ToolDescription Change Theory Validation References ACODE Benchmarks for TELSet of benchmarking statements designed to assist institutions improving the quality of technology enhanced learning. Statements of good practice provided along with a ranking scale. Focus is on a team-based self-assessment. CC LicensedCollaborative Benchmarking Face validity supported by expert review. Revised following experience in implementation Sankey et al. (2014, 2016) EADTU E-xcellence NextA framework operated by the European Association of Distance Teaching Universities. Set of quality indicators /benchmarks provided to be used to engage in self-assessments which may be referenced by external QA schemes. CC LicensedCollaborative Benchmarking Face validity supported by expert review. Revised following experience in implementation Ehlers (2012)EADTU (2012) EFMD Certification of E-learning (CEL)3 year accreditation scheme for e-learning management operated by the European Foundation for Management Development (EFMD). It including mix of self-assessment and a detailed accreditation audit. It has an 18 month reviewNoneFace validity supported by expert review Ehlers (2012) EFQUEL UNIQUe CertificationA European Foundation for Quality in E-Learning quality certification , for courses, programmes, institutional systems, to certifies the whole institution. The Guidelines are supplied with supporting questions. Formalised process of of self-assessment and peer review similar to an accreditation audit. Restricted to eligible institutions.NoneFace validity supported by literature review and extensive reviews undertaken by experts and quality assurance bodies EFQUEL (2011)Ehlers (2012) e-Learning Guidelines (eLG)A guide to designing, implementing and enhancing eLearning. A framework of questions to encourage reflection by a range of stakeholders. No detailed good practice guidance provided. CC LicensedNoneFace validity supported by expert review and literature review. Revised following experience in implementation. Suddaby and Milne (2008) e-Learning Maturity Model (eMM)Quality improvement framework incorporating a benchmarking process and extensive knowledgebase. Extensive set of processes broken down into detailed organisational practice statements. CC LicensedMaturity Model. Process & practice revised after 3 rounds of expert consultation conducted internationally, extensive set of cases & a peer-reviewed framework. Marshall (2012a; 2012b)Neal & Marshall (2008) Taking the LeadNot a quality framework, rather a tool for identifying the strategic goals for e-learning to be improved.None Face validity supported by literature review and case studies. Quality Matters (QM)Quality checklist designed to improve individual online courses through a form of audit process. Checklist items supported by descriptions of good practice applied by reviewers, after training. Focus is on PD of staff for online teaching and quality assurance of courses. Not for profit framework requiring a license to use.NoneFace validity supported by literature review and case studies. Varonis (2014)

The BenchmarksOriginally developed back in 2004Used many timesIn 2014 we shifted the focus away from eLearning to Technology Enhanced Learning (TEL)Developed the self-assessment templatesProvide guidelines for using the instrument

Institution-wide policy and governance for technology enhanced learning;Planning for institution-wide quality improvement of technology enhanced learning;Information technology systems, services and support for technology enhanced learning;The application of technology enhanced learning services;Staff professional development for the effective use of technology enhanced learning;Staff support for the use of technology enhanced learning;Student training for the effective use of technology enhanced learning;Student support for the use of technology enhanced learning.

The 8 Benchmarks for TEL

Extension The methodology provides an institutions with:a platform to self-access their standing against some/all of the 8 benchmarks, and to stimulate meaningful conversations, at a local level, around how you are using technology to support your L&T. an opportunity to share & learn from each other, based on their individual institutions responses (via an inter-institutional event every two years).an enduring record of this via the newly developed online tool (or site)

This resulted in

InstitutionBM 1BM 2BM 3BM 4BM 5BM 6BM 7BM 8Asia Pacific International CollegeXXAuckland University of TechnologyXXAustralian Catholic UniversityXXXAustralian National UniversityCharles Sturt UniversityChristchurch PolytechnicXXCurtin UniversityXXEdith Cowan UniversityFederation UniversityXXXXFlinders UniversityXXLa Trobe UniversityLincoln UniversityXXMacquarie UniversityXXMonash CollegeOpen University - UKXXXXQueensland University of TechnologyXXRMIT UniversitySwinburne UniversityUniversity of AucklandXUniversity of CanberraXXUniversity of MelbourneUniversity of New EnglandXXXXUniversity of Notre DameUniversity of OtagoXXXXXUniversity of Southern QueenslandXXXXUniversity of South AfricaXXXUniversity of the South PacificXXUniversity of the Sunshine CoastUniversity of TasmaniaUniversity of Technology SydneyXXUniversity of Western AustraliaXXUniversity of Western SydneyXXXUniversity of WollongongXXXXVictoria University (Melbourne)XXVictoria University Wellington XXXXXXXXTotal (2014)11881012956Total (2016)12121416191368Total 2320222631221114


But to get to this pointManaging the logistics We first had to do a self assessmentPull people together from different sectionsAgree on where we stoodProvide a rationale and evidence as to why

Short activity: Benchmark 1Institution - wide policy and governance for technology enhanced learningPerformance indicatorInstitution strategic and operational plans support and promote the use of technology enhanced learning.Specific plans relating to the use of technology enhanced learning are aligned with the institutions strategic directions and operational plans.Planning for the ongoing use of technology enhanced learning is aligned with the institutions budget process.Institution policies, procedures and guidelines provide a framework for how technology enhanced learning should be used at both a course and program level.Policies, procedures and guidelines on the use of technology enhanced learning are well communicated and integrated into processes and systems.The institution has established mechanisms for the governance of technology enhanced learning that include representation from key stakeholders.Authority and responsibility for the operational management of the technologies used to enhance learning and teaching are clearly articulated.The institution uses a clearly articulated policy framework and governance structure then deciding on the adoption of new technologies.


Worked with the word templatesSet up in SharePointHad a person .2 of a workload to wrangle Each shared their individual ratingsThen collated the feedback on the word docsThen entered the collated data into the online tool

The online tool

Split indicators

Victoria University WellingtonThey used Google Docs as the main tool for collaborating on the assessments. Created by a simple copy/paste of the word template. The goal was to use something easy for a range of staff to edit the files and add content from a range of sources directly.

VU staff made individual assessments based on local knowledge and evidence (policy and other documentation). They were completed by groups of 2-3 staff with a sub-group leader responsible for collating and reporting the assessments. The staff involved were drawn from across the university:Centre for Academic DevelopmentInformation Technology ServicesVice-Provost ResearchLibraryFaculty of ScienceFaculty of EducationStudent LearningThe initial assessments were then workshopped with all participants and held in an active learning space that encouraged collaborative editing.

This information was uploaded into the online system prior to the Canberra workshop. Edits and revisions were made back into the Google Docs during the workshop and immediately after to reflect the feedback and discussion at Canberra.The resulting information was then edited by CAD and ITS staff to produce an internal report with 13 recommendations arranged in four major themes:Digital governance;Digital teaching;Digital learning; andDigital understanding.The draft report was shared with the complete team to ensure accuracy of the content and agreement that the recommendations reflected a consensus on the important elements needing improvement.


Each benchmark performance indicator result was presented in context for Victoria against the complete data set:

These were accompanied by a narrative outlining the rationale and evidence for the assessment and a section labelled possibilities for improvement. The latter section outlined what could be done to improve the individual indicator. These were used to identify the recommendations, which typically addressed a range of indicators across the benchmarks.

Key to successEarly involvement with a diverse team of staff. Two workshop sessions were needed to introduce the tool and complete the initial evidence gathering, each running for three hours. Engaged team leaders made a significant difference in the extent to which evidence was gathered and synthesised, but all teams provided a comprehensive body of evidence. Project leaders needed to spend a significant amount of time turning the information into a consistent report but this was useful as a means of summarising progress to date against our Digital Strategy which is due for renewal in 2017

RMIT UniversityWE EVALUATE EACH BM byForm groups of 6 10 RMIT staff representing different areas to evaluate each BMEach BM group participant completes & submits a self assessment template (via Google form) Group evaluation meetings are held for each BM, where participants discuss their performance indicator ratings & evidence. The group must then agree on an overall rating for each performance indicatorGroup evaluations & recommendations for improvement are collated into a report for the VCE

Step 1. Complete the self assessment form The Google form link in BM Group Member Information or Benchmark Self Assessment Forms on the site Select a rating of 1-5 for each performance indicator (PI) Note your rationale & evidence to support each rating Note suggestions for improvement at the end of the form If you cannot answer one of the sections on the form, leave it blank and move to the next PI (the reason we have groups) You should not spend more than 2 hours completing the form, or 10 minutes per PIThe self assessment must be submitted prior to group evaluation meetingsPROCESS FOR GROUP MEMBERS


Please note your email, so we dont keep pestering youRead the Scoping and Good Practice statements These inform your consideration of how you will rate an indicator




Step 2. Group evaluation meetingEach group member briefly describes their PI ratings & supporting evidence/rationaleThe group will discuss & choose overall PI ratings for RMITFinal ratings are based on group consensus, where possibleBM group leaders will facilitate discussions

Group evaluations will be included in the final reportSelf assessment forms will be not be reportedPROCESS FOR GROUP MEMBERS


Then we have the conversation

In 2016 on average 15 participants per institution, with some 401 people participating overall.4

Head of TEQSA

The beauty of the beastThe beauty of benchmarking is not around which tool or set of standards you are using, it's about the dialogue that emerges and the sharing of practice that is the real winner for all concerned.It opens the door for further collaboration.It serves as a mechanism to facilitate discussion at senior leadership level.

ConclusionMany of the issues we face can be remediated by simply taking the time to self-assess against a set of quality performance indicators.We then extend this by sharing our current practice with those in similar circumstances. This builds stronger ties and provides our institutions with the wherewithal to meet the unique challenges of building a strong digital future.The ACODE Benchmarks provide a catalyst to help make this happen We are not alone

This June