Evaluating sport psychology teaching through action research

  • Published on
    28-Nov-2016

  • View
    217

  • Download
    4

Transcript

<ul><li><p>of peddbackractitiodenceg as ances f009), tacher07), th</p><p>n beding</p><p>Contents lists available at SciVerse ScienceDirect</p><p>journal homepage: www.elsevier.com/locate/jhlste</p><p>Journal of Hospitality,Leisure, Sport &amp; Tourism Education</p><p>Journal of Hospitality, Leisure, Sport &amp; Tourism Education 11 (2012) 1251301473-8376/$ - see front matter &amp; 2012 Elsevier Ltd. All rights reserved.</p><p>doi:10.1016/j.jhlste.2012.02.014n Corresponding author. Tel.: 44 151 2913715.E-mail addresses: Wakec@hope.ac.uk (C.J. Wakeeld), AdieJ@hope.ac.uk (J.W. Adie).1 Tel.: 44 151 2913442.student achievement, student satisfaction, student enjoyment, and promotion of shared attitudes between teacher andstudents (cf. Wachtel, 1998). For the purposes of this study, teaching quality refers to enhancement of the studentexperience and perceived satisfaction of the teaching quality amongst students. It is important to note here that teachingopposition to the use of student evaluations. This review indicated that continual improvement in teaching casupported by regular student evaluations. A number of variables have been used to indicate teaching quality includeveloping critical thinkers) is a function of teachers (and especially that of new staff) continually monitoring, evaluatingand reecting upon their own practice. To corroborate this assumption, DAndrea and Gosling (2005) pointed out that adaily requirement for all educators is to critically reect on their own teaching practice. One evaluative method tofacilitate reection and ensuing teaching quality is by obtaining regular student feedback.</p><p>Student evaluations have emerged as an appropriate (and to some degree expected) strategy for gaining feedback toindicate teaching quality (Nuemann, 2000). A review by Wachtel (1998) highlighted evidence both in support of and inOver recent years, the conceptInstitutions attempting to gain feetheory (Norton, 2009, p. 44) with pthe University teacher collating evidirectly benet the teacher by servinpromoting optimal learning experieaction research approach (Norton, 2teaching practices used by a new te</p><p>According to Biggs and Tang (20agogical action research has become prevalent, with many Higher Educationon the student experience. Action research is a dual focus on practice andners completing the research. In the case of Higher Education, this would meanin reference to their own teaching practice. By doing so, action research canpowerful tool to encourage modication of ones own practice with the view toor his or her students (Biggs &amp; Tang, 2007; Moreira, 2009). Drawing from anhe current study aimed to discern the quality of, and potential modications to,to the profession.e promotion of a deeper level of learning among student populations (i.e.,1. IntroductionPractice Papers</p><p>Evaluating sport psychology teaching through action research</p><p>C.J. Wakeeld n, J.W. Adie 1</p><p>Liverpool Hope University, Liverpool, L16 9JD, UK</p><p>a r t i c l e i n f o</p><p>Keywords:</p><p>Sport</p><p>Action research</p><p>Students</p><p>Psychology</p><p>a b s t r a c t</p><p>In recent years, there has been a growing emphasis on action research (Norton, 2009),</p><p>accompanied by an increasing focus on staff evaluation. This paper aimed to evaluate a</p><p>single teaching session of a new member to the profession. Forty-three second year</p><p>undergraduate students responded to a standard teaching evaluation form and the</p><p>Stop, Start, Continue method (Angelo &amp; Cross, 1993) of evaluation. The results revealed</p><p>that students were particularly concerned with issues surrounding interaction, rele-</p><p>vance to assessment and practical work. The ndings are discussed in terms of their</p><p>implications for teaching practice</p><p>&amp; 2012 Elsevier Ltd. All rights reserved.</p></li><li><p>C.J. Wakeeld, J.W. Adie / Journal of Hospitality, Leisure, Sport &amp; Tourism Education 11 (2012) 125130126quality and satised students are not necessarily closely related and that instances may occur where large discrepanciesare apparent between the two. For example, with tasks which are not enjoyable but necessary to learning anddevelopment. However, offering students an opportunity to voice their opinions and addressing such concerns canfacilitate subsequent teaching practices (Dunrong &amp; Fan, 2009) by allowing future teaching styles and content to be driven,in part, by the feedback surrounding student satisfaction. Several methods of teaching evaluation exist in the feedbackliterature with student feedback questionnaires the most widely used (Kember, Lueng, &amp; Kwan, 2002). Standard studentevaluations have seldom shown support for the expected relationship between perceptions of teaching quality and studentachievement (Pounder, 2008). Another way of conducting student evaluation is by employing the Stop Start Continuemethod, based on the one minute paper (Angelo &amp; Cross, 1993; Wilson, 1986). Thus, the present study will employ acombination of both conventional and contemporary student evaluation methods (i.e., a standard University evaluationform, and the Stop, Start, Continue approach) to promote learning in action.</p><p>According to Biggs and Tang (2007), learning in action (i.e., receiving feedback during actual teaching) refers not onlyto student learning, or even learning about teaching, but rather to learning about oneself as a teacher and learning how touse reection to become a better teacher. With this in mind, the present study served two purposes. First, it applied twotypes of student evaluation methods relating to a teaching session in order to ascertain which elements of teaching wereeffective and which were not. Second, the study aimed to provide a critical reection of perceived teaching areas thatwarrant necessary changes for future practice.</p><p>2. Method</p><p>2.1. Participants</p><p>Forty three male and female students volunteered for the study from a University in the Northwest of England, UK.Participants completed the evaluation measures following a Level 2 undergraduate Sport Psychology teaching session. Allparticipants provided informed written consent and were assured that all data would remain anonymous and condential.</p><p>2.2. Measures</p><p>Standard Evaluation Form. A standardised module evaluation form of the authors institution was adapted tospecically relate to the teaching session in question. Participants rated the teaching session across eight dimensions (i.e.,session organisation, content of teaching, content of practical work, relevance of practical work to the topic, developedunderstanding of the topic, relevance to assessment and overall satisfaction of the session) along a scale of not satised,satised or very satised.</p><p>Stop, Start, Continue Form (Angelo &amp; Cross, 1993). This qualitative method of evaluation asks participants to indicatewhich elements of teaching (including style, pace, delivery) they would benet from the teacher stopping, starting orcontinuing such practice. According to Norton (2009), this form of evaluation is a useful supplement in action research toprovide expansion on conventional close-ended questionnaire evaluations (i.e., the standard evaluation method).</p><p>2.3. Procedure</p><p>The chosen teaching session focussed on the Sport Psychology topic of attribution and was delivered with promotingstudent engagement in mind. Following the session, interested participants were invited to complete both evaluationmeasures (i.e., standard evaluation form; Stop, Start, Continue form) and return them to a box at the front of the room.Aligned with the recommended time frame for completing (teacher) evaluations (Angelo &amp; Cross, 1993), participantscompleted both forms inside of the allotted 20 min.</p><p>The procedure was repeated by an independent observer. Consistent with the pedagogical literature (Backer, 2008), acolleague in the same subject area (i.e., Sport Psychology) was recruited to observe the session and provide feedback toenable and assist with reection. After the analysis was completed, the interpretation of the ndings was checked forconsistency with both participant groups (i.e., the students and observer).</p><p>3. Results</p><p>3.1. Standard evaluation</p><p>Students. Fig. 1 presents the percentage satisfaction scores of each teaching dimension from the standard evaluationform. Participants reported being satised (mean percentage28.57) or very satised (mean percentage71.42) acrossall eight dimensions. In other words, no students reported being unsatised with the teaching session. The least positiveresponse was for the evaluation dimension relevance to assessment, scoring 54.8% very satisfactory and 45.2%satisfactory.</p></li><li><p>C.J. Wakeeld, J.W. Adie / Journal of Hospitality, Leisure, Sport &amp; Tourism Education 11 (2012) 125130 127Observer. The session was rated as very satisfactory on all of the elements, except for relevance to assessment whichwas rated as satisfactory. This largely mirrored the views of the students, who also rated relevance to assessment aslowest (54.8% very satised, 45.2% satised).</p><p>3.2. Stop, Start, Continue evaluation</p><p>Students. The qualitative Stop, Start, Continue results were collated and are summarised with a trafc light codingsystem in Fig. 2. Re-emerging themes were then attained until the point of saturation.</p><p>Observer. The results revealed positive comments from the observer with respect to the quality of the teaching session.General positive comments included slides were clear and the key points were expanded upon and good structuremovesfrom basics, through to the theory and measurement, and lastly, application to real world settings. However, there were alsopoints to improve upon, such as some interaction with the class could be more effective around individual tasks given duringlecture when working at their seats. During a subsequent discussion with the observer, he felt that a greater amount ofinteraction with the students could have been achieved during this time. In the continue section, the observer reported thatthe tutor should attempt to continue to develop a variety of tasks to maintain student interest and allow their involvement tocontribute to their own understanding of the topic.</p><p>3.3. Summary of ndings</p><p>Overall, the results suggest that the session contained a strong amount of information on the topic and that the</p><p>Fig. 1. Percentage satisfaction scores based on the standard evaluation.practical element assisted in consolidating this information. The component of the standard evaluation that scored thelowest score was relevance to the assessment. Additionally, ndings derived from the qualitative results indicate thatthe key elements were the inclusion of practical work, relevance to assessment and interaction with students.</p><p>4. Discussion</p><p>This study aimed to use an action research approach (Norton, 2009) by evaluating a teaching session of a new memberof staff and allowing an opportunity for critical reection of their current practice. The results suggest that the teachingsession contained a strong amount of information on the topic and that the practical element assisted in consolidating thisinformation.</p><p>4.1. Practical work</p><p>One of the most prominent factors apparent from the results of the study is the use of practical work in teachingsessions. Bligh (1998) points out that the most common method when teaching adults is to lecture, due to constraints suchas large class sizes and the physical structure of the teaching room. However, Biggs (1999) points out that lectures can bedelivered in such a way where the student is actively participating and active methods of teaching have been shown to bemore effective than passive methods (Bligh, 1998). The results of the present study indicate that students appear to enjoypractical tasks, allowing a shift from the teacher-centred (i.e., prescribed learning outcomes) to the student-centredapproach (i.e., facilitating learning outcomes), as recommended by Prosser and Trigwell (1998). This is in line with</p></li><li><p>Keep the InteractHaving</p><p>C.J. Wakeeld, J.W. Adie / Journal of Hospitality, Leisure, Sport &amp; Tourism Education 11 (2012) 125130128slides on for longer for </p><p>note taking</p><p>Include even more practical sessions</p><p>more with the</p><p>students</p><p>Startedto</p><p>breaks in the</p><p>sessions</p><p>Have short concise</p><p>powerpointsthat are easy to </p><p>understand</p><p>Deliver fun and</p><p>informativelectures </p><p>Do what she has been doing </p><p>in previous lectures </p><p>ContinuedtoAsk us more</p><p>questions</p><p>Goingthrough the </p><p>slides so quickly</p><p>Stopped previous research that has shown the majority of students would like the inclusion of a greater proportion of practicalwork (Hay &amp; Van der Merwe, 2007).</p><p>In order to address this in future teaching, greater consideration should be given to what the student does, rather thanwhat the teacher does (Biggs, 1999). We will attempt to contemplate the time the students spend completing differenttasks with the lecture (i.e. making notes, listening, engaging in activities) to ensure that we encourage an increase ofenjoyment and curiosity in the topic area. However, we believe that the practical elements of the session need to becarefully considered. Whilst the students might enjoy practical activities as it provides a respite from a more didacticapproach, it should feature at the core of the learning outcomes rather than acting as an add-on to the session. Follow-upteaching evaluations are necessary to understand if the integration of more relevant practical activities (problem basedlearning tasks) is useful and therefore should be increased across other teaching sessions run in the department topromote engagement.</p><p>4.2. Relevance to assessment</p><p>Another key nding was the relevance of the session to the assessment. There was a subsequent assessment on thetopic taught in this session. However, the students were not aware of the assessment at that stage as priming studentstowards a particular assessment will inevitably lead to a disjointed contextual picture (i.e., bias response on thisdimension), rather than the students learning about the entire topic area which would allow them to apply this learningmore effectively to future situations. This is a view supported by (Norton, 2004, p. 687), p. 687 who explains that makingassessment criteria more explicit in higher education may have a deleterious effect on students learning. As such, ameaningful learning experience may be sacriced in order to focus on attempting to retain the information snippets thatthey believe will gain them a high mark.</p><p>Schelfhout, Dochy and Janssens (2004) pointed out that a balance is needed between allowing students to problem-solve and ensuring that every student has an optimal learning opportunity, which can be achieved by devising anassessment strategy. Additionally, issues regarding the availability and detail of the assessment criteria may also need tobe addressed: an issue highlighted and discussed by Norton (2004).</p><p>Explain the topic in depth</p><p>Use the same combinationof theory and </p><p>practical</p><p>Fig. 2. Schematic representation of the Stop, Start, Continue evaluations.</p></li><li><p>Biggs, J., &amp; Tang, C. (2007). Teaching for quality learning at University (3rd ed.). Berkshire: McGraw-Hill.</p><p>C.J. Wakeeld, J.W. Adie / Journal of Hospitality, Leisu...</p></li></ul>

Recommended

View more >