13
Addressing the Proficiency Gap in Maritime Training Kalyan Chatterjea EMAS Training Academy & Simulation Centre Abstract To be proficient at sea we need to have a combination of underpinning knowledge, relevant technical skills and the necessary soft skills, which make us good shipboard team players capable of managing tasks in a safe manner. During maritime training, it is important to assess these three areas to establish the proficiency gaps relating to the learning objectives/ goals. These identified deficiencies could subsequently guide and encourage us in more effective ways to tweak our learning artifacts to fill in these gaps. This paper presents some of the tools, which have been successfully used in classrooms and in simulator-based training both in formative and in summative situations at the EMAS Academy. Background Traditional knowledge-based training (KBT) in maritime education was supplemented by the advent of STCW Convention to include skill-based training (SBT) in the simulators (Bjorklund, Rodahl, and Robertson 1987; Cross 2014) . With the appearance of simulators for training, there was a tendency among maritime instructors to reduce the importance of knowledge-based components in class rooms. The emphasis was put on the simulator-based skill acquisition, which was perceived to improve performance. Yet in KBT, learners try to process facts, figures and basic information, which become important for informed decision-making later during performance at the workplace. It is argued that the learner applies the knowledge and information, acquired during the KBT-phase, in a real situation (or in a near-authentic simulation scenario) and gets an opportunity to reflect on this application of knowledge during the performance stage. Eventually, this reflection should help the learner assimilate the domain knowledge and associated skills. It can be argued further, that the link between the knowledge, activity and the learning objective gets stronger when adequate emphasis is placed on both theory and practice. In aviation, it was established that mastering theory and practice was not enough and the non-technical skills (NoTechs) also form an important component required to accomplish task goals (Airbus 2012; Helmreich and Merritt 2000). This is now being advocated in maritime practice (Grech, Horberry, and Smith 2002; Barnett 2005; Gegory and Shanahan 2010) and mandated in the STCW 2010 revision. Page 1 of 13

Addressing the Proficiency Gap in Maritime Training

Embed Size (px)

DESCRIPTION

To be proficient at sea we need to have a combination of underpinning knowledge, relevant technical skills and the necessary soft skills, which make us good shipboard team players capable of managing tasks in a safe manner. During maritime training, it is important to assess these three areas to establish the proficiency gaps relating to the learning objectives/ goals. These identified deficiencies could subsequently guide and encourage us in more effective ways to tweak our learning artifacts to fill in these gaps. This paper presents some of the tools, which have been successfully used in classrooms and in simulator-based training both in formative and in summative situations at the EMAS Academy.

Citation preview

Page 1: Addressing the Proficiency Gap in Maritime Training

Addressing the Proficiency Gap in Maritime Training

Kalyan ChatterjeaEMAS Training Academy & Simulation Centre

Abstract

To be proficient at sea we need to have a combination of underpinning knowledge,relevant technical skills and the necessary soft skills, which make us good shipboardteam players capable of managing tasks in a safe manner. During maritime training,it is important to assess these three areas to establish the proficiency gaps relatingto the learning objectives/ goals. These identified deficiencies could subsequentlyguide and encourage us in more effective ways to tweak our learning artifacts tofill in these gaps. This paper presents some of the tools, which have beensuccessfully used in classrooms and in simulator-based training both in formativeand in summative situations at the EMAS Academy.

Background

Traditional knowledge-based training (KBT) in maritime education wassupplemented by the advent of STCW Convention to include skill-based training(SBT) in the simulators (Bjorklund, Rodahl, and Robertson 1987; Cross 2014) . Withthe appearance of simulators for training, there was a tendency among maritimeinstructors to reduce the importance of knowledge-based components in classrooms. The emphasis was put on the simulator-based skill acquisition, which wasperceived to improve performance. Yet in KBT, learners try to process facts, figuresand basic information, which become important for informed decision-making laterduring performance at the workplace. It is argued that the learner applies theknowledge and information, acquired during the KBT-phase, in a real situation (orin a near-authentic simulation scenario) and gets an opportunity to reflect on thisapplication of knowledge during the performance stage. Eventually, this reflectionshould help the learner assimilate the domain knowledge and associated skills. Itcan be argued further, that the link between the knowledge, activity and thelearning objective gets stronger when adequate emphasis is placed on both theoryand practice.

In aviation, it was established that mastering theory and practice was not enoughand the non-technical skills (NoTechs) also form an important component requiredto accomplish task goals (Airbus 2012; Helmreich and Merritt 2000). This is nowbeing advocated in maritime practice (Grech, Horberry, and Smith 2002; Barnett2005; Gegory and Shanahan 2010) and mandated in the STCW 2010 revision.

Page 1 of 13

Page 2: Addressing the Proficiency Gap in Maritime Training

Hence, it can perhaps be claimed that for successful shipboard task performancethe learners need to have a combination of underpinning knowledge, relevanttechnical skills and the necessary soft skills as prerequisite.

In adult learning, the concept of assessment may have negative connotations asthey could be associated with anxiety and awkwardness in front of the peers orinstructors. Yet with the demand for accountability at every learning centre,assessment has becomes an integral part of each learning curriculum. Additionally,assessments are credited to improve both learning for learners as well as teachingfor instructors.

Pupils need to know how their learning is progressing. Teachers also need to know how their pupils are progressing, to guide both their own teaching and the pupils’ further learning. (Assessment Reform Group 2002)

The concept of 'assessment for learning' has been proposed by a number ofproponents on positive aspects of testing both during lessons and for finalgrading. (Black and Wiliam 1998; Bohemia and Harman 2009). Bohemia andHerman suggested the following six assessment scenarios, which could enhancelearning environments and help measure the gaps in learning applicable in areas ofKBT, SBT and NoTechs.

1. An emphasis on authenticity and complexity in the content and methods ofassessment rather than reproduction of knowledge and reductive measurement

2. The use of high-stakes summative assessment rigorously but sparingly rather thanas the main driver for learning

3. Offering students extensive opportunities to engage in the kinds of tasks thatdevelop and demonstrate their learning, thus building their confidence andcapabilities before they are summatively assessed

4. It is rich in feedback derived from formal mechanisms e.g. tutor comments onassignments, student self-review logs

5. It is rich in informal feedback e.g. peer review of draft writing, collaborativeproject work, which provides students with a continuous flow of feedback on ‘howthey are doing’

6. Developing students’ abilities to direct their own learning, evaluate their ownprogress and attainments and support the learning of others

In the next section, we describe the tools and methods used to conduct theseassessments at the EMAS Academy.

Page 2 of 13

Page 3: Addressing the Proficiency Gap in Maritime Training

Knowledge-based Assessments

These are traditional assessments, where we use objective testing using aClassroom Response System (CRS - sometimes called a personal response system,student response system, or audience response system). CRS is a set of hardwareand software that facilitates teaching activities such as the following.

➢ Instructor poses an objective question (multiple-choice, true-false, multiple

response, mapping etc.) to the learners via a computer/projector. ➢ Each learner submits an answer to the question using a hand-held

transmitter (a “clicker”) that beams the student-response to a receiver attached to the instructor’s computer.

➢ Software on the instructor’s computer collects the students’ answers and

produces a bar chart showing how many students chose each of the answer choices.

➢ The teacher makes “on the fly” instructional choices in response to the bar

chart by, for example, leading students in a discussion of the merits of each answer choice or asking students to discuss the question in small groups.

Figure 1. Showing the CRS – a clicker and a multiple-response objective question

Figure 1 shows a multiple response question, which is used in the ResourceManagement Course at the EMAS Academy.

Page 3 of 13

Page 4: Addressing the Proficiency Gap in Maritime Training

Figure 2. Showing a clicker & a multiple-response objective question from BRM Course

The clicker could be covering the following question types:

➢ Recall Questions – ask learners to recall facts, figures, information. They

rarely generate discussion. No higher order thinking isrequired.

➢ Conceptual Questions – checks students understanding of concepts. Helps

to identify misconceptions. This could generatediscussions among learners and clarification from theinstructor.

➢ Application Questions - ask learners to make a decision or to choose from

a given scenario. Real-world scenarios can be givenand the learners asked to choose the appropriateaction. This could generate further interaction amongstudents and with the instructor.

➢ Higher-order Questions – ask learners to analyse relationship among

multiple concepts. This could generate furtherinteraction among students and with the instructor.

Page 4 of 13

Page 5: Addressing the Proficiency Gap in Maritime Training

Figure 3. Showing the results to the participants immediately after conducting the test

It is claimed that in order for learners to receive maximum benefit from feedback,it should be supplied as soon as possible after performing a test. Positive feedbackis important, but negative feedback is equally significant, since an ignorant learnermay go on applying a misconception over and over before discovering the natureof his misconception. Immediate feedback is often the most importantcharacteristic of a drill or tutorial.

Skill-based Assessments on Simulators

In order to evaluate a trainees performance a criterion or standard is requiredagainst which the achievements can be measured. Setting this criterion value isessential but at the same time difficult and complex. Many factors willinfluence the criterion value and they can possibly change in time as well.Furthermore the criterion for certain phenomena might be quite different forthe various levels of training performed on the simulator system. (Cross 2011)

As related above, the authentic environment on a maritime simulator does notalways lends itself to objective assessment. But some simulator manufacturers doproduce objective assessment tools, which could be programmed to makeassessment automated and additionally provide coaching messages duringsimulation exercise, which could be altered using branching technique to provide

Page 5 of 13

Page 6: Addressing the Proficiency Gap in Maritime Training

appropriate instructions during the simulation exercise. Figure 4 to 6 show thesequences of objective assessment on Kongsberg Big View Simulator at the EMASAcademy.

Figure 4. Showing the assessment points in the exercise & use of logic gates to triggerassessment.

Figure 5. Showing the draining of the compressed air system in engine room on the KongsbergBig View Simulator .

Page 6 of 13

Page 7: Addressing the Proficiency Gap in Maritime Training

Figure 6. Showing the draining of the compressed air system in engine control room and theassociated logic circuit.

Figure 7. Showing the trigger activating assessment.

Page 7 of 13

Page 8: Addressing the Proficiency Gap in Maritime Training

Figure 8. Showing allocation of marks.

Figure 9. Showing the final result sheet from the simulator.

Page 8 of 13

Page 9: Addressing the Proficiency Gap in Maritime Training

Soft-skill Assessments on Simulators

Soft-skills were first highlighted in the aviation industry when even experiencedpilots were seen to commit errors. These skills (or sometimes referred to as humanelement) are also known as non-technical skills, which define behaviouralcompetencies covering personal effectiveness, communication skills, creativeproblem-solving, strategic planning, leadership and team-building skills. Soft-skillsrelate to a persons ability to interact effectively with team members. These arenow finding acceptance in maritime (included in STCW 2010 revision) and othersafety critical industries e.g., medical, nuclear power, process and even railways.These are covered in Resource Management Courses in these industries.

Assessing these skills are not easy and in aviation they use behavioural markers forthese assessment.

At the EMAS Academy these non-technical skills are categorised into various classifications, which are shown in the following diagram. (Chatterjea, Labor, andVidal 2013)

Figure 10. Showing the groupings and categories for Behavioural Markers (developed at theEMAS Academy).

Figure 11. Showing examples of behavioural markers for good practice and for poor practice.

Page 9 of 13

Page 10: Addressing the Proficiency Gap in Maritime Training

Figure 12. Showing ratings of behavioural markers.

Figure 13. Showing a marking sheet for behavioural markers.

Page 10 of 13

Page 11: Addressing the Proficiency Gap in Maritime Training

Figure 14. Showing actual recordings of behavioural markers for two runs.

Figure 15. Showing comparison of behavioural markers for two runs.

Page 11 of 13

Page 12: Addressing the Proficiency Gap in Maritime Training

Conclusions

Assessments in maritime training is a complex area and assessment validity and reliability would be an area for continuous research and deliberations. The paper shared some of the efforts, which are being carried out at the EMAS Academy on various aspects assessments. A publication sharing our experience in this area are now available from Amazon.com (Chatterjea, Labor, and Vidal 2013). [http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Dstripbooks&field-keywords=Kalyan%20Chatterjea.]

References

Airbus. 2012. “Human Performance: Error Management.” Flight Operations Briefing Notes. Accessed October 7, 2012. http://www.airbus.com/fileadmin/media_gallery/files/safety_library_items/AirbusSafetyLib_-FLT_OPS-HUM_PER-SEQ07.pdf

Assessment Reform Group. 2002. “Testing, Motivation and Learning”. University of Cambridge Faculty of Education. http://assessmentreformgroup.files.wordpress.com/2012/01/tml.pdf.

Barnett, Michael L. 2005. “Searching for the Root Causes of Maritime Casualties.” WMU Journal of Maritime Affairs 4 (2): 131–45. Accessed June 26, 2012. http://www.solent.ac.uk/research/mhfr/resources/humanerror.pdf

Bjorklund, R, K Rodahl, and B J Robertson. 1987. “Effects of Maritime Simulator Training on Performance.” In Trondheim, Norway: MARSIM 1987. Accessed April 1, 2014. http://trid.trb.org/view.aspx?id=396688

Black, Paul, and Dylan William. 1998. “Inside the Black Box: Raising Standards Through Classroom Assessment.” Phi Delta Kappan 80 (2): 139–44. http://faa-training.measuredprogress.org/documents/10157/15652/InsideBlackBox.pdf.

Chatterjea, Kalyan, Captain Alex G. Labor, and Captain Francisco J. Vidal. 2013. Bridge Resource Management: Teamwork and Leadership. Cengage Learning Asia PteLtd. http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Dstripbooks&field-keywords=Kalyan%20Chatterjea.

Cross, Stephen J. 2011. “Quality MET Through Quality Simulator Applications.” In Rijeka. http://www.pfri.uniri.hr/imla19/doc/015.pdf.

Cross, Stephen J. 2014. “STCW and Simulators.” Maritime Institute Willem Barentsz. Accessed April 1, 2014. http://www.nhl.nl/nhl/7448/miwb/mstc/stcw-and-simulators.html

Page 12 of 13

Page 13: Addressing the Proficiency Gap in Maritime Training

Grech, Michelle R., Tim Horberry, and Andrew Smith. 2002. “Human Error in Maritime Operations: Analyses of Accident Reports Using the Leximancer Tool.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 46 (19): 1718–21. Accessed March 8, 2013. https://www.leximancer.com/wiki/images/4/44/HFES2002_MGRECH.pdf

Gregory, Dik, and Paul Shanahan. 2010. The Human Element: A Guide to Human Behaviour in the Shipping Industry. [London]: TSO for the Maritime and Coastguard Agency. Accessed June 26, 2013. http://www.dft.gov.uk/mca/the_human_element_a_guide_to_human_behaviour_in_the_shipping_industry.pdf

Helmreich, Robert L., and Ashleigh C. Merritt. 2000. “Safety and Error Management: The Role of Crew Resource Management.” Aviation Resource Management 1: 107–19. Accessed October 10, 2012. http://homepage.psy.utexas.edu/homepage/group/helmreichlab/publications/pubfiles/pub250.pdf.

Lincoln, Mary. 2009. “Aligning ICT in Assessment with Teaching and Learning: Enhancing Student Achievement in the Middle Years.” In Canberra, Australia. http://www.acsa.edu.au/pages/images/Mary%20Lincoln%20-%20Alignment.pdf.

Page 13 of 13