18
Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's / Flipped Classrooms Dear Members, PLEASE ENSURE THIS GEN MEMO IS WIDELY DISTRIBUTED WITHIN YOUR INSTITUTION 1 Expression of Deep Concern The following was sent to GlobalMET members in the Philippines on 11 November: Dear Members in the Philippines Having just enjoyed the warmth and friendliness of a visit to Manila it is even more appropriate to express on behalf of GlobalMET deep concern about the appalling destruction and loss of life caused by typhoon Haiyan and our hopes that GlobalMET members in the Philippines are coping well with the aftermath. We think particularly of the National Maritime Polytechnic in Tacloban, where the full impact of the storm was so devastating. With so much death and destruction, it appears appropriate to wish all involved 'mabuhay' in its 'to life' meaning. Mabuhay ang Pilipinas Rod Short Given the resilience of the people in the Philippines and ability to cope with natural disasters, recruitment, training, certification of seafarers and ability to provide manning for some 25% of the global fleet, will continue despite this terrible disaster. 2 AGM 11/13 Please find attached the draft minutes of the meeting held in Manila on 30 October. 3 Alert! The centrespread for Issue 33 of this very useful human element bulletin, headed 'An A to Z of maritime education and training', contains brief explanations of MET terms and links for more information. Alert! can be accessed through www.he-alert.org or through the link on www.globalmet.org home page.

Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

  • Upload
    others

  • View
    23

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's / Flipped Classrooms

Dear Members,

PLEASE ENSURE THIS GEN MEMO IS WIDELY DISTRIBUTED WITHIN YOUR

INSTITUTION

1 Expression of Deep Concern

The following was sent to GlobalMET members in the Philippines on 11 November:

Dear Members in the Philippines

Having just enjoyed the warmth and friendliness of a visit to Manila it is even more appropriate

to express on behalf of GlobalMET deep concern about the appalling destruction and loss of life

caused by typhoon Haiyan and our hopes that GlobalMET members in the Philippines are

coping well with the aftermath. We think particularly of the National Maritime Polytechnic in

Tacloban, where the full impact of the storm was so devastating.

With so much death and destruction, it appears appropriate to wish all involved 'mabuhay' in its

'to life' meaning.

Mabuhay ang Pilipinas

Rod Short

Given the resilience of the people in the Philippines and ability to cope with natural disasters,

recruitment, training, certification of seafarers and ability to provide manning for some 25% of

the global fleet, will continue despite this terrible disaster.

2 AGM 11/13

Please find attached the draft minutes of the meeting held in Manila on 30 October.

3 Alert!

The centrespread for Issue 33 of this very useful human element bulletin, headed 'An A to Z of

maritime education and training', contains brief explanations of MET terms and links for more

information. Alert! can be accessed through www.he-alert.org or through the link on

www.globalmet.org home page.

Page 2: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

4 MCQ's

Attached please find the paper MULTIPLE CHOICE MYSTERIES: The Effectiveness of Multiple

Choice Question Assessment in Maritime Education and Examination by Dennis Drown of St

Johns, Newfoundland and colleagues and presented at IMLA 21, together with

Dennis Drown. The study attempts to identify the degree of major influences on MCQ

effectiveness. The paper’s objective is to encourage discussion about the place of MCQ within

maritime education and examination, and the ways MCQ assessment reliability may be

improved.

5 Flipped Classroom

Maritime Training Issues has a new article The 'Flipped Classroom' and its implications for

Maritime Training. The full blog post can be viewed at www.marinels.com/about/blog.html.

The concept is summarised as 'Every once in a while a new idea on how to improve learning

makes the rounds in education circles. This article discusses a significant new training trend

called a 'Flipped Classroom' and its implications for the maritime industry.'

Kind regards

Rod Short

Executive Secretary

GlobalMET Limited

Page 3: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

AGM 11-13 (2)

GlobalMET Limited Australian Company Number 103 233 754

DRAFT UNCONFIRMED MINUTES OF 11th ANNUAL GENERAL MEETING 30 OCTOBER 2013, MANILA YACHT CLUB, MANILA, PHILIPPINES

1 Adoption of Agenda, Attendance and Apologies The meeting commenced at 1915 hours with 22 present, including delegates from 10 financial Members. 21 proxies with apologies were received from financial Members . It was confirmed that a quorum existed. Details of attendance were duly recorded. The following Directors and the Executive Secretary were present:

Capt Tim Wilson - New Zealand Maritime School – Chairman VAdm Eduardo Santos – Maritime Academy of Asia and the Pacific – Vice Chairman Prof Takahiro Takimoto – Tokyo University of Marine Science & Technology Mr David Fredrick – Malaysian Maritime Academy

Apologies were received from 11 Members (including 5 Directors), 3 Associate Members and 7 Individual Members. The Agenda as proposed was accepted. It was agreed, that because of time constraints, papers would be taken as read and items agreed that were not discussed. 2 Confirmation of Minutes of AGM 10-12 The Minutes of Annual General Meeting 10-12 were confirmed and adopted as a true and correct record of the meeting.

Moved: VAdm Eduardo Santos – Maritime Academy of Asia and the Pacific Seconded: Mr George Hoyt – Seagull AS

3 Matters Arising It was agreed that any matters arising would be discussed under the relevant agenda item.

Page 4: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

4 Chairman’s Report Board of Directors The need to prepare for election of the five elected directors whose three year term will expire in 12 months time at AGM 12-14 was stressed. As the four co-opted directors: Capt Richard Teo - Seafood and Maritime Industries Training Prof Takahiro Takimoto - Tokyo University of Marine Science & Technology Capt Mhd Salleh - Singapore Maritime Academy (Replaced Mr Roland Tan 01/01/13) Mr Swapan Das Sarma – American Digital University had expressed willingness to serve until AGM 12-14 they were duly appointed.

Moved: VAdm E Santos – Maritime Academy of Asia and the Pacific Seconded: Capt Tim Wilson – New Zealand Maritime School

With 31 of the current 43 Financial Members present or providing proxies the constitutional requirement that 75% of the Financial Members are to support an amendment to the Constitution, it was agreed that the proposed amendment to enable the number of co-opted directors to be increased could not be passed. Proposed Asian Development Bank Project Following expressions of satisfaction with the session Human Resource Development in Asia Pacific: Outlining a proposed new industry initiative at the 14th Asia Pacific Manning & Training Conference earlier in the day, which had presentations by GlobalMET Chairman Tim Wilson, ADB Senior Education Adviser for S E Asia Norman La Rocque and Assistant Vice President, AJ Centre for Excellence Terence Uytingban, there was extensive discussion about development of the project. With respect to the Outputs and Activities recommended in the well received June 2013 consultancy report by Fisher Associates, as a result of their strategic review of maritime education and training in Asia commissioned by the ADB, it was agreed that GlobalMET should continue to develop a project proposal with plans for specific activities formulated and submitted to the ADB in the near future. The Chairman’s Report was received, on the basis that further consideration be given to the furtherance of The TK Foundation Proposal by the Board of Governors.

Page 5: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

5 Financial Report The audited Financial Statement for the financial year to 30 June 2013 were received, but referred to the Board of Directors for further consideration and decision following review of the accounts and reconciliation of income and expenditure with the bank statements.

Moved: VAdm E Santos – Maritime Academy of Asia and the Pacific Seconded: David Fredrick – Malaysian Maritime Academy

6 Appointment of Auditor In view of the changes in the auditing company, it was agreed to delegate to the Board of Directors the appointment of the auditor for the audit of the accounts for the current financial year 7 Other Matters Because of lack of time for further consideration it was agreed to delegate to the Board of Directors the other matters in the tabled papers for consideration and decision. 8 Closure The Chairman closed the meeting at 2000 hours with an expression of thanks to VAdm Eduardo Santos for the excellent arrangements for the transport to the venue and for the meeting and also to the Manila Yacht Club for providing the venue and services.

Draft Resolution 1 That these minutes be signed as a true and correct record of Annual General Meeting 11-13 held on 30 October 2013 at the Manila Yacht Club, Manila, Philippines. Chairman’s Signature __________________________ Date ____________________

Page 6: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 1 of 13

In Proceedings of the21st International Maritime Lecturers Association Conference (IMLA 21)Fisheries and Marine Institute of Memorial University of Newfoundland

October 9-12, 2013

MULTIPLE CHOICE MYSTERIES: The Effectiveness of Multiple ChoiceQuestion Assessment in Maritime Education and Examination

byDenis Drown Ex.C., F.N.I.

Robert Mercer M.M., M.Ed.Gary Jeffery Ph. D.

Stephen Cross M.M., M.Sc., Ph. D., F.N.I.3 Bideford Pl., St. John’s, NL, Canada, A1B2W5: Tel: 709-753-9173

Email: [email protected](the authors are presenting privately, without affiliation)

Keywords: multiple choice question assessment: competence: STCW examination.

Abstract

Competency certificates are issued to ships’ officers and ratings following an education andexamination process that includes multiple choice question(s) (MCQ) assessment. Over the last sevenyears the authors have conducted original research into the use and effectiveness of MCQ, presentingthe ongoing survey and studies at IMLA and IMEC conferences. MCQ are used to assess bothknowledge and competence to complete onboard tasks safely. The authors’ studies show thatevaluating knowing/not knowing is only an element of MCQ, since there are other factors influencingassessment. There are concerns about MCQ effectiveness and reliability, especially where significantscores can be obtained without subject knowledge or understanding.

The paper overviews the authors’ previous survey and studies involving 1,480 participants from 55countries. The paper follows on from work presented at IMLA-20. New literature available since2012, relevant to maritime education, is reviewed, continuing a debate about effectiveness ongoingsince MCQ assessment became popular in the 1970s. Media reports are presented, indicating shifts inattitudes towards and away from MCQ assessment.

The paper demonstrates the authors’ study methods, and presents a preliminary report on the lateststudy with participants from the IMEC community, attempting to identify the degree of majorinfluences on MCQ effectiveness. The paper’s objective is to encourage discussion about the place ofMCQ within maritime education and examination, and the ways MCQ assessment reliability may beimproved.

1. Introduction and Research Objectives

The authors’ surveys and studies started in 2006 with the objectives of presenting information andencouraging dialogue between maritime educators in the way in which MCQ are developed andutilised, and offering insights into factors influencing MCQ effectiveness so that such influences maybe minimised. In total there are 1,480 survey and study participants from 55 countries. The authors’publications are in Figure 1 and in this paper are referred to as IMLA-14, IMEC-19 etc.

The paper follows on from work presented at IMLA-20. New material available from academia sinceIMLA-20 is reviewed, continuing the debate about MCQ effectiveness increasingly since the 1970s.Current reports from media and other sources are presented, indicating shifts in attitudes towards andaway from MCQ assessment. The paper reviews the situation and presents a further report on theauthors’ latest study attempting to identify the degree of major influences on MCQ effectiveness.

Page 7: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 2 of 13

IMLA-14Marseille

2006 “One, All or None of the Above - Multiple-choice question techniques and usage in educationand examination for marine certification: A Survey”.

IMEC-19Rotterdam

2007 “The Influence of Language and Question Structure on Multiple-Choice Test Scores: AnExploratory Study”.

IMEC-21Szczecin

2009 “Language and Effectiveness of Multiple-Choice Questions in Maritime Education andCertification: A Study”.

MHRS-4St. John’s

2010 “Disparate Measures in Examinations for STCW Certificates of Competency: The Use andEffectiveness of Multiple Choice Questions”.

IMLA-20Terschelling

2012 “Multiple Choice Question Assessment: A Question of Confidence”.

January 2007 “STCW Competence: Disparate exam methods, does it matter?”, Seaways.July 2011 “Matters of Difference (on Multi-National Ships)”, Seaways

October 2012 “Multiple Choice Question Assessment: A Question of Confidence”. Edited version of theIMLA-20 Paper, Seaways

Figure 1: Authors’ Surveys and Studies

2. Situation, Concerns and MCQ Influences: A Review

Situation: Maritime administrations and colleges have varying multi-dimensional examinationmethods for STCW certificates, including MCQ. Although MCQ are harder to construct than open(essay) questions, they are popular with administrators and instructors, as they are quick to set andmark. MCQ can provide uniformity and fairness, and cover considerable factual matter, but havelimited value in assessing knowledge or competence, being inherently unreliable.

Concerns: MCQ restrict independent thought, measuring recognition rather than understanding. MCQare unreliable because of random elements and lack of learning depth, and are not appropriate forformal examination attesting to competence. The differences in MCQ use reflect disparity ininternational examination methods, lowering confidence in assessment dependability.

MCQ Influences: Assessment methods, including MCQ, will reflect factors other than subjectknowledge. Described in IMLA-20 and earlier papers, for MCQ these include instructor training;question structure; testwiseness; culture and pedagogic regimes; language and languagecomprehension; chance, guessing and intuition; gender; age; previous MCQ experience; examinationmedium (computer/pen-paper); wrong learning, and other influences. For any one standard MCQ ortest it cannot be readily determined whether the response or responses result from the influencingfactors or from subject knowledge or lack thereof.

3. Maritime Education and the Literature: 2011 to 2013

3.1 Relevance

From the 1930s the volume of literature has increased, reflecting Berk’s observation that MCQ are“most popular, most unpopular, most used, most misused, most loved, and most hated.” An attempt atreviewing has been made in the authors’ previous IMLA and IMEC papers. This section reviewsliterature available since IMLA-20, primarily from 2011 to 2013. The pro and con debate continues.

There is little in maritime literature about MCQ except general advice in an IMO Model Course andin a Nautical Institute publication. There are no quantitative studies regarding MCQ other than theauthors’ efforts starting in 2006. Now, however there is interest in a 2011 survey by Sampson1, andGoldberg’s 2013 treatise on MCQ.2 For maritime education the benefit of MCQ assessmentexperience can be found in the literature of other disciplines mentioned in this review, including:

Accountancy Biology Computers Engineering Language Medicine PharmacyAgriculture Business Economics History Law Nursing PsychologyAsset Manager Chemistry Education Humanities Mathematics Pediatrics Social Sciences

Page 8: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 3 of 13

Much is relevant to maritime students. For example, the same MCQ construction advice familiar tomaritime instructors is found in such diverse documents as guides for Primary Education Teachers,3

and Staff Instructions for Civil Aviation Safety Inspectors.4 The common aim is that MCQ assessmentbe readily understandable, consistent and effective. These commonalities are reflected in thefollowing statement: “As assessment drives learning, making accurate pass/fail decisions largelyaffects the effectiveness of medical education programmes. Failing competent students or passingincompetent ones is an error which could have serious implications to the community, student, andinstitution.”5 This statement is as relevant to mariners as it is to physicians.

3.2 The Continuing Debate

A Seaways Editorial6 comments that the proliferation of MCQ in STCW courses is particularlyworrying and may partly explain the reduction in competency levels that are consistently reported tothe Nautical Institute. The Editorial references the authors’ “interesting research” and theirconclusion that MCQ have limited value in assessing either knowledge or competence.7 Also inSeaways is criticism of using MCQ in certain examinations such as for COLREGS, where it may bepossible to learn the answers by rote; where the pass mark is only 90%, and where there is no oralfollowup.8 MCQ dominate assessment in all subjects, being quick and inexpensive, with a number-right score providing a seductive (if false) sense of precision. However, expecting MCQ to measurecognition is like using a pocket-knife for surgery, because they are ineffective for gauging higher-order thinking.9 Tests of knowledge are relatively easy, with good MCQ, but to ensure higher levelsof competence, it is necessary to set project work which needs careful and wider-ranging research,analysis and presentation, costly to administer but a better test of competence than box ticking.10

MCQ examinations may save time (after the initial question creation) but are an inappropriatesubstitute for a free-response format, which requires comprehension and application, instead of onlyrecognition. Free-response and forced-choice questions (i.e. MCQ) assess different types ofknowledge and/or learning, although use of both may help to create a more balanced assessment.11

Much is written about MCQ assessment, with less research on the consequences for long-termknowledge. Studies suggest that MCQ tests can change as well as assess knowledge, for examplewhere initial tests increase performance on later tests. MCQ testing can also have negative effectsbecause they expose students to incorrect answers.12 MCQ are commonly used in maritime educationand examination as they are perceived as objective; being easy to grade with minimal error. However,MCQ typically do not give as much insight into students’ fundamental understanding as do hand-graded, worked-out questions.13

Publisher test bank questions need careful selection to ensure they are field tested, analyzed,discriminated and refined for reliability.14 Dickinson15 analysed MCQ banks on the basis of questiondifficulty and three taxonomy level comparisons, with less than 50% of instructors’ expectationsbeing met. In one country MCQ are used to evaluate about 1,800 medical students every year,requiring 10,000 MCQ items, here there are no secure web-based item banks in any of the medicalinstitutions, and access to international banks is limited by language and costs.16

The capacity for MCQ to effectively evaluate practical competencies and higher level skills isquestionable. One maritime administration maintains a central question bank created by officials fromgovernment and training colleges, from which instructors generate random tests. There are concernsthat the bank contains inappropriate and poorly constructed questions designed by unqualifiedassessors, with MCQ Certificate of Competency examinations being passed by guessing, withoutevidence of safety-critical competence. MCQ tests only basic knowledge recall, and not higher levelsof cognitive skills including comprehension, evaluation and analysis, and application.17 While MCQare derided as incapable of truly measuring understanding, with exam performance improved bylearning a few tricks, MCQ have a larger, less obvious flaw, creating an illusion of right and wrong, abinary condition that ignores the present fluid nature of information.18

Page 9: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 4 of 13

MCQ tests are one of the oldest and most highly maligned assessment techniques, yet they continue tobe used pervasively in maritime training. MCQ may be used despite their obvious shortcomings, butshould not be the sole assessment method. Combining assessment techniques in the overallassessment strategy can take advantage of the benefits of each.19 As Van Der Vleuten states “…… theassessment of competence in any assessment situation is inevitably a compromise between what isdesirable and what is achievable. There are no fixed and firm strategies that guarantee the perfectcompromise. Things strongly depend on specific assessment contexts and local conditions.” 20

3.3 Survey by the Seafarers International Research Centre, Cardiff

The 2011 survey “Watertight or Sinking?” by Sampson (recommended reading) looks at theassessment practices in six seafarer labour supply countries.21 The survey shows that maritimeadministrations have very different assessment practices, using combinations of essay questions, oralexamination and MCQ/simulator testing. Devolved colleges use their own assessment procedureswith consequent possible variations in standards across institutions. A seafarer in one college may betested more rigorously than a seafarer in another college, allowing for an exam pass in one institutionbut an exam fail in another, leading to an overall lack of trust in graduate quality. Significantvariations will persist so long as assessment standards remain internationally unregulated.

Front-line ship owner and manager employers have different opinions about MCQ effectiveness.Employers make use of commercial MCQ to assess their own potential employees and applicants,with perceived objectivity balancing corruption and fraud. Some employers are vehemently opposedto MCQ in license examinations, on the grounds of rote learning and use of cramming centres.Employers are concerned about poor MCQ design, where it is very easy to pick the wrong answer orguess correctly. The limitations of MCQ are reiterated by a number of employers, with concerns aboutguesswork, luck, security, evaluation, and being unable to test anything other than basic knowledge.Employers feel that MCQ should be only part of the licence examination, and suggest that someknowledge-based MCQ tests, such as those relating to collision regulations, should have pass mark of100% rather than the more usual 60% or 70% and/or there should be inclusion of some mandatoryquestions where incorrect answers automatically lead to a failing mark.

3.4 Gender, Language and Culture

Any testing program must be sensitive to gender, language and culture differences, since theyinfluence MCQ assessment.22 Gender difference studies attribute male advantage to physiological andsocial factors. Studies suggest the influence of pedagogic circumstances, for example Arthur23 foundfemale students out-performed males in both MCQ and constructed response (CR) question formats,with female superiority in MCQ diminished in comparison with CR suggesting that MCQ favourmales more than females.

Evereart24 finds genders responding to MCQ differently. Females change their answers about twice asmuch as males, and tend to run out of time on a MCQ exam. Males more often go from incorrect tocorrect answer relative to females. The most likely source of the gender effect is in males’ superiorguessing ability or females’ tendency to change their answers. Hudson25 studied the effectiveness ofMCQ and short-answer questions, finding males achieved higher scores than females in both formats.However, when student abilities were considered, male and females of equal abilities performedequally in each test comparison. This is supported by Hoang26 finding that on the basis of grade pointaverage males perform better than females, but this relationship disappears when Item ResponseTheory (IRT) or Rausch Theory are used to estimate ability and question difficulty levels. Yu27 offersa conceptual and practical guide to IRT and Rausch, which though similar have differentphilosophical foundations. There are also pedagogic differences in the multi-cultural classroombetween those coming from a highly structured, passive (e.g. uniformed) system that favours recallover application, and those from a more laissez faire (e.g. casual dress) learning environment.

Page 10: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 5 of 13

Language is important in the construction of MCQ and in the ability to comprehend, especially forstudents with English as a Second Language (ESL).28 Studies find ESL (international) students haveproblematic English-language proficiency, needing more time to complete a MCQ test, with lowerscores unreflective of demonstrated classroom ability. In one country’s institution, classes sometimeshave more than 50% of ESL students, taking three times as long to answer a MCQ test because ofcomprehension difficulties.29 The authors’ studies indicate that persons more proficient at languageand deductive reasoning with a little subject knowledge are as able to answer MCQ correctly as aperson having less language skills but extensive subject knowledge. There are methods to assist ESLstudents, for example the Maritime English Test of Language (MarTEL) system, using MCQ asassessment,30 noting that a Chinese study found MCQ assessment may not be appropriate for readingcomprehension.31 Langah32 describes a multi-lingual online MCQ examination system in threelanguages suitable for institutions that use English as the primary medium of instruction.

3.5 MCQ and the Instructor

The literature emphasises the instructor’s importance. Shuhidan33 finds that instructors believe goodquality MCQ are valid, provided the essay format is included in overall assessment. MCQ areregarded as being easy summative assessments to keep weak students on track; test basic knowledge,and to motivate, even though use is sometimes equated to distributing free marks. Some instructorsbelieve MCQ are seen as easy questions, although students may not share this view. Many teachersbelieve that MCQ are limited to summative assessment of ability to recall factual information. Withthe increased emphasis on preparing for state (USA) testing and the availability of clicker technology,many middle grade educators use MCQ items for more than a summative check, with ‘clickers’ auseful tool to enhance student engagement and performance.34 MCQ items can be used to guideinstruction, communicate expectations, and develop student dispositions. Carefully crafted MCQ canpush students beyond rote memorization and promote a challenging classroom environment.35

Tarrant36 finds that item-writing violations are common in teacher-developed examinations in manydisciplines, with consequences for both borderline and high-achieving students. Teachers must beprovided with training in writing high-quality test items, with examinations subjected to review bothprior to and after administration. Students sit standardised questions that are set centrally, with moreMCQ than written essays. 37 Institutional reforms are needed in teacher recruitment; teacher evaluationand assessment systems. Goldberg38, commenting on Cox’s IMLA-20 paper,39 notes that the teachersof children are required to undergo anywhere from two to four years of specialized formal education,yet the training of teachers of adults is often completely unregulated. Most post-secondary instructorsare not formally trained in MCQ testing, and only about one-third understand terms such as itemdiscrimination and reliability, not knowing how to rewrite and improve their own MCQ items. It is theresponsibility of post-secondary institutions to provide the training and support needed to constructhigh-quality MCQ items and to make effective use of item analysis.40

There are descriptions of many educational assessment objectives; suggestions on constructing goodMCQ; opinions about efficiency at some cognitive levels, and different subject-matter areas for MCQassessment. In mathematics it is possible, although more difficult, to construct MCQ for higher-levelskills, where it is important that more than one teacher takes part in item and test construction.41

Constructing well-written MCQ is fraught with difficulty, and item writing flaws (IWF) are commonin both test banks and instructor-developed tests, most at the knowledge/comprehension level. In onetest bank Nedeau-Cayo42 found 76.7% of 2,913 questions had IWF, with 47.3% written at Bloom’slowest cognitive level. Another bank of 2,770 instructor-developed test questions had 46.2% with atleast one IWF, with 91% written at a knowledge level. The Nedeau-Cayo study has a questionnaire totest an instructor’s knowledge of assessing IWF.43

DiBattista’s44 study of undergraduates’ responses to 1,198 MCQ found more than 30% of items hadunsatisfactory discrimination coefficients. Of the 3,819 distractors, 45% were flawed either becauseless than 5% of examinees selected them or because their selection was positively rather thannegatively correlated with test scores. The quality of MCQ tests can be improved by item analysis

Page 11: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 6 of 13

and by modifying distractors that impair discriminatory power, reflecting the extent to which the moreknowledgeable will select the correct option. MCQ discriminatory power can be measured by itsdiscrimination coefficient, the correlation between examinees’ overall scores and the scores that theyhave obtained on the item under consideration. Abdalla45 describes the relation between reliability andother components of MCQ item analysis, to judge the difficulty, discrimination and reliability of testitems. Tasdemir46 compares the difficulty levels and discrimination powers of MCQ testing.

3.6 MCQ and the Student

MCQ testing is based on a psychometric-structuralist approach (breaking down subject complexitiesinto isolated segments). Thus testing can be objective, precise, reliable and scientific, provided thatMCQ are constructed in such a way that students obtain the correct option by direct selection ratherthan by the elimination of incorrect options.47

Students ‘game’ the MCQ assessment system through websites such as ‘Bored of Studies’ which givesurprisingly accurate testwise tips on how to answer MCQ by eliminating poorly written distractorsand identifying easily guessed questions.48 Some students tend to use a surface approach to study,reproducing what was taught to meet minimum requirements, even though surface learning negativelyimpacts MCQ performance, whereas a deep learning approach benefits MCQ performance.49 Inseveral disciplines students are prepared for MCQ assessment, for example the University of theDistrict of Columbia Law School hires specialists for two-day workshops on how to answer MCQ andessay questions, with more time spent on the former than the latter.50

Tweed51 studied MCQ confidence-rating scoring systems, concerned that high-stakes, number-correctscoring MCQ examinations imply that guessing is acceptable, with uncertainty whether an incorrectresponse reflecting an unsafe decision is a true belief or a random guess. Students gave fewer unsaferesponses when scored using the confidence system. In another study, students marked using thelogical choice weight method did better than those marked using the number-right scoring method.52

Students often have negative attitudes toward testing, perceiving instructor-written exams asirrelevant and autocratic, leading to lower trust in evaluation and decreased motivation. Corrigan53 hasnew approach, where students write their own exam questions, more challenging, improvingrelevance, increasing student involvement, an approach supported by Denny54 and Luxton-Reilly.55

3.7 Different Approaches to MCQ

There are ways of varying the standard MCQ format to increase reliability. Reliability depends onmany factors; internal such as item construction and the scoring system, and external related tocultural, personal or administrative issues. Reliability also depends on discrimination, reflecting theextent to which the more knowledgeable are more likely than the less knowledgeable to select thecorrect option. Campbell56 describes forms of closed questions, including extended matchingquestions (EMQ), being relatively easy to write, and superior to the standard single best answer inassessment of reasoning. EMQ are theme-structured with a list of possible options, a lead-inquestion/statement, and at least two to three short scenarios, which are ‘matched’ to the correctoption. The examinations for the Institute of Asset Management use both MCQ and scenario-basedquestions.57 MCQ ‘cloze’ questions described by Iwata58 may be appropriate in maritime education.

The debate continues about the relative effectiveness of MCQ and short essay questions (SEQ).Mujeeb59 finds a strong correlation between MCQ and SEQ, with the former assessing ability to recallisolated pieces of quantitative information, and the latter allowing flexibility and individuality ofapproach in which interpretative skills are developed. Kettler60 indicates that shortening the questionstem may be an effective modification, and that adding graphics may be a poor modification.

Kim61 describes how Bloom’s Taxonomy can incorporate critical-thinking skills in MCQ. Thedifficulty level of MCQ is higher when associated with multiple factors, suggesting well-planned

Page 12: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 7 of 13

MCQ using Bloom may be an effective alternative to essays. Tiemeier62 studied MCQ at Bloom’slevels to find ways to promote recall knowledge for more advanced topics while developing ability toapply and analyze information.

3.8 New Ideas on MCQ Assessment

New advice follows old advice in MCQ construction. Salend63 reminds that MCQ stems shouldcontain only one major point and provide the context for the answer. The item’s alternatives should allbe viable choices that are shorter than the stem and that share common elements, such as the samegrammatical structure, and not include key words from the stem or categorical words like always.Responses should be presented vertically in a logical sequence, with the number of choices limited tofour. Choices as all of the above or none of the above are not used. Yaman64 compared thepsychometric properties of MCQ tests with different numbers of choices, finding the reliability of 3 or5 choices higher than 4, and with no significant differences among other properties. The mosteffective was 3, better than 4 or 5, being easier to prepare and analyze.

Sinha65 compared the effects of immediate versus delayed feedback on learning, finding that taking atest generally improves retention of the material tested, called the testing effect. Providing the correctanswer as feedback soon after the student’s response further improves performance. Park66 describes acomputerized constructive MCQ testing system combining short answer (SA) and MCQ formats thatasks examinees to respond to the same question twice, first in the SA format, and then in the MCQformat. The unique findings were where students chose a wrong MCQ option, even though they chosethe correct answer for the SA part.

MCQ are often regarded as testing knowledge recall only, with Modified Essay Questions (MEQ)used to test higher order cognitive skills. However, Palmer’s67 research found MEQ developed for ahigher education assessment primarily tested lower order cognitive skills, with MCQ in the sameassessment testing higher order cognitive skills. Many items in the MEQ and MCQ assessmentsuffered from item writing flaws (IWF) more significantly in the MEQ, with inadequate markingguides. The MEQ was deemed to be a less suitable assessment than the script concordance format.

4. Media Review: Relevance and Synopsis

4.1 Relevance

Media reports are a counterpoint to the Literature Review; an expression of public feeling as opposedto educational discourse, and a reflection of interest by a general population who are alsostakeholders. Parents want reliable examination procedures in secondary and tertiary education,because pass or fail can heavily influence their children’s futures: employers want qualifications to bea true measure of ability. The review is from the period 2011-2013, and primarily concerns secondaryeducation. However in a maritime education context the transition from secondary (High School) totertiary (Maritime College) regimes cannot be said to be accompanied by a greatly changed studentcharacterisation. Full media transcripts are available on request.

4.2 Synopsis

Media reports reflect the academic pro and con debate, overlaid with parents’ belief that exams shouldbe more rigorous, and employers’ concerns that MCQ cannot measure real-world skills and shouldonly be part of assessment. There is recognition that MCQ can disadvantage the brightest students,and that flawed assessments cannot measure learning. MCQ are used in many countries and can testlarge amounts of information, nevertheless there is a trend away from closed questions generally.Use of MCQ is a significant economic driver, on the basis that MCQ exams are quicker and cheaperto mark, as well as being more accurate test of knowledge and ability. Scanning MCQ by computer isnot perceived to affect student marks. In a US state it costs $52.3 million, or $19.93 per student, to

Page 13: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 8 of 13

develop and field-test an annual MCQ summative test. This compared to $146.1 million, or $55.67per student, for a high-quality comprehensive assessment system with half as many MCQ items andmore essay questions. However, with a consortium of states using teachers instead of vendors to scoretest items, the price can be reduced to $10 per student. In a Canadian province eliminating essayquestions saves about $1.5 million, reducing the annual cost of processing diploma exams to $10.5million. In the United Kingdom it is proposed to replace written tests in primary schools with MCQ toavoid the repeated failures of exam marking, because of the difficulty in accurately marking essays.

5. The New Study

5.1 Strength and Limitation

As mentioned in the authors’ previous papers, strength is in the originality of an approach that has notbeen attempted before in maritime education, providing new light on a subject frequently occupyingthe thoughts of maritime educators, and providing opportunity for dialogue. Limitation is thechallenge of gathering data by private individuals. In general, an in-depth survey of maritimeexamination and teaching methods is needed, with greater numbers of participants and more specificstudy and survey instruments, requiring an international and institutional approach.

5.2 Language and MCQ Assessment

The IMLA-14 survey indicates that lack of English comprehension is a significant factor in MCQassessment. The IMEC-19 study uses a database of 1,500 MCQ contributed by 22 countries of whichthree are nominally English first language, indicating the extent to which English is used in maritimeeducation. The IMEC-21 study shows that students with English as a first language and/or goodEnglish comprehension scored best on MCQ tests, independent of subject knowledge. In other wordsthe test is as much about language as it is about technical knowledge.

For maritime students with English as a second language (ESL), comprehension is a factor inassessment systems, including MCQ, and may influence the measure of knowledge or ability. Also, ifESL is the language of instruction, there is a question of whether level of knowledge assessed for afirst language student is the same as for an ESL student for any given examination. Other studiesshow MCQ assessment does not always properly reflect an ESL student’s knowledge or ability.Deficiencies in language comprehension need not be restricted to ESL maritime students. Forexample, in countries where English is nominally the first language, such as UK, USA, Australia orCanada, indigenous (English native-tongue) maritime students are sometimes referred to Englishlanguage classes for new immigrants to improve or remedy their language skills.68

5.3 Study Overview

The new study considers the perspectives of language. IMEC-21 compared the MCQ test scores ofstudent mariners to novices (students without maritime technology knowledge). Novice scoresfrequently mirrored and sometimes equalled or exceeded mariner scores. The new study was partlypresented at IMLA-20. The study has 132 participating English Language faculty from sectors inmaritime colleges around the world. Participants have no formal training or experience in maritimeoperations, and address the same MCQ tests used in the IMEC-21 study.

The study presumption is the same as in the authors’ earlier studies, that is, if persons have no subjectknowledge their MCQ scores must be due to other factors. English Language teachers are selectedbecause previous research highlighted the importance of language comprehension, hence thesupposition that English Language teachers, as linguists, must score better than mariners and novices,recognising that other factors such as age and education also influence MCQ performance. Thestudy’s objective is to investigate the relative significance of influencing factors, particularly languagecomprehension, providing insights helpful when evaluating confidence in MCQ assessment.

Page 14: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IM

The study also attempts to identify the reasons why MCQ questions are answered correctly orincorrectly independent of whether the answer is respectively either known or not known, and,contrary to probability, why certain questions are predominately checked right or wrong. The studyuses the same 20-question MCQ tests (navigating and engineering) given to mariners and novices inprevious research. The tests are randomly selected from a database of 1,500 standard 4-responseEnglish language questions designed to test factual knowledge, contributed by faculty and officialsfrom METS and administrations in 22 countries. The 20 MCQ randomly selected in each test arearranged so that the first 10 questions are more basic, and the last 10 questions more advanced. Thisarrangement separates possibly general knowledge questions from those of a more technical nature.Participants are in four similar groups A, B, C and D.

5.4 Presentation of Data

The data collected is explored in three ways: first, determining proportions attributed to factorsinfluencing test scores: second, comparing participant scores with mariners and novices that did thesame tests: third, focusing on individual test questions where scores are anomalous. The first way waspresented at IMLA-20, showing the significant influence of Deduction, as well as the wide range ofscores. The second way compares the performance of participants, mariners and novices. Figure 2 is agraphical representation of the results for Group B (English Language teachers). The third wayfocuses on individual test questions, tabulated for Group B in Figure 3.

Study B6(2): Rankings for Each Test Question

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

First 100 90 55 60 100 91 73 91 27 36 94 90 15 79 45 45 50 30 42 52

Second 85 67 36 42 45 90 70 90 10 30 82 85 9 73 40 30 36 30 40 50

Third 36 36 20 9 36 70 45 61 9 18 60 55 0 40 39 30 27 9 36 36

First 9 5 6 First 3 3 4 First 6 2 2

Second 7 4 9 Second 4 1 5 Second 3 3 4

Third 4 11 5 Third 3 6 1 Third 1 5 4

Stu

dy

Part

icip

an

ts

No

vic

es

Mari

ners

Stu

dy

Part

icip

an

ts

No

vic

es

Mari

ners

Stu

dy

Part

icip

an

ts

No

vic

es

Mari

ners

Novices

Mariners

Study Participants

Question #

Percentage Score

Basic Questions Advanced Questions

Advanced (11-20)

Ranking/10

All (1-20)

Ranking/20

Basic (1-10)

Ranking/10

Figure 1a – Interpretation Examples

For Individual Questions (example):For Question #1: Mariners, Participants and Novices scored 100%, 85% and 36% respectively

For All Questions (1 – 20), First (place):Participants scored higher than Novices or Mariners on 9 out of 20 occasions.Novices scored lower than Participants and Mariners on 5 out of 20 occasions.Mariners scored lower than Participants and higher than Novices on 6 out of 20 occasions.

For Advanced Questions (11 – 20), First (place):Participants scored higher than Novices or Mariners on 6 out of 10 occasions.Novices scored lower than Participants and equal to Mariners on 2 out of 10 occasions.

LA 21 October 2013: Drown et al. Page 9 of 13

Figure 2 – Group B: Graphical Representation and Interpretation

Mariners scored lower than Participants and equal to Novices on 2 out of 10 occasions.

Page 15: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

5

Twto

#o

fP

art

icip

an

ts

Qu

es

tio

n

Re

sp

on

se

Kn

ow

led

ge

De

du

cti

on

Wo

rd/C

on

ce

pt

La

ng

/Gra

m

Gu

es

sin

g

Intu

itio

n

Ch

ec

k

%A

vT

es

tS

co

re

K=

Kn

ow

led

ge

D=

De

du

cti

on

W=

Wo

rd/C

on

ce

pt

L=

La

ng

/Gra

m

G=

Gu

es

sin

g

I=In

tuit

ion

1 2 3 4 5 6

33 1 C 23 2 3 28 85 28 Correct Responses

A 3 3 K D W L G I

B 1 1 82 7 0 0 11 0

D 1 1

33

33 20 B 1 2 2 5 7 17 52 17 Correct Responses

A 1 1 2 K D W L G I

C 1 1 2 1 5 6 0 12 12 29 41

D 1 2 5 1 9

33

Percentages

1 - 20 139 66 13 12 76 36 1 - 20 21 10 2 2 12 5 All Questions 342 52 342 52

1 - 10 72 38 5 2 39 20 1 - 10 11 6 1 0 6 3 Basic Questions 176 27

11 - 20 67 28 8 10 37 16 11 - 20 10 4 1 2 6 2 Advanced Questions 166 25

Kn

ow

led

ge

Ded

ucti

on

Wo

rd/C

on

cep

t

Lan

g/G

ram

Gu

essin

g

Intu

itio

n

Kn

ow

led

ge

Ded

ucti

on

Wo

rd/C

on

cep

t

Lan

g/G

ram

Gu

essin

g

Intu

itio

n

Nu

mb

ers

Perc

en

tag

es

%

Nu

mb

ers

Perc

en

tag

es

%

NUMBERS PERCENTAGES SUMMATION

For Qu. 1 the 33 Participants attributed their (28) correct answers (C) to Knowledge (23), Deduction (2), andGuessing (3) times. Responses A, B and D are incorrect. The total number of correct answers (28) divided by thenumber of Participants (33) is the average test score (85%) for Qu. 1.

For Qu. 20 the 33 Participants attributed their (17) correct answers (B) to Knowledge (1), Word/Concept (2),Language/Grammar (2), Guessing (5) and Intuition (7) times. Responses A, C and D are incorrect. The total numberof correct answers (17) divided by the number of Participants (33) is the average test score (52%) for Qu. 20.

Numbers: for Knowledge there were 72 correct answers for the Basic Questions (1-10), and 67 correct answers forthe Advanced Questions (11-20), for a total of 139 correct answers for the whole test, expressed in Percentages as11%, 10% and 21% respectively. Summation is for the total number of correct answers (342) out of a total possible

30 – IMLA 21 October 2013: Drown et al. Page 10 of 13

Figure 3 – Group B: Tabular Representation and Interpretation

.5 Further Investigation

he study again indicates that persons with no subject knowledge can obtain significant scores, withide ranges, attributed to chance, guessing, intuition and other factors. A statistical analysis is plannedcharacterise the influencing factors, focusing on individual test questions and anomalous scores.

6. Conclusion

The latest research supports previous conclusions that MCQ are useful classroom tools if there isdialogue between instructor and student, but are not appropriate in formal examinations leading to aSTCW qualification because standard MCQ assessment is inherently unreliable due to randominfluences, and is not designed to evaluate competence.

MCQ variants suitable for maritime education may improve reliability, but should still be a minorelement in a multi-dimensional process favouring competency based assessment. As with all MCQforms, variants require resources for formal instructor training, as well as resources for determininglevels of validity and reliability. However, a proliferation of variant-types addressing differentlearning objectives would exacerbate the present situation of disparate international examinationmethods for STCW competency.

660 (33 Participants x 20 questions/test) contributing to the average test score of 52%.

Page 16: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 11 of 13

The Authors

Denis Drown Ex.C., F.N.I. is retired from the Marine Institute, Memorial University of Newfoundland where he worked asan Instructor; Department Head Nautical Science, and was the first Director of the Centre for Marine Simulation and theOffshore Safety & Survival Centre. He is a Master Mariner with 50 years experience in the marine transportation industry asmariner; educator; manager, and as consultant for national and international projects.Robert Mercer M.M., M.Ed. is a Master Mariner and Instructor in the School of Maritime Studies, Memorial University ofNewfoundland, developing and delivering customized marine training programs. He teaches courses in the Faculty ofEducation relating to Curriculum and Instructional Development, and has 40 years experience in the marine industry.Gary Jeffery Ph.D. is retired from the Faculty of Education, Memorial University of Newfoundland, with 40 yearsexperience in university teaching and research. He is a licensed psychologist with extensive experience in both standardisedpsychometric and classroom assessment.Stephen Cross M.M., M.Sc., Ph. D., F.N.I. is the Director of Projects, Maritime Institute Willem Barentsz (MIWB). He is aMaster Mariner with service on cargo ships, tankers and crane/pipelaying vessels. He taught for four years at the WorldMaritime University. He was MIWB Director for nine years. He is an international consultant on simulator training, andparticipant in EU R&D projects on ship safety, communications, crisis management and maritime education.

References

1 Sampson, H., Gekaraz, V. and Bloory, M. “Water-tight or sinking? A consideration of the standards of the contemporaryassessment practices underpinning seafarer licence examinations and their implications for employers”, Maritime PolicyManagement, Vol. 38, No. 1, 81–92, January 2011.

2 Goldberg M. “Using Multiple Choice Tests In Maritime Training Assessments”, Maritime Professional, March 18, 2013.http://www.maritimeprofessional.com/Blogs/Maritime-Training-Issues/March-2013/Using-Multiple-Choice-Tests-in-Maritime-Training-A.aspx. Accessed 22 March 2013.

3 Salend S.J. “Creating Student Friendly Tests”, Educational Leadership, November, 2011.4 Transport Canada. “Staff Instruction: Multiple-Choice Examination Question Development”. Document SI 566-001,

Issuing Office, Civil Aviation, 2009.5 Hussein A., Abdelkhalek N, and Hamdy H. “Setting and maintaining standards in multiple choice examinations: Guide

supplement 37.3 – practical application”, Medical Teacher, Vol. 32, pp 610–612.6 "MARS, Colregs and Competence: Editorial”, Seaways, the Journal of the Nautical Institute, October 2012.7 Drown D.F., Mercer R., Jeffery G. and Cross S., “Multiple Choice Question Assessment: A Question of Confidence -

Edited”. Seaways the Journal of the Nautical Institute, October 2012.8 Vallance K. “Captain's Column: COLREGs - education and examination”, Seaways the Journal of the Nautical Institute,

May 2013.9 Breakstone J., Smith M., and Wineburg S. “Beyond the bubble in history/social studies assessments”, Phi Delta Kappan

Magazine, Feb. 2013.10 Holder L. “Quality Assurance in Computer-Based Training”, Seaways the Journal of the Nautical Institute, January 201311 Heyborne W.H., Clarke J.A. and Perrett J.J. “A Comparison of Two Forms of Assessment in an Introductory Biology

Laboratory Course”, Journal of College Science Teaching, Vol. 40, No. 5, 2011.12 Fazio L.K., Agarwal P.K., Marsh E.J., and Roediger III H.L. Memorial Consequences of Multiple-Choice Testing on

Immediate and Delayed Tests”, Memory & Cognition, Vol. 38 (4), 2010, pp 407-418.13 Hartman, JA. R. and Lin, S. “Analysis of Student Performance on Multiple-Choice Questions in General Chemistry”,

Journal of Chemical Education, Vol. 88, 2011, pp 1223–1230.14 Moncada S.M., and Moncada T.P., “Assessing Student Learning with Conventional MC Exams: Design and

Implementation Considerations”, International Journal of Education Research, Vol. 5, No. 2, 2010.15 Dickinson J. R. “The Difficulty and Discriminating Ability of a Consumer Behavior Multiple-Choice Question Bank”,

American Marketing Association, Winter 2011, p 25.16 Caliskan, S.A., Durak, H.I., Elif, S. and Karabilgin, S. “Developing a web-based multiple-choice question item bank’,

Medical Education, 2010, Vol. 44, pp 489–526.17 Gekara, V.O.,Bloor, B. and Sampson, H. “Computer-based assessment in safety-critical industries: the case of shipping”,

Journal of Vocational Education and Training, Vol. 63, No. 1, March 2011, pp 87–100.18 Heick T. “The Real Problem with Multiple Choice Tests”, Edutopia, reported in the Washington Post, 25 January 2013.19 Supra note 2.20 Van Der Vleuten C. “Setting and maintaining standards in multiple choice examinations: Guide supplement 37.1 –

Viewpoint”, Medical Teacher, Vol. 32, 2010, pp 174–176.21 Supra note 1.22 Supra note 2.23 Arthur N., and Everaert P., “Gender and Performance in Accounting Examinations: Exploring the Impact of Examination

Format”, Accounting Education: an International Journal, Vol. 21, No. 5, pp 471–487, October 2012.24 Everaert, P. and Neal, A. “Constructed-response versus multiple choice: the impact on performance in combination with

gender”. Ghent University, Faculty of Economics and Business Administration, March 2012 (Working Paper).25 Hudson R.D., “Is there a relationship between chemistry performance and question type, question content and gender?”,

Science Education International, Vol.23, No.1, March 2012, pp 56-83.

Page 17: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 12 of 13

26 Hoang V., May L. and Tang T. “The Effects of Linguistic Factors on Student Performance on Economics Multiple ChoiceQuestions”, July 17, 2012. Available at SSRN: http://ssrn.com/abstract=2121816 orhttp://dx.doi.org/10.2139/ssrn.2121816 (accessed 30 Mar 2013).

27 Yu C.H. “A Simple Guide to the Item Response Theory (IRT) and Rasch Modeling”, July 24, 2012, URLhttp://www.creative-wisdom.com (Accessed 10 April 2013).

28 Cameron M.P., Calderwood R., Cox A., Lim S., and Yamaoka M. “Factors Associated with Financial Literacy amongHigh School Students “, Waikato Department of Economics, Working Paper in Economics 13/05, University of Waikato,Hamilton, New Zealand, March 2013.

29 Personal Communication, October 201230 Ziarati M., Yi J., Ziarati R., and Sernikli S. “Validation of the MarTEL Test: the Importance of Validity of the Test and the

Procedure for Validation in MarTEL”, Proceedings of the International Maritime English Conference, IMEC 24, 2012.31 Brantmeier C., Callender A., and Xiucheng Y. “Textual enhancements and comprehension with adult readers of English

in China”, Reading in a Foreign Language, October 2012, Volume 24, No. 2, pp. 158–185.32 Langah R. A. K., Munshi P., and Langah R.A.K. “Multilingual Online Examination System”, Interdisciplinary Journal

Of Contemporary Research In Business, August 2012, Vol 4, No 4, pp. 820-828.33 Shuhidan, S: Hamilton, M, and D’Souza,D. “Instructor perspectives of multiple-choice questions in summative assessment

for novice programmers”, Computer Science Education , Vol. 20, No. 3, September 2010, pp 229–259.34 Beggs B. Editorial, International Journal of Electrical Engineering Education, Vol. 48 No.3, 2011, Glasgow University.35 Barlow A.T. and Marolt A.M. “Effective use of multiple-choice items in the mathematics classroom”, Middle School

Journal, January 2012, p 50.36 Tarrant, M. “The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments”,

Nurse Education Today, 2006.37 Zulfikar T., “ The Making of Indonesian Education: An Overview on Empowering Indonesian Teachers”, Journal of

Indonesian Social Sciences and Humanities, Vol. 2, 2009, pp. 13–39.38 Goldberg M. “Should Training for Maritime Instructors be Mandatory?”, Blog, July 23, 2012.39 Cox Q.N. “ Is it time to require the Training for Instructors course to become mandatory under STCW?”, Proceedings of

the International Maritime Lecturers Association, IMLA 20, 2012.40 DiBattista D., and Kurzawa L. "Examination of the Quality of Multiple-choice Items on Classroom Tests”, The Canadian

Journal for the Scholarship of Teaching and Learning, Vol. 2, Issue 2, 2011, Article 4.41 Torres, C., Lopes, A.P., Babo, L. and Azevedo, J. “Improving Multiple-Choice Questions”. US-China Education Review,

Vol. B 1, 2011, pp 1-11.42 Nedeau-Cayo R, Laughlin D., Rus L., and Hall J. “Assessment of Item-Writing Flaws in Multiple-Choice Questions”,

Journal for Nurses in Professional Development , Vol. 29, No.2, 2013, pp 52 – 57.43 CE Connections. “CE Test: Assessment of Item-Writing Flaws in Multiple-Choice Questions”, Journal for Nurses in

Professional Development, March/April 2013. (see note 42, Nedeau-Cayo 2013)44 Supra note 40.45 Abdalla, M.E. “What does Item Analysis Tell Us? Factors affecting the reliability of Multiple Choice Questions (MCQs),

Gezira Journal of Health Sciences, Vol. 7, No. 1, June 2011.46 Tasdemir, M. “A Comparison of Multiple-Choice Tests and True-False Tests Used in Evaluating Student Progress”.

Journal of Instructional Psychology, Vol. 37, No. 3, 2010, pp 258-266.47 Yanying Xu. “Principles of Constructing Multiple-choice in Reading Comprehension of CET-4 and Their Enlightening to

General College English Teaching”, International Journal of English Linguistics, Vol. 1, No. 1, March 2011.48 Campbell D.E., “How To Write Good Multiple-Choice Questions”, Journal of Paediatrics and Child Health, Vol. 47, 2011.49 Yonker, J. “The relationship of deep and surface study approaches on factual and applied test-bank multiple-choice

question performance”, Assessment & Evaluation in Higher Education, Vol. 36, No. 6, October 2011, pp 673–686.50 “Yes We Can, Pass The Bar”. University of the District Of Columbia, David A. Clarke School of Law Bar Passage

Initiatives and Bar Pass Rates—from the Titanic to the Queen Mary!”, University of the District of Columbia LawReview, 10 March 2011.

51 Tweed M.J., Thompson M., Schwartz P. and Wilkinson T.J. “A Confidence And Safety Approach To Multiple-choiceQuestion Scoring”, Focus On Health Professional Education: A Multi-Disciplinary Journal, Vol. 13, No. 3, 2012.

52 Ajayi B.K., and Omirin M.S. “The Effect of Two Scoring Methods on Multiple Choice Agricultural Science Test Scores”,Review of European Studies, Vol. 4, No. 1, p 255, March 2012.

53 Corrigan H., and Craciun G. “Asking the Right Questions: Using Student-Written Exams as an Innovative Approach toLearning and Evaluation”, Marketing Education Review, Vol. 23, No. 1, Spring 2013, pp. 31–35.

54 Denny P., Hamer J., Luxton-Reilly A. and Purchase H. “PeerWise: Students Sharing their Multiple Choice Questions”,Fourth International Computing Education Research Workshop (ICER 2008), September 6–7, 2008, Sydney, Australia.

55 Luxton-Reilly A. “The Design and Evaluation of StudySieve: A Tool That Supports Student-Generated Free-ResponseQuestions, Answers and Evaluations”, A thesis submitted in fulfilment of the requirements for a PhD in ComputerScience, The University of Auckland, 2012.

56 Supra note 48.57 Institute of Asset Management, “ Writing MC and scenario-based questions: Guidance note“, Ver4, 30 July 2012.58 Iwata, T., Kojirib, T., Yamadaa, T., and Watanabeb, T. “Recommendation for English MC cloze questions based on

expected test scores”, International Journal of Knowledge-based and Intelligent Engineering Systems, Vol. 15, 2011.59Mujeeb A.M., Pardeshi M.L., and Ghongane B.B. “Comparative Assessment of Multiple Choice Questions Versus Short

Essay Questions in Pharmacology Examinations”, Indian Journal of Medical Sciences, Vol. 64, No. 3, March 2010.

Page 18: Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert ...globalmet.org/services/file/gen memo/gen memo 42.13.pdf · Gen Memo 42/13: Philippines / AGM 11/13 Minutes / Alert! / MCQ's

30 – IMLA 21 October 2013: Drown et al. Page 13 of 13

60 Kettler, R.J., Rodriguez, M.C., Bolt, D.M., Elliott, S.N., Beddow, P.A., & Kurz, A. “Modified Multiple-Choice Items forAlternate Assessments: Reliability, Difficulty, and Differential Boost”, Applied Measurement In Education, Vol. 24,2011, pp 210–234.

61 Kim M.K., Patel, R.A., Uchizono J.A.,Beck L. “Incorporation of Bloom’s Taxonomy into Multiple-Choice ExaminationQuestions for a Pharmacotherapeutics Course”, American Journal of Pharmaceutical Education, Vol. 76 (6), 2012.

62 Tiemeier A.M., Stacy Z.A., Burke J.M. Using Multiple Choice Questions Written at Various Bloom’s Taxonomy Levelsto Evaluate Student Performance across a Therapeutics Sequence”, Innovations in Pharmacy, Vol. 2, No. 2, 2011.

63 Supra note 3.64 Yaman S. “The Optimal Number of Choices in Multiple-Choice Tests : Some Evidence for Science and Technology

Education”, The New Educational Review, 2011, pp 228-249.65 Sinha N. “The Effects of Immediate Versus Delayed Feedback after Multiple-Choice Questions on Subsequent Exam

Performance”, A Thesis Submitted to the Graduate School -New Brunswick, Rutgers, The State University of NewJersey. October 2012.

66 Park J. “Constructive multiple-choice testing system”, British Journal of Educational Technology, Vol. 41, No. 6, 2010.67 Palmer E.J., Duggan P. , Devitt P.G., and Russell R. “The modified essay question: Its exit from the exit examination?”

Medical Teacher, Vol. 32, 2010, pp e300–e307.68 Personal Communication, March 2006.