6
NATIONAL BOARD OF MEDICAL EXAMINERS ® Subject Examination Program Clinical Science Examination Score Interpretation Guide New Score Scale Information (Effective August 2015) GRPID:CLNEURO$$ NBME ® subject examinations provide medical schools with a tool for measuring examinees' understanding of the clinical sciences. Items on this examination were written and reviewed by national test committees. Prior to publication, test forms are reviewed by a panel of course directors from this discipline. Although these examinations are designed to be broadly appropriate as part of overall examinee assessment, course objectives vary across schools, and the congruence between subject examination content and course objectives should be considered when interpreting test scores and determining grading standards. Specifically, subject examination scores should not be used alone, but rather in conjunction with other indicators of examinee performance in determination of grades. Subject Examination Scores The subject examination score is an equated percent correct score that represents mastery of the content domain assessed by the examination. It is calculated as the percentage of items in the total content domain that would be answered correctly based on an examinee’s proficiency level. The subject examination scores are equated across test administrations and are statistically adjusted for variations in test form difficulty. Consequently, these scores can be used to compare and track school and examinee performance over time. The subject examination scores are placed on a classic percent correct metric (0 – 100%) to facilitate interpretation and use. This scale can easily be incorporated into local assessments and grading schemes and provides a useful tool for comparing the scores of your examinees with those of a large, nationally representative group taking the examination as an end-of-course or end-of-clerkship examination. Note: Equated percent correct scores cannot be directly compared to scaled scores reported prior to August 2015. Please consult the score equivalency table for instruction on how to translate old scores to the new scale. Precision of Scores Measurement error is present on all tests, and the standard error of measurement (SEM) provides an index of the (im)precision of scores. The SEM indicates how far an examinee’s score on the examination might stray from his/her “true” proficiency level across repeated testing using different sets of items covering the same content. Using the SEM, it is possible to calculate a score interval that will encompass about two thirds of the observed scores for a given true score by adding and subtracting the SEM from that score. For this examination, the SEM is approximately 4 points. If an examinee’s true proficiency on the examination is 75, the score he/she achieved on the examination will usually (two times out of three) fall between 71 and 79 (75 - 4 and 75 + 4). Score and Performance Feedback Summary information on the examinee group tested, examination purpose and number of items scored is provided on each page of the feedback. The Roster of Equated Percent Correct Scores reports a total test equated percent correct score for each examinee. Reported scores also appear in a comma separated text file that can be downloaded. An Examinee Performance Profile, which graphically displays content areas of strength and weakness, is provided for each examinee. If there were at least 2 examinees, Equated Percent Correct Score Descriptive Statistics for reported scores are provided along with a Frequency Distribution of the total test equated percent correct score. If there were at least 10 examinees for a single form administration, a detailed Content Area Item Analysis Report summarizing the general content of each item on the exam along with group item performance is provided. Content area item descriptors and group item performance also appear in a file that can be downloaded. If there were at least 10 examinees for a single form administration or 20 examinees for a multiple form administration, a Summary Content Area Item Analysis Report is provided. If examinees were tested at your school in the previous academic year, a Year-End Report is provided. The report summarizes the performance of first-time takers and is posted annually in November to the NBME Services Portal (NSP).

NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

Embed Size (px)

Citation preview

Page 1: NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

NATIONAL BOARD OF MEDICAL EXAMINERS® Subject Examination Program

Clinical Science Examination

Score Interpretation Guide

New Score Scale Information (Effective August 2015)

GRPID:CLNEURO$$

NBME® subject examinations provide medical schools with a tool for measuring examinees' understanding of the clinical sciences. Items on this examination were written and reviewed by national test committees. Prior to publication, test forms are reviewed by a panel of course directors from this discipline. Although these examinations are designed to be broadly appropriate as part of overall examinee assessment, course objectives vary across schools, and the congruence between subject examination content and course objectives should be considered when interpreting test scores and determining grading standards. Specifically, subject examination scores should not be used alone, but rather in conjunction with other indicators of examinee performance in determination of grades. Subject Examination Scores

The subject examination score is an equated percent correct score that represents mastery of the content domain assessed by the examination. It is calculated as the percentage of items in the total content domain that would be answered correctly based on an examinee’s proficiency level. The subject examination scores are equated across test administrations and are statistically adjusted for variations in test form difficulty. Consequently, these scores can be used to compare and track school and examinee performance over time. The subject examination scores are placed on a classic percent correct metric (0 – 100%) to facilitate interpretation and use. This scale can easily be incorporated into local assessments and grading schemes and provides a useful tool for comparing the scores of your examinees with those of a large, nationally representative group taking the examination as an end-of-course or end-of-clerkship examination. Note: Equated percent correct scores cannot be directly compared to scaled scores reported prior to August 2015. Please consult the score equivalency table for instruction on how to translate old scores to the new scale. Precision of Scores

Measurement error is present on all tests, and the standard error of measurement (SEM) provides an index of the (im)precision of scores. The SEM indicates how far an examinee’s score on the examination might stray from his/her “true” proficiency level across repeated testing using different sets of items covering the same content. Using the SEM, it is possible to calculate a score interval that will encompass about two thirds of the observed scores for a given true score by adding and subtracting the SEM from that score. For this examination, the SEM is approximately 4 points. If an examinee’s true proficiency on the examination is 75, the score he/she achieved on the examination will usually (two times out of three) fall between 71 and 79 (75 - 4 and 75 + 4). Score and Performance Feedback

Summary information on the examinee group tested, examination purpose and number of items scored is provided on each page of the feedback. The Roster of Equated Percent Correct Scores reports a total test equated percent correct score for each examinee. Reported scores also appear in a comma separated text file that can be downloaded. An Examinee Performance Profile, which graphically displays content areas of strength and weakness, is provided for each examinee. If there were at least 2 examinees, Equated Percent Correct Score Descriptive Statistics for reported scores are provided along with a Frequency Distribution of the total test equated percent correct score. If there were at least 10 examinees for a single form administration, a detailed Content Area Item Analysis Report summarizing the general content of each item on the exam along with group item performance is provided. Content area item descriptors and group item performance also appear in a file that can be downloaded. If there were at least 10 examinees for a single form administration or 20 examinees for a multiple form administration, a Summary Content Area Item Analysis Report is provided. If examinees were tested at your school in the previous academic year, a Year-End Report is provided. The report summarizes the performance of first-time takers and is posted annually in November to the NBME Services Portal (NSP).

Page 2: NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

NATIONAL BOARD OF MEDICAL EXAMINERS® Subject Examination Program

Clinical Science Examination

Score Interpretation Guide

New Score Scale Information (Effective August 2015)

GRPID:CLNEURO$$

Grading Guidelines

Grading guidelines for this exam have been developed by a nationally representative group of clerkship directors to assist schools and institutions in setting fair and valid passing and honors standards for students taking this exam. An abbreviated summary of the grading guidelines is provided and the full study with a list of participating schools is reported on NSP.

Norms

Total academic year and quarterly norms are provided to aid in the interpretation of examinee performance. The norms reflect the performance of first-time taker examinees who took a paper and/or web form as an end-of-course or end-of-clerkship examination across an entire academic year and by quarterly testing periods. Quarterly norms have been provided because scores in some subject examinations are progressively higher for examinees of equivalent ability who take the relevant clerkship later in the academic year. This information may have particular relevance to schools that use the norm tables in the development of grading guidelines. The two most recent sets of norms that have been developed for this examination are provided for your convenience and are reported on the equated percent correct score scale. Norms will be updated in November to reflect the most recent academic year of data. Norms can also be found on NSP.

Page 3: NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

NATIONAL BOARD OF MEDICAL EXAMINERS® Subject Examination Program

Clinical Science Examination

Score Equivalency Table

This table is intended to aid in your transition to the new subject examination score scale. The new subject examination score is an equated percent correct score that represents mastery of the content domain assessed by the examination. It is calculated as the percentage of items in the content domain that would be answered correctly based on an examinee’s proficiency level. The new subject examination scores are placed on a classic percent correct metric (0 – 100%) to facilitate interpretation and use. The table below provides approximate performance equivalents from the old scaled score to the new equated percent correct score. To use the table, locate an examinee’s score in the column labeled “Old Scaled Score” and look across to the column labeled “Equated Percent Correct Equivalent”. This number indicates the examinee’s equivalent equated percent correct score. This table is also available on the NBME Services Portal (NSP) as a file that can be downloaded.

Approximate Subject Examination Scaled Score to Equated Percent Correct Score Equivalency Table

Old Scaled Score

Equated Percent Correct

Equivalent

Old Scaled Score

Equated Percent Correct

Equivalent

Old Scaled Score

Equated Percent Correct

Equivalent

Old Scaled Score

Equated Percent Correct

Equivalent 99 98 74 75 49 36 24 10 98 94 73 73 48 35 23 9 97 94 72 72 47 33 22 8 96 93 71 71 46 32 21 8 95 93 70 69 45 31 20 7 94 92 69 68 44 29 19 7 93 92 68 66 43 28 18 7 92 91 67 65 42 27 17 6 91 90 66 63 41 25 16 6 90 90 65 62 40 24 15 5 89 89 64 60 39 23 14 5 88 88 63 59 38 22 13 5 87 88 62 57 37 21 12 4 86 87 61 55 36 20 11 4 85 86 60 54 35 18 10 4 84 85 59 52 34 17 9 4 83 84 58 51 33 16 8 3 82 83 57 49 32 16 7 3 81 83 56 47 31 15 6 3 80 81 55 46 30 14 5 3 79 80 54 44 29 13 4 2 78 79 53 43 28 12 3 2 77 78 52 41 27 12 2 2 76 77 51 39 26 11 1 1 75 76 50 38 25 10

Page 4: NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

NATIONAL BOARD OF MEDICAL EXAMINERS® Subject Examination Program

Clinical Science Examination

Grading Guidelines

(Reported on the Equated Percent Correct Score Scale)

In 2005, the NBME began conducting webcast standard setting studies for the Clinical Science Subject Examination with medical school faculty from across the United States. For each study, medical school faculty participated as expert judges in webcast sessions that utilized the internet and conference calling to train participants in the standard setting procedure. Judges reviewed the content and rated the difficulty of each item on a current form of the examination. Each study employed both a Modified Angoff content-based procedure and the Hofstee Compromise standard setting method. These two procedures together provide proposed passing standards that are based on an in-depth item-by-item analysis of the examination content, as well as, a more global analysis of the content. The results were summarized and the proposed standards were expressed as the proportion of the content required for a candidate to pass and to receive honors status. Table 1 provides a summary of the medical school faculty who served as expert judges and their school information for each of the webcast studies conducted by the NBME. Table 1 – Demographics of Expert Judges and Schools Participating in Webcast Studies

Standard Setting Study

Number of Judges

Years of Experience

% Clerkship Directors

Number of Schools

Traditional School Curriculum

School Clerkship Length

2005 20 1 - 17 100% 20 43% 2 – 6 weeks 2010 14 1 – 23 79% 14 21% 2 – 6 weeks

The data shown below represent a compilation of the opinions of the medical school faculty who participated in the webcast studies. The results reported have been converted from the old scaled score to the new subject examination Equated Percent Correct score scale effective August 2015. The study results are provided to assist you in setting fair and valid standards for this examination.

Table 2 provides a summary of the results for passing scores from the Modified Angoff and Hofstee Compromise procedures. The recommended minimum passing score based on the 2010 Angoff results is an equated percent correct score of 55 which is slightly higher than the recommended standard in 2005. This score fell well within the acceptable range of minimum passing scores (52 to 65) computed from the 2010 Hofstee results, which suggest that any standard selected within this range would be reasonable. The recommended minimum passing score based on the 2010 Hofstee results is a subject exam score of 60, which is also higher than the 2005 Hofstee results.

Table 2 – Grading Guidelines for Passing (Mean Equated Percent Correct Scores)

Standard Setting Study

Number of Judges

Modified Angoff Hofstee Compromise Recommended Passing Score Range of Acceptable

Minimum Passing Scores Recommended Passing Score

2005 20 52 52 to 63 57 2010 14 55 52 to 65 60

Table 3 provides a summary of the Hofstee results for honors. The 2010 study results indicate that the minimum acceptable score for honors should fall between a score of 75 to 85. The range of minimum acceptable Hofstee scores for honors based on the 2010 study results has shifted slightly downward and the lowest minimum honors score is also slightly lower than the 2005 study results.

Table 3 – Grading Guidelines for Honors (Mean Equated Percent Correct Scores)

Standard Setting Study

Number of Judges

Hofstee Compromise Range of Acceptable Minimum Honors Scores

2005 20 76 to 85 2010 14 75 to 85

Page 5: NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

NATIONAL BOARD OF MEDICAL EXAMINERS® Subject Examination Program

Clinical Science Examination

YYY1-YYY2 Academic Year Norms

Equated Percent Correct Score

Percentile Ranks Academic Year Quarter 1 Quarter 2 Quarter 3 Quarter 4

(n=13,203) (n=3,577) (n=3,104) (n=3,205) (n=3,005) 100 100 100 100 100 100 99 100 100 100 100 100 98 100 100 100 100 100 97 100 100 100 100 100 96 100 100 100 100 100 95 100 100 100 100 100 94 100 100 100 100 100 93 99 99 99 99 99 92 99 99 99 99 98 91 98 99 98 98 97 90 97 98 97 96 96 89 96 97 96 95 95 88 94 96 94 93 93 87 92 94 93 91 91 86 90 92 90 89 88 85 86 89 87 85 83 84 83 86 85 81 80 83 80 83 82 78 76 82 76 79 77 75 72 81 72 76 73 70 68 80 68 72 70 66 63 79 64 68 65 62 59 78 58 64 60 56 54 77 55 60 56 53 51 76 50 55 51 47 45 75 45 51 47 43 41 74 41 47 42 38 37 73 38 43 38 35 34 72 34 39 34 31 30 71 30 36 30 26 26 70 26 32 26 23 23 69 23 28 23 20 20 68 20 25 20 17 17 67 17 22 17 14 14 66 15 19 14 13 12 65 13 17 13 11 10 64 11 14 10 9 8 63 9 12 8 8 7 62 7 10 7 6 6 61 6 8 6 5 5 60 5 7 5 4 4 59 4 6 4 4 3 58 3 5 3 3 2 57 3 4 3 2 2 56 2 4 2 2 2 55 2 3 2 1 1

54 or below 1 2 1 1 1

The table provides norms to aid in the interpretation of examinee performance. These norms reflect the performance of examinees from LCME-accredited medical schools who took a form of this examination as an end-of-course or end-of-clerkship examination for the first time during the academic year frommm/dd/yyy1 through mm/dd/yyy2. The percentile ranks for each quarter are defined using the school reported start date of the first rotation for this subject. Using the start date of the first rotation, examinees are assigned to the appropriate quarter based on the assumption that their test date would be at least four weeks later. For example, if a school’s start date for the first rotation is March, then the performance of examinees from that school that tested in April, May or June would be represented in the first quarter. Since quarterly norms are based only on schools that supplied the start date of the first rotation for this subject, the number of examinees reported across quarters may not add up to the total norm group for the academic year. To use the table, locate an examinee’s score in the column labeled “Equated Percent Correct Score” and note the entry in the adjacent column labeled “Percentile Ranks” for the Academic Year or Quarterly testing period of interest. This number indicates the percentage of examinees that scored at or below the examinee’s equated percent correct score. The mean and standard deviation of the norm group scores for each testing period reported are listed below.

Equated Percent Correct Scores

Academic

Year Quarter

1 Quarter

2 Quarter

3 Quarter

4

N 13,203 3,577 3,104 3,205 3,005

Mean 75.8 74.5 75.7 76.5 76.8

SD 8.8 9.1 8.6 8.6 8.6

Page 6: NATIONAL BOARD OF MEDICAL EXAMINERS ... - intranet.nbme…intranet.nbme.org/pdf/SampleScoreReports/Clinical/Clinical Score...NBME Services Portal (NSP). NATIONAL BOARD OF MEDICAL EXAMINERS

NATIONAL BOARD OF MEDICAL EXAMINERS® Subject Examination Program

Clinical Science Examination

YYY2-YYY3 Academic Year Norms

Equated Percent Correct Score

Percentile Ranks Academic Year Quarter 1 Quarter 2 Quarter 3 Quarter 4

(n=13,854) (n=3,630) (n=3,095) (n=3,199) (n=3,203) 100 100 100 100 100 100 99 100 100 100 100 100 98 100 100 100 100 100 97 100 100 100 100 100 96 100 100 100 100 100 95 100 100 100 100 100 94 99 100 99 99 99 93 99 99 99 99 99 92 98 99 98 98 98 91 97 98 97 97 97 90 96 97 96 95 95 89 94 96 95 94 93 88 92 94 93 91 90 87 90 92 91 89 87 86 87 89 88 85 83 85 83 86 84 80 79 84 79 83 81 76 75 83 76 80 77 72 71 82 71 76 73 68 67 81 67 72 69 63 62 80 62 68 64 58 57 79 58 63 59 54 52 78 53 58 54 49 48 77 49 54 50 45 43 76 44 49 46 41 38 75 39 44 41 36 34 74 35 40 37 32 30 73 32 36 33 28 27 72 28 32 29 25 24 71 24 28 25 22 21 70 21 25 22 19 18 69 19 22 19 17 15 68 16 19 16 15 13 67 13 16 14 13 10 66 11 13 11 10 8 65 9 11 9 8 7 64 7 9 7 7 6 63 6 8 6 5 5 62 5 7 5 4 4 61 4 5 4 3 3 60 3 4 3 3 3 59 3 4 3 2 2 58 2 3 2 2 1 57 2 3 2 2 1 56 2 2 1 1 1 55 1 2 1 1 1

54 or below 1 1 1 1 1

The table provides norms to aid in the interpretation of examinee performance. These norms reflect the performance of examinees from LCME-accredited medical schools who took a form of this examination as an end-of-course or end-of-clerkship examination for the first time during the academic year from mm/dd/yyy2 through mm/dd/yyy3. The percentile ranks for each quarter are defined using the school reported start date of the first rotation for this subject. Using the start date of the first rotation, examinees are assigned to the appropriate quarter based on the assumption that their test date would be at least four weeks later. For example, if a school’s start date for the first rotation is March, then the performance of examinees from that school that tested in April, May or June would be represented in the first quarter. Since quarterly norms are based only on schools that supplied the start date of the first rotation for this subject, the number of examinees reported across quarters may not add up to the total norm group for the academic year. To use the table, locate an examinee’s score in the column labeled “Equated Percent Correct Score” and note the entry in the adjacent column labeled “Percentile Ranks” for the Academic Year or Quarterly testing period of interest. This number indicates the percentage of examinees that scored at or below the examinee’s equated percent correct score. The mean and standard deviation of the norm group scores for each testing period reported are listed below.

Equated Percent Correct Scores

Academic

Year Quarter

1 Quarter

2 Quarter

3 Quarter

4

N 13,854 3,630 3,095 3,199 3,203

Mean 77.2 76.1 76.9 77.9 78.3

SD 8.4 8.7 8.3 8.4 8.3