7
The effectiveness of brief, spaced practice on student difficulties with basic and essential engineering skills Brendon D. Mikula, Andrew F. Heckler Department of Physics The Ohio State University Columbus, OH, 43210 [email protected], [email protected] Abstract—Through extensive testing and interviews of sophomore, junior, and senior engineering students in a Materials Science Engineering course at The Ohio State University, we found that these students struggle with many skills necessary for their coursework. Often these “essential skills” were prerequisite to the course and little to no instruction time was spent on them. Online training was developed to attempt to improve these skills. Students participated in the training several times over the term, with each assignment taking 10-20 minutes and consisting of 10 questions. Students were allowed unlimited attempts on each assignment and were required to achieve mastery (80% or better) for full credit. Training covered a wide range of topics: interpreting log plots and log scales, using metric prefixes for various conversions, estimating typical values of common material properties, employing dimensional analysis, and operating equations when given variables in mixed units. Unlike the success achieved by the log plots training, most of the topics saw little and insufficient improvement as a result of training, despite the basic nature of the skills. Future improvements to the training will focus on determining which factors will help to convince students of the importance of mastering these prerequisite skills. Keywords—engineering, computer training, online homework, mastery learning, student understanding, essential skills I. INTRODUCTION AND THEORETICAL BACKGROUND Many engineering students at The Ohio State University are required to take an introductory course in Materials Science Engineering. Students are expected to leave the course with mastery of certain categories of knowledge which are utilized frequently during coursework, some of which are considered prerequisite to the course. An example of this knowledge is the ability to read and interpret log plots, training on which has been successful across multiple terms [1]. Through interviews with instructors and exploratory pilot testing, we found that students have significant difficulties with a number of other basic skills essential to a functional understanding of materials science engineering and engineering in general. The “essential skills” studied here consist both of prerequisite skills to the course—e.g., metric prefixes and conversions, dimensional analysis, and operating equations when given variables with mixed units—as well as skills the instructor expects to impart to the students—e.g., order of magnitude estimates and patterns of common material properties. One critical aspect of the essential skills is that they are necessary for solving the types of problems posed in exams— even the simpler ones. Therefore, it is expected that students are near 100% accuracy with these skills. Even if students are 80% accurate with these essential skills, this lack of mastery is a critical bottleneck for successful performance. Here we demonstrate that a worrisome 20-50% of students performed poorly in many of these categories. Despite this, instructional time was typically not dedicated to the prerequisite skills. In this study, we developed a series of computer-based training tasks, assigned as homework, to attempt to address these issues. The training employs the method of mastery learning, in which time on task is allowed to vary to allow each student to obtain a required level of mastery. The two most influential versions of mastery learning are Bloom’s Learning For Mastery [2] and Keller’s Personalized System of Instruction [3]. Though these strategies vary in many ways, Block and Burns describe their similarities in [4]: (1) they prespecify a set of course objectives that students will be expected to master at some high level, (2) they break the course into a number of smaller learning units so as to teach only a few of the course’s objectives at one time, (3) they teach each unit for mastery--all students are first exposed to a unit’s material in a standard fashion; then they are tested for their mastery of the unit’s objectives, and those whose test performance is below mastery are provided with additional instruction, (4) they evaluate each student’s mastery over the course as a whole, on the basis of what the student has and has not achieved rather than on how well he has achieved relative to his classmates. In a meta-analysis of courses utilizing mastery criterion for learning, Kulik showed in [5] that mastery learning is effective in improving student performance on exams at all levels of learning. This improvement is greater for students with weaker content knowledge, making mastery learning a useful remediation tool. Another successful, and more recent, approach to learning is to use computer training as part of coursework. This has been successful in many forms: Multimedia learning modules 978-1-4673-5261-1/13/$31.00 ©2013 IEEE 1059 2013 IEEE Frontiers in Education Conference

The effectiveness of brief, spaced ... - Department of Physics

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The effectiveness of brief, spaced ... - Department of Physics

The effectiveness of brief, spaced practice on student difficulties with basic and essential engineering skills

Brendon D. Mikula, Andrew F. Heckler Department of Physics

The Ohio State University Columbus, OH, 43210

[email protected], [email protected]

Abstract—Through extensive testing and interviews of sophomore, junior, and senior engineering students in a Materials Science Engineering course at The Ohio State University, we found that these students struggle with many skills necessary for their coursework. Often these “essential skills” were prerequisite to the course and little to no instruction time was spent on them. Online training was developed to attempt to improve these skills. Students participated in the training several times over the term, with each assignment taking 10-20 minutes and consisting of 10 questions. Students were allowed unlimited attempts on each assignment and were required to achieve mastery (80% or better) for full credit. Training covered a wide range of topics: interpreting log plots and log scales, using metric prefixes for various conversions, estimating typical values of common material properties, employing dimensional analysis, and operating equations when given variables in mixed units. Unlike the success achieved by the log plots training, most of the topics saw little and insufficient improvement as a result of training, despite the basic nature of the skills. Future improvements to the training will focus on determining which factors will help to convince students of the importance of mastering these prerequisite skills.

Keywords—engineering, computer training, online homework, mastery learning, student understanding, essential skills

I. INTRODUCTION AND THEORETICAL BACKGROUND Many engineering students at The Ohio State University

are required to take an introductory course in Materials Science Engineering. Students are expected to leave the course with mastery of certain categories of knowledge which are utilized frequently during coursework, some of which are considered prerequisite to the course. An example of this knowledge is the ability to read and interpret log plots, training on which has been successful across multiple terms [1]. Through interviews with instructors and exploratory pilot testing, we found that students have significant difficulties with a number of other basic skills essential to a functional understanding of materials science engineering and engineering in general.

The “essential skills” studied here consist both of prerequisite skills to the course—e.g., metric prefixes and conversions, dimensional analysis, and operating equations when given variables with mixed units—as well as skills the instructor expects to impart to the students—e.g., order of

magnitude estimates and patterns of common material properties.

One critical aspect of the essential skills is that they are necessary for solving the types of problems posed in exams—even the simpler ones. Therefore, it is expected that students are near 100% accuracy with these skills. Even if students are 80% accurate with these essential skills, this lack of mastery is a critical bottleneck for successful performance. Here we demonstrate that a worrisome 20-50% of students performed poorly in many of these categories. Despite this, instructional time was typically not dedicated to the prerequisite skills. In this study, we developed a series of computer-based training tasks, assigned as homework, to attempt to address these issues. The training employs the method of mastery learning, in which time on task is allowed to vary to allow each student to obtain a required level of mastery.

The two most influential versions of mastery learning are Bloom’s Learning For Mastery [2] and Keller’s Personalized System of Instruction [3]. Though these strategies vary in many ways, Block and Burns describe their similarities in [4]: (1) they prespecify a set of course objectives that students will be expected to master at some high level, (2) they break the course into a number of smaller learning units so as to teach only a few of the course’s objectives at one time, (3) they teach each unit for mastery--all students are first exposed to a unit’s material in a standard fashion; then they are tested for their mastery of the unit’s objectives, and those whose test performance is below mastery are provided with additional instruction, (4) they evaluate each student’s mastery over the course as a whole, on the basis of what the student has and has not achieved rather than on how well he has achieved relative to his classmates.

In a meta-analysis of courses utilizing mastery criterion for learning, Kulik showed in [5] that mastery learning is effective in improving student performance on exams at all levels of learning. This improvement is greater for students with weaker content knowledge, making mastery learning a useful remediation tool.

Another successful, and more recent, approach to learning is to use computer training as part of coursework. This has been successful in many forms: Multimedia learning modules

978-1-4673-5261-1/13/$31.00 ©2013 IEEE 10592013 IEEE Frontiers in Education Conference

Page 2: The effectiveness of brief, spaced ... - Department of Physics

viewed before lecture improved performancand delayed post-tests when compared to reand figures alone [6]; deficiencies in mremediated by the adaptive ALEKS tutor pintegration of physics simulations, such as circlab have been successful in improving contewell as proficiency at actual lab tasks [8]; to na

The prerequisite nature of many of the knoobserved in Materials Science Engineering sthat computer-based training graded for masterprovide successful remediation of these difficu

This paper aims to investigate and ddifficulties with engineering essential skills. Aunderstanding of the knowledge state of thepopulation is invaluable to future steps in difficulties. Also presented are the results otraining, which can be used as a model or sfuture corrective measures.

II. METHODS Exploratory data taken during Autum

showed poor performance at the essential skthis paper. During Autumn semester 2012, training was given to N = 271 students inMaterials Science Engineering course at University. Most of the students in this clasand junior engineering majors. For 6 weeks beginning in week 6, students completed the Quizzes”, each worth approximately the same as one homework assignment. There were remaining open to the students for one wefollowed the same training pattern; theyunlimited attempts on each quiz and were reqmastery--80% or better--to obtain full credit. CEssential Skills Quizzes was divided approbetween interpreting log scales and the “mentioned above. Time spent on each week’sto the mastery grading criterion, but averminutes per quiz. The “essential skills” compof Essential Skills Quizzes 2 and 4, and half oQuiz 6. The remainder of the six Essentiacovered log scales; these results are not consid

In addition to the Essential Skills Quizzeagain awarded approximately one homework’to attend a “FLEX homework” session, whcompleted the essential skills assessment, whiminutes; cumulative time spent during the tskills training and assessment averaged arounstudent. In order to better assess the impessential skills quizzes and instruction, the randomly split into two conditions: 127 preQuiz (week 4) participants and 144 post-Esse(week 13) participants. Twenty students perselected for interview data; these students werwhile taking the essential skills assessment anthink out loud and respond to prompts fromoverview of the experimental design can be se

ce on immediate eading static text

math are largely program [7]; and cuit building, into ent knowledge as ame a few.

owledge gaps we students suggests ry may be able to

ulties.

describe student A more complete e current student correcting these

of mastery-based starting place for

mn quarter 2011 kills described in

computer-based n an introductory The Ohio State

ss are sophomore during the term, “Essential Skills amount of credit 6 quizzes, each

eek. Students all y were allowed quired to achieve Content in the six oximately evenly “essential skills” s quiz varied due raged around 15 prised the entirety of Essential Skills al Skills Quizzes dered here.

es, students were s worth of points

here the students ich took about 30 term on essential nd 70 minutes per pact of both the

population was e-Essential Skills ential Skills Quiz r condition were re video recorded nd were asked to

m the proctor. An en in Fig. 1.

Fig. 1. Experimental conditions used in conditions took the same assessment but completed the Essential Skills Quizzes concu

Training with the Essentiaadministered on Carmen, a course by The Ohio State University and dEach quiz draws randomly from a so that the content each student sequiz, but the specific questions mathat consecutive attempts at a singlwill not yield the same 10 questionson the same topics. This pseudo-ranattempt to reduce the number of studsolutions from earlier attempts or fro

Training was consistent with tmastery learning described by B“essential skills” to be learned anstudents were specified to the studetopics covered by the Essential Skinto 6 separate quizzes, with one quEssential Skills Quizzes was mastewere given quizzes over the same coattempt, Carmen provided the correon the quiz, allowing the students ththeir errors, (4) As the entire courselearning style--just the Essentiamastery over the course as a whoHowever, students were awarded more of the quizzes. In this sense, their mastery alone, not by the time

Thus, the computer-based Eswithin the standard framework described above.

III. RESU

All comparisons were betwbetween pretest and posttest wassquared test and numerical valuappropriate t tests. Effect sizes aresuch that a positive d means impincrease in proportion. Results are p

A. Metric Conversions Engineering students are expe

converting between metric units (centimeters to nanometers, megapaPretest students averaged only 74.8conversion problems--surprisingly fsimplicity of the skill and its const

this study. Students in both at different times; all students

urrently.

al Skills Quizzes was management website used

developed by Desire2Learn. categorized question bank,

ees is the same for a given ay not be. This also means le quiz by a single student s, but rather a new set of 10

ndomization was included to dents simply copying down om other students.

the four common traits of Block as follows: (1) the nd their importance to the ents prior to training, (2) he kills Quizzes were divided uiz given per week, (3) each ery graded and all students ontent. Upon completing an ect answers to all questions he opportunity to learn from e was not taught in mastery al Skills Quizzes--student le could not be evaluated. more points for mastering students were rewarded by it took to achieve.

sential Skills Quizzes fit of mastery learning, as

ULTS ween-student. Correctness s compared using a Chi-ues were compared using e reported using Cohen’s d provement, not merely an resented by category:

ected to be proficient in (micrograms to kilograms, ascals to gigapascals, etc.). % correct on simple metric far from ceiling, given the tant use in the engineering

10602013 IEEE Frontiers in Education Conference

Page 3: The effectiveness of brief, spaced ... - Department of Physics

Fig. 2. Metric conversion training questions came in thnotation questions (top); decimal form questions (middlequestions (bottom), which were given with multiple choice

courses the student had already taken. Traconversions comprised 6 of the 10 questions inQuiz #2 plus 2 of the 10 questions in Essentiaand involved three problem types (Fig. 2).

Despite the fact that students had to during the Essential Skills Quizzes, we foundmetric conversions resulted in no significant cperformance. One of the three metric conversthe essential skills assessment asked studevolumetric conversion from cubic centimetersThere was no significant difference in sco(73%) to posttest (71%). The most common ethe problem as a length conversion (centimignoring the volumetric nature of the probshould be noted, however, that volumetric princluded in the training, thus students faileskills they may have acquired from this quiz.

The remaining two metric conversion quonly linear conversions and gave conflictiquestion asked students to convert from megagrams, working in scientific notationsignificantly improved from 62% to 74% (χ2

= 0.044, d = 0.25). The other question asconvert from kilometers to centimeters in dequestion showed a marginal decrease from(χ2(1)^2 = 2.652, p = 0.103, d = -0.20).

Overall, student performance on meyielded rather limited and inconsistent resimportance and relative simplicity of thperformance is unacceptable low. On averagejust 74.8% correct on the three metric convbefore training and 75.5% after (t = 0.193, p =

B. Dimensional Analysis Students in engineering are expected to u

meaning they need to be able to analyze relationships between variables in an equatio

Fig 3. Common answer patterns on an example metric Students were asked to convert from cubic meters to cumost common error was to treat the problem as a length cconverting from meters to centimeters. Neither of statistically significant.

hree types: scientific e); and “find the unit” e options.

aining on metric n Essential Skills al Skills Quiz #6,

achieve mastery d that training on change in student sion questions on ents to make a s to cubic meters. res from pretest error was to treat

meters to meters), blem (Fig. 3). It roblems were not ed to utilize any

uestions involved ing results. One

micrograms to n; this question 2(1)^2 = 4.053, p sked students to ecimal form; this m 89% to 82%

tric conversions sults. Given the

he task, student e, students scored version questions

0.847).

understand units, the dimensional

on with minimal

conversion problem. ubic centimeters. The conversion, as though these changes were

Fig 4. Dimensional analysis training questiobreakdown questions (top); find the missstudents were given an equation and units foasked to solve for the units of the remainingmissing unit questions (right) dealing withconstant, R, and the Boltzmann constant, kmultiple choice questions and directed fill-student indicated the powers of respectivechoice responses is shown (right).

effort. For example, students shoulunits of a given quantity in an eqvariables involved (e.g., Fig. 4). expected to know whether to useBoltzmann constant—identical condepending on the units of the othdimensional analysis involved thconsisted of 7 of the 10 questions iplus 3 of the 10 questions in Essenti

In contrast to other essential skdimensional analysis resulted in sperformance for the category as a scores on the dimensional analysisfrom 43% to 57% (t = 3.922, p = 0.noting that this is still far from ceilineffect size. Significant gains were alquestions. One item on the assessmequation for the theoretical densitysolid as well as appropriate unitvariables. The students were then tafor the remaining variable. Signshown after training (χ2 (1)^2 = 6.with correctness increasing from 71%

Another similar question involva variable in the exponent of the eqnumber of vacancies in a materibecame apparent that some studentssides of the equation, ignorant to itself must be unitless. Part of the wquestion was updated to the form shquestion showed significant imcorrectness (χ2(1) = 7.500, p = 0.056%, while the most common einformation was needed to answesignificantly (χ2(1)^2 = 4.545, p = to 25%. Most students claiming indicated that the rest of the equation

ons came in three categories: unit sing unit questions (left), where or all but one of the variables, then g variable; and specialized find the h the difference between the gas . Each category consisted of both -in-the-blank questions where the e units. An example of multiple

ld be able to determine the quation based on the other

Students would also be e the gas constant or the nstants in different units—her variables. Training on hree problem types and in Essential Skills Quiz #4, al Skills Quiz #6.

kills categories, training on ignificant gains in student whole, improving average

s portion of the assessment .0001, d = 0.48); it is worth ng, despite a relatively large lso seen on many individual ment gave the students the y of a crystalline metallic ts for all but one of the asked with finding the units nificant improvement was .781, p = 0.009, d = 0.32), % to 84%.

ed determining the units of quation for the equilibrium al. Through interviews, it

s were merely balancing the the fact that the exponent

way through the pretest, this hown in Fig. 5. This updated mprovement in terms of

06, d = 0.49) from 33% to error--the belief that more er the question--decreased 0.033, d = 0.38) from 42% to need more information n was required.

10612013 IEEE Frontiers in Education Conference

Page 4: The effectiveness of brief, spaced ... - Department of Physics

Fig. 5. An example dimensional analysis question fromassessment and patterns of right and wrong answersstatistical significance at the p < 0.05 level. Student coerror submitted an answer that was the reciprocal of the co

Interestingly, students who believed that thbe solved with the given information showed in correctness from 56% to 72%, though thioutside the range of significance (χ2(1)^2 = 3.= 0.43). The lack of statistical significrespectable effect size--is likely due to smalsizes, a deficiency owed to the change in preteproblem and the larger proportion of studentinformation was needed in the pretest condition

C. Mixed Unit Equations Engineers in the field use equations to

from measurements, and the instruments measurements don’t always do so in consistenit is essential that engineering students be equations--to find the value of the dependenpresented with independent variables in mixed

Fig 6. An example of a mixed unit equation train ququestions in this category had the same form, but manyequations.

m the essential skills s. Asterisks indicate ommitting an inverse orrect answer.

he problem could an improvement

is result fell just 670, p = 0.055, d cance--despite a ll pretest sample

est versions of the ts believing more n.

calculate values providing these

nt units. As such, able to operate

nt variable--when d units.

uestion is shown. All y dealt with different

Fig. 7. An example mixed unit equations qpatters of right and wrong answers. Asterisksp< 0.05 level.

As a part of training, studentequation as well as values for involved, which were purposefully 6). Students were then asked tovariable in specified “target units.equation consisted of 3 of 10 questi4. Training was somewhat succesassessment scores on mixed unit eqto 48% (t = 2.765, p= 0.006, d=0.3still below 50%.

While overall scores in the mixshowed significant improvement inof the three individual questionassessment failed to do so. Scorquestion, increased from 40% to 4was not statistically significant (χ2(10.13). The question statement and converting to incorrect target unitatoms conversion, and erring exactlyshown in Fig. 7. For this item, taffected by training was students gitarget units than specified in the prob

A similar question involving gigapascals using Hooke’s law alchange in student performance (χ2(10.18) from pre (35%) to post (44%).

The simplest question providednewtons and an area in square miforce in kilopascals; this was the question to show significant improv= 0.003, d = 0.37), with correctne53%. It is worth emphasis that ththree questions of this type presenassessment, and 47% of students aranswers after training.

D. Typical Values of Material PropEngineers are expected to kno

various material properties and othexample, an engineer should knowYoung’s moduli are usually higherThey should also be able to provide material that is within a reasonablevalues contained two problem typesof the 10 questions in Essential Skill

question from the assessment and s indicate statistical significance at

s were presented with an the independent variables given in mixed units (Fig.

o calculate the dependent .” Training on mixed unit ons in Essential Skills Quiz sful in improving average

quation questions from 36% 34), leaving posttest scores

xed unit equations category n student performance, two ns in the essential skills res on the diffusion flux

46%, but this improvement 1)^2 = 1.144, p = 0.285, d =

common errors--explicitly ts, forgetting the mole to y by some power of 10--are the only error significantly iving an answer in different blem statement.

a conversion from psi to lso showed no significant 1)^2 = 2.279, p = 0.131, d = .

d students with a force in llimeters, then asked for a only mixed unit equation

vement (χ2(1)^2 = 8.992, p ss increasing from 35% to is was the simplest of the nted in the essential skills re still submitting incorrect

perties ow approximate values for her physical constants. For w that melting points and r for metals than polymers. a rough value for a specific

e range. Training on typical s (Fig. 8) and consisted of 4 ls Quiz #2.

10622013 IEEE Frontiers in Education Conference

Page 5: The effectiveness of brief, spaced ... - Department of Physics

Fig 8. Typical values training questions had two catego(top); and ranking questions (bottom), which containedappropriately ranked by some material property.

Training on estimates of typical valuproperties produced a range of results from sigmarginal losses. Cumulative student performain this category showed a nonsignificant chang0.145, d = 0.18) from pre (35%) to post (40%)

A series of three questions asked studentYoung’s modulus of three materials: copperhigh-density polyethylene (HDPE). Since studthese questions spanned such a broad range,results a number of ways. One metric “acceptable range” for the numerical answerby a course instructor (see Fig. 9). By thisresponses to the question on copper shoimprovement (χ2(1)^2 = 5.377, p = 0.020, d percentage of students in the acceptable range34% to 49%. Student estimates of the Younaluminum showed no significant improvem0.805, p = 0.370, d = 0.11) from pre (40%HDPE started and remained at 25% withinrange, showing no change at all. Note that nonvalues are higher than 50%, despite the fcontained problems dealing with Young’s mod

A second metric was to determine responses were in any of four increasingly lwithin 20% of the correct answer, within 2x thwithin 5x the correct answer, and within answer. This metric revealed a slightly morewhich can be seen Fig. 10. While student percopper question showed some improvement a

Fig 9. “Acceptable ranges” for the Young’s modulus of messential skills assessment, specified by the instructor.

ories: multiple choice d 2-5 materials to be

ues of material gnificant gains to ance on questions ge (t = 1.461, p = .

ts to estimate the r, aluminum, and dent responses to , we analyze the was to use an s, which was set s metric, student owed significant = 0.30), with the e increasing from

ung’s modulus of ment (χ2(1)^2 =

%) to post (45%). n the acceptable ne of the posttest

fact that training dulus.

whether student arge “ballparks”:

he correct answer, 10x the correct

e detailed picture rformance on the across the board,

materials used on the

most of this came from significaranges--students in the “within 10x”to 75% (χ2(1)^2 = 18.531, p < 0.00students in the “within 20%” range d(χ2(1)^2 = 1.988, p = 0.159, d = 0(15%). Student responses for alubehavior; students in the “withinincreased (χ2(1)^2 = 10.349, p = 0.074%, while those in the “within 2significant decrease (χ2(1)^2 = 5.3from 21% to 10%. In essence, stusomething resembling orbit aroundapproach yielded no significant chanthe question about HDPE for all ran

The final metric attempts to fvalues of student responses, rather value itself. Through discussions wwas expected of the students thatYoung’s modulus values for the mStudents giving a higher value for decreased significantly (χ2(1)^2 = 9from 55% to 36%; students givingthan for copper decreased significa0.012, d = 0.32) from 59% to 43%. incorrectly giving a higher Young’for both aluminum and copper =4.314, p = 0.038, d = 0.27) as welvalues can be seen in Fig. 11. Despiis again worth noting that posttest one of these errors about 40% of the

Students were also asked to ratanswers. Student confidence increas(t = 2.477, p = 0.014, d = 0.43) a0.017, d = 0.43), but decreased non-(t = -1.159, p = 0.248, d = -0.20). Nconfidence rankings exceeded 2.

Fig 10. Classification of student estimates oaluminum, and high-density polyethylene (value. Asterisks indicate significant change all of these changes were improvements.

ant increases in the larger ” range increased from 48% 001, d = 0.57). Conversely, did not significantly change .18) from pre (9%) to post

uminum displayed strange n 10x” range significantly 001, d = 0.41) from 55% to 20%” range actually saw a 395, p = 0.020, d = -0.30) udent responses settled into d the correct answer. This nge in student responses for ges.

focus more on the relative than the magnitude of the

with the course instructor, it t they at least give higher

metals than for the polymer. HDPE than for aluminum

9.280, p = 0.002, d = 0.40) g a higher value for HDPE antly (χ2(1)^2 = 6.250, p =

The percentage of students ’s modulus for HDPE than fell significantly (χ2(1)^2 l, from 49% to 36%. These ite statistical significance, it students still make at least

e time.

te their confidence in their sed significantly for copper and HDPE (t = 2.410, p = -significantly for aluminum Neither pretest nor posttest .2 out of 5 for any of

f the Young’s modulus of copper, (HDPE) in relation to the actual at the p < 0.05 level, though not

10632013 IEEE Frontiers in Education Conference

Page 6: The effectiveness of brief, spaced ... - Department of Physics

Fig 11. Relative value error frequencies for typical vastudent to estimate Young’s modulus of aluminum, cothese are errors, a decrease in percentage correspcorrectness.

the three materials involved, and changes in stdid not match student improvement in terms of

Finally, posttest students were asked to gmelting points for metals, polymers, andquestion was not included in the pretest. Swere judged based on the correct relative opoints; 71% of students ordered the melting This leaves almost 30% of posttest students unrank typical melting points of three distinct ma

E. Interview Data Twenty students each from pretest posttest

subjected to “think-aloud” interviews as theFLEX assessment. The most striking feinterviews was that many students not only alacked certain knowledge and skills, but seemthat fact. Some excerpts from interviews are sh

• “Usually I look [the metric prefixes/con

• “[Metric prefixes are] readily availabland textbook.”

• When asked if they felt it was importa“I feel like there’s always a table for it.

• Some students described their lack of these topics as a conscious choice: “Iable to look them up. So, as of now, Ito memorize them.” (Emphasis is autho

The prevailing view for certain knowleseems to be something along the lines of “Wwhen I can always look it up?” Perhaps the whbest described by one student in particular. Wstruggled on metric conversions, the proctoanswer any questions you have [about metconversions] once you’re done.” to which th“Or I can just go on Wikipedia.”

IV. DISCUSSION Computer-based, mastery-graded training

“essential skills” has been effective for some kfailing to be effective other knowledge. 20-were unable to perform simple metric conversinstruction nor training was able to alter these 30% of posttest students were unable to cotypical melting points of polymers, metals, andthan 50% of students were unable to c

alue problems asking opper and HDPE. As ponds to increasing

tudent confidence f correctness.

give approximate d ceramics—this Student responses order of melting points correctly.

nable to correctly aterial classes.

t conditions were ey completed the eature of these

admitted that they med content with hown below:

nversions] up.”

le on the internet

ant to memorize: .”

memorization of I’ve always been I haven’t decided or’s)

edge components Why memorize it hole experience is When this student or stated “I can tric prefixes and

he student replied

g on engineering knowledge, while -30% of students sions, and neither numbers. Almost orrectly rank the d ceramics. More correctly operate

Fig 12. Training effectiveness as measured passessment, averaged by category. Only dimequations showed statistically significant imp

equations when given variables ininstruction and training did little to some limited success was seen in stumaterial properties, 50% of postteestimate the Young’s moduli oinstructor-specified acceptable rainvolving a polymer saw 75% acceptable range. Not only were t40% of students gave a higher polymer than for the metals.

An overview of student performin the four categories considered inFig. 12. Even where training was eanalysis, posttest scores are still farperforming question still saw 16% to solve for the units of a givenequation and units for all other varistudents leave the course not knowunitless and, of the ones that do, solve for the units of the variable 50% of student could not succesNewtons and an area in mm2 to a str

Though training was more effethan for others, it was not as effectisuch essential and fundamentalperformance of greater than 90% cof possible explanations for these ob

• The training was too short (≈

• Posttest scores were alreadyfor the unit conversion).

• The students did not exert ththe post test questions.

• The students did not learnbecause they believe that thnot relevant to them.

• The students did not learn tbecause they perceive the into be learned, but looked up o

It is likely that some combinatiorelevant, but as stated earlier,

performance on the essential skills mensional analysis and mixed unit provement at the p<0.05 level.

n mixed units; once again, alter these numbers. While udent estimates of common st students were unable to f two metals within an

ange; a similar question of students outside the

the values incorrect, about Young’s modulus for the

mance as a result of training n this study can be seen in effective, as in dimensional r from ceiling. The highest-of posttest students unable

n variable when given an iables. As many as 25% of

wing that an exponential is still 25% cannot correctly in question. Further more,

ssfully convert a force in ress in kilopascals.

ective for some categories ive overall as necessary for l skills, i.e. a posttest orrect. There are a number

bservations. These include:

≈20 minutes for each topic).

y approaching ceiling (71%

heir full efforts in answering

n some of the categories e knowledge and skills are

the some of the categories formation as something not online or in a text.

on of these explanations are we have evidence from

10642013 IEEE Frontiers in Education Conference

Page 7: The effectiveness of brief, spaced ... - Department of Physics

interviews that the last bullet is important for at least some of the topics. The interview data hint at an explanation for this lack of consistent success. “Or I can just go on Wikipedia.” Unlike training on log plots, the answers to which questions cannot be easily “googled,” many of the essential skills can be performed with the aide of computer tools and text references—unit conversions can be typed into the Google search bar, the metric prefixes are inside the front cover of the text, Wolfram Alpha will operate equations and perform conversions for you. Because these references are so readily available, many students do not see committing these essential skills to memory as a task worthy of their time. In the process of maximizing points earned per time spent, these tasks simply fall by the wayside. The relative success of dimensional analysis training is in line with this explanation; you can’t “google” dimensional analysis very easily, so knowledge in this category was better retained in training.

As far as students not exerting a full effort in the posttest, this is likely to play a factor, but from a number of years of experience with tests in this context, we have found that students do answer these questions with a reasonable amount of effort.

Note that one threat to the external validity of this study is that pre to post training gains on the test were in fact not due to the training but rather from the instruction in the course. That is, all effects reported here may be the result of a combination of training and course instruction, and it is not clear which (if not both) caused the gains. Therefore further research in this area would benefit from a more controlled design in order to isolate the effects of training and instruction. In any case, it is clear that, from an instructional point of view, significantly higher gains are desired.

V. INSTRUCTIONAL IMPLICATIONS AND NEXT STEPS The results of this study offer invaluable insights into the

knowledge state of undergraduate engineering students. Instructors should be aware of these fundamental deficiencies in their classrooms and should take measures to ensure students are made aware of their shortcomings as well. Simple mastery-based training has been shown to be effective with some skills; future training can use this study as a starting point or a model upon which to build. At the very least, some form of intervention—instructional or otherwise—seems necessary to prevent allowing students with such critical deficiencies to slip through the academic cracks.

The continued poor performance of students suggests that it may be useful to focus on determining with factors might help convince them of the importance of mastering these essential skills, thus motivating the students to achieve mastery. In this study, the essential skills were somewhat separate from the activities of the lecture and recitations. One way to mitigate this effect is to increase the direct involvement of the

instructors, helping to create a dialogue with students as to why mastery of these skills is important.

Another possible approach to further improving mastery and fluency is to progressively limit the time allowed for the Essential Skills Quizzes, particularly those containing knowledge that can easily be looked up online. This will help compel students to internalize and automate the essential skills and related knowledge.

Finally, this study suggests that instructors should consider two practical categories of essential skills and knowledge. The first is the category of skills and knowledge that lend themselves well to short, spaced training. The second category is comprised of simple knowledge or skills that one can quickly access via other resources such as the internet. This study suggests that the latter category seems to be resistant to brief training, as students often recognize easier methods for success in place of committing the skills to memory. In this second case, the relevant instructional goal would be to improve students’ mastery in efficiently accessing this information.

ACKNOWLEDGMENTS This work has been supported in part by the Center for

Emergent Materials at The Ohio State University, an NSF MRSEC (Award Number DMR – 0820414).

The authors would also like to thank Katharine Flores and Alison Polasik for their comments and generous cooperation with us in implementing the assignments in their courses.

REFERENCES [1] Heckler, A. F., Mikula, B. D., & Rosenblatt, R. (submitted for

publication) Student accuracy in reading logarithmic plots: the problem and how to fix it. FIE 2013 Conference Proceedings.

[2] Bloom, B. S. (1968). Mastery Learning. Evaluation comment, 1(2). Los Angeles: University of California at Los Angeles, Center for the Study and Evaluation of Instructional Programs.

[3] Keller, F. S. (1968). “Good-bye, teacher…” Journal of Applied Behavioral Analysis, 1, 79-89.

[4] Block, J. H., & Burns, R. B. (1976). Mastery Learning. Review of Research in Education, 4, 3-49. http://www.jstor.org/stable/1167112. Accessed: 27 March 2013.

[5] Kulik, C. C., Kulik, J. A., & Bangert-Drowns, R. L. (1990). Effectiveness of Mastery Learning Programs: A Meta-Analysis. Review of Educational Research, 60(2), 265-299.

[6] Stelzer, T., Gladding, G., Mestre, J. P., & Brookes, D. T. (2009). Comparing the efficacy of multimedia modules with traditional textbooks for learning introductory physics content. American Journal of Physics, 77(2), 184-190.

[7] Hu, X., Luellen, J. K., Okwumabua, T. M., Xu, Y., & Mo, L. (n. d.). Intelligent tutoring systems help eliminate racial disparities. http://www.aleks.com/highered/behavior/AERA_Paper.pdf. Accessed: 28 July 2012.

[8] Podolefsky, N., Reid, S., & LeMaster, R. (2005). When learning about the real world is done better virtually: a study of substituting computer simulations for laboratory equipment. Physical Review Special Topics - Physics Education Research, 1(1), 010103.

10652013 IEEE Frontiers in Education Conference