19
AC 2012-4442: IDENTIFICATION OF COMMON STUDENT ERRORS IN SOLVING FUNDAMENTAL MECHANICS PROBLEMS Dr. Shawn P. Gross, Villanova University Shawn Gross is an Associate Professor of civil and environmental engineering at Villanova University, where he teaches undergraduate and graduate courses in structural engineering and engineering mechan- ics. Prof. David W. Dinehart, Villanova University c American Society for Engineering Education, 2012 Page 25.709.1

Identification of Common Student Errors in Solving

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

AC 2012-4442: IDENTIFICATION OF COMMON STUDENT ERRORS INSOLVING FUNDAMENTAL MECHANICS PROBLEMS

Dr. Shawn P. Gross, Villanova University

Shawn Gross is an Associate Professor of civil and environmental engineering at Villanova University,where he teaches undergraduate and graduate courses in structural engineering and engineering mechan-ics.

Prof. David W. Dinehart, Villanova University

c©American Society for Engineering Education, 2012

Page 25.709.1

Identification of Common Student Errors in Solving Fundamental Mechanics Problems

Abstract Sophomore-level mechanics courses, by nature, focus on developing students’ problem solving abilities. Students are challenged with numerous typical problems in which they must interpret given information, determine what is required as an answer, set up a structured solution methodology, and execute that methodology without error. In these types of fundamental problems, there is always “one right answer”. Experience has shown that a large percentage of students do not obtain that correct answer when solving a problem on a quiz or examination. In some cases, students are unable to set up a problem correctly due to major conceptual issues. In other cases, students make more execution type errors such as using a wrong moment arm, making sign errors in equilibrium equations, and using incorrect forces or areas. Finally, in many cases students demonstrate a clear understanding of concepts but fail to obtain the correct answer due to mistakes related to mathematics (algebra, trigonometry, etc.), general carelessness (calculator entry errors or transcribing errors), or the use of proper units (conversion errors, weight/mass errors, failing to state units, etc.). For the past five years faculty in an introductory mechanics course combining elements of statics and mechanics of solids at Villanova University, and in a previous course that covered statics alone, have methodically collected data on the mistakes students make in solving these types of problems. For every quiz and exam problem, a detailed gradesheet was used that identifies every error made by every student. Over five years, this grading approach has been used on over 150 assigned quiz and examination problems and errors have been identified on over 8000 student problem submissions. Using the data collected, this paper summarizes the errors students made in solving common mechanics problems, including the topics of equivalent force systems, 2-D rigid body equilibrium, truss analysis, and centroids of composite areas. Typical problems are presented and the grading and data collection methodology is outlined in detail. The types of errors made by students are grouped for discussion into those that are major conceptual, those that are minor execution type, and those that are non-conceptual and unrelated to mechanics altogether. Conclusions are drawn and potential uses for the data in improving teaching and student learning are also discussed. Importance and Innovation in Introductory Mechanics Courses Many engineering educators believe that the first introduction to engineering mechanics, statics at most universities, is of paramount importance in a student’s journey to becoming an engineer. The course serves as the foundation for numerous other courses (mechanics of solids, materials, dynamics, fluid mechanics, structural analysis, transportation, etc.), but perhaps more importantly this course is the first introduction to developing rigorous problem solving skills.

Page 25.709.2

Recent work has focused on evaluation of problem solving skills in statics courses and areas of difficulty (Newcomer and Steif 1, Newcomer 2). Additionally, evaluations of conceptual understanding and problem solving skills have been studied and concept inventories have been developed and pre-tests and multiple choice questions analyzed to highlight typical student shortcomings (Douglas et al. 3, Steif and Dantzler 4, Steif and Hanson 5). Follow up studies focused on improving problem solving performance by assessing pre-post tests and written and verbal protocols (Steif, et al. 6). It was shown that problem solving could improve if students develop strategies for recognizing when and how to apply techniques. There has been significant educational effort in recent years focused on implementing new techniques to the teaching of engineering mechanics. This work has included combining traditional statics topics in a heavily design oriented backdrop (Russell 7, Condoor 8, Klosky et al. 9), focusing on application to real artifacts (Seif and Dollar 10), and combining statics concepts with those from mechanics of solids and machine design (Chaphalkar 11). Recent efforts document successes with utilizing an inverted classroom (Papadopoulos et al.12) and other innovative pedagogies. The goal of improving educational outcomes via a highly interactive classroom has been shown to be successful in formats where lectures and laboratories are combined and problem-based active learning techniques are implemented (O’Neill, et al.13, Gross, et al.14, Glynn, et al.15). The innovations in topic delivery, facilitation of active and diverse learning environments, and the evaluation of problem solving skills are all positive steps to producing better engineers. Of course problem solving skills are essential to successfully navigating an engineering curriculum and becoming a good engineer. While faculty assist students in developing these skills it is essential that we not lose sight that the goal is not the skill development or concept understanding. The ultimate goal is that the engineers we are charged to educate consistently produce the correct answer. It is important to develop an assessment method that provides insights into why students do not produce the correct answer. This methodology should be able to identify deficiencies in problem solving skills as well as execution. Unfortunately this type of methodology requires rigorous assessment of real problems (divorced from the multiple-choice, easy to grade world that testing agencies and many educators prefer). Assessment Problem-solving practice reinforces students’ learning while providing the teacher with opportunities to identify students’ misconceptions or incomplete understanding. Homework is clearly a valuable resource in the assessment process; however, there are the practical limitations of faculty time to thoroughly assess and document each problem and the fact that the homework may not have been a product of an individual effort. Shortcomings in students’ learning can only be identified if the instructor rigorously assesses the students’ individual and independent performance, with respect to the established learning outcomes. Assessment of quiz and exam problems guarantees individual efforts are being reviewed. Ultimately the goal is to use the data from this performance assessment to improve learning by giving constructive feedback oriented toward improving the identified deficiencies. The methodology detailed herein can be used for both assessment and evaluation purposes. P

age 25.709.3

Welch et al.16 defined assessment as the process of measuring performance for the purpose of improving future performance, and evaluation as the process of measuring performance against a defined standard, usually for the purpose of reward or punishment. Many professors choose to assess students’ performance with examinations only, even though exams are better suited for evaluation than for assessment. Exams inevitably occur at the end of the learning process, when there are few opportunities for students to apply feedback to improve future performance. Effective assessment of weekly quizzes is formative, rather than summative. The immediate and specific feedback provides students with formal performance review throughout the learning process, rather than solely at the end of that process. As noted by Welch et. al.16, several iterations of performance and feedback are often necessary to achieve high-quality learning outcomes. Methodology For the past five years faculty in an introductory mechanics course combining elements of statics and mechanics of solids and in a previous course that covered statics alone, have methodically collected data on the mistakes students make in solving these types of problems. For every quiz and exam problem, a detailed grade sheet (as shown in Figure 1) was used that identifies every error made by every student. Students are given between 10 and 15 minutes to complete a quiz focused on one specific learning outcome. Exam problems typically address one or two student learning outcomes and students may have 20 to 25 minutes on average to complete them. Examples of the learning outcomes evaluated via quizzes and exams that are reviewed in this paper include the following:

1. Determine the magnitude, location, and direction of a single force to represent a system of parallel forces.

2. Determine the centroid of a composite area. 3. Solve for unknown forces in a truss by the method of joints or method of sections. 4. Solve for unknown forces and moments acting on a 2-D rigid body in equilibrium.

To address a specific learning outcome, pre-class reading is assigned on the topic, a short Powerpoint lecture is presented, and two or three example problems are worked in class. For some topics a brief demonstration may be performed, or a more in-depth laboratory may be executed. Two homework problems are assigned and collected at the next class. Homework is graded and returned the next class period, so that students have feedback on their understanding prior to taking a quiz. The quiz is usually given within the next two class periods following the homework being returned. Additionally, three exams and a cumulative final exam are distributed throughout the semester. These exams are weighted heavily and account for about two-thirds of the final grade. All quiz and exam problems are graded out of 10 points and the specific student errors are noted in the spreadsheet and on the student submission, as are the amount of points deducted for each mistake. Typically 1 point (sometimes 2) is deducted for non-conceptual errors, such as a math error, transcription error, neglecting units, geometry mistake, or not showing a direction on a P

age 25.709.4

force. For minor execution type errors, 2 or 3 points are typically deducted, while 3 or more points are lost for major conceptual errors. Half points are not deducted in any case. The grading process is far more time intensive than simply marking up the quiz and exam paper; however, the largest time effort is the development of the spreadsheet and the initial year of converting from a traditional grading technique. Although the methodology requires additional time, the effort becomes second nature. The assessment work does yield important data that is invaluable in dissecting concept oriented problem solving and execution errors. Over five years, this grading approach has been used on over 150 assigned quiz and examination problems and errors have been identified on over 8000 student problem submissions.

Figure 1 – Sample gradesheet for a single problem identifying all student errors

Page 25.709.5

Results This section highlights representative findings for the four learning outcomes identified previously. For the work reported in this paper, results were tabulated using 28 problems (nearly 1500 individual student problem submissions). Over 1000 errors were methodically recorded and categorized for these problems. For each type of problem a typical problem is first presented as it would appear on an examination or quiz. The grade distribution is then presented and the student errors made on the problem are identified. The first problem evaluated involves the determination of a single force to represent a system of parallel forces. An example problem statement is given in Figure 2. Nine problem offerings were considered in this study, including five examination problems from 2007 through 2011 and four quiz problems from 2007 through 2010. All problems are extremely similar and involve three or four parallel forces.

Figure 2 – Typical problem involving a system of parallel forces. The grade distribution for parallel force problems is given in Table 1. Students typically perform very well on this problem, with nearly three out of every five students getting the problem perfectly correct. Three-quarters of students earn a score of at least 9 on the problem. A detailed list of the mistakes made by students on the parallel forces problem is given in Table 2. A total of 263 mistakes were recorded from the 500 student problem submissions. The most common problems were minor conceptual errors including switching coordinate directions in the answer (i.e. reporting the x-coordinate of the centroid as the y-coordinate and vice versa) and using a wrong moment arm for a force in a moment calculation about a coordinate axis. Many students also made non-conceptual errors unrelated to mechanics such as using wrong units (or

Page 25.709.6

leaving units off of an answer), calculation errors (e.g. providing a correct equation and getting a wrong result due to a calculator error), or expressing their final answer incorrectly. Table 1 – Grade distribution on parallel forces problem

Score Frequency

n Mean Median 0 1 2 3 4 5 6 7 8 9 10 F’ 07 Quiz 4 0 0 0 5 1 5 7 2 9 6 26 61 8.0 9 F’ 07 Exam 1 0 1 0 0 0 0 3 3 8 8 39 62 9.1 10 F’ 08 Quiz 4 0 0 1 5 3 4 3 2 10 5 29 62 8.0 9 F’ 08 Exam 1 0 0 0 1 0 0 4 0 9 2 48 64 9.3 10 F’ 09 Quiz 9 0 0 0 1 0 1 5 1 5 13 30 56 9.0 10 F’ 09 Exam 3 0 0 0 0 0 0 0 1 2 14 39 56 9.6 10 F’ 10 Quiz 9 1 0 0 2 0 3 1 1 3 12 22 45 8.6 9 F’ 10 Exam 3 0 0 0 0 0 0 0 1 3 11 31 46 9.6 10 F ’11 Exam 3 0 0 0 0 0 0 0 1 5 11 31 48 9.5 10 Total (#) (%)

1 1 1 14 4 13 23 12 54 82 295 500 8.9 10 0 0 0 3 1 3 5 2 11 16 59

Table 2 – Mistakes made by students on parallel force problem

Error Description

Typical Point

Deduction n Non-Conceptual Errors Wrong or missing units 1 34

Calculation or transcription error 1 28 Final answer expressed incorrectly (missing direction, not clear, etc.)

1 24

Minor Execution Errors Switched x and y coordinates 1-2 53

Wrong moment arm for a force 1-2 51 Negative sign error due to inconsistent sign convention 2 14 Miscellaneous error 1-3 6 Incorrect resultant force 2 4 Missing force in moment calculation 2 3

Major Conceptual Errors Correct resultant force but no coherent solution for location 5-7 21

Inverted calculation (M=Fd taken as F=Md) 2 9 Has moments computed but did not complete solution foby solving for coordinates

3-4 8

Did not treat x,y coordinates separately and computed all moments as F*d (d taken from origin)

5 4

Essentially no clue (not even resultant force computed correctly)

7-10 4

Total Number of Errors Recorded 263

Page 25.709.7

A summary of the types of errors made on the parallel force problem is presented in Figure 3. Almost half of the errors made by students on this problem were minor execution errors, though it is worth noting that a full third of the errors made were of the non-conceptual type.

Figure 3 – Summary of error types for parallel force problem. The second problem evaluated involves the determination of a centroid of a composite shape. An example problem statement is given in Figure 4. Eight problem offerings were considered in this study, including four examination problems from 2009 through 2011 and five quiz problems from 2007 through 2011. All problems are extremely similar and involve an area that can be broken up into three or four subareas for calculation of the centroid. In most cases, the problem includes an area that is best considered as a void (i.e. negative area) to facilitate solution although the problem can usually be solved in a variety of ways. The grade distribution for centroid of a composite area problems is given in Table 3. Students typically perform very well on this problem, with a mean score of 9.1. Over 60 percent of students get the problem perfectly correct and over 70 percent of students earn a score of at least 9 on the problem. A detailed list of the mistakes made by students on the centroid problem is given in Table 4. A total of 209 mistakes were recorded from the 415 student problem submissions. The most common error by far was the execution error of using an incorrect value (area or centroidal coordinate) to represent a subarea. A significant number of non-conceptual mistakes, especially calculation and transcription errors, were also made. A summary of the types of errors made on the centroid of a composite area problem is presented in Figure 5. Almost two-thirds of the errors were classified as minor execution errors, and an additional 29% were non-conceptual errors. Only 6% of the errors for this problem were classified as major conceptual errors.

Page 25.709.8

Figure 4 – Typical problem involving the centroid of a composite area. Table 3 – Grade distribution on centroid of a composite area problem

Score Frequency

n Mean Median 0 1 2 3 4 5 6 7 8 9 10 F’ 07 Quiz 10 0 0 0 0 2 2 3 2 10 8 21 48 8.6 9 F’ 08 Quiz 10  0 0 0 0 1 0 6 1 12 4 23 47 8.7 9 F’ 09 Quiz 10  0 0 0 0 0 0 2 2 1 3 38 46 9.6 10 F’ 09 Exam 3 0 0 0 0 0 2 1 4 10 8 21 46 8.8 9 F’ 10 Quiz 11 0 0 0 0 0 0 0 1 4 3 48 56 9.8 10 F’ 10 Exam 3 0 0 0 0 0 0 2 0 5 5 42 54 9.6 10 F’ 11 Quiz 11 0 0 1 0 3 2 2 4 16 3 33 64 8.6 10 F ’11 Exam 3 0 0 1 0 1 0 5 2 9 3 33 54 8.9 10 Total (#) (%)

0 0 2 0 7 6 21 16 67 37 259 415 9.1 10 0 0 0 0 2 1 5 4 16 9 62

Page 25.709.9

Table 4 – Mistakes made by students on centroid of a composite area problem

Error Description

Typical Point

Deduction n Non-Conceptual Errors Calculation or transcription error 1 43

Wrong or missing units 1 13 Final answer expressed incorrectly (missing direction, not clear, etc.)

1-2 4

Failing to submit quiz on time (after “pencils down” called) 4 1 Minor Execution Errors One subarea value (area or centroidal coordinate) wrong 1-2 112

Corner area of a C-shape double counted 2 15 C-shape treated as a line (thickness ignored in solution) 2 4 x and y coordinates flipped throughout solution 2 2 Miscellaneous error 2 2

Major Conceptual Errors No coherent solution, numerous strategy mistakes 5-7 4

Divided sum of first moments by sum of x or y coordinates instead of total area

3 3

Missing void completely (did not consider negative areas) 3-4 2 Incorrect answer by inspection (misunderstanding of symmetry)

4 2

Essentially no clue 8 2

Total Number of Errors Recorded 209

Figure 5 – Summary of error types for centroid of composite area problem.

Page 25.709.10

The third problem evaluated involves the analysis of a small truss using the method of joints or method of sections. An example problem statement is given in Figure 6. Five problem offerings were considered in this study, including examination problems from 2007 through 2011. All problems involve solving for the members forces and reactions in a five to seven member truss. In almost every case, students use the method of joints since they are required to perform a complete analysis and solve for all member forces. The grade distribution for the truss analysis problem is given in Table 5. The mean score is 8.1 and the median score is 9. Approximately one third of the students earn a perfect score of 10. The distribution of scores on this problem is greater than for the parallel forces or centroid problem, with only 54% scoring at 9 or above (compared to 75% and 71%, respectively). Almost 20% of students receive scores of 6 or lower on this problem. A detailed list of the mistakes made by students on the truss analysis problem is given in Table 6. A total of 263 mistakes were recorded from the 279 student problem submissions. The most common error was getting tension or compression wrong on one member or reaction force, usually due to a misinterpretation of correct equilibrium calculations. A significant number of non-conceptual mistakes, especially calculation and transcription errors, were also made. Other common errors included trigonometry errors, rigid body equilibrium mistakes, and joint free body diagram mistakes. A summary of the types of errors made on the truss analysis problem is presented in Figure 7. The majority of errors were classified as minor execution errors, while about one-third of the errors were classified as non-conceptual errors.

Figure 6 – Typical problem involving truss analysis.

Page 25.709.11

Table 5 – Grade distribution on truss analysis problem

Score Frequency

n Mean Median 0 1 2 3 4 5 6 7 8 9 10 F’ 07 Exam 1 0 3 2 3 4 4 2 2 13 0 29 62 7.6 8 F’ 08 Exam 1  0 0 0 0 4 2 5 11 11 15 16 64 8.1 8 F’ 09 Exam 2 0 0 0 0 0 2 1 4 9 14 26 56 9.0 9 F’ 10 Exam 2 0 0 0 1 0 2 7 3 7 18 8 46 8.1 9 F ’11 Exam 2 0 1 0 1 2 2 4 9 8 13 11 51 7.8 8 Total (#) (%)

0 4 2 5 10 12 19 29 48 60 90 279 8.1 9 0 1 1 2 4 4 7 10 17 22 32

Table 6 – Mistakes made by students on truss analysis problem

Error Description

Typical Point

Deduction n Non-Conceptual Errors Calculation or transcription error 1 44

Trigonometry error (wrong angle or sin/cos error) 1 23 Final answer expressed incorrectly (missing direction or T/C, not clear, significant figures, etc.)

1 12

Wrong or missing units 1 7 Minor Execution Errors T/C wrong on one member or reaction direction wrong 1 62

Rigid body equilibrium mistake 1-3 24 Error on joint FBD 1-2 22 Negative sign substitution error 1-2 18 Miscellaneous error 1 13 Incomplete solution; missing answer for one member force 2 2 Missed zero force member 1 7

Major Conceptual Errors Major FBD errors (missing forces, wrong forces, etc.) 3-4 10

Incomplete solution; missing answer for one member force 3-6 9 Essentially no clue 9 1

Total Number of Errors Recorded 263

Page 25.709.12

Figure 7 – Summary of error types for truss analysis problem. The final problem evaluated involves rigid body equilibrium analysis for a simple 2-D structure. An example problem statement is given in Figure 8. Six problem offerings were considered in this study, including two examination problems each from 2009 through 2011. All problems require the student to solve for the reaction forces at a simple support. In every problem, one or more distributed loadings act on the structure and the student must resolve these into equivalent concentrated forces as part of the solution. The grade distribution for the rigid body equilibrium problem is given in Table 7. The mean score is 8.3 and the median score is 8. Thirty-eight percent of the students earn a perfect score of 10 and just under half of the students earn scores of 9 or above. Only 7% of students receive scores of 6 or lower on this problem. A detailed list of the mistakes made by students on the rigid body equilibrium problem is given in Table 8. A total of 276 mistakes were recorded from the 299 student problem submissions. The most common error by far was a trigonometry error, typically related to breaking up a force on a diagonal member into components. Many students switch the horizontal and vertical components or compute the wrong angle for the member. The next most common mistakes were minor transcription and calculation errors, followed by significant free body diagram errors. In most cases, these free body diagram errors were related to improper representation of support reactions at a roller or pin. A summary of the types of errors made on the rigid body equilibrium analysis problem is presented in Figure 9. The types of errors on this problem follow a different trend than for the other types of problems. The majority of errors were classified as non-conceptual errors, while nearly 20% were classified as major conceptual errors. P

age 25.709.13

Figure 8 – Typical problem involving rigid body equilibrium. Table 7 – Grade distribution on rigid body equilibrium problem

Score Frequency

n Mean Median 0 1 2 3 4 5 6 7 8 9 10 F’ 09 Exam 3 0 0 0 0 0 0 4 8 20 4 19 55 8.5 8 F’ 09 Final Ex 0 0 0 0 0 0 3 13 5 5 30 56 8.8 10 F’ 10 Exam 3 0 0 0 1 1 0 4 6 10 4 20 46 8.5 9 F’ 10 Final Ex 0 0 0 0 1 1 1 4 12 8 19 46 8.7 9 F’ 11 Exam 3 0 2 1 0 3 4 4 10 13 4 7 48 7.1 7.5 F’ 11 Final Ex 0 0 1 0 0 3 4 4 10 7 19 48 8.4 9 Total (#)

(%) 0 2 2 1 5 8 20 45 70 32 114 299 8.3 8 0 1 1 0 2 3 7 15 23 11 38

Page 25.709.14

Table 8 – Mistakes made by students on rigid body equilibrium problem

Error Description

Typical Point

Deduction n Non-Conceptual Errors Trigonometry error (wrong angle or sin/cos error) 2 73

Calculation or transcription error 1-2 49 Final answer expressed incorrectly (missing directions, not clear, significant figures, etc.)

1 14

No separate FBD of structure provided 1 12 Wrong or missing units 1 5

Minor Execution Errors Error in computing magnitude of distributed load 1-2 22

Wrong moment arm for one force in moment equilibrium equation

2 21

Missing term in moment equilibrium equation 1 11 Error in force equilibrium equation 2 10 Error in locating resultant of distributed load 1-2 4 Wrong direction given on a reaction 1 3

Major Conceptual Errors Major FBD errors (incorrect representation of reaction or

missing force) 2-3 42

Did not convert distributed load to an equivalent concentrated load

3 6

Essentially no clue 9 2 Treated diagonal distributed load as a horizontal load 4 1 Incomplete solution; did not solve equilibrium equations 4 1

Total Number of Errors Recorded 276

Page 25.709.15

Figure 9 – Summary of error types for rigid body equilibrium problem. Summary Figures 10 and 11 summarize the grade distribution and types of errors made for all four problems considered together. Note that the authors have collected data on nine other topic areas, but have not consolidated all data to date. Interestingly, just over half of all problems (758 of 1493) were solved completely correct. Only 12% of all problems received scores of 6 or below. Approximately half of the 1011 errors identified in this study were classified as minor execution errors. These errors tend to be the common errors that most people think of as “silly mistakes” when computing mechanics problems, such as using a wrong moment arm or missing a term in an equilibrium equation. Approximately one-eighth of the mistakes are the major conceptual, or “understanding” issues. These issues are related to the fundamental concepts that allow for a problem to be set up and solved correctly. It is quite interesting that 38% of all tabulated errors were non-conceptual errors unrelated to mechanics. These errors generally involve calculation, mathematics, or units errors and are usually related to a general lack of problem solving discipline. Altogether, this data suggests that students probably have a better understanding of overall mechanics concepts than a complete ability to obtain the correct solution. As noted earlier, the goal for students to become strong real world practicing engineers is to be able to do both.

Non-Conceptual

Errors55%

Minor Execution

Errors26%

Major Conceptual

Errors19%

Page 25.709.16

Figure 10 – Summary of grades for all problems considered together.

Figure 11 – Summary of error types for all problems considered together. Uses for Data The data presented can be used in many ways to benefit the students and improve the course offering. Recently, the data has been used in two main applications. The authors have implemented a quiz preparation sheet that is reviewed in class the period prior to a quiz. This sheet includes a prioritized listing of the common mistakes that have been made historically. Additionally, the authors now highlight the common mistakes in their presentation of notes, demonstrations, and example problems. While the students now know what common mistakes to avoid, there does not appear to be a significant improvement in quiz performance yet.

0%

10%

20%

30%

40%

50%

60%

0 1 2 3 4 5 6 7 8 9 10

Freq

uency (%

)

Score

Non-Conceptual

Errors38%

Minor Execution

Errors49%

Major Conceptual

Errors13%

Page 25.709.17

Another benefit of collecting this data is it facilitates consistency of grading between co-teachers. As a result of this grading system we have never been accused of having one faculty member grade harder or easier than another. One final observation is that all students, whether they are doing very well or poorly in the class, feel that they have been treated extremely fair throughout the course. They admit that they have been warned repeatedly about the common mistakes their predecessors have made; unfortunately, so far they seem destined to repeat them. In general, performance (as indicated by grades) improves from quiz to exam to final exam, indicating that the direct feedback from the quiz aids in preparation for the exam and final exam. Another future parsing of the data will include the tracking of the three error categories over the course of the semester. Based on observation of the final exam data, it appears that the major conceptual errors are drastically reduced by the end of the course and the majority of errors are minor execution and non-conceptual. Moving forward, there are plans to develop a macro for the spreadsheet that would generate tabular and graphical data for each student on their personal performance at three points throughout the semester (following each exam). Students would get a breakdown of the specific mistakes made on all quizzes and exams as well as a pie chart similar to Figure 11 that presents the percentage of mistakes within the categories of Major Conceptual, Minor Execution, and Non-Conceptual Errors. The next step that would coordinate this personalized report would be to develop specific remediation exercises to support each type of error. For example, if the main error being made is related to switching the x and y components of a force due to sine and cosine consideration. A non-graded worksheet could be prepared that focuses on that specific non-conceptual error. Some thought has been given to developing a booklet of problems that specifically identify the common errant paths; however, there is still a lot of data to analyze prior to the execution of that thought. Bibliography

1. Newcomer, J. L. and Steif, P. S. (2008) “What Students ‘Know’ About Statics: Specific Difficulties Common Among Students Entering Statics”, Proceedings – 38th Annual Frontiers in Education Conference, ASEE/IEEE.

2. Newcomer, J. L. (2010) “Inconsistencies in Students’ Approaches to Solving Problems in Engineering Statics”, Proceedings – 40th Annual Frontiers in Education Conference, ASEE/IEEE.

3. Douglas, T. C., Santiago-Roman, A. Streveler, R. (2009) “Does Conceptual Understanding Matter: Patterns of Error in Senior Engineering Students Problem-Solving in Statics”, Proceedings of the ASEE Annual Conference and Exposition.

4. Steif, P. S. and Dantzler, J. A. (2005) “A Statics Concept Inventory: Development and Psychometric Analysis”, Journal of Engineering Education, 94:4, October.

5. Steif, P. S. and Hanson, M. A. (2007) “New Practices for Administering and Analyzing the Results of Concept Inventories”, Journal of Engineering Education, 96:3, July.

6. Steif, P. S., Lobue, J. M., Kara, L. B., Fay, A. L. (2010) “Improving Problem Solving Performance by Inducing Talk About Salient Problem Features”, Journal of Engineering Education, 99:2, April.

Page 25.709.18

7. Russell, J. (2002) “Statics and Engineering Design – A New Freshman Engineering Course at the US Coast Guard Academy”, Proceedings – 32nd Annual Frontiers in Education Conference, ASEE/IEEE.

8. Condoor, S. S. (2009) “Reformulating the Statics Course – A Design-Based Approach”, Proceedings of the

ASME International Mechanical Engineering Congress and Exposition, IMECE, November.

9. Klosky, J. Hains, D., Johnson, T. Bruhl, J., Erickson, J.B., and Richards, J. (2007) “An Integrated Approach for Engineering Mechanics and Design,” Proceedings of the 2007 ASEE Annual Conference & Exposition, Austin, TX.

10. Seif, P. S. and Dollar, A. (2005) “Reinventing the Teaching of Statics”, International Journal of

Engineering Education, 21:4.

11. Chaphalkar, P., Fleishmann, S., Pawloski, J., and Jack, H. (2009) “Preparation for Innovative Statics and Solid Mechanics Course Sequence”, Proceedings of the ASEE Annual Conference and Exposition.

12. Papadopoulos, C., Santiago-Roman, A. and Portela, G. (2010) “Work in Progress – Developing and Implementing an Inverted Classroom for Engineering Statics”, Proceedings – 40th Annual Frontiers in Education Conference, ASEE/IEEE.

13. O’Neill, R., Geiger, R. C., Csavina, K., and Orndoff, C. (2007) “Making Statics Dynamic! Combining Lecture and Laboratory into an Interdisciplinary, Problem-Based, Active Learning Environment”, ASEE Annual Conference and Exposition, June.

14. Gross, S. P., Dinehart, D. W., Yost, J. R., and Radlinska, A. (2011) "Overarching Problems in Sophomore Mechanics Courses", Proceedings of the 2011 American Society of Engineering Educators Annual Conference and Exposition, June.

15. Glynn, E. F., Dinehart, D. W., Gross, S. P., Hampton, F. P., and Wadzuk, B. W. (2007) “Teaching Engineering Mechanics in a Problem-Structured Environment,” Proceedings of the Fall 2007 ASEE Middle Atlantic Section Conference, Philadelphia, PA.

16. Welch, R. W., Ressler, S. J., Estes, A. C. (2005) “A Model for Instructional Design”, Journal of

Professional Issues in Engineering Education Practice, 131:167 (2005); doi:10.1061/(ASCE)1052-3928(2005)131:3(167)

Page 25.709.19