3
In the Classroom JChemEd.chem.wisc.edu Vol. 80 No. 7 July 2003 Journal of Chemical Education 785 Chemistry faculty concerned with helping students develop conceptual understanding may find that assessing stu- dents’ learning is not a trivial undertaking. Standard instru- ments are useful for quantifying student learning on the basis of specific facts or concepts learned, such as the American Chemical Society’s series of course-specific standardized ex- ams, but these may not thoroughly probe the depth of stu- dents’ understanding of complex ideas. Recent articles in this Journal have explored different ways to assess the quality of students’ learning in organic chemistry classes (1, 2). Given the complexity of the undertaking, we may often rely on our intuitive ability to “recognize it when we see it”. Establish- ing clear criteria for distinguishing the level of student work and sharing these criteria with students, however, can save a great deal of time and trouble. Students are clearer about our expectations for them and better able to understand the ba- sis for grading decisions when we clearly articulate our evalu- ation guidelines. Faculty may also find that developing and using evaluation criteria can help them clarify their own teaching goals (3). In this article we describe our experience in adapting a standardized instrument, the Structure of Observed Learning Outcomes (SOLO) taxonomy (4), for use in an organic chem- istry two-semester course sequence at a small liberal arts col- lege for women. Although there are examples in the literature of this instrument being used to evaluate levels of learning in college-level biology classes (5, 6), we have not seen its use discussed in college-level chemistry classes. This instrument may be used both in a formative manner to assess students’ understanding of key concepts throughout a course, and in a summative manner as part of a plan to assess the effectiveness of certain pedagogical strategies or curricular approaches in promoting student learning. This method may be adapted to use as a template or rubric for assigning grades on essay ques- tions, especially when grading is done by more than one per- son, such as the case when using teaching assistants. The SOLO taxonomy describes student learning in five hierarchical levels related to a student’s ability to apply appro- priate concepts in answering questions, connect concepts to- gether coherently, and relate concepts to new ideas. Prestructural: No recognition of appropriate concepts or relevant processing of information Unistructural: Preliminary processing but question not approached appropriately Multistructural: Some aspects of question addressed but no relationship of facts or concepts Relational: Several concepts are integrated so coherent whole has meaning Extended Abstract: Coherent whole is generalized to a higher level of abstraction Faculty can use the SOLO taxonomy to rank student re- sponses to open-ended questions in terms of increasing struc- tural complexity that reflect depth of understanding (7). We found this method to be a powerful tool for analyz- ing points of difficulty or confusion in student learning and following students’ progress in their understanding of par- ticular ideas. Faculty can use this tool to identify and docu- ment students’ lack of understanding or alternate conceptions more clearly, and thus design more effective intervention strat- egies. This instrument is adaptable across institutional and discipline contexts, making it a valuable assessment method for anyone interested in scholarly and reflective teaching. Analyzing Points of Difficulty Each author taught a two-semester organic chemistry course sequence for science majors using collaborative learn- ing methods. The class sizes ranged from 19 to 28 students over the course of the two semesters. In one application we were interested in probing the depth of student understand- ing of how structure affects the physical properties of organic molecules. We asked students on an examination early in the first term to rank the boiling points of three compounds of similar molecular mass but different polarities: methanol, me- thyl chloride, and ethane. The two instructors defined crite- ria that corresponded to each level of learning in the SOLO taxonomy and then evaluated students’ responses. Since the evaluation process is subjective, each student’s work was ini- tially evaluated by both instructors using these criteria to vali- date the instructors’ individual judgments. An individual instructor may meaningfully use this taxonomy, however, as long as the instructor maintains consistent criteria for classi- fying student responses. There were recurring themes in the students’ answers rep- resentative of different levels of their understanding of these concepts. Recognizably correct answers were ones at the Re- lational or Extended Abstract level, but we were also inter- ested in viewing what might be described as the gradient of student conceptual understanding. We therefore assigned stu- dents’ answers to the other levels based on whether they tried to apply inappropriate concepts (Prestructural or Unistruc- tural) or whether they used appropriate concepts in wrong ways (Multistructural) in answering the question. Assigning students’ responses as Prestructural or Uni- structural was fairly straightforward since these answers demonstrate that students do not know what concepts to draw from to approach the question. In this question, for example, students’ answers that did not include the concepts of molecular size, polarity, or hydrogen bonding were ranked as Prestructural. If students included one of these concepts Evaluation of Student Learning in Organic Chemistry Using the SOLO Taxonomy Linda C. Hodges* McGraw Center for Teaching and Learning, Princeton University, Princeton, NJ 08544; *[email protected] Lilia C. Harvey Department of Chemistry, Agnes Scott College, Decatur, GA 30030

Evaluation of Student Learning in Organic Chemistry Using the SOLO Taxonomy

  • Upload
    lilia-c

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

In the Classroom

JChemEd.chem.wisc.edu • Vol. 80 No. 7 July 2003 • Journal of Chemical Education 785

Chemistry faculty concerned with helping studentsdevelop conceptual understanding may find that assessing stu-dents’ learning is not a trivial undertaking. Standard instru-ments are useful for quantifying student learning on the basisof specific facts or concepts learned, such as the AmericanChemical Society’s series of course-specific standardized ex-ams, but these may not thoroughly probe the depth of stu-dents’ understanding of complex ideas. Recent articles in thisJournal have explored different ways to assess the quality ofstudents’ learning in organic chemistry classes (1, 2). Giventhe complexity of the undertaking, we may often rely on ourintuitive ability to “recognize it when we see it”. Establish-ing clear criteria for distinguishing the level of student workand sharing these criteria with students, however, can save agreat deal of time and trouble. Students are clearer about ourexpectations for them and better able to understand the ba-sis for grading decisions when we clearly articulate our evalu-ation guidelines. Faculty may also find that developing andusing evaluation criteria can help them clarify their ownteaching goals (3).

In this article we describe our experience in adapting astandardized instrument, the Structure of Observed LearningOutcomes (SOLO) taxonomy (4), for use in an organic chem-istry two-semester course sequence at a small liberal arts col-lege for women. Although there are examples in the literatureof this instrument being used to evaluate levels of learning incollege-level biology classes (5, 6), we have not seen its usediscussed in college-level chemistry classes. This instrumentmay be used both in a formative manner to assess students’understanding of key concepts throughout a course, and in asummative manner as part of a plan to assess the effectivenessof certain pedagogical strategies or curricular approaches inpromoting student learning. This method may be adapted touse as a template or rubric for assigning grades on essay ques-tions, especially when grading is done by more than one per-son, such as the case when using teaching assistants.

The SOLO taxonomy describes student learning in fivehierarchical levels related to a student’s ability to apply appro-priate concepts in answering questions, connect concepts to-gether coherently, and relate concepts to new ideas.

• Prestructural: No recognition of appropriateconcepts or relevant processing of information

• Unistructural: Preliminary processing but questionnot approached appropriately

• Multistructural: Some aspects of question addressedbut no relationship of facts or concepts

• Relational: Several concepts are integrated socoherent whole has meaning

• Extended Abstract: Coherent whole is generalized toa higher level of abstraction

Faculty can use the SOLO taxonomy to rank student re-sponses to open-ended questions in terms of increasing struc-tural complexity that reflect depth of understanding (7).

We found this method to be a powerful tool for analyz-ing points of difficulty or confusion in student learning andfollowing students’ progress in their understanding of par-ticular ideas. Faculty can use this tool to identify and docu-ment students’ lack of understanding or alternate conceptionsmore clearly, and thus design more effective intervention strat-egies. This instrument is adaptable across institutional anddiscipline contexts, making it a valuable assessment methodfor anyone interested in scholarly and reflective teaching.

Analyzing Points of Difficulty

Each author taught a two-semester organic chemistrycourse sequence for science majors using collaborative learn-ing methods. The class sizes ranged from 19 to 28 studentsover the course of the two semesters. In one application wewere interested in probing the depth of student understand-ing of how structure affects the physical properties of organicmolecules. We asked students on an examination early in thefirst term to rank the boiling points of three compounds ofsimilar molecular mass but different polarities: methanol, me-thyl chloride, and ethane. The two instructors defined crite-ria that corresponded to each level of learning in the SOLOtaxonomy and then evaluated students’ responses. Since theevaluation process is subjective, each student’s work was ini-tially evaluated by both instructors using these criteria to vali-date the instructors’ individual judgments. An individualinstructor may meaningfully use this taxonomy, however, aslong as the instructor maintains consistent criteria for classi-fying student responses.

There were recurring themes in the students’ answers rep-resentative of different levels of their understanding of theseconcepts. Recognizably correct answers were ones at the Re-lational or Extended Abstract level, but we were also inter-ested in viewing what might be described as the gradient ofstudent conceptual understanding. We therefore assigned stu-dents’ answers to the other levels based on whether they triedto apply inappropriate concepts (Prestructural or Unistruc-tural) or whether they used appropriate concepts in wrongways (Multistructural) in answering the question.

Assigning students’ responses as Prestructural or Uni-structural was fairly straightforward since these answersdemonstrate that students do not know what concepts todraw from to approach the question. In this question, forexample, students’ answers that did not include the conceptsof molecular size, polarity, or hydrogen bonding were rankedas Prestructural. If students included one of these concepts

Evaluation of Student Learning in Organic ChemistryUsing the SOLO TaxonomyLinda C. Hodges*McGraw Center for Teaching and Learning, Princeton University, Princeton, NJ 08544; *[email protected]

Lilia C. HarveyDepartment of Chemistry, Agnes Scott College, Decatur, GA 30030

In the Classroom

786 Journal of Chemical Education • Vol. 80 No. 7 July 2003 • JChemEd.chem.wisc.edu

but not another, their answer was classified as Unistructural.We rated responses as Multistructural if they indicated thatstudents knew most of the general concepts needed to an-swer the question, but the students did not have the depthof understanding necessary to apply the concepts correctlyand coherently to answer the question, or the responsesshowed only partial understanding of all the concepts needed.For example, typical student responses at this level showedthat students realized that polarity was a key factor affectingboiling point, but they ascribed this property of the moleculesto the lone pairs of electrons rather than looking at polarityof bonds and overall geometry of the molecule. Others ne-glected to take into account that the alcohol could partici-pate in hydrogen bonding. We saw no examples thatillustrated extended abstract thinking on this question.

Examples of typical student responses that we assignedto the different levels and the percentage of student answersranked at that level are shown below.

Prestructural (5%): “The CH3CH3 will have the lowestbecause it will be the least likely to latch onto other moleculesbecause it is saturated and has no lone pairs. The CH3OHwill be somewhat interested in bonding because of the singlelone pair on the oxygen, but the CH3Cl will be the most in-terested in bonding with other molecules due to its many lonepairs and therefore, it will have a relatively high boiling point.”

Unistructural (21%): “CH3CH3 is the most ‘branched’molecule. So the intermolecular forces are weak. CH3Cl isnot as branched as CH3CH3 so the boiling point must behigher due to stronger intermolecular forces. CH3OH is notas branched as much as CH3CH3 and also, the OH at theend is used when hydrogen ‘bonding’ occurs in between twoor more molecules. And hydrogen bonding is strong and sincethe force is strong the boiling point is high.”

Multistructural (32%): “CH3CH3, CH3OH, CH3Cl.The Cl group allows strong polar interactions and is highest.CH3OH is more polar than CH3CH3 so it must take moreenergy to boil it.”

Relational (42%): “Methanol is capable of hydrogenbonding due to its OH group. This means that the bondsbetween molecules of methanol are stronger, therefore metha-nol has the highest boiling point. Because of the geometryCH3Cl are polar molecules. The polarity of a molecule hasan affect on the boiling point. The more polar a molecule isthe higher the boiling point.”

When we classified student responses on this questionwe found that over half were Prestructural to Multistructuralin nature. This close examination of student work alerted theinstructors to a lack of understanding of an important con-cept in organic chemistry: polarity as it relates structure tofunction. The question then became, are students able toprogress in their understanding of this key concept as thecourse proceeds? The instructors followed students’ progressthroughout the year by asking and assessing additional ques-tions on that topic using the SOLO taxonomy.

Following Student Progress

Recognizing from responses on this early examinationquestion that many students had only a partial understand-ing of how a molecule’s structure relates to its physical prop-erties, we deliberately designed assignments that asked

students to discuss polarity of molecules or how a molecule’spolarity affected some aspect of its structure or function. Inaddition to class “mini-lectures” on the topic, students wereasked questions on this topic on worksheets throughout thesemester. The focus of these exercises was descriptive, notquantitative. They answered these questions individually anddiscussed their answers in groups.

We designed questions on the midterm and final examsrelated to the concept of polarity and assessed the responsesusing the SOLO taxonomy. One question on each exam dealtwith effects of polarity on reactivity of the carbonyl func-tional group, specifically the effect of the presence or absenceof electron-withdrawing groups on the susceptibility of thecarbonyl compounds toward nucleophilic addition (Figure1). We did not ask the identical question on both the semes-ter and final exams since students had their earlier exam andcould simply memorize their prior exam answer. This differ-ence in the questions on a particular topic posed betweenthe semester exam and the final exam made interpretationof results somewhat more complicated. Using the SOLO tax-onomy, however, one is able to pick out the key ideas thatdemonstrate understanding.

We were, of course, most interested in responses at theRelational or Extended Abstract level. Common features thatwe looked for to assign a student’s response as Relational were:recognition of the greater reactivity of the aldehyde comparedto the ketone because of the greater polarity of the carbonylin the absence of the alkyl substituent; recognition that theketone was less reactive than the aldehyde because of the pres-ence of the alkyl; recognition that this effect was offset some-what by the electron-withdrawing effects of the halidesubstituent; and recognition that the effect of the halide wasstronger the closer to the carbonyl that it occurred—the

Figure 1. Midterm and final exam questions that ask students torank compounds according to their susceptibility to nucleophilicattack.

O

H

O

ClH

O

O

Cl

O

Cl

O

OH

O

H

O

Midterm: List the compounds shown below in increasingorder of reactivity in a nucleophilic addition reaction to thecarbonyl. That is, list the least reactive compound first andthe most active compound last. Explain your reasoning.

Final: Which of the following molecules contains the carbo-nyl group more susceptible to nucleophilic attack and why?

In the Classroom

JChemEd.chem.wisc.edu • Vol. 80 No. 7 July 2003 • Journal of Chemical Education 787

extreme example, of course, being the acid halide. Althoughsteric effects on reactivity are important, students were notrequired to include this discussion for their response to beranked as Relational. We have found in general that studentscan recognize these spatial constraints on molecular reactivitymuch more easily than they can understand electronic effects.

Students showed overall improvement on the final examin understanding the concept of polarity: 28% of student re-sponses were at the Relational level on the midterm, 53%on the final (N = 28); 6 out of 8 of the students who an-swered at the Relational level on the midterm maintained thislevel on the final; 9 students improved on the final in theirunderstanding of the concept of polarity to the level of Rela-tional thinking as determined from our assessment of theirresponses. The small numbers of students involved means thatthese results are suggestive but not statistically definitive interms of a positive upward trend in responses on the final.

A key observation from these results is that even withadditional exercises and class emphasis placed on the idea ofpolarity, more than a third of the students still had difficultydeveloping an in-depth understanding of this concept. Thisresult may reflect both the complexity of this seemingly in-tuitive concept and the challenge that students have in trans-ferring knowledge learned in a specific context to anothersimilar, but not identical, application. Concepts in organicchemistry build continuously throughout the course and in-structors rely on students’ ability to transfer informationlearned in one kind of situation to related contexts. By chart-ing students’ progress in understanding specific key concepts,instructors may sometimes uncover the underlying basis forother points of difficulty.

Relating Learning and Assessmentwith Effective Use of Questions

Develop Questions Focused on a Key Learning GoalInstructors will gain the most useful information from

evaluation of the responses to questions that examine stu-dents’ understanding of key concepts on which later contentbuilds. Assessing esoteric questions that deal with some ofthe idiosyncratic behavior in organic reactions using theSOLO taxonomy will likely reveal little about the range ofstudent learning. Only those students who are extremely facilewith the material are likely to understand these nuances.

Develop Questions Specific to a ConceptEvaluation of student learning is clearer when questions

distinguish between related but different concepts since stu-dents may have some understanding of one issue and not theother. For example, a question that asks students to compareboth boiling points and solubilities of a series of compoundsmay in all likelihood yield confusing information about stu-dents’ depth of understanding of either concept as determinedby the SOLO taxonomy.

Develop Questions That Are MultifacetedThe SOLO taxonomy is designed to help instructors

examine students’ abilities to make coherent connections andformulate relationships between ideas. For example, items thatask students to rank and explain characteristics or activities

of a certain series of compounds or reactions usually yieldmore information than asking for an explanation of a singlefeature of a single molecule or reaction. If an instructor wishesto examine students’ understanding of a certain feature, shemay want to ask students to discuss the experimental basisof this idea or to place this concept in the context of an ear-lier principle, for example.

The SOLO taxonomy provides a way to take both asingle-point snapshot of student conceptual understandingand to view the panorama of progress in learning as the courseproceeds. The nature of the questions used is central to theeffectiveness of adapting the SOLO taxonomy for a class.

Conclusions

The SOLO taxonomy is most useful in identifying pointsof difficulty and in following students’ progress in understand-ing specific content issues. Students may demonstrate an Ex-tended Abstract level of understanding in some areas and yetremain Unistructural in their thinking on other topics. Com-paring students’ level of understanding on isolated, unrelatedconcepts throughout a course may not yield particularly help-ful information about their progress, unless the instructorwishes specifically to study these kinds of differences.

Reflective teachers are often aware of the concepts thatstudents find particularly challenging. Using the SOLO tax-onomy, however, can reveal student learning difficulties in amuch clearer way than simply noting overall examinationscores or anecdotally following student written responses. Wecould see patterns developing in the types of struggles thatstudents were having with the course material. We were thenbetter able to design particular pedagogical interventions toaddress these specific issues. After we shared this taxonomywith our students, they were better able to understand ourgoals for their learning and became more motivated towardlearning for understanding and a little less focused on sim-ply getting a desired grade.

Acknowledgments

We gratefully acknowledge the Pew National FellowshipProgram for Carnegie Scholars and Agnes Scott College fortheir support of this work.

Literature Cited

1. Nash, J. G.; Liotta, L. J.; Bravaco, R. J. J. Chem. Educ. 2000,77, 333–337.

2. Maroto, B.; Camusso, C.; Cividini, M. J. Chem. Educ. 1997,74, 1233–1234.

3. Walvoord, B.; Anderson, V. A. Effective Grading: A Tool forLearning and Assessment; Jossey-Bass: San Francisco, 1998.

4. Biggs, J. B.; Collis, K. F. Evaluating the Quality of Learning:the SOLO Taxonomy; Academic Press: New York, 1982.

5. Hazel, E.; Prosser, M.; Trigwell, K. Research and Developmentin Higher Education 1996, 19, 323–326.

6. Lake, D. J. Biol. Educ. 1999, 33, 191–198.7. Hattie, J.; Purdie, N. In Teaching and Learning in Higher

Education; Dart, B. Boulton-Lewis, G., Eds.; The AustralianCouncil for Educational Research Ltd.: Melbourne, Australia,1998; pp 146–176.