17
Boise State University Boise State University ScholarWorks ScholarWorks Kinesiology Faculty Publications and Presentations Department of Kinesiology 2019 Assessment, Evaluation, Metacognition, and Grading in POGIL Assessment, Evaluation, Metacognition, and Grading in POGIL Shawn R. Simonson Boise State University, [email protected] Follow this and additional works at: https://scholarworks.boisestate.edu/kinesiology_facpubs Part of the Kinesiology Commons Publication Information Publication Information Simonson, Shawn R.. (2019). "Assessment, Evaluation, Metacognition, and Grading in POGIL". POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners, 215-230. This document was originally published in POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners by Stylus Publishing, LLC. Copyright restrictions may apply.

Assessment, Evaluation, Metacognition, and Grading in POGIL

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Assessment, Evaluation, Metacognition, and Grading in POGIL

Boise State University Boise State University

ScholarWorks ScholarWorks

Kinesiology Faculty Publications and Presentations Department of Kinesiology

2019

Assessment, Evaluation, Metacognition, and Grading in POGIL Assessment, Evaluation, Metacognition, and Grading in POGIL

Shawn R. Simonson Boise State University, [email protected]

Follow this and additional works at: https://scholarworks.boisestate.edu/kinesiology_facpubs

Part of the Kinesiology Commons

Publication Information Publication Information Simonson, Shawn R.. (2019). "Assessment, Evaluation, Metacognition, and Grading in POGIL". POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners, 215-230.

This document was originally published in POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners by Stylus Publishing, LLC. Copyright restrictions may apply.

Page 2: Assessment, Evaluation, Metacognition, and Grading in POGIL

10ASSESSMENT, EVALUATION,

METACOGNITION, AND GRADING IN POGIL

Shawn R. Simonson

I truly believe no matter the level of studem, all students can participate and learn in this form. Set your expectations that all students can learn rhis way and don'r underestimate rhem. By doing POGIL you can actually see your students learning and it is wonderful!

-A POGIL practitioner

H ow does POGIL fir inro grading schemes for assignmenrs, resrs, and rhe cou rse? POGIL acriviries are nor designed robe graded as assessmenrs; rarher, rhey are inrended as learning rools. However,

one of rhe principle process ski lls rargered by POGIL is assessmenr-specifically self-assessment. Thus, much of rhe grading and assessmenr in a POGIL classroom helps srudenrs learn how ro self-assess (meracog-nirion) and, in rum , self-regulare. The insrrucror musr model how ro self-assess and emphasize irs imporrance. Assessmenr is also more mean-ingful when ir occurs in proxim iry ro rhe learning. Common rools ro accomplish rhis are enhancing meracognirion, crearing individual and group accounrabiliry via grading group work and peer grading, and fre-quenr formarive assessmenrs.

Simply arrending class improves quiz and rest performance; however, instrucrors generally wanr studenrs to not only do well on rests but also later recall and use the content (Shimoff & Catania, 2001). Active learning increases the number of cues that students have to aid information retrieval and helps them learn and/or retain content and concepts (Bransford et al. , 2000; Crede, Roch, & Kieszczynka, 2010; Deci, Vallerand, Pelletier, & Ryan, 1991 ; Doyle, 2008; Karpicke & Roediger, 2008; McDaniel, Roediger, &

215

Page 3: Assessment, Evaluation, Metacognition, and Grading in POGIL

216 IMPLEMENTING

McDermorr, 2007; Medina, 2008; M .D. Mi ller, 201 1 ). Using assessmenrs to require repeated retrieval and use of course content is more effective for improving retention than simple repetition (Karpicke & Roediger, 2008; McDaniel et al., 2007). Timely feedback, o r correction of knowledge, also aids retention and later perfo rmance by aiding metacogn ition, the under-standing of what is known and nor known (McDaniel er al. , 2007; Thomas & McDaniel, 2007).

Assessment improves retention by focusing the learner's arrention o n pertinen t content and concepts, consolidating learn ing, and providing prac-tice (Crooks, 2001 ; Karpicke & Roediger, 2008; McDaniel et al. , 2007). H owever, acco rding to Crooks (2001 ), it offers other effects: (a) Ir guides subsequent and/o r additional instruction; (b) ir inAuences motivation and self-efficacy; (c) it communicates, reinforces (o r undermines) performance criteria and standards; (d) it modulates students' development of lea rning strategies; and (e) ir inAuences students' decisions about what to (dis)con-tinue to study and pursue as a ca reer. G iven these significant effects and the potential for negative outcomes, it is imperative that assessment be appropri-ate and provide accurate and meaningful results.

If a teacher is lecturing and the students are memorizi ng, then a standard-ized multiple-choice test may be the appropriate assessment cool (Gulikers, Bastiaens, & Kirschner, 2004). H owever, if the educational goal is that stu-dents grow as learners, develop the abili ty to build their own knowledge, and become reflective practitioners, then perhaps rhe multiple-choice test is not the only cool rhar should be used, and alternative assessments should be incorporated. Alternative assessments require students be responsible for their learning and for reflecting and collaborating with other students and the facilitator (Gulikers et al. , 2004). Multiple assessment formats are used and are built around interesting and real-world problems (Gulikers et al. , 2004).

Definitions Assessment is one of those areas in which several terms are used interchange-ably, so it is beneficial to clarify the discussion with agreed-on definitions:

Assessment: As stated in chapter 3, assessment is an activity designed to improve future performance. It is any activity chat provides evidence of what the students and reacher are doing; how rhe scudents are changing; and what the students are accomplishing, learning, and thinking (Crooks, 2001). Assessment can be of the activity, the learning, and the teaching as it is performed by both the teacher and the students.

Page 4: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSMENT, EV/\LUJ\TION, METi\COGNITION, /\ND GRADING 217

Formative assessment, also referred ro as assessmenr for learning, is rhe col-lecrion of insranraneous, ofren informal, data abour srudenr learning ro supporr learners and help insrrucmrs make improvemenrs in reach-ing and learning (Angelo & Cross, 1993; Crooks, 200 1;Educarion Reform, 2014; Taras, 2010). In this chaprer, che cerm assessment will refer ro formarive assessment.

Evaluation for our purposes is synonymous with summarive assessmenr and is also referred ro as assessmenr of learning. le is rhe analysis of data and comparison ro standards ro judge performance and derer-mine passing or fai ling, and ic is che assigning of grades ro determine what students have learned as wel l as allowing appraisal of rhe course, reacher, and program performance (Angelo & Cross, 1993; Crooks, 200I; Educarion Reform, 2013). In rhis chaprer, rhe rerm evaluation will refer ro summarive assessment.

Grading is che process of applying standardized measuremenrs of varying levels of ach ievemenr in a course. Grading is nor jusr giving srudenrs a rubric or answer key ro assign a score. Grading and self-assessmenr are used as a technique ro allow srudenrs ro realize and idenrify whac rhey do or do nor know and how chey must transform rheir learning ro acquire rhe skills or knowledge necessary ro learn and master rhe concenc.

Self-assessment is rhe process of individuals garhering evidence abour rheir own abilities and performance and reflecring on rhar information wirh rhe inrenr ro improve subsequenr performance (Baird, 1986). Ir is critical ro meracognirion.

Metacognition enrails awareness of one's own understanding of what one knows and does nor know. It requires reflection and performance moniroring (self-assessment); being aware of one's personal abiliries, knowledge, and learning; and planning for learning (McDaniel er al., 2007; Schraw, 1998; Thomas & McDaniel, 2007; Tobias & Everson, 2009).

Assessment, Evaluation, and Grading

Assessment for learning is a process that most instructors do reflexively. We often "take the temperature" of a class or contemplate how well the students are grasping the material. Formative assessment can "supplement and com-plement" evaluations (Angelo & Cross, 1993, p. 26). Making this process intentional and transparent, as well as mapping it to course outcomes and student performance, can enhance both teaching and learning.

A few specific techniques commonly used in POGIL classrooms will be discussed here. Others can be found in chapter 6. Angelo and Cross's (1993) Classroom Assessment Techniques: A Handbook for College Teachers, and the

Page 5: Assessment, Evaluation, Metacognition, and Grading in POGIL

218 IMPLEMENTING

newer companion book by Barkley and Major (20J 6a), Learning Assessment Techniques: A Handbook for College Faculty, present many more excellent sug-gestions that are applicable at any grade level.

Facilitators should determine the key concepts in an activi ry based on the course and lesson learning objectives and the activity itsel f. Facili rarion guides, provided with most published POGIL activities, indicate what the activiry author suggests as the key concepts. In addition, some POGIL act ivities, par-ticularly for high school , are designed with rhe key concepts indicated by a symbol in the activiry--often a picture of a key. It is only these questions that facilitators need to verify as correct in some way. If studems can answer these key questions correctly, rhen the preceding answers were also correct. T his verification can occur via various modes of student reporting o r of the instruc-tor asking a similar question char requires the students to have successfully completed rhe preceding portion of an activity. For example, in the econom-ics activity Credit Default Swap, used in the introductory POGTL workshop, participants are asked ro determine how much mo ney the pensio n fund wo uld earn under conditions not previously described in the activity model. If par-ticipants understand the model, they will correctly answer chis novel question.

Application questions that require students ro use their freshly con-structed knowledge in new ways or unique combinations are often included at the end of POGIL activities. Solving a real-wo rld problem by using the newly acquired content provides an opportuniry to assess student under-standing and higher-order chinking, enhances understanding, and provides an opportunity ro help students develop thinking patterns similar ro experts' (Gulikers et al., 2004). Real-world casks beyond che POGIL activity can also enhance student motivation and help rhem identify future opportuni-ties to use the content and skills developed (Fink, 2003) . T hese casks should be scaled to student ability and kept as similar to whar professionals in rhe field routinely wrestle with as possible. For example, a series of earth science units over geology, watersheds, and pollution might end with student reams deciding where to place another sewage treatment plant in rheir local com-munity. Solving real-world problems can also be used to model and foster self-assessment and regulation , and these two skills will be discussed more completely lacer in the mecacognicion discussion.

Many teachers will agree chat most stud en cs are nor going to work as hard, or even complete an assignment, unless there is a grade attached. This pay-for-play attitude can be improved in the POGIL classroom, but it requires scaffolding-and chat scaffolding can be via providing points for student work on the POGIL activities. A common first-level activity point-awarding mechanism involves simply giving students credit for completing the activ-ity. This can be ramped up and foster ream and individual accountability by

Page 6: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSMENT, EVALUATION, METACOGNITION, AND GRADING 219

moving to randomly collecring and reviewing a single srudenr's paper from the ream and assuming rhe whole ream has rhe same level of undersrand-ing- and as igning all ream members rhe same poinrs. Anorher version of this is to ask each ream spokesperson to collect all che previous day's activi-ties. Then, the spokesperson rums each copy to a specific question as direcred by rhe facilirator. If all rhe ream members' answers are che same, everyone in the ream earns full points for reaching and recording a consensus answer. If even one member's answer differs, all ream members receive zero credit. A third level of chis scaffo lding is to then move to che recorder's report, rurned in ar rhe end of each class, as a log of rhe imporram concept char the team has lea rned . The final scaffold level is then no collection of evidence that students have completed the acriviry.

Numero us POG IL facili tato rs start each class session with a short quiz based on the co ntent mastered in the previous class. Depending on how the results of these quizzes are used, these can be assessments or evalua-tions. T hey erve to identify misconceptions and/or gaps in understanding and to provide encouragement for the students to continue to work with che material o urside of class. Unit tescs are another obvious evaluation/ grading oppo rrunicy. Taking rhe quizzes and tests a step furrher to encourage both individual and team accountability is rhe rwo-srage test used in some POGIL and other collaborative learning classrooms. In the first stage, stu-dents take che rest individually. T his can be rurned in or kept for reference, based on the instructor's preference. In the second stage, students retake the test in their teams. Scores on the two tests can be recorded separately, aver-aged, or weighted per the instructor's preference (in some of my courses, at che beginning of the semester, che srudenrs determine how these scores will be weighted) (Michaelsen, Knight, & Fink, 2004; Nowak, Miller, & Washburn, 1996).

For multiple-choice tesrs, instructors can use Immediate Feedback Assessment Technique (IF-AT) forms (Epstein, n.d.). (An Internet search for "how to make scratch-off cards" also nets several do-it-yourself methods for making cards.) These tests not only save rime by having the students grade their tests and identify rhe correct answers as they complete them bur also correct errors in thought. IF-AT forms are preprinted scratch-off rest forms that indicate rhe correct answer as srudents are taking the test. Students score higher when they make fewer scratches to find the correct answer. See Figure 10.1 for an example. Cognalearn (intedashboard.com) has an on line version of this testing format as well.

There are also evaluation methods that simultaneously encourage meta-cognition. Two examples of this are weighting confidence and accuracy credit. Weighting confidence can be performed in multiple ways. One, used

Page 7: Assessment, Evaluation, Metacognition, and Grading in POGIL

220 IMPLEMENTING

Figure 10.l. Sample IF-AT fo rm.

IMMEDIATE FEEDBACK ASSESSMENT TECHNIQUE (IF AT(R)Name Test#

_o SCRATCH ANSWER

1. 2. 3. 4.

7. 8. 9.

12. 13. 14. 15. 16. 17. 18. 19.

A B D

Answers weighted: I scratch= 4 poinrs, 2 scratches= 2 poincs, 3 I point.

on multiple-choice tests, is to assign each question a value of four points. Students can then distribute those points across the four answer options as they see fit: four points on an answer option if they are ve1y confident that they are correct, two and two on two answer options if they are split, three and one, or even ones across all of the answer options if they have no idea. They then earn the points assigned to the correct answer (Michaelsen et al., 2004). Another confidence-weighting method is to have students rate their

Page 8: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSMENT, EVALUATION, METJ\COGNITION, AND GRADING 22I

shore or longer answers by how confident rhey are in their correcmess from very confident rhar the answer is correct to very confident that the answer is incorrecr. Accurate confidence ratings are used as a multiplier of the prob-lem scores to generate a rest score that encourages student contemplation of confidence (Perr, 200I ) . Another method for providing encouragement to develop and demonstra te problem-solving skills is awarding credit for accu-racy. This is the traditional approach of giving students (partial) credit for correctly serring up and solving problems.

Metacogni ti on

Moniroring knowledge is the foundation of meracognition and rhe higher-level metacognitive skills: Selecting strategies, evaluating learning, planning, and con trolling require accurate knowledge monitoring (Serra & Metcalfe, 2009; Tobias & Everson, 2009). However, metacognirion does not come naturally to most learners, and it is nor routinely promoted in education (Winne & Nesbit, 2009). Yet, improving metacognition is possible and requires that students rake responsibili ty for their learning and intention-aJly practice metacognirive and decision-making skills (Baird, 1986; Baird & White, 1982). Additionally, minimally related to IQ, meracognirive skills are transferable. They are not content specific and, once learned, can be applied in a variety of situations (Schraw, 1998).

First, ro promote metacognition, learners must be aware of metacog-nition-rhat it is different from content knowledge and understanding and that it will enhance success (Schraw, 1998). Second, learners must believe that they can be self-regulated learners and that they do have control (Dweck & Leggett, 1988; Winne & Nesbit, 2009). Third, strategies to encourage and enhance metacognirion must be presented and practiced (Baird, 1986; Schraw, 1998). Learning and using metacognition is like learning any other concept or skill-scaffolding and multiple approaches enhance uptake and internalization (Baird & White, 1982). Direct instruction, modeled by both the instructor and other students; reflection; and group activities all fit into the scaffold (Baird, 1986; Schraw, 1998). Creating a classroom that helps students identify improvement, encourages mastery and increased effort, and rewards persistence also enhances metacognitive development (Schraw, 1998). Fourth, making mistakes may have been discouraged in earlier learning environments, and students may have learned to avoid and/or be demoti-vated by them. Thus, they need to develop the appreciation that mistakes are learning opportunities to be taken advantage of (Winne & Nesbit, 2009).

The beginning of class or an activity is an excellent opportunity to enhance metacognition by explicitly activating prior learning or knowledge.

Page 9: Assessment, Evaluation, Metacognition, and Grading in POGIL

222 IMPLEMENTING

In 1987, the Biological Sciences Curriculum Study (BSCS) elaborated on the Atkins and Karplus three-phase learning cycle, on which POG IL is based , to add two phases: engagement in rhe beginning and evaluation at the end (Bybee er al. , 2006). Engaging students before starting a new activity by piquing their curiosity and helping them identi fy what they already know about a topic improves meracognition and understanding, can be accom-plished in numerous ways, and is limited only by rhe facilitator's imagination and skill set (Baird, 1986; K.A. Miller, Lasry, Chu, & Mazur, 201 3; Tanner, 2010). See Table 10.1 for suggestions.

Misconceptions can be very persistent and may require significant energy and repeated efforts to correct (Baird & Whi te, 1982). Inquiry learning, such as POGIL, is an important method fo r helping students identify and shift from their current knowledge and misunderstandings to rhe concepts and theories held by content experts (Tanner & Allen, 2005). Inquiry-based learning encourages students to think and ask questions in rhe habits of mind used by scholars: to challenge preconceptions and current models in an effort to advance new and better ideas (Tanner & Allen, 2005). Thus, it is a sig-nificant opportunity when selecting or writing POGIL activities to include models that address and challenge common misconceptions. It can also be beneficial to call out this concept transition so that students are aware that it occurred.

The end of a class or an activity is another chance to develop metacogni-tion by asking students to assess their learning. An obvious cool is to include metacognition opportunities or questions at the end of the POGIL activity. During this additional evaluation phase of the learning cycle, students reflect on their learning and reveal their skill or content proficiency, thus providing the instructor the opportunity to assess students' progress (Bybee et al., 2006). The evaluation phase can take on many forms and is limited only by the instructor's imagination and repertoire (Baird, 1986; Davis, 1993; Isaacson & Was, 2010; Schraw, 1998; Tanner, 2010). See Table 10.1 for suggestions.

Postactivity knowledge reflections and content organizers seem to be more effective if there is a time delay between activity completion and the implementation of these tools. The delay forces use of long-rerm memo1y rather than working memory (Serra & Metcalfe, 2009). Daily quizzes at the start of class work well to provide an appropriate time delay and improve content retention (Karpicke & Roediger, 2008; McDaniel et al., 2007).

Observing others engaged in metacognmon helps students develop their own metacognitive skills. This observation can be of the instructor, other students, and themselves. Teachers should explicitly model their own metacognition by calling out their problem-solving, decision-making, and regulatory techniques (Buder & Winne, 1995).

Page 10: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSMENT , EVALUATION, METACOGNITION, AND GRADING 223

TABLE 10.1 Sample Methods for Engaging and Assessing Students

Engaging prior to learning

Use a preliminary or rhe initial model ro engender curiosiry in rhe POGIL .. activity.

Ask srudenrs ro predicr rhe ourcome of a demonstration and rhen respond to the resulrs o f that demonstration (K.A. Miller er al. , 2013).

Ask s[lldenrs ro expla in rheir prior knowledge about rhe conrenr of rhe subjecL

Assign prereadings from rhe popular press o r Inrerner.

Give preassessmenr questions.

Use discrepanr even rs (unexpected examples) of a phenomenon.

Establish process goals to be rargered during the acriviry.

Assessing after learning

Give application quesrions ar rhe end of the POGIL acriviry.

Have S[lldents/ reams com plete a minute paper ro identify muddiest points (quesrions) and most important concepts. This encourages learners to reflect on rhe stare of rheir knowledge prior to leaving the class (Angelo & Cross, 1993; Davis, 1993).

Give quizzes and rests, in or our of class. Asking students co rare their confidence in rheir answers o n daily quizzes-and compounding points when confidence marches correctness-further enhances meracognirion (Isaacson & Was, 2010).

Ask for knowledge reflection in which the students are asked to summarize and share the key concepts learned in the activiry.

Use content organizers rhat demonstrate relationships (i.e. , concept maps or flow charrs, poster presentations, pamphlets, papers).

Ask students what they learned or what contradicted their prior knowledge.

Predict the outcome of another demonstration.

Encourage students ro reflect on their learning and share these reflections with other students.

Note. Baird,1986; Isaacson & Was, 2010; Schraw, 1998; Tanner, 2010.

This can scarr wich che inscruccor determining what abilities and cools are critical within their content area and recalling how chey developed these abilities (Schraw, 1998). Teachers can explicitly describe these skills as they are using chem. For example, when demonstrating problem-solving, do nor simply demonstrate the seeps, share the thought processes char you are going through to make choices and move from one step co rhe next

Page 11: Assessment, Evaluation, Metacognition, and Grading in POGIL

224 IMPLEMENTING

(Schraw, 1998). Working in groups can aid meracognirion as peer observa-tion may be as good as or better than observing the instructor. Students often closely observe thei r classmates and feel that mimicking their peers is more possible, reaso nable, likely, and comfortable than mimicking rhe instructor (Schraw, 1998).

Self-observation and reAecrion are critical in developing and improving metacognicive skills (Schraw, 1998). There are a variety of reAecrion prompts chat can be used here. Asking students to reAecc on their exam perfo rmance, study habits, and preparation effectiveness helps students explore rhe success of their preparation strategies and make plans to improve them. "What, so what, now what" journals help students frame their learning process. T hey identify what happened and how it was different from what they already knew. Next, students identify why what they learned matters and how it aligns with what they have learned elsewhere. Lase, they plan fo r how they will use what they learned, what they will share with others, and what they want to learn next (Barkley & Major, 20J 6b). Mary Jarratt Smith (2016) at Boise State University provides her differential equation students with mecacognirion cards, printed on card stock, that derail steps and/o r ques-tions they can use when solving problems. Mare Sullivan, now at Seattle Pacific University, used a similar KNAP SACK strategy with her junior and senior high students. Shown in Figure 10.2, supports like these can be used in a variety of settings. Students can also be encouraged to contemplate what has worked well and what has nor. In addition, helping students identify their strengths, opportunities for improvement, rime-management tenden-cies, and study strategies are just a few examples of meracognirive strategies.

Promoting Teamwork: Team and Peer Assessment

As indicated in chapter 6, a component of helping students value group work is assessing and/or evaluating the group work. This can be done by the facilitator and/or by the students themselves. Some POGIL facilitators use participation grades for each ream's work, while ochers assign content grades. Individual activities can be collected to indicate chat all students are responsible for their own learning. Or, as mentioned previously, one copy of the activity can be collected from each ream: one ream member's activ-ity randomly reviewed and a team grade assigned based on that individual's response, operating on the assumption that the team has worked together, that thay have come to a consensus, and that they all have completed the activity. Some facilitators may collect completed activities and assign a con-tent grade to individuals or whole teams.

Page 12: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSMENT, EVi\LUJ\TION, METJ\COGNIT ION, AND GRADING 225

Figure 10.2. Sample strategies for a iding student problem-solving and prompting metacognition.

Metacognition card

I. Reflect befo re solving . W hat is the problem asking me ro do? . W hat concept is the problem asking me to use? . How is this problem similar to ones that I have done before? How is it differen t?

• What strategies can I use to solve the problem?

2. Monitor du ring solving • Am I on the righ t track? . Do I need a new plan or strategy? . Am I closer to my goal? . How should I proceed?

3. Evaluate after solving . Did I get the results I expected? . What worked? What didn't work? . What could I have done differently? . Do I need to go back and fi ll in gaps in my undemanding?

Always bring your KNAP SACK with you

K Write down everything thar you al ready KNOW that might help you.

N Identify what you NEED to know. How will you know when you have arrived at the answer?

A Describe how you will ATTACK the problem. What steps will you take? What subproblems will you solve?

p PREDICT your answer. What do you expect, based on logical thinking? A huge number? A tiny number? A number near one? What units should the answer be in?

s SOLVE the problem.

A AND

c CHECK your answer against your prediction.

K KISS the problem good-bye and move on!

Individual and team accountability can be encouraged by assessing, eval-uating, and grading teamwork and team contributions. Peer grading should be included at some level in all collaborative learning environments, and there are myriad tools available. Initially, students may not assess their peers with much enthusiasm or accuracy. One of the most common student complaints

Page 13: Assessment, Evaluation, Metacognition, and Grading in POGIL

226 IM PLEMENTING

about group work is the uneven distribution of effort. Remind students rhar assessing their peers is their opportunity to encourage positive change by call-ing attention to loafing as well as exceptional effort. Again , transparency and scaffolding are beneficial. If rhe instructor communicates to rhe students that rheir input is important and will be serio usly considered, students are mo re likely to put effort and thought in to peer assessments and evaluation. T his is also a situation where maintaining rhe same reams and using roles for a period of time is beneficial, as students are more likely to honestly review their peers when they have observed rheir perfo rmance over a lo nger period .

Whar to do with rhe peer assessments and eval uations? In my classes, I have a separate grade category dedicated to team contribution. Somewhere from J 0% to 15% of a studen r's grade is dererm i ned by rhei r peers. Another method of using peer grading is to use it as a multiplier fo r grades on teamwork. This awards the highest grades to the students whom peers iden-tify as making rhe most significant contributio n the ream.

Scaffolding peer assessment and evaluation is necessary to help students develop confidence and skill. Familiar to POG IL practitio ners, ini t ial ly an SII of the ream and its members can be used: S asks fo r strengths and why they are strengths, the first l asks for opportunities for improvement and how those improvements might be made, and rhe second l asks for insights about the team/individual. Students may initially earn completion credit for this peer assessment, with anonymous feedback provided to the assessed ream members. A next level can be ranking students from most co least valuable contributors. An averaged distribution is then shared wirh the ream members. This assessment strategy decouples the peer review from grades, making it informative without grade pressure. A fo llowing srep is to ask students to assign a percentage of effectiveness score to each ream member. A rationale for each score is required. Evaluated ream members then receive an average of thei r assigned scores and anonymous feedback. The final level can then be asking team members to grade each other without assigning the same grade to any of their teammates and requiring rhar the overall score average to a set standard (Michaelsen et al. , 2004).

While some do not care for rubrics because of their rigidity and inherent imperfection, rubrics can be helpful in reaching students to assess and evaluate each other's contribution to the ream. The Association of American Colleges & Universities (AAC&U) has several excellent VALUE rubrics-one of which is for teamwork (AAC&U, n.d.). Karen Franker at the University of Wisconsin, Stout has rubrics available for assessing teamwork at the primary through high school levels (University of Wisconsin, Stout, n.d.) . Suzanne Ruder (2014) at Virginia Commonwealth University has a series of short rubrics that encourage

Page 14: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSME T, EVALUATION, METACOGNITION, AND GRADING 227

students co race each ocher on che POGIL cargeced areas of critical thinking, info rmacion processing, problem-solving, and teamwork. Ruder's rubrics even-tually led to the ELI PSS project and the newer, modified rubrics being devel-oped by chat ream (see chapter 3 for more information).

High- cakes assessments, such as exams, can also be used co support che importance of effeccive teamwork by rooting exam questions in che type of chinking required by POGIL accivicies. As such, chere should be exam ques-tions chat go beyond rote learning co include application, analysis, and ocher high-level cognitive ski lls. The POGIL approach effectively includes all six of Fink's (2003) taxo nomic categories of foundational knowledge, application, integration, learning how to learn, caring, and human dimension, many of which can be incorporated into well-designed assessments. Thus, even the grad ing of content ski lls can be used to emphasize and reward effective work in POGIL reams.

Ir was coward the end of che semester in my undergraduate exercise physi-ology cou rse and we were final ly learni ng about metabolic pathways and the concribucion of each to physical activity. (Mose exercise physiology courses scare there, but 1 prefer to end chere as ic is some of the lease famil-iar concern . I like co scare with muscle, something chat most kinesiology students have some prior knowledge about and are somewhat interested in.) The scudents were in their reams working on che POGIL activity Metabolism: Cellular Respiration: Part 2 in which the reasons and path-ways for !aerie acid production and clearance are discovered. As the activ-ity was winding down, a srudent called me over and asked for clarification about how lactate formation and clearance related co cardiopulmonary function and acid/base regulatio n. I answered chat she was on the right crack. She chen cook it several seeps furrher and cied it all co muscle con-craccion, fiber typing, and the energy demands of physical activity.

I could not contain my enthusiasm as I responded, "Exactly!" Her response co my "exactly" was co jump up out of her seat, throw

her arms (and her activity) up in the air, and shout, "I get it!" The whole class came co a grinding halt and she proudly repeated her description of how energy for muscle contraction, fiber types, acid/base regulation, and the metabolic pathways all tied together.

H er peers applauded! -Shawn R. Simonson,

Professor, Boise State University

Page 15: Assessment, Evaluation, Metacognition, and Grading in POGIL

228 IMPLEMENTING

Summary

• Frequent assessment and evaluations, individual and group account-ability, and peer grading enhance learning and retention.

• Self-assessment is one of the principle process skills targeted by POGIL.

• Grading and assessment in a POGIL classroom are designed to help students learn how to self-assess and self-regulate.

• Scaffolding and modeling of self-assessment emphasizes its importance and promotes its development.

• Metacognition is a skill that requires intentional_ practice.

Listening to my students while they work on POGIL nctivities is the single grenrest insight into how they think and learn! Don't miss a chance to henr them talking, thinking, ere. It is gold.

- A POGIL practitioner of eight yen rs

References

Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco, CA: Jossey-Bass.

Association of American Colleges & Universities. {n.d.). Teamwork VALUE rubric. Retrieved from lmps://www.aacu.org/value/rubrics/teamwork

Baird, J. R. (1986). Improving learning through enhanced metacognition: A class-room study. European Journal of Science Education, 8(3), 263-282.

Baird, J. R., & White, R. T. (1982). Promoting self-control of learning. Instructional Science, 11(3), 227-247.

Barkley, E. F., & Major, C. H. {2016a). Learning assessment techniques: A handbook for college faculty. San Francisco, CA: Jessey-Bass.

Barkley, E. F., & Major, C. H. {2016b). What? So what? Now what? journals. In E. F. Barkley & C. H . Major (Eds.), Learning assessment techniques: A handbook for college faculty (pp. 388-392). San Francisco, CA: Jessey-Bass.

Bransford, J. D., Brown, A. L., Cocking, R. R., Donovan, M. S., Bransford, J. D., & Pellegrino, J. W {Eds.). (2000). How people learn: Brain, mind, experience, and school: Expanded edition. Washington, DC: National Academies Press.

Buder, D . L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theo-retical synthesis. Review of Educational Research, 65(3), 245-281.

Bybee, R. W, Taylor, ]. A., Gardner, A., Van Scatter, P., Carlson Powell, J ., West-brook, A., & Landes, N. (2006). BSCS 5E instructional model: Origins and effec-tiveness. Retrieved from htcps://bscs.org/bscs-5e-instructional-model

Crede, M., Roch, S. G., & Kieszczynka, U. M. (2010). Class attendance in college: A meta-analytic review of the relationship of class attendance with grades and srudenc characteristics. Review of Educational Research, 80(2), 272-295.

Page 16: Assessment, Evaluation, Metacognition, and Grading in POGIL

ASSESSMENT, EVALUATION, METACOGNITION, AND GRADING 229

Crooks, T. (200 1 ) . The validity of formative assessments. Paper presented at the British Educatio nal Research Association Annual Conference, University of Leeds, Leeds, UK. Retrieved from lmp://www.lecds.ac.uk/educol/documents/OOOO 1862.hrm

Davis, B. G. ( 1993) . Tools for teaching. San Francisco, CA: Jossey-Bass. Deci, E. L. , Vallerand, R. J., Pelletier, L. G. , & Ryan, R. M. (199 1). Motivation and

education: The self-dererminarion perspective. Educational Psychologist, 26(3-4), 325-346.

Doyle, T. (2008). Helping students learn in rl learner-centered environment. Sterling, VA: Stylus.

Dweck, C. S. , & Leggetr, E. L. ( 1988). A social-cognirive approach ro morivation and personality. Psychological Review, 95(2), 256-273.

Educarion Reform. (2014 , April 29) . Formative assessment. Retrieved from hnp:// edgl ossa ry.o rg/ fo rma tive-assessm en t/

Educarion Reform. (2013, Augusr 29). Summative assessment. Retrieved from hrrp:// edglossary.org/su mmarive-assessmenr/

Epsrein. (n.d.). What is the IF-AT? Retrieved from hrrp://www.epsreineducarion .com/ homc/abour

Fink, L. D . (2003) . Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass.

G ulikers, J. T. M., Basriaens, T. J. , & Kirschner, P. A. (2004). A five-dimensional framework for aurhcnric assessment. Educational Technology Research and Devel-opment, 52(3), 67-86.

Isaacson, R. M. , & Was, C. A. (2010). Building a meracognirive curriculum: An educational psychology ro reach meracognirion. National Teaching and Leaming Forum, 19(5), 1-4.

Jarrarr Smith , M. (2016) . Training students to be better critical thinkers. Paper pre-sented ar rhe Colloquim, Augusrana University, Sioux Falls, SD.

Karpicke, J. D., & Roediger, H. L., III. (2008). The crirical imporrance of retrieval fo r learning. Science, 3 19(5865), 966-968.

McDaniel, M. A. , Roediger, H. L., III , & McDermott, K. B. (2007). Generalizing rest-enhanced learning from the laborarory ro the classroom. Psychonomic Bulletin and Review, 14(2), 200-206.

Medina, J. (2008). Brain rules: 12 principles for surviving and thriving at work, home, and school. Searrle, WA: Pear Press.

Michaelsen, L. K., Knighr, A. B., & Fink, L. D . (2004). Team-based learning: A transformative use of small groups in college teaching. Srerling, VA: Srylus.

Miller, K. A., Lasry, N ., Chu, K., & Mazur, E. (2013). Role of physics lecture dem-onstrations in conceptual learning. Physics Education Research, 9(020113), 1-5.

Miller, M. D . (2011) . What college teachers should know about memory: A per-spective from cognitive psychology. College Teaching, 59(3), 117-122.

Nowak, L. I., Miller, S. W , & Washburn, J . H. (1996). Team testing increases performance. Journal of Education for Business, 71(5), 253-256.

Petr, D. W (2001). Confidence scores measure (and enhance?) student perceptions of their work. Journal of Engineering Education, 90(4), 469-475.

Page 17: Assessment, Evaluation, Metacognition, and Grading in POGIL

230 IMPLEMENTING

Ruder, S. M. (20 14). Assessing process skills inn large POGIL classroom. Paper prc-senred ar rhe Biennial Conference on C hemical Educarion , Grand Valley State University, Allendale, Ml.

Schraw, G. (1998). Promoring general meracognirive awareness. Instructional Science, 26(1-2), 113-125.

Serra, M. ]., & Metcalfe, J. (2009). Effecrive implemenration of meracognirion. In D. ]. Hacker, ] . Ounlosky, & A. C. Graesscr (Eds.), Hnnclbook of metncognition in education (pp. 278-298). New York, NY: Routledge.

Shimoff, E., & Catania, A. C. (200I). Effects of recording arrcndance on grade in inrroducrory psychology. Teaching of Psychology, 28(3), 192- 1 95.

Tanner, K. D. (2010). Order marcers: Using rhe 5E model to align reaching wirh how people learn. Life Sciences Education, 9(3) , 159- 164.

Tanner, K. D., & Allen, D. (2005, Summer). Approaches ro biology reaching and learning: Undersranding rhe wrong answers-reaching toward conccprual change. Cell Biology Education, 4(2), 112-11 7.

Taras, M. (2010). Assessment for learning: Assessing the theory and evidence. Proce-din-Socinl and Behavioral Sciences, 2(2), 3015-3022.

Thomas, A. K., & McDaniel, M. A. (2007). Meracomprehensio n for educationally relevant materials: Dramatic effects of encoding-retri eval interactions. Psycho-nomic Bulletin and Review, 14(2), 2 12-218.

Tobias, S., & Everson, H. T. (2009) . The imporrance of knowing what you know. In D. J. Hacker, J. Ounlosky, & A. C. Graesser (Eds.), Handbook of metncognition in education (pp. 107-128). New York, NY: Rou rledge.

University of Wisconsin, Stou. (n.d.). Examples of mbrics. Rerrieved from https://wwwcs.uwstouc.edu/soe/profdev/rubrics.cfm#cooperative

Winne, P. H., & Nesbit, C. (2009). Supporting self-regulated learning with cog-nitive rools. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 259-277). New York, NY: Rourledge.

J.