120
How Do You Know When They Know It? A Capstone Project Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Education Colleen Hodenfield Department of Education and Human Performance College of Education and Health Sciences Graduate School Minot State University Minot, North Dakota Summer 2011

How Do You Know When They Know It? A Capstone …yourspace.minotstateu.edu/laurie.geller/Capstone Examples...How Do You Know When They Know It? A Capstone Project Submitted in Partial

Embed Size (px)

Citation preview

How Do You Know When They Know It?

A Capstone Project

Submitted in Partial Fulfillment

of the Requirements for the Degree

of Master of Education

Colleen Hodenfield

Department of Education and Human Performance

College of Education and Health Sciences

Graduate School

Minot State University

Minot, North Dakota

Summer 2011

ii

This capstone project was submitted by

Colleen Hodenfield

Graduate Committee:

Dr. Laurie Geller, Chairperson

Dr. Rebecca Anhorn

Dr. Deanna Klein

Dean of Graduate School

Dr. Linda Cresap

Date of defense: June 30, 2011

iii

Abstract

This project focused on formative assessment and its place in the classroom. In

order for learning to take place, students need to be given consistent and

informative feedback on their progress in a timely fashion. Daily entrance slips

were given to students and one-on-one feedback was given on those entrance slips

the following day. Students were surveyed and overwhelmingly agreed that this

method of assessment helped them understand the concepts being taught better

than previously used assessment methods. Throughout this project, a journal was

written by the researcher who concurred that this method of assessing students is

superior to formerly used techniques to assess students. It was determined that for

this researcher, daily entrance slips effectively and efficiently helped answer the

question, “How do you know when they know it?”

iv

Acknowledgements

Once again, thank you Scott, Cody, Braden, and McKenna for all the

patience and support you give me as I take on my professional projects. Your love

and understanding are boundless!

A huge thank you also goes out to the chairman of my committee, Dr.

Laurie Geller. The time you have invested, the patience you have shown, and your

commitment to the teaching profession is amazing.

Another big thank you goes out to Dr. Becki Anhorn. You are always

giving support, advice, and truly caring about your students. I have learned so

much from you. You are an inspiration to me.

Lastly, thank you Dr. Deanna Klein for being on my committee. I have so

much respect for you and value your opinion.

v

Table of Contents

Page

Abstract .................................................................................................................. iii

Acknowledgements ................................................................................................ iv

List of Tables ......................................................................................................... ix

List of Figures ......................................................................................................... x

Chapter One: Introduction ...................................................................................... 1

Motivation for the Project ........................................................................... 3

Background on the Problem........................................................................ 5

Statement of the Problem ............................................................................ 6

Statement of Purpose .................................................................................. 7

Research Questions ..................................................................................... 8

Summary ..................................................................................................... 9

Chapter Two: Review of Literature ...................................................................... 11

Formative Assessment .............................................................................. 12

Essential Learnings and Professional Learning Communities .................. 19

Feedback ................................................................................................... 24

Frequency of Assessments ........................................................................ 27

Summary ................................................................................................... 28

Chapter Three: Research Design and Method ...................................................... 31

vi

Setting ....................................................................................................... 32

Intervention/Innovation............................................................................. 34

Design ....................................................................................................... 35

Description of Methods............................................................................. 38

Expected Results ....................................................................................... 41

Timeline for the Study .............................................................................. 41

Summary ................................................................................................... 41

Chapter Four: Data Analysis and Interpretation of Results .................................. 42

Data Analysis ............................................................................................ 42

Daily entrance slips ....................................................................... 42

Survey ........................................................................................... 47

Quiz, test, and homework completion .......................................... 51

Journal ........................................................................................... 55

Future study .................................................................................. 56

Interpretation of Results ............................................................................ 61

Summary ................................................................................................... 63

Chapter Five: Conclusions, Action Plan, Reflections, and Recommendations .... 64

Conclusions ............................................................................................... 64

Daily entrance slips ....................................................................... 64

Survey ........................................................................................... 65

vii

Journal ........................................................................................... 68

Quiz, test, and homework completion .......................................... 71

Action Plan................................................................................................ 73

Reflections and Recommendations for Other Teachers............................ 75

Summary ................................................................................................... 77

References ............................................................................................................. 78

Appendices ............................................................................................................ 83

Appendix A: Parental/Guardian Consent Form.........................................84

Appendix B: Youth Assent Letter............................................................. 87

Appendix C: Principal Letter .................................................................... 90

Appendix D: Assistant Superintendent Letter .......................................... 91

Appendix E: Student Survey ..................................................................... 93

Appendix F: Entrance Slip #1 ................................................................... 96

Appendix G: Entrance Slip #2 .................................................................. 97

Appendix H: Entrance Slip #3 .................................................................. 98

Appendix I: Entrance Slip #4 .................................................................... 99

Appendix J: Entrance Slip #5 ................................................................. 100

Appendix K: Entrance Slip #6 ................................................................ 101

Appendix L: Entrance Slip #7................................................................. 102

Appendix M: Entrance Slip #8 ............................................................... 103

viii

Appendix N: Quiz Sections 10.1-10.3 .................................................... 104

Appendix O: Test Chapter 10 ................................................................. 106

Appendix P: IRB Approval Letter .......................................................... 109

ix

List of Tables

Table Page

1. Entrance Slip Score Totals ........................................................................ 43

2. Survey Results for Questions 7-19 ........................................................... 48

3. Survey Coding Results .............................................................................. 50

4. Average Scores ......................................................................................... 52

x

List of Figures

Figure Page

1. Scatter plot of entrance slip averages versus test scores ........................... 46

2. Scatter plot of total completed homework averages versus test scores .... 55

3. Boxplots of entrance slip averages by time of day ................................... 57

4. Boxplots of quiz scores by time of day ..................................................... 58

5. Boxplots of test scores by time of day ...................................................... 58

6. Boxplots of entrance slip averages by grade level .................................... 60

7. Boxplots of quiz scores by grade level ..................................................... 60

8. Boxplots of test scores by grade level....................................................... 61

Chapter One

Introduction

If a teacher teaches but no students have learned, has the teacher taught?

This question brings up an important point: learning is integral to the act of

teaching (Gareis & Grant, 2008). It seems like an obvious statement, but it is not

as obvious as it seems. Teaching is not about “covering” the material; it is about

students uncovering the material. Just because a teacher has taught a concept does

not mean the student actually learned it. Gone are the days when teachers can say,

“I have taught it so it is up to the students to learn it.” The No Child Left Behind

(NCLB) Act has stopped that philosophy in its tracks. NCLB was put in place by

the government to provide incentives and create accountability for states to

improve instruction in an effort to prepare students to succeed at school and in the

workplace.

Since teaching involves learning, teachers need to know what their

students have learned. Teachers need a way of seeing learning (Gareis & Grant,

2008). Formative assessment can help.

Formative assessment is used while students are learning. The results of

formative assessment are used for developing knowledge. On the other hand,

summative assessments are used at the end of an instructional unit to judge the

outcome of the development of that knowledge (Marzano, 2010).

2

There are many ways to assess student learning. The problem is that one

of the most effective ways to formatively assess learning is to correct and give

feedback on every problem that a student does. That process takes more time than

most heavily burdened teachers have. In 20 years of teaching, I have still not

found the best solution to the problem of seeing what my students know each day

on a given topic without overwhelming myself to the point of burn out.

Early in my career I saw two extremely good math teachers quit the

profession because they were simply “burned out.” I told myself that I didn‟t want

to follow in their footsteps. I love my job too much to let assessment tasks keep

me from the profession in which I believe so strongly. I wanted to find a way to

assess my students as quickly and accurately as I can.

Students and teachers need feedback immediately. That is how they learn.

“Timely feedback is a critical element in any process to promote continuous

improvement and ongoing learning” (Eaker, DuFour, & DuFour, 2007, p. 97). As

the saying goes, if you keep doing what you have been doing, you will keep

getting what you have been getting. If all is well, students will keep getting

correct answers. If all is not well, students will keep getting the answers wrong.

Teachers need the feedback students give so adjustments can be made while

students are engaged in their work, rather than when that work is completed.

3

Students need the teacher‟s feedback to know if they understand the concept

while they are engaged in learning it.

Motivation for the Project

I found myself too many times in the following predicament. Students

were taking a quiz and completing a homework check over the first three to four

sections in a chapter. As I graded them, I began to realize how many students had

not taken the time to get their work done and checked. Until that quiz and

homework check day, I had only the questions I asked during the lecture and any

other formative assessments I did in class to inform me about what students

understood. I had a general idea of what they knew, but nothing specific. At times

I found myself frustrated that I did not know what was going on in their brains.

Our school district believes in the Professional Learning Community

(PLC) philosophy. The premise of a PLC is that a school‟s purpose is to ensure

that all students learn. The learning process must happen through collaboration

between all staff. Continual tangible evidence that the students are acquiring

knowledge is essential for a student‟s future success (Eaker et al., 2007).

The North Dakota state mathematics standards encompass many topics,

too many to be covered within one course. The job before teachers in my school‟s

mathematics department was to “unpack” the standards. Essential learnings were

the result of that “unpacking.” They are the state standard targets determined to be

4

necessary for student success in each course. My school‟s mathematics

department has worked hard to create essential learnings based on the state

standards and the district‟s curriculum for each of the classes offered at the

school.

Once the essential learnings were in place, common formative assessments

were created. Data from these assessments drove how teachers taught. The

problem was that these assessments were given after three to four sections were

taught. I wanted to devise a way to immediately determine whether my students

had mastered those essential learnings, rather than wait until after I gave a quiz. I

wanted to know what each student knows daily, but without being overwhelmed

with correcting papers.

I have used a variety of formative assessments throughout my Algebra I

course. I have enjoyed employing many differentiated instruction techniques to

reach as many students as possible. I wanted to put formative assessment and

daily feedback together in a friendly way that would not be overwhelming. An

entrance slip is a type of formative assessment where the students respond to

questions relating to the prior day‟s lesson. I used entrance slips that focused on

current essential learnings and one past essential learning to determine whether

students retained the material. I also continued to use other formative

5

assessments, some formal and some informal, to give me additional feedback on

what my students knew.

My hope was that this plan would benefit me and my students. Because of

this project, I hoped to be better informed of my students‟ progress, and I hoped

my students would be better informed of their progress.

Background on the Problem

I have attempted to change and improve my assessment techniques. I tried

to correct all the assignments for all my students and quickly became

overwhelmed with the sheer amount of work it involved. Burnout is inevitable for

me when I try to correct everything the students need to do to master a skill.

I have also tried giving completion points to students for having made an

attempt to do all the problems. This technique helps me “catch” the students who

are not working, but does not tell me or the students what they know.

Not knowing until mid-chapter what students know results in some

students “falling between the cracks” and not succeeding. Finding out that a

student does not know how to do a skill two to three sections after it was taught is

frustrating. Unfortunately, many students do not confess that they do not

understand; therefore, they fall further and further behind. Catching some of these

students before they are too far gone may increase the likelihood of their future

success in mathematics.

6

The quizzes I currently use do not break down the skills into essential

learnings. Thus, it becomes difficult to determine whether a particular student has

mastered the essential learnings. The quiz score gives an overall grade for the

sections it covers. I hoped frequently assessing students on small chunks of

information and recording them according to the essential learning would help

give a better picture of what the students know.

After speaking at many workshops, I have found that when given the

chance, most teachers will bring up the subject of assessment and grading

homework. It is a hot topic, especially in mathematics. I bought an “Easy

Button©” to push when difficult topics such as this come up, but it doesn't work!

There is no easy answer.

Statement of the Problem

I have had a difficult time in the past deciding the best way to formatively

assess my students on a consistent basis. Formative assessment is frequently

discussed in education. Homework is often an issue with teachers and students.

Without constant feedback that is specific and time-bound, the student and the

teacher do not know what needs to be done next to ensure success. Developing

essential learnings within our school has been rewarding and time consuming.

The hope was that pinpointing those essential learnings using more frequent

formative assessments would help identify the mathematical strengths and

7

weaknesses of my students. The current method of assessment used does not

identify those strengths and weaknesses in a timely matter. For this project, the

challenge was to see if daily entrance slips are the solution to this problem.

Statement of Purpose

“Assessment of a student‟s work should provide a rich array of

information on his or her progress and achievement” (Eaker et al., 2007, p.59).

The purpose of this project was to use daily entrance slips based on

essential learnings from my school to determine whether entrance slips were more

beneficial than homework checks for my Algebra I students.

I designed formative assessments known as daily entrance slips that are

based on the essential learnings of the topics taught in my high school Algebra I

classes. Three questions assessed the current essential learnings to diagnose the

current learning status of students, and one question assessed a past essential

learning to determine whether the earlier material was retained. I recorded the

results by identifying the essential learnings assessed rather than by the section of

the book. These formative assessments were graded for accuracy but were only a

small percentage of each student‟s grade.

Students need a say in their education. I gave the students a survey to

determine whether the students found the entrance slips beneficial to their

learning. I looked at the results of the mid-chapter quiz grades and the summative

8

test grades to determine whether these grades correlated with entrance slip grades.

I reflected in a journal about whether the entrance slips gave me the feedback I

needed so I could adjust my teaching tactics in order to improve learning.

Research Questions

Would daily entrance slips provide sufficient information to adjust

instruction to meet the needs of Algebra I students better than past assessments

(specifically homework checks) had?

In the past, I have been frustrated, as a teacher, at the untimely access to

data about what students know. Due to the timeliness of the entrance slips, I was

better prepared to address student difficulties on a daily basis rather than on a two

to three week basis. The question of whether entrance slips provided sufficient

information to adjust instruction to meet the needs of Algebra I students better

than past assessments (specifically homework checks) was answered by my

thoughts and feelings written in my journal. The study was qualitative in nature.

Would frequent feedback give my Algebra I students a better sense of

what they know than past assessments (specifically homework checks) had?

In the past, students only knew if they were doing the problems correctly

if they got up and checked their answers with an answer key posted in the

classroom. Many students did not check their answers on a regular basis, but

rather waited until they got the results from the homework check. This homework

9

check was given after three or four sections had passed. Due to the layered

learning that takes place in mathematics, the student may end up doing multiple

sections wrong, based on inaccurate methods repeatedly used throughout these

sections, making this form of assessment rather untimely for students. I

determined whether the students had a better sense of what they knew by

gathering their thoughts in the survey provided at the end of the unit. This

approach was qualitative in nature, and the results were only used to help me

guide future instruction of my classes; the results were not meant to be

generalized. Using entrance slips provided students with more timely feedback,

and as a result, improved the learning for many students.

I believed the entrance slips would benefit my students and me, but I was

not without concerns. I was concerned that this type of formative assessment

would be an overwhelming task for me. A burned-out teacher is not an effective

teacher. Another concern was the issue of class time. I worried that this new form

of assessment would take too much time out of classroom instruction and

practice. Both of these concerns had to do with the passage of time. As Benjamin

Franklin once said, “Lost time is never found again.”

Summary

“Intentions are fine, but they will not impact results unless and until they

are translated into collective commitments and specific concrete actions” (Eaker

10

et al., 2007, p. 17). Finding a solution to the question, “How do you know when

they know it?” was a difficult task. This project was the action I planned to take in

an effort to improve my students‟ learning.

Constant and specific feedback is a critical aspect of learning. Finding a

way to frequently, efficiently, and effectively assess students would be a great

feat. My goal was to use daily entrance slips based on the essential learnings from

my school as one way to use formative assessment strategies to improve the

learning process.

Research on formative assessment, feedback, Professional Learning

Communities, mathematics standards, and frequency of assessments and how

these topics might work together in a harmonious way is included in the next

chapter.

Chapter Two

Review of Literature

The purpose of this project was to use daily entrance slips based on

essential learnings from my school to determine whether entrance slips were more

beneficial than homework checks for my Algebra I students. This project focused

on daily entrance slips as a way of seeing what the students knew on a regular

basis. The research questions related to (1) formative assessment as a way to

guide instruction, (2) state standards and the need to “unpack” them to ensure

success, (3) frequency of assessments, and (4) timely feedback and its importance

to students and teachers.

Knowing that multidimensional student assessments are important to get a

clear picture of what a student knows (National Council of Teachers of

Mathematics [NCTM], 1991), this study focused on one of the many forms of

formative assessments that can be used in the classroom.

Keeping in mind that assessment drives many educational decisions,

NCTM (2005) stated that educators need to use effective assessment materials as

important tools in the teaching and learning process. Items designed to assess

specific standards and expectations should be incorporated into the “classroom

repertoire” of assessment tasks. According to the National Council of Teachers of

Mathematics (2005), assessment should be an open, coherent process which is fair

12

to all students and encourages students to make inferences about what they know.

Assessments should enhance what students have learned.

Formative Assessment

Students learn in multiple ways. Teachers must find ways to see students

learn. Mathematics can be described as layered learning. The layers include

essential knowledge, application, and complex thinking (Doty, 2008). Algebra is

often thought of as the foundation of all other mathematics, the gateway to higher

level mathematics courses. A solid foundation in mathematics begins with a solid

foundation in algebra. Development of the routine skills, or essential knowledge,

is necessary before applications and complex thinking become involved. Without

a solid background of the essential skills, the differentiated instruction needed in

order to reach the many types of learners in the classroom cannot take place. It is

those necessary skills that need the formative assessment that this project focused

upon.

There has been plenty of debate as to the definition of formative

assessment. According to Marzano (2010), Scriven began the discussion in 1968

with his program evaluation approach which contrasted formative and summative

evaluation. Then, according to Marzano, in the early 1970s, Bloom, Hastings and

Madaus stated that the main purpose of formative observations was to determine

the degree of mastery of a given learning task. It was not meant to be graded, but

13

rather a way for the learner and teacher to focus on the specific learning necessary

to move toward mastery. This assessment was to be done with more frequent,

smaller units. The instrument used to collect data for the assessment should focus

on narrow components of proficiency. It was to take place during learning in an

effort to improve it. The intent of the assessment was to learn from it, not strictly

to be graded. All these years later, their definition still represents an excellent way

to describe formative assessment.

Formative assessments are educational tools, and like any tool, their

success depends upon how they are used (Haigh, 2007). To be formative,

assessments need to provide “on the way” information to guide instruction in

response to the needs of diverse learners (Tomlinson & McTighe, 2006, p. 71).

Teachers‟ opinions of their students‟ abilities formulated through formative

assessment often justify the differentiated instruction used “on the way” (Watt,

2005). Students need to be allowed to “show what they know” in multiple ways.

When it comes to the fundamental concepts, consistent formative assessment is

necessary to make sure students can eventually make extensions beyond the

basics.

Fluckiger, Tixier, Vigil, Pasco, and Danielson (2010) stated the following

about formative assessment:

14

Consistent use of formative assessment transforms a traditional,

competitive classroom, where the main purpose of assessment is to assign

grades, into a non-traditional learning-dominated classroom, where the

main purpose of assessment is for students and teachers to self-reflect in

an effort to improve learning. (pp. 136-137)

Reflection leads teachers to judgments about the quality of student responses, and

as a result, these judgments shape pedagogical action to improve students‟

competence. When teachers think through students‟ misconceptions and then

share those misconceptions with students, students can eliminate randomness and

inefficient trial-and-error learning (Newton, 2007). Students can learn from their

mistakes sooner instead of later. Harlen (2005) stated, “The same information,

gathered in the same way, would be called formative if it were used to help

learning and teaching” (p. 208).

Formative assessments provide the chance for students to practice, take

risks, learn from their mistakes, and correct their work (Tomlinson & McTighe,

2006, p. 131). These assessments are a form of student monitoring to determine

whether students are making acceptable progress over time (Newton, 2007). The

reporting of this progress can be done either verbally or on paper, which

determines whether individual or group intervention is necessary. The

15

interventions that take place become the scaffolding for subsequent instructional

activities.

Paul Black and Dylan Wiliam (1998) summarized the findings from more

than 250 studies on formative assessment. According to their analysis, formative

assessment had an effect size (ES) of 0.7 on student achievement, the largest ever

reported for educational interventions. An effect size of 0.7 told Black and

Wiliam that across all the individual studies examined, formative assessment

improved student achievement by 26 percentile points. The research-based

concept of formative assessment is essentially one of the most successful tools in

an instructor‟s belt for improving student achievement.

There are three types of formative assessments: on-the-fly, planned-for-

interaction, and formal and embedded in curriculum (Young & Kim, 2010). They

range from “teachable moments” that take place in the classroom to deliberately-

planned questioning designed to improve students‟ knowledge to formal

embedded assessments to create “teachable moments” in the classroom. No

matter the type, formative assessments should be short and focused with

immediate feedback so students and teachers can attend to the conceptions and

misconceptions of students (Alayla, Shavelson, Ruiz-Primo, Brandon, Yin,

Furtuk, & Young, 2008).

16

Data analysis can indicate the areas on which teachers need to focus more

effort, but it cannot tell them how to do it. Teachers need support in developing

the content knowledge and pedagogical tools to respond to data analysis results.

Achieving a balance between supporting the student and allowing appropriate

struggle requires experience (Young & Kim, 2010).

How formative assessments are created and eventually implemented by

teachers is sometimes misunderstood (Alayla et al., 2008). According to Young

and Kim (2010), using assessments formatively in the classroom is not a

beginner‟s skill:

When a teacher‟s knowledge of subject matter is both deep and flexible,

she can break down concepts, find different entry points for different

students, and repackage topics to match students‟ apparent understanding

and misconceptions as evidenced in their work, oral responses, or other

assessments. (p. 9)

Many state certification systems and teacher education programs have few

or no requirements that teachers and administrators receive training in assessment

(Frey & Schmitt, 2007; Stiggins, 2002). Textbooks used in teacher education

programs tend to provide very little instruction in the assessment methods

relevant for the current classroom (Stiggins & Bridgeford, 1985). Due to the drain

of resources used for standardized testing, few resources are available to train

17

teachers to create and conduct appropriate classroom assessments (Stiggins,

2002). Research has indicated that few teachers explicitly used formative

assessments as part of their instructional practice (Young & Kim, 2010). Teachers

perceived assessments as primarily summative and failed to use assessments for

formative purposes.

Many teachers tend to rely heavily on their own mental recordkeeping to

store and retrieve information while assessing their students (Stiggins &

Bridgeford, 1985). Although teachers‟ interactions with students over time

contribute to impressions of individual students, it is difficult to keep accurate

records of each student that way.

Bligh (2001) acknowledged the gap between theory and practice as it

relates to classroom assessment:

It appears that assessment is an example of a subject where there are two

camps: one full of well meaning, earnest teachers and researchers

immersed in the language and culture of assessment practice; the other full

of well meaning, earnest teachers and researchers facing day to day

practical problems of running assessments. (p. 312)

NCLB has used phrases such as “evidence-based decisions” and “scientifically

based research” to improve education. Embedding formative assessment into

curriculum not only helps guide teachers toward better instruction, it also leads to

18

greater student learning (Alayla et al., 2008). By frequently embedding formative

assessments into a curriculum, the teacher and the student are given a snapshot of

what the student knows and is able to do in the critical moments when the

learning takes place (Alayla et al.). Collecting a series of those snapshots can

create a more complete picture of what each student knows.

National and international assessments across multiple years have

highlighted the need for more effective teaching and learning of mathematics in

general, and algebra in particular (Foegen, 2008). Progress monitoring is a

research-driven approach to formative assessments that relies on frequent

assessments. It uses smaller assessments that serve as indicators of general

proficiency in a content area. The intent is to provide teachers with specific data

on student performance that can be used to track progress and indicate the need

for pedagogical changes when students are not progressing at acceptable levels

(Foegen). Numerous research projects exist in the area of mathematics on

progress monitoring at the elementary level, but few studies at the high school

level have been conducted.

Research shows that formative assessment improves learning and those

assessments should be tied to the mathematics standards (NCTM, 1991). Next, the

teacher must decide which standards to teach.

19

Essential Learnings and Professional Learning Communities

A school‟s cultural shift to a Professional Learning Community means

shifting from a focus on teaching to a focus on learning (DuFour, DuFour, Eaker,

& Karhanek, 2004). A focus on learning requires purposeful assessment.

A big proponent of PLCs and president of the Assessment Training

Institute, Inc., Stiggins (2002) contended that assessment for learning happens

when teachers do the following:

understand and articulate achievement targets in advance of teaching,

inform students in advance about those learning targets,

develop assessments that accurately reflect student achievement,

use classroom assessments to build student confidence and help them

to take responsibility for their learning,

translate classroom assessment results into frequent descriptive

feedback for students that provides them with specific insights as to

how to improve,

continuously adjust instruction based on the results of classroom

assessment,

engage students in regular self-assessment so they feel in charge of

their own success,

20

involve students in communicating with teachers and families about

their achievement status. (pp. 761-762)

Instruction may or may not improve when the teacher is engaged in these steps

alone, but when teachers belong to a Professional Learning Community, they

collaboratively engage in these steps. This collaboration increases the likelihood

of improvement. The feedback given from teachers to teachers is invaluable.

Teachers talking to teachers about teaching is often an excellent way to improve

instructional techniques. Teachers working with teachers is the power behind

PLCs. Also at the heart of PLCs‟ continuous improvement philosophy is the

notion that teachers are learners, learning from and with each other how to

improve instruction. Formative assessment adds a new dimension to the list of

people from whom teachers can learn; student feedback implies teachers can learn

from students.

The nation‟s drive toward educational reform came to a head with NCLB.

Properly executed classroom assessments of state standards should be a top

priority of every district‟s list of strategies to ensure that no child is left behind

(Marzano, 2006). Two barriers stand in the way of standards being the focus of

effective classroom assessment, too much content and lack of unidimensionality.

Unidimensionality means that a single score on a test represents a single

dimension or trait that has been assessed.

21

The Mid-continent Research for Education and Learning (McREL) found

that 71% more instructional time is needed in order to address the mandated

content in all the standards documents in the U.S. (Marzano, 2006). A comparison

of U.S. standards with those of other countries led to the conclusion that the U.S.

has identified far too much content for its K-12 education system. In mathematics

alone, research found that the U.S. mathematics textbooks attempt to cover 175%

as many topics as do German textbooks and 350% as many topics as do Japanese

textbooks (Schmidt, McKnight, & Raizen, 1996).

Marzano (2006) found that, when unpacked, there are 741 different skills

embedded in the 241 mathematics standard benchmarks. If mathematics teachers

tried to teach all the skills within the standards, it would take 23 years!

Reorganizing the standards into clusters of power standards, or essential

learnings, is the first step to designing effective measurement topics. Teaching

standards one inch deep and one mile wide, versus one mile deep and one inch

wide, is the issue at hand. Trying to “cover” all the standards would lead to

teaching and reteaching year after year and lower scores on standardized tests.

The researcher attended the annual national assessment conference

sponsored by Solution Tree in Las Vegas, Nevada. At this conference, Chris

Jakicic, a Solution Tree representative, stated that it is better for students to be

22

proficient at 88% of what is on the state test rather than have them exposed to

100% of what could be on the test, but proficient at only a few of those items.

During our shift to PLCs at my school, essential learnings became the

acronym used for the “unpacked” state standards. All departments, including the

mathematics department, were asked to determine which standards needed to be

mastered for each course, a common task of PLCs. Once that task was

accomplished, student learning and student needs became the focus.

Attending to students‟ needs, rather than striving for curriculum coverage,

requires flexibility in teachers‟ instructional strategies (Young & Kim, 2010).

Rather than teachers “covering” the material, students need to be “uncovering”

the material. This approach may change the amount of time allocated for a

particular topic. Flexibility on the teacher‟s part is essential. Driscoll (1999) stated

that teachers should no longer view assessment as a separate summative activity

used to check knowledge gains after instruction, but rather as an ongoing

interactive process of instruction. Driscoll defined assessment as “the process of

gathering evidence about students„ knowledge, skills, and disposition toward

mathematics and of making inferences based on that evidence for a variety of

purposes, including better understanding of students‟ needs and more appropriate

instructional goals and curriculum design” (p. 82).

23

One of the PLC‟s goals is for schools to monitor each student‟s learning

on a timely basis and to create additional time and support when students

experience difficulty (DuFour et al., 2004). To ensure the impact on students is

strong, teachers must ensure that students remain motivated to learn the essential

learnings for each subject. The assessments should be given on a unit-by-unit

basis with an opportunity for retesting (Newton, 2007). Retesting an assessment

may have two outcomes: (1) the teacher and student can get updated information

on content development after intervention; and (2) if a grade is recorded, then the

student can improve the grade on that skill if the teacher allows it (Newton).

Legend has it that Albert Einstein had a plaque on his wall that said, “Not

everything that counts can be counted and not everything that can be counted

counts.” This quote leads to another issue to be discussed: should formative

assessments be recorded? Some researchers such as O‟Connor (2007) believed

that formative assessments should be reported but not recorded. Others believed

that formative assessment can support learning as well as measure it (Black &

Wiliam, 2003).

Assessments that yield formative scores can be recorded. Those scores can

be used to track student progress over time, which in turn can be used to generate

a summative score at the end of a particular interval of time. Those scores should

be reported for each essential learning to create a more complete profile of

24

individual student strengths and weaknesses (O‟Connor, 2007).

Unidimensionality, or reporting on essential learnings separately, is important for

feedback to be interpreted correctly.

How much weight teachers assign to formative assessment data depends

on the teachers‟ beliefs about their educational significance (Young & Kim,

2010). Some teachers assign no grade for formative assessments; others believe

some sort of grade is necessary for student motivation reasons. The decision is

ultimately in the hands of each teacher, which leads to another issue related to this

project, feedback.

Feedback

Feedback is among the most critical influences on student learning (Hattie

& Timperley, 2007). As the legendary football coach Vince Lombardi once said,

“Feedback is the breakfast of champions.” Timely feedback is an essential

component of learning. In this technological age of instantaneous results, students

are accustomed to immediate feedback; it is what they have grown up with.

Timely feedback to students is crucial while they are acquiring their

essential learnings. The results of the formative assessment can provide feedback

that will be beneficial to the teacher as instruction is planned. Effective feedback

causes thinking (Black & Wiliam, 2003) not only for the student, but also for the

teacher.

25

Four qualities characterize an effective feedback system. The feedback

must be (1) timely, waiting until the end of a unit to give feedback is too late, (2)

specific, telling students “good job” is not specific feedback, (3) understandable

to the receiver, a score does not enhance learning, and (4) allow for adjustment,

students need opportunities to act on the feedback, to refine, revise, practice, and

retry (Tomlinson & McTighe, 2006).

Encouragement after an assessment makes a difference to students and is

one of the most challenging parts to implement (Marzano, 2006). Providing

students with feedback that encourages students to keep track of their progress on

the essential learnings could be one way to motivate students to learn. Students

can then define success in terms of their own learning, as opposed to comparing

themselves to other students in the class (Harlen, 2005; O‟Connor, 2007). They

need to make it personal. When students see that the teacher is willing to self-

reflect on what is being taught and revise, students may be more apt to do so

themselves (Fluckiger et al., 2010). The time the teacher takes to explain the

benefits of reflection to the student and to establish a trusting culture makes the

feedback practice a powerfully engaging system of instruction (Clark, 2008).

More assessments can mean more achievement. The frequency effect does

taper off over time though. O‟Connor (2007) stated that the more consistent a

26

student is, the less evidence of learning that is needed; an inconsistent student

requires the teacher to gather more evidence that learning is taking place.

Learning is negatively influenced in classrooms where assessment simply

tells students whether their answers are correct or incorrect. Learning is also

negatively influenced when an assessment grade is given with no comments

(Marzano, 2006). Providing detailed, descriptive feedback noticeably enhances

the quality of students‟ work; yet the time and energy this type of feedback

requires is often why some teachers avoid or are reluctant to provide it (Fluckiger

et al., 2010).

Teachers reported difficulty in record-keeping and time management

while implementing classroom assessments (Young & Kim, 2010). Hattie and

Timperley (2007) did extensive research on the power of feedback. They found

that to make feedback effective, teachers need to make appropriate judgments

about when, how, and at what level to provide appropriate feedback. The

teacher‟s success in implementing formative assessment practices depends on the

strength of the teacher‟s classroom management skills. Giving timely and specific

feedback at regular intervals helps keep students involved and motivated (Doty,

2008). Frequent assessments increase positive attitudes toward mathematics

(Kika, McLaughlin, & Dixon, 1992). The time invested is time well spent.

27

Research has shown that timely and specific feedback has a proven track

record of success. Literature related to the last issue of this study, how often a

teacher should use formative assessment, is summarized below.

Frequency of Assessments

Results from a study done by Connor-Greene (2000) suggested that

students‟ study habits are strongly influenced by assessments. The NCTM (1989,

1991) advocated the use of frequent assessments as a way for teachers to make

instruction purposeful, even as often as daily. Due to the constant flow of

information between teacher and student, assessments need to occur often. When

frequency of assessments increases, a positive correlation exists on many levels.

Student engagement and response to questions and discussions increase when

frequency of assessments increases (Haigh, 2007). Frequent assessments

encourage students to better monitor their study time, which corresponds to

improved study habits and increased organization (Connor-Greene).

Shirvani (2009) researched frequent feedback extensively and found that,

with frequent feedback, there was less procrastination and less student anxiety.

Also, Sharvani found that frequent assessment had a positive impact on retention

of concepts previously taught.

Daily assessments are found to be more effective in lower level courses,

such as algebra, as opposed to upper level courses (Dineen, 1992). This result

28

may be due to the higher rate of improvement for lower ability students than

higher ability students as found in a study by Kika et al. (1992). Lower level

students have more room to grow.

Of course there are always studies to the contrary. Shirvani (2009) found

many studies in the 1980s with no significant differences between frequent and

infrequent assessments. Shirvani‟s study suggested the contrary. Using a control

group who were given weekly assessments and an experimental group who were

given daily assessments, Shirvani found that the experimental group outperformed

the control group on the final exam and homework assignment completion. Those

in the experimental group expressed a preference for daily assessments.

Shirvani (2009) also noted that intrapersonal students benefited from daily

quizzes because they could monitor their understanding internally using the

feedback given each day. Another study, done by Kika et al. (1992), also found

that students preferred more frequent assessments over less frequent assessments

and performed better when being assessed more frequently.

Summary

“How do you know when they know it?” Seeing is believing. Expectations

in education are vast. Teachers tend to get caught up in the intertwined details of

teaching and assessing. In spite of all this, teachers must keep student academic

29

achievement as a priority. Student success comes from purposeful, engaging

teaching with consistent assessment of student understanding (Doty, 2008).

Teachers using formative assessments for the purpose of guiding

instruction need to establish a classroom environment in which students

understand that the purpose of the assessments is learning, not grading. This

change requires effort on the part of the teacher to instill this new way of thinking

of assessments, as most students have only experienced assessments as judgments

and grades. Some of the pressures of grades have to be removed in order to

successfully implement this form of assessing students. Firm evidence indicates

formative assessment is an essential component of classroom work and its

development can raise standards of achievement.

Studies have shown that daily assessments are looked upon favorably by

students and improve student motivation and learning. Research also shows that

to appropriately modify learning, feedback has to be timely and describe features

of the work or performance relating directly to the essential learnings. The stakes

should be low, allowing time for adjustments before it “counts” (O‟Connor,

2007).

In this study, daily entrance slips were consistent assessments of student

understanding. They were the diagnostic tool, or “dip stick” used to modify lesson

plans. Spontaneity was expected as “on the way” learning would go many

30

directions. As Marzano put it at the annual national assessment conference

sponsored by Solution Tree in Las Vegas, Nevada attended by the researcher,

“Assessment is not what teachers do TO students, it is what teachers do WITH

students.” The next chapter describes how daily entrance slips were incorporated

into the classroom and how the teacher saw the learning going on in the

classroom.

Chapter Three

Research Design and Method

The purpose of this project was to use daily entrance slips based on

essential learnings from my school to determine whether entrance slips were more

beneficial than homework checks for my Algebra I students. This project focused

on daily entrance slips as a way of seeing what the students knew on a regular

basis.

I designed formative assessments known as daily entrance slips that were

based on the essential learnings of the unit being taught in my high school

Algebra I classes. The daily entrance slip consisted of three questions that

assessed the current essential learnings and included one question that assessed a

past essential learning to determine whether earlier material was retained. I

recorded the results by identifying the essential learnings assessed rather than by

the section of the book. These formative assessments were graded for accuracy

but were only a small percentage of the students‟ grades. A completion grade of

the homework was reported, but not used as part of the students‟ grades. I was

curious whether students would continue to complete homework at the same rate

while daily entrance slips were being used.

Students need a say in their education. I gave students a survey to

determine whether the students found the entrance slips beneficial to their

learning and to determine how the entrance slips affected how they prepared for

32

class. I looked at the results of the mid-chapter quiz and the summative test at the

end of the unit to determine whether there was any correlation between the

entrance slip grades and the grades received on the quiz and test. I reflected on

whether the entrance slips gave me the feedback I needed to make informed

adjustments in my teaching to improve learning. Reflections were documented in

a journal I kept during the study.

Setting

I have taught mathematics in the same school for 21 years and one year at

a rural school in the same state. I have incorporated many differentiated strategies

to try to reach all of my students. I have continually made an effort to keep

refining the art of teaching mathematics using past experience and new ideas to

help instruct students as best I can. Seeing what my students know has always

been a difficult task for me. Trying to find the “happy medium” between

correcting and giving feedback on every problem students do, and not correcting

any of the problems they do is trying. I have had many discussions with

mathematics teachers from across the country about this issue. No one seems to

have the “perfect” answer, and most, if not all, of the teachers I have talked to

struggle with the same assessment issues that I do. I know that I am not alone in

this search for a more effective and efficient way to see what students know on a

consistent basis.

33

The public school involved in this project is in a mid-western community

with a population of about 37,000 people. This community is located near an air

force base and is in the middle of an oil boom which is leading to an influx of

people from diverse areas. The school is part of a split high school with grades

nine and 10 at one building and grades 11 and 12 at another building. The ninth

and tenth grade high school was the location of this project. There were

approximately 1,100 students at this school. The four Algebra I classes involved

in this project ranged in class size from 18 students to 28 students per class. There

were 64 ninth grade students and 22 tenth grade students. Gender was fairly

evenly divided with 44 females and 42 males included in this project. The

students were predominantly white with other minorities represented, and

consisted of both regular and special education students, with the majority,

89.5%, being regular education students.

This project was conducted during the spring in the fourth nine weeks.

Because of the time of year, waning motivation and poor attendance due to spring

activities and other issues were thought to possibly adversely affect the results.

Also, the students were accustomed to the previous manner in which assessments

have been done. This was thought to possibly affect students in opposite ways.

Some may have been comfortable with the way the assessments have been done

34

and have no desire to see it done differently. Others might have appreciated a

change in pace and welcomed the new way of being assessed.

Intervention/Innovation

In the past, assessment was done using the following: (1) mid-chapter

homework checks in which I selected problems from each section for students to

show me what they had done; it was a spot check of completed work to see if it

was done accurately; (2) end of the chapter completion checks, where I checked

to see if the students had attempted to do all the problems assigned; (3) mid-

chapter quizzes to assess what the students knew “so far;” and (4) end of the

chapter summative tests. Amid all these were “on the fly” assessments made

through discussion, differentiated instruction tasks, and occasional entrance and

exit slips. A project was also assigned at the end of each unit so students could

“show what they know” in their own way. All assessments except for the mid-

chapter homework check continued to be used. The homework completion check

was only recorded, not included as part of the students‟ grades during the project.

Consistent daily assessments have not been a part of my classes. I relied

on my memory to know what students did and did not know until the mid-chapter

assessments occurred. Creating daily assessments (known as entrance slips),

administering these assessments, and recording these assessments in a new way

using the essential learnings as titles in the grade book were all new to me and the

35

students. Reporting completion of homework, but not including it in the student‟s

grade was also new.

Use of daily entrance slips to help me better diagnose my students‟

individual and group needs was my personal goal. Getting students to benefit

from daily feedback in an effort to improve their learning was my ultimate goal.

Design

Before the project began, all involved participants were made aware of the

action research about to be undertaken. A Parent/Guardian Consent Form (see

Appendix A) was sent and a Youth Assent Letter (see Appendix B) was given to

the students. The researcher had discussions with the principal of the school and

the assistant superintendent and letters were signed to document their awareness

and acceptance of the project (see Appendices C and D).

I used a student survey, a journal of my reflections and multiple forms of

assessments. I designed a survey (see Appendix E) to question my students as to

their thoughts on whether the entrance slips (see Appendices F-M) made a

difference in their learning and in what ways. My own opinions were brought to

view in my journal, but it was ultimately the students‟ voices that needed to be

heard. I then determined if the completion grade, mid-chapter quiz, and

summative test results at the end of the unit reflected their thoughts.

36

The study began during the fourth nine weeks in early April. I began my

unit on quadratic functions by explaining to the students that they would be

completing daily entrance slips that were based on the essential learnings of the

quadratic unit. The students were told that the entrance slips would be given to

them on a daily basis covering information from the previous day. The daily

process went as follows:

I taught the new essential learnings and assigned homework for students to

practice the skills involved.

The next day I went over questions and gave them an entrance slip on the

essential learnings from the previous day.

Answers were shared immediately after the entrance slip was handed in so

students could quickly find out if they got the problems correct. Later they

were graded by the teacher and recorded. This was a small part of their

overall grade. The emphasis was on students realizing whether or not they

knew the information, not a number in the grade book.

Individual feedback was given to those who did not get the problems

correct while they were taking the new entrance slip the next day.

Entrance slip grades replaced the homework check grades that had been

given in the past.

Intervention was assigned if the essential learnings were not understood.

37

On days when new material was not covered, an entrance slip was not

given the next day.

In the middle of the unit, a mid-chapter quiz was given (see Appendix N)

and homework was collected to check for completion. A homework

completion grade was recorded to diagnose whether lack of effort played a

part in their learning, but this grade was not included in their overall grade

during the project.

The daily entrance slips continued until the end of the unit when a

summative unit test was given (see Appendix O), and homework was

collected again to check for completion. Once again, this score was

reported but not included in the students‟ grades.

A survey was filled out by the students once the summative unit test

results had been given.

Because this was an action research project instead of a thesis, the main

emphasis was to gather information from students and myself about our thoughts

and feelings regarding this method of assessment. Using a control group would

exclude students from the use of entrance slips. I believed all students should be

given the opportunity to experience the use of entrance slips. I also believed all

my students should be allowed to anonymously voice their opinion on whether

this form of assessment of their knowledge was found to be helpful in their

38

learning. Use of a pre-test and post-test to determine whether knowledge growth

occurred did not make sense in this situation since there would automatically be

an improvement due to students gaining knowledge as information was taught.

The purpose of this action research project was to determine whether entrance

slips were more beneficial than homework checks for my Algebra I students.

Description of Methods

The following methods and activities occurred as the study was

completed:

(1) The principal and assistant superintendent were made aware of the project

and permission to do this project was requested from both of them.

(2) IRB permission was sought and received (see Appendix P). All personal

rights of the students were observed. Confidentiality and anonymity was of

utmost importance.

(3) A parental/guardian consent letter was sent to parents/guardians informing

them of the project that I would undertake with their child and requesting

permission for their child to participate. Once again, confidentiality was

emphasized.

(4) The students were informed of the project and their role in it. They were

asked to complete a youth assent form prior to the study. As always,

39

confidentiality was explained. The daily process as listed in the design was

explained to the students.

(5) The students were given an entrance slip on a daily basis covering

information from the previous day. The entrance slip consisted of four

questions, three of which assessed the new information from the previous day

and one question was a review question to check for retention of previous

material, known as a flashback. When a problem was too difficult to do, a

note was required from the student to tell me when that student would be in

to get extra help. To save time, I returned each student‟s graded entrance slip

the next day while the student worked on the new entrance slip for the day.

Because I had already discussed the previous entrance slip immediately after

it was completed, I was able to devote less additional time in discussing the

graded entrance slips upon their return. Any additional time was spent

personally with those students who did not understand their mistakes.

(6) Results were recorded by essential learning in the grade book. Good record

keeping was important as well as tracking absent students to get assessments

made up. I corrected the entrance slips for accuracy, writing comments as to

mistakes made or areas done well. Each entrance slip was worth four points.

(7) Modifications to the entrance slips and lesson plans began as the results to

the assessment data emerged. Flexibility was crucial here. A mid-chapter

40

quiz was given to assess knowledge attained in the first half of the unit. More

entrance slips followed.

(8) Upon completion of the unit, the end of the unit summative test was

administered. The unit this project focused on was Quadratic Functions.

(9) A survey was given to the students during class. The survey was given to

investigate the students‟ impressions about the formative assessment strategy

known as entrance slips. I was interested in knowing whether the students (a)

found the entrance slips helpful in the process of learning the essential

learnings, (b) increased or decreased their preparation time for class due to

the entrance slips, (c) found the individualized feedback that was provided

helpful, (d) became more aware of what the essential learnings are for the

class, (e) felt nervous about the entrance slips, and (f) thought that the mid-

chapter quiz and end of the unit summative test were affected by the entrance

slips.

(10) A journal was kept throughout the project by the researcher in order to self-

reflect on concerns, celebrations and any issues that came up.

(11) The results of the surveys, mid-chapter quiz scores, entrance slip scores, and

end of the unit summative test scores were interpreted.

41

Expected Results

I anticipated that the entrance slips would improve learning. I believed the

students would appreciate the extra attention they got from the feedback. I

believed knowing more about my students‟ abilities on a daily basis would be

extremely helpful.

Potential obstacles of using the entrance slips were thought to be time-

related. Student attendance, correcting, handing back and personally connecting

with students who need intervention were thought to become overwhelming and

time-consuming.

Timeline for the Study

The students began participating in this project in early April. It ended in

mid-May when the survey was given.

Summary

The action research study discussed in this chapter was done with my

Algebra I students. This study involved students completing a daily entrance slip

as a way for me and my students to formatively assess their understanding of the

essential learnings being discussed. I used entrance slip scores, surveys, self-

reflection, mid-chapter quiz and summative test scores to analyze the

effectiveness of the entrance slips to improve learning. The results of my study

are discussed in the next chapter.

Chapter Four

Data Analysis and Interpretation of Results.

The purpose of this project was to use daily entrance slips, based on

essential learnings from my school, to determine whether entrance slips were

more beneficial than homework checks for my Algebra I students. Included in this

chapter are the results of the survey, data from the entrance slips, mid-chapter

quiz and test results, as well as the results of the journaling/self-reflection done by

me, the researcher.

Data Analysis

Data for this project were collected in four ways: (1) Daily entrance slip

scores were collected; (2) a survey was given; (3) quiz, test, and homework

completion scores were collected; and (4) a journal was kept by the researcher.

Daily entrance slips. The day after new material was taught, an entrance

slip was given. Daily entrance slips replaced the homework checks that had been

used in the past to assess student knowledge. The daily entrance slips consisted of

three questions that assessed the current essential learnings and one question that

assessed a past essential learning (called a flashback). The scores of the entrance

slips were recorded. Each entrance slip score included the three questions that

assessed the current essential learnings. One of the questions was a multiple point

question creating the four points recorded for the entrance slip. The fourth

question, called the flashback, was not included in the grade but was reported as

43

to whether it was correct or not. Eight flashback problems were given with

students averaging 4.88 correct.

Table 1 shows the total scores for each entrance slip; 0, 1, 2, 3, or 4 points

could be earned on each one. The mean was computed to compare each entrance

slip. Creating reliable assessments is important. Scores that appear much lower

than the others or have a higher standard deviation may mean there are reliability

issues. At the same time, lower entrance slip scores could also indicate that

instruction needs to change to correct incorrect thought processes. The need

existed to closely inspect student work to determine what is causing the lower

scores.

Table 1

Entrance Slip Score Totals

Entrance

Slip 0 1 2 3 4 N M SD

1 0 2 44 17 22 85 2.69 0.89

2 9 5 10 20 41 85 2.93 1.34

3 2 0 16 36 31 85 3.11 0.87

4 5 1 8 10 61 85 3.42 1.11

5 19 10 19 24 13 85 2.02 1.39

6 2 0 8 35 41 86 3.31 0.83

7 9 4 20 31 22 86 2.62 1.22

8 12 1 8 21 42 84 2.95 1.40

Overall - - - - - 86 2.89 0.68

44

Entrance slips #1-3 assessed Essential Learning 9.3, graphing a parabola

and labeling important parts. Entrance slip #1 became the baseline as it assessed

preliminary skills necessary for graphing parabolas. After entrance slip #1‟s

results were reviewed, I noticed a common mistake that the students were making

that I was unaware of before giving the entrance slip. The next day, the problem

was addressed, more information about how to graph was presented and more

practice was assigned. Entrance slip #2 assessed the issue that had been a problem

on entrance slip #1 along with the new information the students had just learned.

The mean for entrance slip #2 had a 0.24 point gain from the mean of entrance

slip #1. More practice and one-on-one assistance produced a mean of 3.11 on

entrance slip #3, a 0.18 point gain from entrance slip #2. Entrance slip #4 assessed

Essential Learning 8.6, stating the domain and range of a function. Included in

this assessment was Essential Learning 9.3 again. Once again, a gain occurred as

the mean rose to 3.42, a 0.31 point gain from entrance slip #3 and a 0.73 point

gain since entrance slip #1 when the skill was first assessed.

In the case of entrance slip #5, the lowest entrance slip score, Easter break

followed by a substitute teacher created a long gap from when the material was

last seen in class and the entrance slip. As an instructor, a “teachable moment”

happened and a deeper understanding for the students was accomplished by

45

revisiting the concepts in question with the class as well as individually with the

students who made mistakes.

Entrance slips #6-8 assessed Essential Learning 11.5, solving quadratic

equations. Entrance slip #6 assessed easier types of problems and entrance slips

#7 and #8 involved more difficult problems that required using the Quadratic

Formula. The sign of whether feedback made a difference can be seen between

entrance slip #7 and #8 assessing the same skill. A gain of 0.33 points occurred.

“Would daily entrance slips provide me with sufficient information to

adjust instruction to fit the needs of my students better than past assessments

had?” can be clearly answered. Yes, the entrance slips guided my instruction.

The entrance slip mean scores were then crossed with the test scores to see

if the entrance slips had a positive correlation with the test scores. Linear

regression was used with the dependent variable represented by the test scores and

the independent variable represented by the average entrance slip scores. The

Pearson correlation of the entrance slip average and the test scores was 0.594 with

a p-value < 0.001. Figure 1 shows the scatter plot representing this comparison.

The outlier students who failed the test were the students who generally

scored a two as their average entrance slip score. In this Algebra class, 61% was a

failing grade. These students needed to address the fact that they were not

mastering the skills necessary for a passing grade. I should have placed more of

46

these students into assigned intervention after school to work with them on the

skills they lacked.

ES Average

Te

st

4.03.53.02.52.01.51.0

110

100

90

80

70

60

50

40

30

20

S 12.2252

R-Sq 35.3%

R-Sq(adj) 34.5%

Fitted Line PlotTest = 48.60 + 13.15 ES Average

Figure 1. Scatter plot of entrance slip averages versus test scores.

The scatter plot in Figure 1 was created in an effort to answer the research

question, “Would frequent feedback give my Algebra I students a better sense of

what they know than past assessments (specifically homework checks) had?” In

response to that question, the scatter plot shows a positive trend between entrance

slip grades and test results. This trend along with the fact that the average on the

entrance slips was 72% and the test average was 86.6% shows that many of the

students “learned from their mistakes“.

47

Survey. Students need a say in their education. At the end of the unit a

survey was given to the students to investigate their impressions about the

formative assessment strategy known as entrance slips. The survey included

Likert-scale items in which students indicated their level of agreement from

strongly agree to strongly disagree, as well as demographic information such as

gender and grade level.

The surveys were copied and handed out the day after the summative test

at the end of the unit. Students were instructed to answer the questions honestly

and were told to keep their names off the survey for anonymity reasons. Students

finished the survey in approximately 10 to 15 minutes.

Eighty-six surveys were filled out and returned to me. Some surveys were

completely filled out; others had questions unanswered; while others answered

with multiple answers. Those questions that had more than one response were not

included in the results. Most students supplied at least one additional comment,

with many supplying multiple additional comments.

Once the surveys were collected, the responses were entered into a

spreadsheet. The Likert-scale items were also coded and given a numerical value

where one corresponded to strongly disagree, two represented disagree, three

corresponded to agree, and four represented strongly agree. A mean was

computed from these numerical data. Table 2 represents the results of the survey.

48

Table 2

Survey Results for Questions 7-19

Question SA A D SD M

7. The entrance slips took too much time out

of class. 3.5% 4.7% 67.4% 24.4% 1.87

8. The entrance slips made me learn the

essential learnings better. 23.3% 60.5% 11.6% 4.7% 3.02

9. The entrance slips made me more prepared

for the mid-chapter quiz. 36.5% 49.4% 12.9% 1.2% 3.21

10. The entrance slips made me more prepared

for the chapter 10 test. 37.7% 48.2% 11.8% 2.4% 3.21

11. The entrance slips made me nervous. 4.7% 12.8% 40.7% 41.9% 1.80

12. The entrance slips helped me understand

what I was doing better than homework

checks.

48.8% 46.5% 3.5% 1.2% 3.43

13. Mrs. Hodenfield‟s one-on-one help on

what I did wrong on the entrance slips was

more beneficial than homework checks.

57.7% 41.2% 1.2% - 3.56

14. If I had a choice, I would stay with

entrance slips instead of homework checks. 58.3% 31.0% 8.3% 2.4% 3.45

15. I check my homework assignments often

to make sure I have done the problems

correctly.

12.1% 45.8% 30.1% 12.1% 2.58

16. I like doing homework better than doing

entrance slips. 3.5% 14.1% 35.3% 47.1% 1.74

17. The homework assignments helped me

prepare for the chapter 10 tests. 7.1% 64.3% 17.9% 10.7% 2.68

18. I did less homework during chapter 10

than I did in the past chapters in Algebra. 15.5% 36.9% 35.7% 11.9% 2.56

19. I am more aware of what the essential

learnings are for Algebra I. 26.2% 64.3% 7.1% 2.4% 3.14

49

The students disagreed (mean of 2.00 or less) with the following survey

statements: (7) that the entrance slip took too much time; (11) that the entrance

slip made them nervous; and (16) they liked doing homework better than doing

entrance slips.

The students agreed (mean of 3.00 or more) with the following survey

statements: (8) the entrance slips made them learn the essential learnings better; (9

& 10) the entrance slips made them more prepared for the quiz and test; (12) the

entrance slips helped them understand what they were doing better than

homework checks; (13) Mrs. Hodenfield‟s one-on-one help was more beneficial

than homework checks; (14) they would stay with entrance slips rather than

homework checks; and (19) they were more aware of what the essential learnings

are for Algebra. All these results answer the question “Would frequent feedback

give my Algebra I students a better sense of what they know than past

assessments (specifically homework checks) had?” with a resounding yes!

Each survey question had a space provided for students to comment. For

the purpose of coding the data, all additional student comments were collated on a

document by stating the survey question and then listing all the student comments

for the question under it. There were 448 comments made by the students on the

survey. An extra section was provided for students to give additional comments in

general. Thirty-five of the 448 comments came from this section. As students

50

filled out the survey, the amount of comments added became less as the survey

went on. The first question had 48 comments and the last question had 17

comments.

The document was read multiple times, and the following themes

emerged: (1) the entrance slip‟s helpfulness to the student, (2) the researcher‟s

role in the student‟s learning (feedback), (3) the ease and quickness of the

entrance slips, (4) entrance slips were a review, (5) negative homework

comments, (6) positive homework comments, (7) mention of grades, (8) flashback

references, and (9) negative comments about the entrance slips. Table 3 shows the

results of the coding.

Table 3

Survey Coding Results

Comment Number of comments

The entrance slip‟s helpfulness to the student 128

The ease and quickness of the entrance slips 52

Feedback from the teacher 47

Negative comments about the entrance slips. 34

Negative homework comments 32

Positive homework comments 17

Thought that the entrance slips were a review

Mention of the word grades

13

8

Flashback references 5

51

Of the 33 comments to the survey question asking whether the student

would stay with entrance slips instead of homework checks, 88% of the

comments stated in one way or another that they would stay with entrance slips.

When all the comments, both coded and un-coded, were counted that pertained to

entrance slips, 90% of all the comments reflected a positive attitude about this

project. Students believed that the entrance slips gave them valuable information

about what they knew.

Quiz, test, and homework completion. A mid-chapter quiz and a

summative end of the chapter test were given. When each assessment was given,

homework was collected to determine how much homework was completed by

the students. The assignments from the first half of the chapter were collected the

day of the quiz. The assignments from the second half of the chapter were

collected the day of the test. The quiz and test scores were recorded and included

in the student‟s grade; the homework was reported but not included in the

student‟s grade.

The scores were placed into the data spreadsheet created for this project. A

total homework completed grade was also computed. Table 4 shows the averages

(M) of each entrance slip, the quiz, the test, and the two homework completion

checks as well the flashback average.

52

Table 4

Average Scores

Assessment N M SD

Flashback 86 4.88 1.77

Quiz 85 90.07 10.24

Test 86 86.62 15.10

Entrance slips 86 2.89 0.68

Homework 1st Half 85 48.93 14.69

Homework 2nd

Half 86 36.60 15.56

The flashback average was based on eight flashback questions assessing

prior concepts. The prior concepts ranged from graphing lines to exponents to

solving equations. The recording system was to put an X next the entrance slip

score if they had correctly answered the flashback. Students were told what

chapter the flashback came from and could look back at notes or assignments to

help them remember the concept. An average of 4.88 (out of 8.0) tells me that a

little over half the time the students could recall prior information. The results

would be more interesting once entrance slips have been used over the course of

the year to see if they help students retain knowledge.

The quiz and test averages are based on the percent correct on each

assessment. The quiz assessed the first half of the unit which included graphing

parabolas and maximum height situations. The test assessed the entire chapter

53

which included the quiz information as well as solving quadratic equations. The

second half of the unit was more challenging than the first half of the unit, in my

opinion. This difference in difficulty might be one reason for the lower average on

the test compared to the quiz. Both the quiz and the test had large standard

deviations due to some extreme outliers. One student in particular did very poorly

on the test due to family issues.

Homework was collected twice throughout the unit. A completion grade

was reported but not included in the students‟ grades. The first half of the

homework was collected after the quiz was taken. There were 60 problems

assigned. I was very “picky” to make sure all parts of the problems were done in

order for the student to earn credit for finishing the problem. Approximately 82%

of the first half of the homework assigned was completed. The second half of the

homework was collected after the test taken. There were 51 problems assigned.

Approximately 72% of the second half of the homework assigned was completed.

Variables that influenced the decrease in the amount completed might relate to

time, as the students had more time to finish the first half of the homework.

Another factor might be difficulty, as the second half of the unit was more

difficult than the first half. A third factor might be student motivation and

attendance, as spring activities increased and the weather improved. Less

54

homework completion may also explain why the test scores were lower than the

quiz scores.

The entrance slip average of 2.89 (out of 4.0) corresponds to a grade of

approximately 72%. Due to the small number of problems, making one mistake

led to a grade of 75%. At the time of the entrance slips, I tried to get the students

to fine tune all the skills; absolutely no mistakes could be made. Thus, it was not

easy to earn 100% on the entrance slips. An average of 72% means that, on

average, students made one or two mistakes in their work. Because of the highly

critical way the entrance slips were graded, most students‟ quiz and test scores

were higher than their entrance slip averages as students learned from their

mistakes.

Figure 2 shows the scatter plot that was created to check for a positive

correlation between the total homework completed and the summative test score.

Linear regression was computed using the total homework completed grade as the

independent variable and the test scores as the dependent variable. The Pearson

correlation of the homework completion average percent and the average test

percent was 0.507 with a p-value < 0.001.

There always seem to be exceptional children who score far above or far

below the expected score based on the amount of homework completed. For the

55

most part, those students who completed over 60% of their homework scored

80% to 100% on the test.

HW Percent

Te

st

100806040200

100

90

80

70

60

50

40

30

20

10

S 13.0938

R-Sq 25.7%

R-Sq(adj) 24.8%

Fitted Line PlotTest = 63.82 + 0.2979 HW Percent

Figure 2. Scatter plot of total completed homework averages versus test scores.

Journal. Journaling was done by the researcher throughout the project.

This included reflections on the process of collecting consent forms, creating the

entrance slips, administering, grading, writing feedback, returning the entrance

slips, giving feedback individually to students, frustrations, positive outcomes,

and thoughts for the future.

The journal was read multiple times. The following themes emerged while

coding my journal: (1) teacher preparation, (2) student and teacher reactions, (3)

time, (4) instruction, (5) intervention issues, and (6) missing or late entrance slips.

56

Action speaks louder than words. When it came time to actually DO this

project with the students, it was not without frustrations and excitement.

Seventeen journal entries were made, starting prior to the beginning of the unit as

the paperwork and creation of the entrances slips began and ending after the unit

was over as preparation for the next unit took place.

There were 20 comments about reactions from me or my students, 16

comments about intervention or students who do not get it, 15 references to

teacher preparation issues, 10 references to instructional issues, eight comments

about time, and seven references to missing or late entrance slips. The overall

tone of the journal was positive with a few issues that need to be addressed to

move forward with this method of assessment.

The journaling was done to answer the research question, “Would daily

entrance slips provide sufficient information to adjust instruction to meet the

needs of Algebra I students better than past assessments (specifically homework

checks) had? Due to the positive tone of the journal and self-reflection after the

project was completed with the students, the answer to this research question is a

resounding yes!

Future study. A last thought was to investigate whether there were any

differences between ninth and tenth grade classes and between the morning and

afternoon classes.

57

Boxplots were constructed to look at the spread of scores separated by the

time of day, before or after lunch. Figure 3 shows the entrance slip average

scores. Figure 4 shows the quiz scores. Figure 5 shows the test scores.

The students who were in Algebra I before lunch had less spread in their

scores and higher scores on all types of assessments than those in Algebra I after

lunch. Contributing factors might include attendance as those students involved in

activities often tend to leave for their activities after lunch. As the day progresses,

students get tired and restless, and this might have also affected the scores. Future

study could be done to delve into this subject more deeply.

ES Average

Before Lunch

After Lunch

4.03.53.02.52.01.51.0

Boxplots of ES Average vs Time of Day

Figure 3. Boxplots of entrance slip averages by time of day.

58

Quiz Score

Before Lunch

After Lunch

10090807060

Boxplots of Quiz Score vs Time of Day

Figure 4. Boxplots of quiz scores by time of day.

Test Score

Before Lunch

After Lunch

100908070605040302010

Boxplots of Test Score vs Time of Day

Figure 5. Boxplots of test scores by time of day.

59

Boxplots were constructed to examine the assessment scores separated by

grade level, ninth grade and tenth grade. Figure 6 shows the entrance slip average

scores. Figure 7 shows the quiz scores. Figure 8 shows the test scores.

Grade level doesn‟t seem to affect the median much but the spread of

scores seems to be tighter for the ninth graders.

Ninth grade students taking Algebra I at this school are typically students

who are on the regular track in regard to their mathematics progression. Tenth

grade students taking Algebra I at this school have typically experienced some

difficulty and have taken an Introduction to Algebra math course as ninth graders

to work on skills prior to taking Algebra I.

The marks inside the boxes represent the means. The asterisks outside the

boxes represent outliers. One of the consistent outliers for the ninth grade

morning class was the student who suffered from family issues that occurred

sometime between the quiz and the test. The mean scores tend to be lower than

the median due to the low outliers.

60

ES Average

10

9

4.03.53.02.52.01.51.0

Boxplots of ES Average vs Grade Level

Figure 6. Boxplots of entrance slip averages by grade level.

Quiz Score

10

9

10090807060

Boxplot of Quiz Score vs Grade Level

Figure 7. Boxplots of quiz scores by grade level.

61

Test Score

10

9

100908070605040302010

Boxplot of Test Score vs Grade Level

Figure 8. Boxplots of test scores by grade level.

Interpretation of Results

Would daily entrance slips provide sufficient information to adjust

instruction to meet the needs of Algebra I students better than past assessments

(specifically homework checks) had?

The results of the journal and the survey indicate that not only did the

researcher believe that the entrance slips gave valuable input as to what the

students knew, but the students responded that help given to them due to the

errors they made gave them a better grasp of what they knew, as well as guided

them in the direction of comprehension of the material.

62

I expected that the entrance slips would provide better insight than the

homework checks that were given in the past, and this was true as the project

unfolded. As time went on, the process became easier including the administrative

decisions I had to make. Students learned more than whether they got a problem

wrong or not, as was the case with homework checks. Rather they learned what,

and more importantly why, they got it wrong. Quality versus quantity is at play

here as four quality questions asked frequently with feedback was better than 25

questions asked all at once later.

Even when I was gone two of the days during this project, the entrance

slips were administered correctly and the results helped me to better prepare for

the issues that needed to be addressed upon my return. While journaling, I did not

hold back any comments or thoughts as this project went on. In an effort to decide

if this would be a worthwhile form of assessment in the future, all thoughts, good

and bad, were written down and considered.

Would frequent feedback give my Algebra I students a better sense of

what they know than past assessments (specifically homework checks) had?

The results of the survey showed a positive student response about the

effectiveness of the entrance slips. The positive correlation between the

summative test scores and the entrance slips showed that high test scores were

63

related to high entrance slip averages. Those who did well on the entrance slips

successfully applied their knowledge and skills to the test.

The researcher expected that the one-on-one feedback I provided in

response to the entrance slip results would give students a better sense of what

they knew, but I was not sure if they would appreciate the feedback as much as

the survey comments reflected.

Due to the anonymity of the survey, the students felt free to express their

opinions freely making the results of the survey more trustworthy. Even the

student who stated in the survey that she was absent many of the days of this unit

found the entrance slips to be beneficial.

Summary

The positive response by both the students and me leads to the conclusion

that this type of formative assessment with one-on-one feedback on a consistent

basis is beneficial to students‟ learning. Triangulation occurred as the entrance

slip score improvements, the survey results, and the journaling all led to positive

results for both research questions. The last chapter of this project includes final

conclusions and thoughts about this project.

Chapter Five

Conclusions, Action Plan, Reflections, and Recommendations

“Intentions are fine, but they will not impact results unless and until they

are translated into collective commitments and specific concrete actions” (Eaker

et al., 2007, p. 17).

Conclusions

How do you know when they know it? Due to the positive response of the

entrance slips during this project, entrance slips are the concrete action I plan to

continue in the future to help me and my students “know when they know it.” All

forms of data revealed a positive attitude about the two research questions,

“Would frequent feedback give my Algebra I students a better sense of what they

know than past assessments (specifically homework checks) had?” and “Would

daily entrance slips provide sufficient information to adjust instruction to meet the

needs of Algebra I students better than past assessments (specifically homework

checks) had?” The researcher made the following conclusions based on each

method of data collection.

Daily entrance slips. Table 1 showed the total scores for each entrance

slip. The mean scores followed a trend consistent with an assessment that is

guiding instruction. When new essential learnings were introduced a baseline

grade was established. Each entrance slip that followed assessing that same

essential learning produced a gain in the mean number of points obtained on the

65

following entrance slip. Instructional changes and one-on-one feedback

contributed to the gain in mean points.

The overall mean score for the entrance slips was a score of 2.89 points

out of 4.0. I was “picky” while grading the entrance slips. Small details missed

produced lower scores. I believed students needed to pay attention to all the

details so when the quiz and test was given, they had learned from their mistakes.

Figure 1 showed a positive correlation between the entrance slip grades

and the test scores. The higher the entrance slip grade, the higher the test score.

The outliers indicated that those students who scored a two or lower on the

entrance slips tended to fail the test.

Survey. One of my concerns before this project began was whether

entrance slips would take too much time out of an already busy class period. The

majority of my students, 91.8%, stated that it did not take too much time out class.

At least for this unit, I too found myself feeling like it was not taking too much

time out of class. The time it did take was well worth it to me as well as to the

students. As one student put it on the survey, “I think it‟s definitely worth the time

it takes.”

The most overwhelming response came from the issue of one-on-one

feedback. A large majority of my students, 98.9%, felt that the one-on-one help

benefited them. As one student stated on the survey, “The entrance slip made me

66

realize what I am doing and I liked when Mrs. Hodenfield explained it to me one-

on-one.”

I believe this response, along with the 89.3% of students who stated they

would prefer to stay with entrance slips instead of homework checks, answers the

research question quite well: “Would frequent feedback give my Algebra I

students a better sense of what they know than past assessments (specifically

homework checks) had?” Yes, the students felt that the frequent feedback was

better than homework checks.

The entrance slips seemed to be viewed as a review or practice in many

students‟ eyes. Thirteen students mentioned that it was a good refresher, extra

practice, and a good review or recap of the material. “It reviewed what I learned

the other day and taught me what I didn‟t know” was how one student described

it.

A few students talked about what the entrance slips forced. “It forced you

to become more aware of the curriculum” and “We did them every day and were

always different questions so it forced me to be on my toes” were two of the

comments on this theme.

For me, reading the surveys had some “feel good” moments and some

“feel bad” moments. The following are some of the “feel good” comments: “They

were fun;” “They got my brain working. I did math every day!” and “They are

67

cool.” My favorite comment was, “I felt like I worked more in class, which was

enjoyable and more beneficial.”

I would be remiss if I did not mention some of the negative comments

made by students. It is hard to please everyone, and their voices were heard and

carefully considered. “They are kinda pointless.” “I kinda don‟t like these.” “I

don‟t think they made a huge difference.” “I think I could do fine without them.”

The most thought provoking negative comment for me was made by a student

who said, “The entrance slip didn‟t do much if I didn‟t understand it before.” I

like to think that the comment just mentioned followed the same theme as this

comment, “The slips only checked to see if I had learned the lesson; they didn‟t

teach me anything.” Only with good personalized feedback, can the entrance slip

benefit those who do not grasp the concepts being assessed. I believe the

following student nailed the point of this discussion with this comment, “It wasn‟t

the slips themselves, but the feedback from Mrs. Hodenfield that I found helpful.”

On the subject of homework, there were many comments that were made

that showed great insight on the part of the students. “I would rush to get

homework done and not check my answers so the slips with the one-on-one

confrontation helped me understand.” My favorite homework quote came from a

student who said, “Ironically, I actually bothered to do more of my homework this

chapter in the hopes that I wouldn‟t fail all the entrance slips.”

68

As for whether the entrance slips made students more prepared for the

quiz and test, student responses were varied. The fact that they were allowed to

use the entrance slips while taking the quiz and test gave many students a good

reference to use. Others felt the entrance slips helped prepare them because they

gave them extra practice to get ready.

One student commented, “I came in feeling prepared”. Entrance slips

provided awareness for students. Many students came to class knowing that they

knew the material for the quiz or test. On the flip side, other students came to

class knowing that they did not know the material based on the entrance slip

results. The test and quiz results should not have been a surprise to anyone.

Journal. Emotions ranged during this project. There was frustration trying

to get parent forms back and excitement giving the first entrance slip. I worried

about how the entrance slips were going and I was irritated that students were not

quiet while I provided one-on-one feedback to students. There were happy kids

that enjoyed this new way of finding out what they knew to bored kids who didn‟t

want yet another entrance slip to show their lack of understanding. There were

parents who did not understand the purpose of the entrance slips and gave

students grief about not performing well on them to students who were frustrated

that they could not “retake” an entrance slip. Students were cheering, yes, actually

cheering, when they correctly answered the entrance slip questions to a huge

69

admittance by one student that even though the entrance slips were a pain, they

were helping him understand the concepts. As a teacher, I was exhausted some

days as I tried to get around the room providing the feedback necessary for

student success. This exhaustion was counterbalanced with a great feeling of

satisfaction that I knew what my students understood. That knowledge led to

concerns as to what to do with those students who did not understand the

concepts. I had questions as to whether I should put the students who did not

understand in intervention now or later.

Intervention at this school is assigned to students who need additional

help. They are scheduled into after school intervention by the teacher or

counselor. If a student is failing, then intervention is mandatory. If a student is not

failing but is behind or struggling, a teacher may assign intervention. Once the

student is passing and/or the teacher feels that student is at an acceptable level of

understanding, intervention is then dropped for that student.

The intervention decisions seemed to be the toughest decisions. Not all

students confess that they do not understand the information being taught. I didn‟t

want to jump to conclusions too soon as some students need more time to

comprehend new material. Ignorance is bliss; awareness requires action. Some

sort of rule or guideline needed to be determined to help make those tough

decisions about when intervention needed to be assigned to struggling students.

70

In the beginning of this unit, the amount of time invested was large as the

decisions were made as to how to incorporate the entrance slips. Grading

decisions and entrance slip creation decisions as to what I really needed to know

topped the list of time consuming activities. As time went on, those decisions

were made quicker as I made it work for me and my students. Every classroom is

different, and a teacher must make it “their own” in a way that is comfortable for

all.

I was concerned about being overwhelmed by the amount of time

correcting these entrance slips on a daily basis would take. I quickly realized that

the time it took was very manageable.

Parents need to be brought on board right away. Parental education is

crucial so kids aren‟t getting in trouble for getting poor entrance slip grades. If the

pressure of entrance slip grades is put on kids, cheating, stress and other negative

issues begin. During this project I entertained the idea of not reporting the

entrance slip scores, but later decided I needed the support and awareness of

parents.

I considered having some sort of recording document for students to use to

monitor their entrance slip grades. Possible ways to do this included a grid where

a line graph could be created using the essential learnings as the horizontal axis

and the entrance slip score as the vertical axis. I decided this would be a time-

71

consuming task to get these created efficiently for the first year, but once it is

done, it will be easy to later incorporate year after year.

The flashbacks that were included on the entrance slips proved to be good

review for the final test. Retention in math is important, and they were a good

way to work on that. Stating the chapter from which the flashback question

originated is a change that will have to be made on future entrance slips. By

telling students the flashback‟s chapter, they were able to go back in their notes

and refresh their memories on the skill being assessed.

As I wound down my thoughts from the journal, I realized that instead of

“entrance slips,” these assessments should be called “Awareness Slips.” With this

new name, maybe parents and students would then better understand the point of

this formative assessment. How do I know when my students know it? How do I

become more aware of what is going on in their brains? Consistent, frequent

formative assessments such as entrance slips is a definite step in the right

direction.

Quiz, test, and homework completion. It has always been my

understanding that the amount of homework a student should do varies by

student. The goal is to get a student to automaticity. A student knows how to

perform a skill when it can be done automatically without spending large amounts

of time figuring it out. Some students are naturally gifted and can get to

72

automaticity quickly. Other students need more time and effort to get to

automaticity. As with any skill, if you don‟t use it you lose it, so to retain

automaticity, you must revisit the skill often. It is with this thought in mind that I

chose to not include the homework completion grade in the students‟ grades.

Having the data of how many problems were completed provided me with the

information necessary to make an informed decision as to whether the student had

done enough practice to be successful at a task. If homework is not getting done

and quiz and test scores are below acceptable levels, then more practice is

necessary. On the other hand if homework is not getting done but quiz and test

scores are satisfactory, then all is well in my train of thought, as that student does

not need as much practice as others. If another student is completing their

homework and is still not performing well on the quiz and test, then instruction

needs to change. When homework completion was compared to test scores, it was

determined that there was a positive correlation between the amount of homework

completed and the test scores.

The homework completion average was 77.05%, the quiz average was

90.07%, and the test average was 86.62%. The numbers tell me that for most

students, even if they didn‟t completely finish their homework, they performed

well on the quiz and test.

73

Action Plan

I intend to continue the use of entrance slips as one of many ways to

formatively assess my students. I will modify a few things in an effort to “tweak”

this method of formative assessment. I will not force all entrance slips to be the

same size, each having four problems. Keeping each entrance slip the same size

sometimes required extra effort or was inconvenient. It was done for this project

for the purpose of comparison of data. I will continue to use the same amount of

points by making the questions have four tasks that I want to assess. This may be

accomplished by having one or two problems with multiple parts at times.

I will begin to use entrance slips right away at the beginning of the year

next year and will discontinue the use of homework checks as a way to determine

what my students know. Early student and parental education will be crucial.

Many students will need to get used to this way of being assessed. Taking the

focus off the grade on their homework and putting the focus on what they know

and don‟t know will take time for students and parents to get used to.

Homework checks were snapshots of what my students knew. Not only

were they untimely, telling me what my students knew after too many days had

passed, but also they did not tell me how much homework my students had done

which then did not tell me if they were working as hard as they needed to in order

74

to succeed. These two problems were the main reasons for trying this new way

assessing my students.

On the other hand, completion checks serve a different purpose. They are

designed strictly for the purpose of finding out how much homework students

have attempted. I will continue to exclude the homework completion scores as

part of their grades but will continue to report them to determine any students who

need to practice more due to lower quiz and test scores. I will not record the

flashback score but will ask the students to track their progress on the entrance

slip grades as well as the flashback questions. This will be done on a document

provided to the students which will be collected and used during parent-teacher

conferences. Not only will self-tracking let them see any trends they may have but

will also make them aware of any missing entrance slips that need to be made up.

Students who choose to come in for help on any essential learnings will be

given another chance to take a new entrance slip over the same material. As the

research indicated, struggling students need more documentation as to the levels

of their abilities. The new score will replace the old score. If this proves to be too

overwhelming, this may need to change. A folder will be required for students to

keep their entrance slips and tracking sheet. Once a unit is completed, the students

will staple all of it together and put it aside until the semester test. The entrance

slips will become a good review for the semester test.

75

Because of this project, I have no doubt that entrance slips give me and

my students a better sense of what they know on a consistent basis. My new

addition to the action plan I used in this research will be a recording document

that students will use to track their progress on the entrance slips. I plan to use this

document at parent-teacher conferences as a tool to show parents what their child

knows.

Reflections and Recommendations for Other Teachers

Our district‟s motto is “Empowering students to succeed in an ever

changing world.” Improving yourself as a teacher is a constant challenge for all

instructors. As the years go by, the world changes, students change, and teachers

must change with them. Forms of assessment used in the past may no longer work

in the present and future. Formative assessment makes sense. The focus of

assessment should be on learning, not grading. No one wants to be graded when

they are trying to learn something new. We all want time to “figure it out” before

a grade is given, whether it is in life skills or educational skills. Kids today are

under so many more pressures than in the past. Taking some pressure off them

while they are learning so they CAN learn has to be beneficial.

Consistency is another important issue here. Without consistent feedback,

it is difficult to learn. Consistent feedback thrives on making personal connections

76

with students so trust can be obtained. All of these issues are prevalent with the

use of entrance slips.

The use of entrance slips came with excitement and frustration. The

energy level necessary for this form of assessment is high. Some days I wanted to

blow off the entrance slips and just teach my lesson and let them work. To keep

me motivated and to remind me how much students appreciated the personal

connection, I am going to enlarge and laminate some of the positive student

responses regarding the one-on-one help I gave them during this project. I will put

them around my desk so I can see them every day.

My advice to anyone who wants to try this form of assessment is be

flexible, be eager to make personal connections with kids, and stay organized.

The correcting did not become an issue for me as the small number of problems

on the entrance slips did not take too long to grade. Take baby steps. Teachers

could try it for one unit and see how they and their students feel. They should

conduct their own action research projects and use my survey or just have an open

discussion with the students about how they feel about being assessed this way.

Of course there will be some students who do not like it. The only thing some

would like is if you just left them alone. My favorite comment came from a kid

who told me that even though the entrance slips were a “pain,” they helped him

learn. Step outside the box and try it!

77

Summary

Nothing is perfect, including this method to formatively assess students‟

Algebra I knowledge. The benefits of entrance slips far outweigh the drawbacks.

Seeing is believing, and as the results have shown, entrance slips made a positive

difference for many of my students. As the teacher in all of this, entrance slips

also made a positive difference for me.

The most telling results in all of this were the survey results. How the

students felt was the most important data to me. There are many factors that

influence the scores—family issues, attendance, motivation—the list goes on, but

the survey results and comments had nothing to do with all of that. The survey

responses were students‟ personal impressions on how this project impacted how

they learned. When 98% of my students told me that they benefited from my one-

on-one help, then I am going to continue doing just that. When students are

comfortable and happy in a classroom setting, then more learning can be

accomplished. Maybe some scores were not awesome, but if I can create a

happier classroom, I am going to do it. I believe entrance slips can contribute to

that good atmosphere. As one of my students put it, “I felt like I worked more in

class, which was enjoyable and more beneficial”. Enough said.

78

References

Ayala, C. C., Shavelson, R. J., Ruiz-Primo, M. A., Brandon, P. R., Yin, Y.,

Furtak, E. M., & Young, D. B. (2008). From formal embedded

assessments to reflective lessons: The development of formative

assessment studies. Applied Measurement in Education, 21, 315-334.

doi: 10.1080/08957340802347787

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment

in Education, 5(1), 7-75. doi: 10.1080/0969595980050102

Black, P., & Wiliam, D. (2003). In praise of educational research: Formative

assessment. British Educational Research Journal, 29(5), 623-637.

doi: 10.1080/0141192032000133721

Bligh, J. (2001). Assessment: The gap between practice and theory. Medical

Education, 35, 312. doi: 10.1046/j.1365-2923.2001.00933.x

Clark, I. (2008). Assessment is for learning: Formative assessment and positive

learning interactions. Florida Journal of Educational Administration &

Policy, 2(1), 1-16.

Connor-Greene, P. A. (2000). Assessing and promoting student learning: Blurring

the line between teaching and learning. Teaching of Psychology, 27(2),

84-88. doi: 10.1207/S15328023TOP2702_01

79

Dineen, F. N. (1992). The effect of testing frequently upon the achievement of

students in high school mathematics courses. School Science Mathematics,

89(3), 197-200. doi: 10.1111/j.1949-8594.1989.tb11910.x

Doty, G. (2008). Focused assessment enriching the instruction cycle.

Bloomington, IN: Solution Tree.

Driscoll, M. (1999). Crafting a sharper lens: Classroom assessment in

mathematics. In M. Solomon (Ed.), The diagnostic teacher. Constructing

new approaches to professional development (pp. 78-103). New York:

Teachers College Press.

DuFour, R., DuFour, R., Eaker, R., & Karhanek, G. (2004). Whatever it takes:

How professional learning communities respond when kids don’t learn.

Bloomington, IN: Solution Tree.

Eaker, R., DuFour, R. & DuFour, R. (2007). A leader’s companion. Bloomington,

IN: Solution Tree.

Fluckiger, J., Tixier, Y., Vigil. Y., Pasco, R., & Danielson, K. (2010). Formative

feedback: Involving students as partners in assessment to enhance

learning. College Teaching, 58(4), 136-140. doi: 10.1080/87567555.

2010.484031

Foegen, A. (2008). Algebra progress monitoring and interventions for students

with learning disabilities. Learning Disability Quarterly, 31(2), 65-78.

80

Frey, B. B. & Schmitt, V. L. (2007). Coming to terms with classroom assessment.

Journal of Advanced Academics, 18(3), 402-423. doi: 10.4219/jaa-2007-

495

Gareis, C. R., & Grant, L. W. (2008). Teacher-made assessments. Larchmont

NY: Eye On Education.

Haigh, M. (2007). Sustaining learning through assessment: An evaluation of the

value of weekly class quiz. Assessment & Education in Higher Education,

32(4), 457-474. doi: 10.1080/02602930600898593

Harlen, W. (2005). Teachers‟ summative practices and assessment for learning-

tensions and synergies. The Curriculum Journal, 16(2), 208. doi:

10.1080/09585170500136093

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational

Research, 77(1), 81-112. doi: 10.3102/003465430298487

Kika, F. M., McLaughlin, T. F., & Dixon, J. (1992). Effects of frequent testing of

secondary algebra students. Journal of Educational Research, 85(3), 159-

162. doi: 10.1080/00220671.1992.9944432

Marzano, R. J. (2006). Classroom assessment & grading that work. Alexandria,

VA: Association for Supervision and Curriculum Development.

Marzano, R. J. (2010). Formative assessment & standards-based grading.

Bloomington, IN: Solution Tree.

81

National Council of Teachers of Mathematics. (1989). Curriculum and evaluation

standards for school mathematics. Reston, VA: Author.

National Council of Teachers of Mathematics. (1991). Professional standards for

school mathematics. Reston, VA: Author.

National Council of Teachers of Mathematics. (2005). Grades 9-12 mathematics

assessment Sampler. Reston, VA: Author.

Newton, P. E. (2007). Clarifying the purposes of educational assessment.

Assessment in Education, 14(2), 149-170. doi: 10.1080/

09695940701478321

O‟Connor, K. (2007). A repair kit for grading 15 fixes for broken grades.

Portland, OR: Educational Testing Service.

Schmidt, W. H., McKnight, C. C., & Raizen, S. A. (1996). Splintered vision: An

investigation of U. S. science and mathematics education: Executive

summary. Lansing MI: U.S. National Research Center for the Third

International Mathematics and Science Study, Michigan State University.

Shirvani, H. (2009). Examining an assessment strategy on high school

mathematics achievement: Daily quizzes vs. weekly tests. American

Secondary Education, 38(1), 34-45.

Stiggins, R. (2002). Assessment crisis: The absence of assessment for learning.

Phi Delta Kappan, 83(10), 758-767.

82

Stiggins, R., & Bridgeford, N. (1985). The ecology of classroom assessment.

Journal of Educational Measurement, 22(4), 271-286. doi: 10.1111/

j.1745-3984.1985.tb01064.x

Tomlinson, C. A., & McTighe, J. (2006). Integrating differentiate instruction &

understanding by design. Alexandria, VA: Association for Supervision

and Curriculum Development.

Watt, H. M. G. (2005). Attitudes to the use of alternative assessment methods in

mathematics: A study with secondary mathematics teachers in Sydney,

Australia. Educational Studies in Mathematics, 58(1), 21-44.

Young, V. M., & Kim, D. H. (2010). Using assessments for instructional

improvement: A literature review. Education Policy Analysis Archives,

18(19), 1-37.

Appendices

84

Appendix A

Parental/Guardian Consent Form

Invitation to Participate

Your student is invited to participate in a study being conducted by Colleen

Hodenfield, a Minot State University graduate student and teacher at Central

Campus High School. The study will incorporate the use of a daily entrance slip.

Entrance slips are a type of assessment where the students respond to questions

relating to the prior day‟s lesson. Students will be given a chance to ask questions

at the beginning of the hour to clear up any misunderstandings prior to being

given the entrance slip.

Basis for Subject Selection

Your student has been selected because he or she is in my Algebra I class and will

participate in my project by completing daily entrance slips, homework, a mid-

chapter quiz, a unit test, and a survey at the end of the unit. In the event you do

not want your child to participate in this project, the student‟s data will not be

included in my results and your student will not complete the survey, but your

student will still be asked to complete the entrance slips, homework, quiz, and test

since these are regular parts of my course. The data collected from my Algebra I

students will be used to determine if this type of tracking student knowledge

helped my students learn better than previous methods.

Purpose of the Research

I am currently completing work towards my Master of Education degree through

Minot State University. For my final degree requirement, I am conducting an

action research project for approximately five weeks during the fourth quarter to

determine whether the use of entrance slips are more beneficial than homework

checks. An entrance slip is a type of assessment where your student responds to

questions relating to the prior day‟s lesson and gets feedback about their

solutions.

Specific Procedures

This study involves students in my four Algebra I classes who will complete daily

entrance slips as a way for us to assess their understanding of the essential

learnings being discussed. Essential learnings are the state standards found to be

most important for Algebra I. I will use daily entrance slips in place of homework

85

checks to determine whether there is any relationship between the entrance slip

grades and the grades received on the quiz and test. I will also keep a journal of

my impressions of the effectiveness of this project. Students will be surveyed at

the end of the project to get their perceptions of entrance slips versus homework

checks. My results will be summarized and included in my research paper. None

of the students in my class will be identified by name in my results. The principal,

Mr. Altendorf of Minot High School Central Campus, and Mr. Marquardt,

assistant superintendent of Minot Public School District, have both approved this

research study.

Duration of Participation

Your student will participate in entrance slips for approximately five weeks

during quarter four of the academic school year. He or she will complete a survey

at the conclusion of this project.

Benefits to the Individual

There are no direct benefits in participating in this study. Not participating in this

project will only mean the data from your student will not be included in the final

results.

Risks to the Individual

The risks to your student are no more than he/she would encounter in a regular

classroom setting.

Confidentiality

All data will be treated confidentially by the researcher. Names of participants

and their data sets will be kept in a locked filing cabinet in the researcher‟s office

or on a password-protected computer and will be destroyed once the project has

been defended and approved. The researcher agrees to maintain strict

confidentiality, which means your student‟s name will not be discussed or

divulged with anyone outside of this research project. The researcher will also

make sure confidential information will not be discussed in an area that can be

overheard that would allow an unauthorized person to associate or identify the

student with such information.

Voluntary Nature of Participation

During this study, your student‟s survey and scores do not have to be included in

the final results. However, I hope you approve of your student being involved in

86

this study because a large sample size improves the accuracy of the results of my

study. If you decide to allow your child to participate, you are free to withdraw

your consent at any time. If you want to withdraw your consent, write me a short

note about why you want to withdraw and sign it. This will in no way affect your

student‟s grade. If you do not consent or withdraw your consent, your student‟s

data will not be included in my results and your student will not complete the

survey, but your student will still be asked to complete the entrance slips, quiz,

and test since these are regular parts of my course.

Human Subject Statement

The Institutional Review Board of Minot State University has given me

permission to conduct this research. If you have questions regarding the rights of

human research subjects, please contact the Chairperson of the MSU Institutional

Review Board (IRB), Brent Askvig at 701-858-3051 or

[email protected].

Offer to Answer Questions

If you have any questions or concerns now or during the study, feel free to contact

me at 701-857-4660 or email me at [email protected], Mr.

Keith Altendorf at 701-857-4660, or Mr. Arlyn Marquardt at 701-857-4423.

Consent Statement

You are voluntarily making a decision whether or not your student can participate

in this study. With your signature below, you are indicating that upon reading and

understanding the above information, you agree to allow your student‟s survey,

entrance slip results, and end of the chapter test results to be used in this study.

You will be given a copy of the consent form to keep.

______________________________________________________________

Participant (Please Print Student’s Name)

_____________________________________________ ________________

Signature of Parent or Guardian Date

_____________________________________________ ________________

Signature of Researcher Date

87

Appendix B

Youth Assent Letter

Invitation to Participate

You are invited to participate in a study being conducted by Colleen Hodenfield, a

Minot State University graduate student and teacher at Central Campus High

School. The study will incorporate the use of a daily entrance slip. Entrance slips

are a type of assessment where the students respond to questions relating to the

prior day‟s lesson. You will be given a chance to ask questions at the beginning of

the hour to clear up any misunderstandings prior to being given the entrance slip.

Basis for Subject Selection

You have been selected because you are in my Algebra I class. You will complete

daily entrance slips, homework, a mid-chapter quiz, a unit test, and a survey at the

end of the unit. By choosing to participate, you are allowing me to use these

scores as part of my research. The data collected from you and the rest of your

classmates in my Algebra I classes will be used to determine if entrance slips

helped you learn better than previous methods. You will also be asked to

complete a survey at the end of the unit to determine any changes you have

noticed.

Purpose of the Research

I am currently completing work towards my Master of Education degree through

Minot State University. For my final degree requirement, I am conducting an

action research project for approximately five weeks during the fourth quarter to

determine whether the use of entrance slips are more beneficial than homework

checks. An entrance slip is a type of assessment where you respond to questions

relating to the prior day‟s lesson.

Specific Procedures

During this study you will be asked to complete daily entrance slips as a way for

you and me to assess your understanding of the essential learnings being

discussed. Essential learnings are the state standards found to be most important

for Algebra I. I will use daily entrance slips in place of homework checks to

determine which one helped you to be better prepared for your mid-chapter quiz

and unit test. I will also keep a journal of my impressions on the effectiveness of

this project. You will be surveyed at the end of the project to get your perceptions

of entrance slips versus homework checks. My results will be summarized and

included in my research paper. No one in the class will be identified in my results.

88

The principal, Mr. Altendorf of Minot High School Central Campus, and Mr.

Marquardt, assistant superintendent of Minot Public School District, have both

approved this research study.

Duration of Participation

You will participate in entrance slips during quarter four for approximately five

weeks of the academic school year. You will be expected to complete a survey at

the conclusion of this project. Your name will not be included on the survey.

Benefits to the Individual

There are no direct benefits in participating in this study, but participation will

give you quick results and explanations which can be used to help you learn the

information better.

Risks to the Individual

The risks to you are no more than you would encounter in a regular classroom

setting.

Confidentiality

All data will be treated confidentially by the researcher. Names of participants

and their data sets will be kept in a locked filing cabinet in the researcher‟s office

and will be destroyed once the project has been defended and approved. The

researcher agrees to maintain strict confidentiality, which means your name will

not be discussed or given to anyone outside of this research project. The

researcher will also make sure confidential information will not be discussed in an

area that can be overheard that would allow an unauthorized person to associate

or identify you with such information.

Voluntary Nature of Participation

During this study you may decide not to have your survey and scores included in

the final results. However, I hope you approve of being involved in this study

because more students‟ data improves the accuracy of the results of my study. If

you decide to participate, you are free to withdraw your consent at any time. If

you want to withdraw your consent, write me a short note about why you want to

withdraw and sign it. This will in no way affect you or your grade. If you do not

consent or withdraw your consent, your data will not be included in my results

and you will not complete the survey, but you will still be asked to complete the

entrance slips since these are a regular part of my course.

89

Human Subject Statement

The Institutional Review Board of Minot State University has given me

permission to conduct this research. If you have questions regarding the rights of

human research subjects, please contact the Chairperson of the MSU Institutional

Review Board (IRB), Brent Askvig at 701-858-3051 or

[email protected].

Offer to Answer Questions

If you have any questions or concerns now or during the study, feel free to contact

me at 701-857-4660 or email me at [email protected], Mr.

Keith Altendorf at 701-857-4660, or Mr. Arlyn Marquardt at 701-857-4423.

Consent Statement You are voluntarily making a decision whether or not you can participate in this

study. With your signature below, you are indicating that upon reading and

understanding the above information, you agree to allow your survey, entrance

slip results, and end of the chapter test results to be used in this study. You will be

given a copy of the consent form to keep.

______________________________________________________________

Participant (Please Print Student’s Name)

_____________________________________________ ________________

Signature of Student Date

_____________________________________________ ________________

Signature of Researcher Date

______________________________________________________________

90

Appendix C

Principal Letter

Dear Mr. Altendorf:

I am completing work toward the Master of Education degree through

Minot State University. As a degree requirement, I am to conduct a research

project in my classroom during the fourth quarter this year. I am planning to

implement daily entrance slips which will formatively assess essential learnings in

algebra. To accomplish this, I would like to work with the students in my algebra

classes. An entrance slip is a type of assessment where students respond to

questions relating to the prior day‟s lesson.

This study involves students in my four Algebra I classes who will

complete daily entrance slips as a way for me and my students to formatively

assess their understanding of the essential learnings being discussed. I will use

daily entrance slips in place of homework checks to determine which one has a

more positive impact on students‟ mid-chapter quiz grades and unit test grades. I

will also keep a journal of my impressions on the effectiveness of this project.

Students will be surveyed at the end of the project to get their perceptions of

entrance slips versus homework checks. My results will be summarized and

included in my research paper. Classroom and student confidentiality will be

observed regarding all data collected and no individual will be identified by name.

Before the study begins, I will send home consent forms for

parents/guardians to notify them of this project and request their permission

allowing their student to participate in the research study. A copy of this letter is

attached for your inspection.

I am requesting that you permit me to carry out this research in my

classroom. Please contact me if you have any questions. Thank you for your

consideration.

_______ I grant permission for Colleen Hodenfield to conduct the above

mentioned research in her classroom.

_______ I DO NOT grant permission for Colleen Hodenfield to conduct the

above mentioned research in her classroom.

___________________________________________ ______________________

Signature of Mr. Keith Altendorf, Date

91

Appendix D

Assistant Superintendent Letter

Dear Mr. Marquardt:

I am completing work toward the Master of Education degree through

Minot State University. As a degree requirement, I am to conduct a research

project in my classroom during the fourth quarter this year. I am planning to

implement daily entrance slips which will formatively assess essential learnings in

algebra. An entrance slip is a type of assessment where students respond to

questions relating to the prior day‟s lesson. To accomplish this, I would like to

work with the students in my algebra classes.

This study involves students in my four Algebra I classes who will

complete daily entrance slips as a way for me and my students to formatively

assess their understanding of the essential learnings being discussed. I will use

daily entrance slips in place of homework checks to determine which one has a

more positive impact on students‟ mid-chapter quiz grades and unit test grades. I

will also keep a journal of my impressions on the effectiveness of this project.

Students will be surveyed at the end of the project to get their perceptions of

entrance slips versus homework checks. My results will be summarized and

included in my research paper. Classroom and student confidentiality will be

observed regarding all data collected and no individual will be identified by name.

Before the study begins, I will send home consent forms for

parents/guardians to notify them of this project and request their permission

allowing their student to participate in the research study. A copy of this letter is

attached for your inspection.

I am requesting that you permit me to carry out this research in my

classroom. Please contact me if you have any questions. Thank you for your

consideration.

92

_______I grant permission for Colleen Hodenfield to conduct the above

mentioned research in her classroom.

_______I DO NOT grant permission for Colleen Hodenfield to conduct the above

mentioned research in her classroom.

___________________________________________ ______________________

Signature of Mr. Arlyn Marquardt, Date

93

Appendix E

Student Survey

Instructions: For each statement, checking the most appropriate box:

1. My Algebra I class is at the following time of day.

□ Before lunch □ After lunch

2. Indicate your grade.

□ Grade 9 □ Grade 10

3. Indicate your gender.

□ Male □ Female

4. I am on an IEP (Individualized Education Plan).

□ Yes □ No

5. I am in High School Prep.

□ Yes □ No

6. The grade on my entrance slips was generally a:

□ 1 □ 2 □ 3 □ 4 □ Varied from day to day

Instructions: Please respond to each of the following statements. Indicate your

level of agreement by circling the appropriate response on the right and

explaining your answer.

SA = Strongly Agree, A = Agree, D = Disagree, SD = Strongly Disagree

Question SA A D SD

7. The entrance slips took too much time out of class.

Explain your answer: SA A D SD

94

8. The entrance slips made me learn the essential

learnings better.

Explain your answer: SA A D SD

9. The entrance slips made me more prepared for the mid-

chapter quiz.

Explain your answer: SA A D SD

10. The entrance slips made me more prepared for the

chapter 10 test.

Explain your answer: SA A D SD

11. The entrance slips made me nervous.

Explain your answer: SA A D SD

12. The entrance slips helped me understand what I was

doing better than homework checks.

Explain your answer:

SA A D SD

13. Mrs. Hodenfield‟s one-on-one help on what I did

wrong on the entrance slips was more beneficial than

homework checks.

Explain your answer:

SA A D SD

14. If I had a choice, I would stay with entrance slips

instead of homework checks.

Explain your answer:

SA A D SD

95

15. I check my homework assignments often to make sure

I have done the problems correctly.

Explain your answer:

SA A D SD

16. I like doing homework better than doing entrance

slips.

Explain your answer:

SA A D SD

17. The homework assignments helped me prepare for the

chapter 10 tests.

Explain your answer:

SA A D SD

18. I did less homework during chapter 10 than I did in

the past chapters in Algebra.

Explain your answer: SA A D SD

19. I am more aware of what the essential learnings are

for Algebra I.

Explain your answer: SA A D SD

96

Appendix F

Entrance Slip #1

97

Appendix G

Entrance Slip #2

98

Appendix H

Entrance Slip #3

99

Appendix I

Entrance Slip #4

100

Appendix J

Entrance Slip #5

101

Appendix K

Entrance Slip #6

102

Appendix L

Entrance Slip #7

103

Appendix M

Entrance Slip #8

104

Appendix N

Quiz Sections 10.1-10.3

105

106

Appendix O

Test Chapter 10

107

108

109

Appendix P

IRB Approval Letter