21
PHS CARP Stacey, Cameron, Matt, Cat, Brennan

Short and sweet

Embed Size (px)

Citation preview

Page 1: Short and sweet

PHS CARPStacey, Cameron, Matt, Cat, Brennan

Page 2: Short and sweet

Introduction Population: • Patuxent High School in Lusby, MD (Calvert County)• Biology, Social Studies, English Need: According to Principal Highsmith + SIP. . .

Students have difficulty taking tests because they don’t understand the language or intent of questions

Purpose: • To determine if explicitly teaching test-taking skills

will improve students’ test-taking skills

Page 3: Short and sweet

Research Questions1. Did our strategies help students

become better test takers?

2. Do students use the strategies that we suggest?

Page 4: Short and sweet

Strategy, Rationale, and Justification1. Key Words

Circle the important words that show what the question is asking.

2. Rephrase Test Questions Student puts question in his/her own words

Our strategy may ameliorate test-taking ability for all disciplines which may lead to improvement in standardized tests

Peer-reviewed research supports the specific test-taking strategies that we have chosen (Chittooran & Miles 2001)

Page 5: Short and sweet

MethodsAdministering pre-test of test-taking strategies

Explicitly teach strategies (one per week)

Continue with strategies throughout the week(s)

Observe class and collect student work to look for strategy use

Administer post-test

Run T-tests to look for significance

Page 6: Short and sweet

Data Collection PlanPre/Post Assessment

Student Work Shadowing

Did our strategies help students become better test takers?

Survey using Likert Scale

Looking for evidence of strategy in student work. Ex. Circle, highlight, underline

Do students use the strategies that we suggest?

Looking for evidence of strategy on tests and quizzes Ex. Circle, highlight, underline

Shadow and take observational notes as students work independently or take tests

Page 7: Short and sweet

Pre-Post AssessmentUsed a Likert Scale questionnaire and configured our

survey results so that a “5” was always positiveSample question: 1. How often do you circle or underline parts of a test question to help you answer that question?

1 – Never2 – Not often3 – Sometimes4 – Very often5 - Always

Page 8: Short and sweet

Pre-Post AnalysisUsed a unpaired T-Test to analyze any significance of

student responses between pre/post-tests

Compared both sets of test results for each class using a T-Test (alpha level 0.05)

Compiled data across all surveyed classes and compared pre and post results using a T-Test (alpha level 0.05)

Page 9: Short and sweet

Class Data – Ms. Meyer 1st period standard biology n=15Pretest Average = 18.5Posttest Average = 17.8p > 0.05 so the treatment effect was not significant

This means that implementing implicit and explicit instruction with our two test-taking strategies did not produce a significant effect.

Page 10: Short and sweet

Observations/Student WorkI observed very few students using the strategies during

class work.

Results were slightly better when I looked at students quizzes and observed students while testing which was encouraging. Ex. Organelle vs. Process

Page 11: Short and sweet

Class Data – Mr. StoneData from 4 Periods of Academic Englishn = 76Pre Test Average = 16.50Post Test Average = 17.42p = 0.034

Since p < 0.05, this means that implementing implicit and explicit instruction in our two test-taking strategies did produce a significant effect.

Page 12: Short and sweet

Observations/Student WorkI observed a moderate amount of students using the

strategies during daily class work.

However, during exams when students were reminded of the strategies and encouraged to use them I observed a greater number of students applying the strategies.

Page 13: Short and sweet

Class Data – Mr. DavisData from 3 periods of Honors/Pre-AP World Historyn= 72Pre-Test Average= 17.6Post-Test Average= 18.92p= 0.14529p>0.05, so the treatment did not produce a significant

result.Note: a significance was found in my 6th period (p=

0.00405)

Page 14: Short and sweet

Observations/Student WorkI observed a moderate amount of students using the

strategies during class work. However, some of these students were using these strategies before explicit instruction.

I was unable to observe student behavior on tests and quizzes as none were taken during the project’s 4-week timeline.

Page 15: Short and sweet

Class Data – Ms. Holland 4th and 7th period AP Literature and 5th/6th double-period

English StandardN = 37Pre-test average = 17.702Post-test average = 16.783p> 0.05 so the findings were not significant

Implementing two explicit test-taking strategies did not produce significant results.

Page 16: Short and sweet

Observations/Student Work Some students used strategy #1, but it is unclear whether

they used this strategy prior to our intervention or not. Students were reluctant to even attempt strategy #2. I observed some students in the English 10 Standard class

use strategy #1 while taking Benchmark #1, a county-wide multiple-choice test.

I observed some AP Literature students using strategy #1 on the practice AP tests that I distributed to the class, but they only took vocabulary tests during the 3-week period of our intervention.

Page 17: Short and sweet

Class Data – Mr. Leischer Data from 3 Periods of Academic Englishn = 55Pre Test Average = 16.62Post Test Average = 16.56p = 0.47Since p > 0.05, this means that implementing implicit and

explicit instruction in our two test-taking strategies did not produce a significant effect.

Page 18: Short and sweet

Observations/Student WorkI observed a small amount of student using the strategies

we taught. However, most of these students were already using similar strategies before instruction.

Observation of strategies took place during district-level benchmark exam and HSA-style question practice sessions.

Page 19: Short and sweet

ConclusionsWe found that the overall effect of our treatment effect is

not significant

n = 256 ; p = 0.0581

Page 20: Short and sweet

ConclusionsIt is unclear whether our test strategies improved student

performance or not.

Evidence is mixed about whether students use the strategies that we taught. Because of the way we collected our data and differences

in n numbers were unable to do the question by question analysis needed to tease apart the data

Page 21: Short and sweet

Implications for ResearchRepeat with more time and greater n

Is there a difference between subjects?

Introduce in earlier grade

Look at differences in test scores comparing students who were taught/used the strategies compared to a control group