112
Science: It’s Elementary Year One Evaluation Report Eric R. Banilower June 2007 (Updated October 2007) Submitted to: Reeny Davison ASSET Inc. 2403 Sidney Street – Suite 800 Pittsburgh, PA 15203 Submitted by: Horizon Research, Inc. 326 Cloister Court Chapel Hill, NC. 27514

Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Science: It’s Elementary

Year One Evaluation Report

Eric R. Banilower

June 2007 (Updated October 2007)

Submitted to: Reeny Davison ASSET Inc. 2403 Sidney Street – Suite 800 Pittsburgh, PA 15203 Submitted by: Horizon Research, Inc. 326 Cloister Court Chapel Hill, NC. 27514

Page 2: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

TABLE OF CONTENTS Page Acknowledgements......................................................................................................................... ii Executive Summary ....................................................................................................................... iii Introduction..................................................................................................................................... 1 Overview of Science: It’s Elementary and the Evaluation ............................................................. 2 Developing a Coherent, Cohesive, and Aligned System to Support Elementary

Science Education Reform ....................................................................................................... 5 Baseline Status of School Science Programs...................................................................... 6 The Strategic Planning Institute and Vision Conference.................................................. 10 The Quality of SIE Professional Development............................................................................. 13 Introducing Teachers to Science: It’s Elementary ............................................................ 15 Familiarizing Teachers with the Modules......................................................................... 18 Preparing Teachers to Use Science Notebooks ................................................................ 18 Developing the Content Storyline..................................................................................... 19 Developing the Pedagogical Storyline.............................................................................. 21 Impact of SIE on Teachers and Teaching ..................................................................................... 25 Teacher Preparedness........................................................................................................ 25 Module Implementation.................................................................................................... 27 Student Opportunity to Learn ........................................................................................... 31 Impact of SIE on Students ............................................................................................................ 41 Instrumentation ................................................................................................................. 41 The Sample ....................................................................................................................... 43 Analysis and Results ......................................................................................................... 44 Summary and Recommendations ................................................................................................. 53 Appendices A: District Curriculum Alignment Questionnaire and Summary of Data B: Teacher Questionnaire C: Selected Teacher Curriculum Survey Results D: Composite Definitions E: Regression Tables from Student Achievement Analyses

Horizon Research, Inc. i October 2007

Page 3: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

ACKNOWLEDGEMENTS A number of people contributed to data collection, data analysis, and preparation of this report. Horizon Research, Inc. employees: Alison S. Bowes, Kimberley W. Cohen, Sherri L. Fulp, Shireline Green, Susan B. Hudson, Joan D. Pasley, Sheila D. Richmond, Sharyn L. Rosenberg, Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data collection and analysis:

Jill Feldman, Marlene Hilkowitz, Vonda Johnson, and Melanie Wills from Research for Better Schools; Cara Ciminillo, Debra Moore, Cindy Tannanis, and Keith Trahan of the Collaborative for Evaluation and Assessment Capacity of the University of Pittsburgh School of Education; and Charlotte Kresge and Kalyani Raghavan, independent education consultants.

Special thanks are due to the staff at the schools participating in the Science: It’s Elementary program who took the time from their busy schedules to provide information about their experiences with the program, and who helped facilitate the various evaluation activities.

Horizon Research, Inc. ii October 2007

Page 4: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

EXECUTIVE SUMMARY The Science: It’s Elementary (SIE) program is overseen by the Pennsylvania Department of Education (PDE) and managed by ASSET Inc. SIE is an initiative aimed at improving elementary science instruction across the commonwealth of Pennsylvania. The program is focused on helping schools and districts implement an inquiry-based, hands-on science education program, with the ultimate goal of improving student learning. To accomplish its goals, SIE provides participating schools with module-based science instructional materials; teacher professional development around those modules and inquiry-based science teaching; and opportunities for strategic planning to help create supportive systems for science education reform. The program has six main components:

1. Strategic Planning Institute: Based on the National Science Resource Center’s LASER model, these institutes are intended to help schools and districts understand, plan for, and successfully implement an inquiry-based elementary science program. Schools participating in SIE sent teams comprised of administrators, teachers, and community partners to the Strategic Planning Institute to learn about science education reform and to develop a three-year strategic plan for their school.

2. Vision Conference: This one-day event for school district and community leaders is

intended to help foster a common, long-term vision for science education among a larger and more diverse group of stakeholders than the Strategic Planning Institute. Attendees included superintendents, district administrators, school board members, principals, and community representatives.

3. Curriculum Showcase: The curriculum showcase provides an opportunity for school

personnel to become familiar with the module-based instructional materials being provided by the program so they can select materials that fit into their school curriculum.

4. Teacher Professional Development: Each teacher responsible for teaching one of the

SIE-provided science modules is expected to attend an initial three-day training focused on that module. This training provides teachers with an opportunity to familiarize themselves with the instructional materials, receive practical tips for using them in their classrooms, and learn teaching strategies such as inquiry, questioning for higher learning, and integrating science and literacy through the use of science notebooks. In the second year of the program, teachers will be expected to attend a two-day follow-up training on the same module to learn more about the related science content and to reflect upon their initial implementation of the module.

5. Delivery of Classroom Science Materials: In addition to the professional development,

the SIE program delivers the science modules directly to schools for teachers to use in their classrooms. After instruction with the modules is finished, the modules are returned and are refurbished for use again the following year.

6. Leadership Conference: This five-day conference is intended to help develop the

capacity of teachers selected by schools participating in SIE to become teacher leaders

Horizon Research, Inc. iii October 2007

Page 5: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

who can mentor and coach other teachers in their schools and districts. The program also hopes that a number of these teacher leaders will be able to serve as module trainers in future years.

In its first year, 1,406 teachers in 75 schools across 67 districts participated in the Science: It’s Elementary program, serving approximately 37,000 students. This report uses data collected from district personnel, principals, teachers, and students participating in the program, as well as from program records. Data sources include questionnaires administered to teachers prior to and following their participation in SIE professional development, as well as an end-of-year and curriculum survey; a district curriculum alignment questionnaire; interviews with district science supervisors, principals, and teachers; observations of SIE professional development and teachers implementing the SIE-provided science modules; and student assessments. Data were used to address six evaluation questions:

System Alignment • How effectively is SIE assisting schools in the development of a coherent, cohesive,

and aligned system to support elementary science education reform? Professional Development

• What is the quality of SIE professional development provided to teachers? Impacts on Teachers and Teaching

• What is the impact of SIE on teacher knowledge and skills? • What is the impact of SIE on teachers’ implementation of the modules? • What is the impact of SIE on student opportunity to learn the Pennsylvania science

standards? Impacts on Students

• What is the impact of SIE on student achievement in science? Major Findings In its first year, the SIE program can be credited with a number of accomplishments. The program has conducted two Strategic Planning Institutes (one for schools in the western part of the state and another for schools in the eastern half) and Vision Conferences; provided schools with opportunities to explore the various modules being offered by SIE; provided three days of professional development to nearly 1,300 teachers across the state; and delivered a ready-to-use science module to each teacher participating in the program.

Horizon Research, Inc. iv October 2007

Page 6: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Overall, teachers, principals, and district staff have reported positive experiences with each of the elements of the SIE program. The Strategic Planning Institutes provided an opportunity for school teams to become familiar with science education reform and gave them the rare opportunity to work together on a plan for improving their science program. The Vision Conferences provided schools the opportunity to broaden the base of stakeholder support for science education reform. The three days of professional development were extremely well received by teachers and generated a great deal of enthusiasm for teaching science from the modules, even among teachers who were skeptical of the program at the start. In addition, the professional development was successful at providing teachers with the knowledge and skills they needed to feel confident in implementing a science module for the first time. All of the observed lessons included some elements of effective practice, though only a few lessons managed to “put it all together.” Common strengths of the lessons included involvement of students with key content as well as collegial interactions among students and respectful contact between the students and the teacher. Areas of weakness included the extent of student intellectual engagement with and sense-making of the content. Students engaged in many hands-on activities, but the amount of intellectual rigor required of the students in most lessons was generally low. Many teachers reported making modifications to the instructional materials during module implementation, often due to time constraints. Some teachers chose to skip entire activities, others cut or added pieces within individual lessons. Modifications to the lessons, often in the form of deleting probing questions or wrap-up discussions, likely hindered students’ opportunity to make sense of the target concepts. Analyses of student achievement data found that instruction by teachers participating in SIE professional development and based on the SIE-provided modules can lead to increased student achievement on assessments designed to measure knowledge of the concepts targeted by the modules. On 4 of the 6 assessment scales (Rocks and Minerals; Electric Circuits; Levers and Pulleys; and Mixtures and Solutions), students receiving instruction using the SIE-provided module related to the scale scored significantly higher than students not receiving instruction from that module, controlling for pre-test scores and student demographics. Most of the effects were sizeable, ranging from 0.61 to 1.03 standard deviations. The student achievement analyses also found that the extent of both module use and participation in SIE professional development were positively correlated with student achievement. Recommendations As is the case with most programs in their early stages, a substantial amount of work remains to be done. In the spirit of a critical friend, HRI offers the following recommendations to assist the program in reflection and planning for the future.

Provide greater support to districts, schools, and teachers in developing articulated curricula that incorporate the SIE-provided modules and that also ensure that all state science standards will be addressed.

Horizon Research, Inc. v October 2007

Page 7: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Although the majority of districts participating in SIE report having curriculum guides to direct teachers in the selection of science topics to teach (most of which were developed pre-SIE), the evaluation data indicate that the enacted curriculum may not align well with Pennsylvania science standards. This lack of alignment is likely due to at least two factors. First, because of the emphasis on reading/language arts and mathematics due to high-stakes testing, science (and other non-tested subjects) has been deemphasized in schools (i.e., what gets tested is what gets taught). Second, elementary teachers often do not feel well prepared to teach science, consequently minimizing the amount of science they teach. Teachers also tend to cover the topics in which they feel comfortable and skip the others. Although the addition of a state science assessment may encourage schools to include more science in their instructional programs, the fact that science is not yet counted towards Adequate Yearly Progress may temper the impact of the science PSSA in this regard. Furthermore, teachers reported difficulties finding time to implement even a single module, and many indicated they have skipped portions of their module because of time constraints. Some indicated that science was taking away from other subjects; others indicated that the module spent too much time on a relatively small subset of the state’s science standards. Adding modules in future years will likely exacerbate this situation, forcing teachers to pick and choose activities from the modules.

Continue to refine the initial three-day professional development provided to teachers.

The professional development sessions HRI observed varied widely in regard to focus on the content storyline in the modules. In some sessions, facilitators consistently made explicit connections between module activities and content goals, but in other sessions they did not. Thus, the program should consider building a structure into the training (i.e., some sense-making activity for each lesson in a module that all facilitators would be expected to do) to help ensure that all participants understand the intended content in each lesson. In addition, there was often a disconnect between portions of the trainings that focused on instructional strategies (e.g., inquiry, the learning cycle) and the module. Better integrating these different aspects will not only help teachers better understand the instructional strategies, but will also help them implement them at appropriate places during their teaching of their module.

The program’s plans for future professional development wisely include a focus on

deepening teachers’ understanding of the content in their first module. In addition, the program should work to strengthen teachers’ vision of effective inquiry-based science instruction.

In order for teachers to effectively use inquiry-based teaching methods, they need to have a clear vision of what inquiry instruction is and how they can use the module in an inquiry-based manner. In the three-day professional development HRI observed, the concept of inquiry was discussed during the first day’s overview, but no clear vision of inquiry ever emerged. Furthermore, inquiry was rarely discussed explicitly after the first day, nor was it related specifically to the module teachers would be using.

Horizon Research, Inc. vi October 2007

Page 8: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Observations of teachers implementing the modules highlight the need to help teachers move from mechanical to purposeful use of the modules. In order to accomplish this goal, the program and the participating teachers need to develop a shared understanding of what effective science instruction looks like. Without such a vision of teaching and learning, professional development cannot be focused on helping teachers work towards that goal. The set of knowledge and skills needed by teachers to achieve this vision becomes the objectives for professional development, and also provides teachers a “gold standard” for reflecting upon their practice. Developing this vision is not an easy or quick task, however, it will be essential if the program is to maximize its impact on science teaching and learning. Examining and discussing video of classroom instruction, using role-plays that provide examples and non-examples of effective teaching, and analyzing student work can all be used to help foster this common vision.

With the necessary addition of a large number of trainers for the program’s second year, the program should consider additional ways to build structures to support their work.

Providing effective professional development requires a great deal of knowledge and skill. First, professional development providers need to have a common vision of the goals of the program (e.g., what the program hopes science instruction will look like if it is successful). Second, future leaders need to have a strong understanding of the content in the modules and how the module activities address that content. Given the number of modules the program will eventually support, and the program’s plan to use this year’s participants as future facilitators, the program should consider ways to help leaders develop the knowledge they need. One possibility would be to have an expert review each module and create annotations that would make the disciplinary and pedagogical content more explicit. Providing this sort of scaffolding would make it less important that each individual facilitator possess deep content knowledge of each module. In addition to helping prepare future leaders, these annotations could be shared with teachers to assist them in learning about and implementing the modules.

Horizon Research, Inc. vii October 2007

Page 9: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

INTRODUCTION This report summarizes the activities and findings of Horizon Research, Inc. (HRI) in its external evaluation of the Science: It’s Elementary (SIE) program in the period September 2006 through June 2007. In the program’s first year, HRI has:

• Developed and administered a baseline questionnaire to collect baseline data on the first cohort of participating teachers and schools;

• Observed one of the two Strategic Planning Institutes conducted by SIE; • Observed a sample of 11 professional development sessions to examine the quality of the

training provided to teachers; • Developed and administered a post-professional development questionnaire to gather

teachers’ opinions of the quality and impact of the professional development; • Developed six assessment scales related to science curriculum modules being provided

by the program and administered them to students in schools participating in the program;

• Developed a classroom observation protocol and conducted 30 classroom observations to examine the fidelity and quality with which teachers were implementing the SIE-provided modules;

• Interviewed a sample of 41 teachers participating in the SIE program1; • Interviewed a sample of 20 principals whose schools are participating in the SIE

program; • Developed and administered an end-of-year questionnaire to examine the impacts of the

SIE program on teachers and their teaching; • Developed and administered a curriculum survey for teachers; • Developed and administered a curriculum alignment survey for district science

supervisors/curriculum coordinators; • Interviewed 17 out of a random sample of 20 district science supervisors/curriculum

coordinators; and • Observed the leadership conference designed to prepare teacher leaders to provide

professional development in future years of the program. In addition, as part of the Year One evaluation HRI plans to interview samples of the Resource Teachers who provided professional development to teachers, and teacher leaders who participated in the leadership training. Data from these interviews will be included in an addendum to this report. After a brief description of the SIE program and the evaluation plan, this report describes the initial status of the teachers, schools, and districts participating in SIE; the nature and quality of the services provided by the program; and initial evidence of the program’s impact on teachers,

1 For each set of interviews, HRI drew a random initial sample and made repeated attempts to interview everyone in the sample. HRI randomly selected additional participants as needed to make up for non-response. To meet the target number of interviewees, 78 teachers and 29 principals were contacted (targets were 40 and 20, respectively).

Horizon Research, Inc. 1 October 2007

Page 10: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

their teaching, and their students. The report concludes with a summary of major achievements and presents HRI’s recommendations for the program.

OVERVIEW OF SCIENCE: IT’S ELEMENTARY AND THE EVALUATION The SIE program is overseen by the Pennsylvania Department of Education (PDE) and managed by ASSET Inc. SIE is an initiative aimed at improving elementary science instruction across the commonwealth of Pennsylvania. The program is focused on helping schools and districts implement an inquiry-based, hands-on science education program, with the ultimate goal of improving student learning. To accomplish its goals, SIE provides participating schools with module-based science instructional materials; teacher professional development around those modules and inquiry-based science teaching; and opportunities for strategic planning to help create supportive systems for science education reform. The program has six main components:

7. Strategic Planning Institute: Based on the National Science Resource Center’s LASER model, these institutes are intended to help schools and districts understand, plan for, and successfully implement an inquiry-based elementary science program. Schools participating in SIE sent teams comprised of administrators, teachers, and community partners to the Strategic Planning Institute to learn about science education reform and to develop a three-year strategic plan for their school.

8. Vision Conference: This one-day event for school district and community leaders is

intended to help foster a common, long-term vision for science education among a larger and more diverse group of stakeholders than the Strategic Planning Institute. Attendees included superintendents, district administrators, school board members, principals, and community representatives.

9. Curriculum Showcase: The curriculum showcase provides an opportunity for school

personnel to become familiar with the module-based instructional materials being provided by the program so they can select materials that fit into their school curriculum.

10. Teacher Professional Development: Each teacher responsible for teaching one of the

SIE-provided science modules is expected to attend an initial three-day training focused on that module. This training provides teachers with an opportunity to familiarize themselves with the instructional materials, receive practical tips for using them in their classrooms, and learn teaching strategies such as inquiry, questioning for higher learning, and integrating science and literacy through the use of science notebooks. In the second year of the program, teachers will be expected to attend a two-day follow-up training on the same module to learn more about the related science content and to reflect upon their initial implementation of the module.

11. Delivery of Classroom Science Materials: In addition to the professional development,

the SIE program delivers the science modules directly to schools for teachers to use in their classrooms. After instruction with the modules is finished, the modules are returned and are refurbished for use again the following year.

Horizon Research, Inc. 2 October 2007

Page 11: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

12. Leadership Conference: This five-day conference is intended to help develop the

capacity of teachers selected by schools participating in SIE to become teacher leaders who can mentor and coach other teachers in their schools and districts. The program also hopes that a number of these teacher leaders will be able to serve as module trainers in future years.

Figure 1 shows the program’s theory of action, i.e., how program activities are intended to fit together to produce the desired short- and long-term outcomes.

Science: It’s Elementary Theory of Action

Figure 1

Horizon Research, Inc. 3 October 2007

Page 12: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

The evaluation plan for SIE was developed by HRI in conjunction with key stakeholders with the goal of examining the major elements of this theory of action. The questions driving the evaluation focus on four main areas: (1) the development of an aligned system to support science education; (2) professional development for teachers; (3) the impact of the program on teachers and their teaching; (4) and the impact of the program on students. The key evaluation questions, by area, are:

System Alignment 1. How effectively is SIE assisting schools in the development of a coherent, cohesive, and

aligned system to support elementary science education reform? Professional Development 2. What is the quality and impact of SIE leadership training?2 3. What is the quality of SIE professional development provided to teachers? Impacts on Teachers and Teaching 4. What is the impact of SIE on teacher knowledge and skills? 5. What is the impact of SIE on teachers’ implementation of the modules? 6. What is the impact of SIE on student opportunity to learn the Pennsylvania science

standards? Impacts on Students 7. What is the impact of SIE on student achievement in science?

This report provides initial answers to these questions based upon the program’s activities in Year One.

2 The quality and impact of SIE leadership training is not addressed in this report as the leadership conference for developing the capacity of teacher leaders was very recently held (June 11–15, 2007). Data regarding this component of the program will be presented in an addendum report.

Horizon Research, Inc. 4 October 2007

Page 13: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

DEVELOPING A COHERENT, COHESIVE, AND ALIGNED SYSTEM TO SUPPORT ELEMENTARY SCIENCE EDUCATION REFORM

In its first year, 1,406 teachers in 75 schools across 67 districts participated in the program, serving approximately 37,000 students. Table 1 shows the distribution of schools across the education regions in Pennsylvania.

Table 1 Distribution of Schools Participating in SIE

Region Number of Schools 1 12 2 4 East { 3 4 4 5 5 12 Central { 6 11 7 12 West { 8 16

Total 76 As can be seen in Table 2, half of the students in these schools qualify for free or reduced price lunch; 13 percent receive special education services and 6 percent are classified as English language learners.3

Table 2 Student Demographics

Percent of Students Race/Ethnicity

American Indian/Alaskan Native 0 Asian/Pacific Islander 2 Black (non-Hispanic) 16 Hispanic 9 White (non-Hispanic) 73

Free/Reduced-Price Lunch Eligible 50 Receive Special Education Services 13 English Language Learners 6 Migrant Family 1

One goal of the SIE program is helping schools and districts develop an aligned system for science education reform. The major components of the program for accomplishing this goal are the Strategic Planning Institute (SPI) and Vision Conference. In order to assess the impact of SIE on school and district science program, it is necessary to understand the status of those 3 Data are from PDE’s mid-year report on the SIE program.

Horizon Research, Inc. 5 October 2007

Page 14: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

programs prior to their participation in SIE. District and teacher surveys, and interviews with district science supervisors, principals, and teachers provide insight into the status of the schools and districts as they entered the SIE program. Baseline Status of School Science Programs On a district curriculum alignment questionnaire completed by 59 of the 67 districts participating in SIE, only 40 respondents indicated that their district had a curriculum guide for science in grades K–6. The vast majority of these guides were developed prior to 2007 (i.e., pre-SIE). (A copy of the questionnaire and completed responses to the closed-ended questions are included in Appendix A.) In nearly all of these districts, the curriculum guides are intended to guide teachers in the selection of topics to teach and instructional materials to use; however, other data indicate that the intended curriculum is not always being enacted. A baseline questionnaire administered to teachers at the beginning of their SIE training and interviews with a random sample of principals indicate that prior elementary science instruction was inconsistent, at best. (Copies of the baseline, post-professional development, and end-of-year questionnaires are included in Appendix B.) When asked with an open-ended question to describe the strengths of their school’s science program, the most common response, given by nearly one-third of responding teachers, was some version of “our school does not have a science program.” Often, the lack of a science program was attributed to a heavy focus on reading/language arts and mathematics due to high-stakes testing. Other teachers indicated that, while science was taught, there was no set curriculum for science. Principals also acknowledged that the discretion given to teachers often resulted in science being deemphasized. Typical responses included:

We don’t have a science program! Science is not taught. [Teacher]

I can’t name a strength—we don’t have any standard curriculum for science. Most teachers are doing their own thing. We’re lucky if we can fit it in once a week. It’s hit and miss. [Teacher]

At the elementary level, since you teach all of the subjects and if you have to pick and choose and move it around, you’re going to focus less on the areas that are not tested by a standardized test, unfortunately. [Principal]

These concerns were echoed by district science supervisors. In interviews, many science supervisors stated that they thought teachers were spending the vast majority of their instructional time on mathematics and reading. Only a handful of districts had a district policy requiring that science be part of the instructional day. However, this policy was usually not enforced or monitored, and several interviewees believed that teachers were skipping science to focus on reading and mathematics. The relative lack of emphasis on science instruction is also evident from quantitative teacher questionnaire data. As can be seen in Table 3, 17 percent of teachers indicated that they did not

Horizon Research, Inc. 6 October 2007

Page 15: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

teach any science prior to SIE; only a slight majority taught science three or more times per week. When science was taught, nearly half of the lessons were 30 minutes or less in length. Taken together, these responses translate to an average of less than 20 minutes of science instruction per day.

Table 3 Time Devoted to Science Instruction Prior to SIE

Percent of Teachers (N = 616)

Science Lessons per Week 0 17 1 12 2 21 3 25 4 9 5 17

Minutes per Lesson 10 or fewer 15 11–20 8 21–30 25 31–40 24 41–50 21 51 or more 7

When asked in what ways their school science program could be improved, the most common responses focused on allocating more time for science teaching, providing teachers with materials for teaching science, making science more hands-on, and developing a curriculum articulated across grades. Often, teachers listed several of these concerns. Typical responses included:

[We need] More time for teaching science, more hands-on material. Limiting the number of topics required to teach in a year, so we can get more in-depth with units. Many teachers are not aware of what other grade level teachers are doing. This creates a great deal of overlapping and gaps. We need a consistent science curriculum across the grade levels.

Still, teachers did mention some strengths of their pre-SIE science programs, though responses varied greatly and few strong themes emerged. About 1 in 5 teachers thought that the hands-on nature of their science program was a strength; about 1 in 10 wrote that their school had lots of materials available for teaching science. However, fewer than 10 percent indicated that their school had a high-quality science curriculum. District science coordinators and principals also indicated a desire to improve the science program in their schools by making it more hands-on and inquiry based. The prospect of getting professional development for teachers, and the upcoming science portion of the Pennsylvania System of School Assessment (PSSA) provided motivation to many of the schools. Often, it was a combination of these factors:

Horizon Research, Inc. 7 October 2007

Page 16: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

I thought [SIE] was an opportunity to receive a grant and kind of move forward with science because I knew we were lacking in…professional development and curriculum in science. I thought if we were awarded this grant it would kind of catapult us forward, rather than sitting back and waiting until we do poorly on the PSSA for science. [Principal]

We were really lacking a shared vision of science K–12; everyone was doing their own thing and pulling their own resources. It was not very hands-on at the elementary level. I think that’s why when we applied for the SIE grant; we hoped to get a shared vision of science K–6. With the science curriculum we had before, we didn’t even have planned courses of study. Everyone did their own thing. There wasn’t much consistency within or across grade levels. Our goal now is to incorporate hands-on, inquiry science, but to also get consistency within a grade level and to be addressing the science standards. [District Science Supervisor]

I see SIE as a grant opportunity to give districts the resources and training, which is a missing component, to really carrying out your curriculum. Without it, we would be behind our goal a couple of years. We’re going to head in this direction to inquiry science K–6; we’ll get there, but without SIE it would be a slower process. [District Science Supervisor]

Another thing that has helped is No Child Left Behind for science is coming, and that new state test changed the willingness of people to hear about science programs in comparison with math and reading. [District Science Supervisor]

Although the PSSA is a motivator for schools to participate in SIE, it is also a concern for some. Specifically, some principals and teachers expressed concern about the extent of curricular (and module) alignment with the state science standards and the PSSA. Over half of the interviewed principals indicated that their school would use the modules only as a complement to their existing curriculum, in part because they think the modules are not sufficient to address all of the state science standards. Teachers expressed concerns about the amount of time it took to implement a module, time that could be spent covering additional standards. As one teacher said:

We field tested the [science] PSSA this year and there were only two questions on the test from this topic and it [the SIE module] keeps us from the other topics we used to cover.

In order to help examine curriculum alignment with the state standards, and to provide data that could assist schools, districts, and PDE with curriculum planning, a teacher curriculum survey was administered as part of an end-of-year questionnaire to all teachers in the SIE schools

Horizon Research, Inc. 8 October 2007

Page 17: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

responsible for teaching science.4 The survey was modeled after the Surveys of Enacted Curriculum,5 and asked teachers how much instructional time was devoted to the various Pennsylvania science standards as well as the nature of that instruction. Table 4 shows the amount of instructional time teachers reported devoting to the Science and Technology and the Environment and Ecology standards, disaggregated by grade level expectations. Across all teachers, just over 70 percent of instructional time is spent on Science and Technology standards, and slightly less than 30 percent of time is spent on Environment and Ecology standards. Interestingly, grades K–4 teachers reported spending over 40 percent of their instructional time on standards that the state designates as standards to be addressed by the 7th grade; grades 5–6 teachers reported devoting about 45 percent of their science instruction to 4th grade standards.

Table 4

Instructional Time Devoted to Various Standards, by Grade Level Percent of Instructional Time

Standards All Teachers (N = 1,015)

K–4 Teachers (N = 850)

5–6 Teachers (N = 165)

4th Grade Science and Technology 41 42 34 4th Grade Environment and Ecology 17 18 11 7th Grade Science and Technology 30 29 40 7th Grade Environment and Ecology 11 11 15

Although one would expect some instructional time to be “off grade level” (e.g., for reinforcement, extensions, and making connections), these data seem to indicate that at this point the enacted curricula in many of the schools participating in the SIE program are not well aligned with the state standards. Additional data from this survey, disaggregated by major topic area (e.g., Inquiry and Design; Biological Sciences, Earth Sciences), are presented in Appendix C. In addition to examining how instructional time is allocated, the survey asked teachers to indicate the performance expectations they had for students. As can be seen in Table 5, responses are remarkably similar across grades K–4 and 5–6 teachers, and are about equally distributed across the various expectation categories (though memorization is expected slightly less often than the other categories).

4 The questionnaire was conducted on-line, with each school’s Support on Site person (SOS) responsible for administering it to the teachers at the school. Of the 1,406 teachers targeted by the program, 1,015 completed the questionnaire, a response rate of 72 percent. Although this response rate is acceptable, HRI has heard from a number of schools that because of the length of the curriculum survey, some teachers did not respond to this portion of the questionnaire carefully. Thus, these data should be interpreted with caution. 5 Blank, R. K., Porter, A., & Smithson, J. (2001). New tools for analyzing teaching, curriculum and standards in mathematics and science: Results from survey of enacted curriculum project, final report. Washington, DC: Council of Chief State School Officers.

Horizon Research, Inc. 9 October 2007

Page 18: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table 5 Performance Expectations for Students, by Grade Level

Percent of Instructional Time All Teachers (N = 1,015)

K–4 Teachers (N = 850)

5–6 Teachers (N = 165)

Memorize Facts/Definitions/Formulas 16 15 17 Conduct Investigations/Perform Procedures 21 21 21 Apply Concepts/Make Connections 22 22 21 Analyze Information 19 19 20 Communicate Understanding of Science Concepts 22 22 21

It will be important for schools, districts, and PDE to consider how well the data from this survey match their vision for science curriculum, and what steps, if any, should be taken to improve that alignment.6 In addition, it is HRI’s experience that most teachers do not have the time or training to develop high-quality curriculum materials. Thus, the SIE program may want to consider ways to assist schools and districts in developing curricula that are well aligned with the state science standards. The Strategic Planning Institute and Vision Conference The Strategic Planning Institute (SPI), and to a lesser extent the Vision Conference, are the program’s main activities for helping participating schools develop a common vision for science instruction, as well as a plan for making that vision a reality. The SPI is a six-day event modeled after the National Science Resource Center’s (NSRC) LASER Institute. Each SIE school was invited to send a team to the SPI, typically a mix of district administrators, school administrators, teachers, and community representatives. The SPI had six main goals:

1. Develop a shared vision about effective science teaching and learning;

2. Design models for developing and sustaining a corps of leaders for improving the learning and teaching of science;

3. Apply knowledge about the five components of an effective school-based infrastructure

needed to support a research-based science program (i.e., standards-based materials, professional development, centralized materials support, assessment, and community/administrative involvement);

4. Develop a strategic plan for improving instruction in science that is informed by research,

incorporates best practices, and leverages change through partnerships;

6 To facilitate such work, HRI has made each SIE school’s data available to them via the on-line system created for the curriculum survey. In addition, PDE has access to all schools’ data, individually and in aggregate, via the on-line system.

Horizon Research, Inc. 10 October 2007

Page 19: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

5. Assess and incorporate local, state, regional, and national resources that will contribute to the implementation and sustainability of a research-based science program; and

6. Become members of a network of informed professionals working to increase student

achievement through the implementation of effective science programs. To help accomplish these goals, the SPI offered a number of sessions led by a mix of personnel from the NSRC, ASSET Inc., school districts experienced in science reform, and a number of other experts. Sessions focused on topics such as systemic change, research about science teaching and learning, the FOSS and STC modules that are part of the SIE program, and research about effective professional development. The SPI also provided time for teams to generate a strategic plan for reforming science education in their schools and districts. Interviews with principals, teachers, and district science supervisors indicate that attendees benefited from the SPI. Many of the interviewees said that the SPI helped them see the “big picture.” They also reported that having time to sit down as a team and think about their science program, including looking at what was taught in their schools, and how it was taught, was a strength, as was having the opportunity to begin developing a plan for improving their science program.

I think that in the strategic planning, ASSET did a good job of providing the foundation for using the modules in science and provided us an opportunity to meet as a team to plan and talk about our approach for instituting and maintaining the inquiry approach to science. [District Science Supervisor] A team of us were all there together for that many days, working through lunch and into dinner time because we were so excited about it. We were able to do so much good processing, so that when we brought it back we were truly ready for bringing to everyone else and getting them on board too. [Principal]

It was good for me to know what was going on in the district and what they are trying to accomplish. [Teacher]

The major concern about the SPI, rising from all three sets of interviews, was that it required too great of a time commitment. A number of participants thought the SPI could have been condensed into fewer days. As one principal said:

[The strategic planning institute] is a tremendous commitment, for people to be away from their families. I don’t know if it can be consolidated into three days; I think that would be great.

Thus, the program may want to carefully re-examine the agenda for the SPI to determine whether and how elements can be condensed or removed. However, care will need to be taken to make sure both that time is sufficient to ensure that the depth and the coherence of the institute is not lost.

Horizon Research, Inc. 11 October 2007

Page 20: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Interestingly, one of the major issues HRI identified in its observations of the SPI did not come up in the interviews, perhaps because it has already been addressed by the program prior to those interviews. At the SPI, it was evident that the participants were having difficulty engaging with the content of the sessions because of their questions about SIE. For example, many participants wanted to know what was going to be required of their schools, which modules they would be allowed to use, and when and where the teacher professional development would take place. In addition, it wasn’t always evident how the SPI fit into SIE. Given that the program had been funded shortly before the SPI was held, it is understandable that some of the details were not yet worked out. One would expect that the program will be better able to anticipate and meet the information needs of new schools joining the program more effectively and quickly in future years. The Vision Conference is a one-day event designed to help build support for a reformed science program among a larger and more diverse group of stakeholders than the SPI. Attendees included superintendents, district administrators, school board members, principals, and community representatives. According to interviewees, the Vision Conference was effective at building a broader base of support for the program.

I think you have to have the [vision conference] for administrators and I think superintendents and curriculum people have to go to that too so that they realize just how good the program is. [Principal]

The more people you have in your corner and understanding what’s going on, the better off you are. And that has paid many dividends, whether you’re looking at purchasing things or their understanding about what’s going on in our school with science education, everything. It paid dividends having all those people there. [Principal]

I learned a lot about what inquiry can do for you. I really liked the presentations on the research that indicated how this type of science model can improve science achievement in other academic areas like reading and math. Also the kits were new to me. So seeing the different aspects of the modules was good for us. [District Science Supervisor]

Those participants that attended both the SPI and the Vision Conference thought that the Vision Conference was repetitive with the Strategic Planning institute. Some interviewees also indicated that they were unable to convince key stakeholders to attend, which inhibited their district’s ability to garner support for the reforms. Thus, the program may want to consider clarifying expectations as to who should attend which events and make sure to disseminate those expectations as early as possible.

Horizon Research, Inc. 12 October 2007

Page 21: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

THE QUALITY OF SIE PROFESSIONAL DEVELOPMENT Each teacher participating in the program was expected to attend a three-day training session, focused primarily on the SIE-provided module as well as a variety of teaching strategies (e.g., the use of science notebooks, learning cycles, inquiry). Based on program records, across all of the schools participating in the program, 91 percent of eligible teachers (1,286 out of 1,406) attended professional development around their module. This section of the report describes the quality of the SIE professional development attended by these teachers using data from HRI’s observations of SIE professional development, teacher questionnaires, and teacher interviews. Overall, teachers had very positive opinions about the SIE professional development. A questionnaire was administered at the end of each session, asking teachers, in part, about the quality of the training. As can be seen in Table 6, participating teachers thought highly of the professional development; nearly all teachers thought the goals of the trainings were clear, believed that the sessions reflected careful planning and organization, and were confident in their ability to teach science using the SIE-provided module. In addition, when asked about the pace of the professional development was appropriate, over three-quarters of the teachers indicated that it was appropriate (17 percent thought the sessions were too slow).

Table 6 Teacher Opinions† of the Quality of SIE Professional Development

Percent of Teachers (N = 765)

The goals of the professional development were clear 96 The professional development reflected careful planning and organization 95 The professional development increased my confidence to teach using the SIE module 95 Adequate time, structure, and guidance were provided for participants to reflect individually on

the substance of the professional development 93 Adequate, time, structure, and guidance were provided for participants to discuss the SIE

modules and pedagogical strategies with each other 92 † Represents those teachers agreeing or strongly agreeing with each statement.

When asked in an open-ended question what aspects of the professional development were most helpful or effective, nearly two-thirds of the teachers responding to this question indicated working through the module as students. About one-quarter of the respondents described the modeling of science notebooks as a learning tool, and many mentioned the high quality of the facilitators who shared practical tips for implementing the module. These findings mirror those found in interviews with teachers conduced near the end of the school year. Typical responses included:

The “hands-on” approach was very effective in helping me understand how to teach science. Actually doing the lessons made me feel much more comfortable about teaching them (more so than if I had just read the lessons). The most helpful aspect, for me, was going through each lesson, step by step, very methodically, with our instructors mentioning where some potential pitfalls could be.

Horizon Research, Inc. 13 October 2007

Page 22: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

I think the notebook is a terrific teaching tool to pull together what is being taught and what the children are learning. The hands-on applications are totally kid friendly/age appropriate and can be modified/differentiated to the variety of levels in a normal classroom. I appreciated how our own instructors added things from their “real world” experience using the module. These tips gleaned from experience will help us as we implement the modules.

Similarly, when asked how the professional development could be improved, the most common response was that it didn’t need any changes. However, a relatively small number of teachers indicated that the training did not adequately address the amount of preparation and instructional time they later found that the module would require. As one teacher said:

The experiments, materials wise, are overwhelming. It took a lot of preparation. All of my planning time was spent on setting up, and my paperwork is not getting done.

Preparing teachers to implement a new instructional program is a major undertaking. In the case of SIE, teachers are being asked to implement new instructional materials to teach science, a subject area that many teachers (in Pennsylvania and across the nation) have put on the back burner due to high-stakes testing in mathematics and reading/language arts. In addition, few elementary teachers have strong content backgrounds in science.7 Both of these factors make this undertaking that much more challenging. A teacher’s implementation of an SIE-provided science module can progress along a trajectory from no or minimal adherence to the instructional materials, to mechanical use of the module (i.e., simply using all of the activities in the module), to purposeful use (i.e., using the activities and pedagogies in the materials to provide students a high opportunity to learn the intended content). In reflecting on the quality of SIE professional development, HRI considered what teachers need to know and be able to do in order to implement a module in a purposeful manner, and identified five major elements the program would likely have to provide to participating teachers:

1. The first element is an understanding of the SIE program, what services it will provide and what being part of the program means for them and their schools. If these basic informational needs are not met, it is unlikely that teachers will be able to engage with other aspects of the program.

2. The second element is familiarity with the module, including knowing what is in the box

of materials provided with the module and how to manage both the students and the activities.

7 See, for example, Weiss, I. R., Banilower, E. R., McMahon, K. C., & Smith, P. S. (2001). Report of the 2000 national survey of science and mathematics education. Chapel Hill, NC: Horizon Research, Inc.

Horizon Research, Inc. 14 October 2007

Page 23: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

3. Third, given the program’s emphasis on science notebooks, teachers need to understand what a notebook is (and is not) and how it can be effectively used to promote student learning.

4. Fourth, teachers need to understand the content in the module, including what concepts

and skills students should learn from each activity and how they relate to the big ideas and overall storyline of the module.

5. Lastly, teachers need to understand the pedagogical storyline in the modules, including

the vision of inquiry learning and how the module embodies that vision. Research on teacher concerns suggests that teachers using new instructional materials will first be concerned with the mechanics of the activities and their role in leading them, so initial training will necessarily be focused largely on familiarizing teachers with the basics of the module.8 Additionally, addressing all of these areas takes a good deal of time, in terms of both the amount of professional development needed and opportunities for teachers to implement and reflect upon the module. Thus, a reasonable goal for an initial training is to familiarize teachers with the basics of the module, helping them reach a mechanical implementation of the module. However, it is possible to lay the groundwork for facilitating teachers’ movement to more purposeful use of the module. An analysis of SIE professional development by each of the five elements described above follows. Introducing Teachers to Science: It’s Elementary Because many teachers came to the trainings with little understanding of what the program entails, and often very short notice, answering their questions early on was especially important so that they could engage with the main goals of the module training. An icebreaker activity was used to identify participants’ burning questions about SIE. Participants were asked to write on pieces of chart paper around the room about their current understanding, and any questions they had, about inquiry, science notebooks, science education reform, and the SIE program. This activity allowed facilitators to see where the group stood in their understanding of these topics, and allowed them to tailor the information they provided throughout the session to the information needs of the particular group. Once the initial questions were answered, facilitators used a PowerPoint presentation to provide an overview of the SIE program that explained the elements of science education reform as envisioned by the NSRC, and how the SIE program addresses these elements. The sessions also gave teachers additional opportunities to ask questions about the program throughout the training, utilizing the “parking lot” strategy (teachers could post questions on chart paper that were discussed as time allowed). These strategies for dealing with teachers’ initial information needs appear to have been successful, allowing teachers to focus on the module-specific aspects of the training. 8 Hall & Loucks (1979) Implementing innovations in schools: A concerns-based approach. Austin, TX: Research and Development Center for Teacher Education, University of Texas.

Horizon Research, Inc. 15 October 2007

Page 24: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

This conclusion is supported by questionnaire data. As part of the baseline and post-professional development questionnaires, HRI administered a set of items developed as part the Concerns Based Adoption Model (CBAM). The theory holds that when teachers adopt a new curriculum or instructional approach, they progress through a series of concerns about the “innovation.” As experience with the innovation grows, early concerns (e.g., being unfamiliar with the curriculum) are resolved and later ones emerge. The implication of this stage theory is that early concerns must be resolved before later ones can be effectively addressed. The theory suggests that effective professional development specifically targets the concerns that are likely to emerge at different points in the adoption process. The seven stages of concern are operationalized in a 35-item CBAM questionnaire9 developed to measure the intensity of each concern:

1. Awareness (e.g., I don’t even know what the SIE modules are.) 2. Informational (e.g., I would like to know what resources are available for implementing

the SIE modules.) 3. Personal (e.g., I would like to know how my teaching is supposed to change using the

SIE modules.) 4. Management (e.g., I am concerned about my inability to manage all that the SIE modules

require.) 5. Consequence (e.g., I am concerned about how the SIE modules affect students.) 6. Collaboration (e.g., I would like to develop a working relationship with both our faculty

and outside faculty using SIE modules.) 7. Refocusing (e.g., I would like to modify our use of the SIE modules based on the

experiences of our students.) The hypothetical concerns profile in Figure 2 illustrates how early concerns (e.g., informational) are the most intense early in an implementation, but decrease over time, to be replaced by other concerns. It is important to note that in the CBAM theory, the changes depicted in Figure 2 can take two to three years to be observed.

9 HRI modified the original CBAM questionnaire to tailor it for the SIE program.

Horizon Research, Inc. 16 October 2007

Page 25: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Hypothetical Concerns Profile atThree Points in an Implementation

0

20

40

60

80

100

Awareness

Informational

Personal

Management

Consequence

Collaboration

Refocusing

Type of Concern

Inte

nsity

Early Middle Late

Figure 2

Figure 3 shows CBAM data from teachers participating in SIE professional development. Based on these data, SIE appears to have been successful at addressing teachers’ initial concerns about the program and what the implications were for classroom practice.

SIE Teacher Concerns (CBAM)*

0

20

40

60

80

100

Awareness

Informational

Personal

Management

Consequence

Collaboration

Refocusing

Types of Concern

Inte

nsity

Pre PD Post PD

* Post PD profile is significantly different than the Pre PD profile, p < 0.05 (MANOVA)

Figure 3

Horizon Research, Inc. 17 October 2007

Page 26: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Familiarizing Teachers with the Modules Familiarizing teachers with the basic content, structure, and activities in the module is a necessary step in moving teachers along the continuum towards mechanical, and then purposeful, implementation of a module. Based on HRI’s observations, the trainings appeared to be successful at familiarizing teachers with the materials in the modules they would be using, as well as the mechanics of working through the various investigations. As soon as teachers received the teacher manual for the module, most facilitators gave the teachers an overview of what is in the manual by walking them through the various sections, including the unit overview, background information for teachers, alignment to national standards, and assessment. The teachers then spent the bulk of each three-day training working through the modules as if they were students, so that they would have an operational knowledge of what the materials looked like and how each investigation was supposed to work. Because the trainings were led by experienced teachers who had previously used these modules, participants received tips on managing the module materials and students throughout the training. For example, management tips included showing teachers the proper way to measure the length of a pendulum, suggesting that teachers test the K’NEX cars on different surfaces in the classroom before using them with students, diagramming the correct and incorrect ways to use a diode, and suggesting that teachers allow young students to explore new tools (e.g., hand lenses and pen lights) before using them as part of an investigation. Informal conversations with participants and data from the post-professional development questionnaire (presented later in this report) indicate that these strategies were successful in helping teachers become comfortable with the modules. Preparing Teachers to Use Science Notebooks The use of science notebooks is promoted by SIE as a learning tool and a way to connect science instruction to literacy goals. Notebooks are also useful for meeting meta-cognitive and sense-making goals, in that they help students become aware of their own learning. Overall, the use of notebooks was effectively modeled throughout the observed trainings, and analyses of post-professional development questionnaires suggest that teachers found this aspect of the trainings to be among the most useful. Still, the nature and extent of facilitation surrounding the use of notebooks varied considerably across the observed sessions, and in one session they were not used at all. In some, but not all, observed sessions, facilitators presented research supporting the use of notebooks. In several sessions facilitators discussed the use of notebooks as an organizational tool, while other sessions stressed the importance of notebooks in helping to make students’ learning visible. Furthermore, while working through module lessons, facilitators in different sessions varied in the amount of guidance they provided as to what should be written in the notebooks. Across the observed sessions, notebooks were used for recording data, building vocabulary, and responding to focus questions. Many facilitators encouraged teachers to keep notes from a teacher’s perspective on the left side of each page, and notes from a student’s perspective on the right. This division provided teachers with a useful product to take back to their classrooms as a resource that could be used during their teaching of the module.

Horizon Research, Inc. 18 October 2007

Page 27: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

In the observed sessions that used science notebooks, they were a part of nearly every activity. Integrating their use throughout every aspect of the training helped to build teachers’ comfort with them, and allowed the facilitators to highlight many different ways notebooks can be used to enhance student learning. Because of the diversity of ways in which facilitators presented notebooks during trainings, however, teachers attending different trainings had very inconsistent experiences and likely came away with different ideas about the notebooks’ purpose. To increase consistency, the program may want to provide more guidance to facilitators about SIE’s vision for the use of notebooks in science lessons. Developing the Content Storyline Although teachers can implement a module mechanically with a limited understanding of the science content in a module, purposeful implementation requires teachers to have a solid grasp of both the disciplinary and pedagogical content knowledge related to the module. Specifically, teachers need to understand: (1) what concepts and skills each lesson is intended to teach; (2) how each lesson contributes to the overall content storyline of the module; and (3) the initial ideas students are likely to have about the content and how those ideas can be addressed through the module activities. It is unlikely that any three-day training could address all of these areas, in addition to the other goals of the training. However, opportunities do exist for addressing these goals in the sessions, some of which were capitalized upon and some of which were missed in the observed SIE sessions. The modules themselves provide some background information for teachers on the content they intend students to learn. For example, the STC modules provide an overall introduction to the module, a rough concept storyline, and a brief content overview for teachers before each lesson. Similarly, the FOSS modules provide a unit overview and a teacher background section for each investigation. However, the modules vary in the extent to which they explicitly relate the student activities to this content information, or provide details on ideas students are likely to have prior to instruction and how these ideas could impact student learning. Because most elementary teachers do not already have this type of knowledge, and because the modules may not provide it, it is important for a professional development program to provide teachers with opportunities to engage with these ideas over time. Facilitators in the observed SIE trainings attempted to address the content storyline of each module, though there was a great deal of inconsistency across the sessions. One method employed by some facilitators was to use the science notebooks as a reflection tool, periodically posing “focus questions” that encouraged teachers to consider the content in the investigations they had just completed. This use of notebooks provided a valuable reflection tool throughout the training, and would have been even more beneficial if used more systematically both within and across sessions, for example at the end of each investigation. In addition to the use of focus questions in notebooks, some facilitators paused to reflect on the content teachers had learned after each individual module investigation. For example, after working through a lesson in the Motion and Design module, one facilitator asked teachers to

Horizon Research, Inc. 19 October 2007

Page 28: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

look at the data each group had collected and draw conclusions about relationships among the tested variables. He then went on to explain how their results were related to Newton’s second law of motion. This sort of sense-making can be very effective for helping teachers to be aware of how each activity relates to the overall content goals of a unit. In other cases, teachers were simply told what they “should have” learned, which may have short-circuited opportunities for teachers to build their own conceptual understanding. Although such mini-lectures can be very effective at presenting important content and ensuring that the critical points are not missed, it is important to relate them to teachers’ work with the module so that teachers can make sense of the content for themselves rather than having it done for them, to model the approach we would like teachers to use with their students. Other strategies the program might consider using to help teachers focus on the content in the modules include highlighting how each lesson addresses the Pennsylvania science standards, and reflecting on the big ideas in a module after teachers have worked through all of the investigations. Given the current focus on the science PSSA, it is important that the content in the module training be connected to the standards. A few facilitators did refer to content standards throughout the training, such as by noting when a particular graphing activity could be used to meet a mathematics standard. However, this sort of explicit connection to the standards was very rare in the observed sessions, and in some cases the standards alignment seemed fairly tenuous. For example, one facilitator noted that an activity in which different types of fabric were observed to have absorbed water was aligned to a standard about “change over time,” because most fabrics would absorb the water eventually. More explicit and meaningful connections need to be made between standards and module content if teachers are to understand how their instruction from the module relates to the Pennsylvania science standards. In most observed trainings that related the modules to standards the emphasis seemed to be on process standards, as these standards were posted at the front of the room in several sessions and referred to during the training. Disciplinary content standards appeared to be less of a focus. Most facilitators provided handouts with lists of content standards that were considered to be aligned to the particular module, but at no observed training was there a discussion of how the module lessons were connected to these standards. In at least some cases an explanation was needed, because the connections were not immediately obvious. For example, the Rocks and Minerals module was listed on the handout as being aligned to several standards about living things. In at least one session the handout was presented as a means of assisting teachers with their school bureaucracy rather than as a tool to aid with instruction, as participants were told that it was being provided for them to give to their principals if they were challenged to demonstrate that they were teaching the standards. To aid participants in understanding the overall content storyline of each module, the original design for the trainings included a concept mapping activity that was intended to help teachers see how the ideas in each unit connected to each other, and to build a sense of the “big picture.” In this activity, vocabulary words were added to a “word wall” during each lesson, and teachers were asked to combine these words in a concept map at the end of the third day. After the first half of trainings were complete, feedback from facilitators led SIE to eliminate this activity, as it proved unwieldy to train teachers on concept mapping while also helping them reflect on the

Horizon Research, Inc. 20 October 2007

Page 29: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

content in the unit. Given the inconsistent use of the concept maps during observed trainings, this decision made sense. However, the elimination of this activity left a gap in the trainings, and SIE needs to consider another structure for overall sense-making of the content of each module. For example, facilitators at one observed training asked teachers on the last day to list and discuss what they believed to be the big ideas of the module they had just experienced. This method was an effective way for facilitators to assess teachers’ understanding while also focusing them on the appropriate content goals from the module. Incorporating this type of reflection into the design of future trainings could help to boost teachers’ understanding of module content. The more structures for sense-making that are built into the design and materials for module training, the less dependent the sessions will be on the knowledge of individual facilitators. The design of the SIE trainings called for one experienced teacher and one content expert to serve as co-facilitators of the professional development. The strengths of this design are that it has the potential to provide one leader who can alert participants to common prior ideas and problems students might have, and one leader who can elaborate on the big ideas of the module and discuss content at a level higher than what students would be expected to know. Unfortunately, this format was not always possible due to the limited number of trained leaders and complications of scheduling trainings across the state. Several of the observed trainings were led by two experienced teachers with no designated content expert, and others were led by just one person. Moreover, even when the desired format was possible, the two leaders did not always take on the roles described above. More often, one leader (in most cases the experienced teacher) would lead the bulk of the training, with the other acting as a support person. Regardless of who took on a greater leadership role, the focus of the trainings was usually on managing the logistics of the modules rather than reflecting on the big content ideas and how the module gets them across to students. One possibility for increasing both the consistency and the quality of the content training would be to have experts review each module in advance to identify: (1) the big ideas in the unit; (2) how each activity contributes to the overall storyline of the module; (3) common student misconceptions that need to be explicitly addressed during instruction; and (4) possible misconceptions that may be inadvertently promoted by the module, and how to avoid them. A written review of each module that addressed these issues would be especially helpful to the new leaders who will be facilitating next year’s module trainings. Including such a review in the training materials for facilitators would increase the probability that every facilitator understands the content associated with each activity. Developing the Pedagogical Storyline In addition to understanding the content, teachers need to understand the pedagogical strategies needed to implement the module in a manner consistent with the vision of science instruction promoted by the SIE program. For a mechanical implementation of a module, teachers need to understand, at a minimum, how to manage students and materials. For a more purposeful implementation, teachers need a clear vision of what high-quality, inquiry-based science

Horizon Research, Inc. 21 October 2007

Page 30: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

teaching looks like, and how the module fits into that vision.10 As with the content storyline, developing these understandings and skills typically requires sustained and in-depth professional development. Overall, the SIE trainings appear to have provided teachers with ample opportunity to learn strategies they will need to manage and implement the module. Many of the facilitators had experience implementing the modules in their own classrooms, which allowed them to share tips that they know work. These tips for managing the materials and students help teachers gain the confidence and skills they need to feel comfortable using the module. Without these strategies, it is unlikely teachers would reach even mechanical implementation of the module. Another strength was that when a strategy was used, the leaders were typically very explicit about what they were doing and why. For example, facilitators in observed sessions often noted that they were circulating around the room to informally assess where teachers were in their understanding of a given concept. In one session, a facilitator asked a teacher a question, then followed it up by asking “how do you know?” This facilitator went on to explain that she was looking for the teacher to back up her claim with evidence. Another facilitator instructed teachers to add the word “property” to a vocabulary list in the back of their notebooks only after conducting an activity related to properties. This facilitator noted that research has shown that students should explain a concept in their own words before they are given scientific terms. The explicit modeling of pedagogical and classroom management strategies was also applied on a broader scope, as time to reflect on these strategies was built into the design of the training. At the end of each day, teachers were asked to brainstorm what strategies they had used that day to support: (1) content goals, (2) process skill goals, and (3) classroom management. Facilitators listed the strategies on chart paper at the front of the room as they were suggested by teachers. This brainstorming session had the potential to provide teachers with valuable reflection time and help them see the connections among the various pieces of the training. However, the implementation of this activity varied greatly. Due to timing constraints, this reflection was often rushed. In these cases, participants were given little time to reflect on these questions before the large-group discussion. Some facilitators added to the lists themselves rather than asking the teachers what they had experienced. Others accepted answers without making sure all participants understood how the strategy contributed to its intended goal. For example, when asked for strategies used to support process skills, a teacher in one session called out “candle,” a reference to an activity in which teachers wrote 20 observations about a lit candle that was intended to help build their observation skills. The facilitator then wrote “candle” on the chart paper and moved on to the next participant, missing an opportunity to distinguish between the specific activity and the general strategy, and how the strategy might be used in a classroom. Although the trainings appear to have been effective at preparing teachers for at least a mechanical implementation of the module, there was not a consistent emphasis on developing teachers’ ability to use the modules in a purposeful manner. Different facilitators introduced

10 HRI’s frame for thinking about the quality of instruction is based upon the National Research Council’s How People Learn: Brain, Mind, Experience, and School (1999) and How Students Learn: Science in the Classroom (2005).

Horizon Research, Inc. 22 October 2007

Page 31: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

different advanced pedagogical strategies, often bringing handouts they had created or used in the past. For example, in the 11 observed sessions, one facilitator provided a handout with questions to aid in formative assessment, three provided a handout on “power conclusions” (a detailed conclusion statement that incorporates students’ original predictions, data collected, an error analysis, and ideas for future investigations), and, two provided a handout on effectively using claims and evidence in science notebooks. Including these handouts often enhanced an individual training session; their consistent use across all sessions would have provided more opportunities for a greater number of teachers to engage with these ideas. To implement a module purposefully, teachers need a clear vision of high-quality inquiry-based instruction. In the observed sessions, a number of different visions were expressed by facilitators and participants. The concept of inquiry was introduced on the first day of each three-day training through a “gallery walk,” in which participants were asked to read a variety of definitions of inquiry and choose the one that most “spoke” to them. The facilitators then led a discussion, asking teachers to share their perspectives. However, no consensus was arrived at through these discussions. In fact, across the sessions, the facilitators themselves expressed different perspectives on inquiry. Some likened inquiry to the scientific method; some noted that even “experts” and the module developers don’t know for sure what inquiry is; some told participants that “there are no right or wrong answers” in inquiry learning because it is more about process than content. None of the facilitators highlighted how inquiry is different than simply a hands-on activity. The term “inquiry” is used in many different ways by different people, ranging from open inquiry in which students ask their own questions and develop their own procedures for answering the questions, to a much more guided approach in which students participate in systematic investigations of phenomena and rely on data for drawing conclusions. Thus, it would be helpful for SIE to clarify, both for facilitators and participants, exactly what type(s) of inquiry the program is promoting. Because the modules provide a structured frame for teaching and learning, it appears that SIE is focusing mainly on guided inquiry. Communicating a clear and consistent vision of inquiry would avoid confusing teachers and provide them a concrete image on which to base their science teaching. Further, the relationship between inquiry and the modules was not consistently made across the observed sessions, and inquiry was rarely mentioned after the first day’s introduction. This omission likely contributed to a perceived disconnect between the first day and the rest of the training, as evidenced by teachers’ comments in informal conversations and on surveys in which they noted that the first day could have been shortened or omitted all together. Thus, teachers were unlikely to leave the training with a clear understanding of inquiry learning, or an ability to implement it in their classrooms. To use a module purposefully, teachers also need to understand the learning theory (in this case the learning cycle) embedded in the materials. The training introduced teachers to the learning cycle, both the “5 Es” (Engagement, Exploration, Explanation, Elaboration, Evaluation) learning cycle used in the FOSS modules and “FERA” (Focus, Engage, Reflect, Apply) learning cycle used in STC modules, through handouts given to participants as the modules were introduced. However, as with inquiry, the extent to which facilitators connected the learning cycle to the modules varied. A few of the observed facilitators took time throughout the training to help participants step back and reflect on which part of the learning cycle they were experiencing (by

Horizon Research, Inc. 23 October 2007

Page 32: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

saying, for example, “This investigation helps students to reflect, which is the R part of FERA”), and some noted the similarities between the two learning cycles. In most cases, however, facilitators were not observed discussing the handouts in any depth, and most did not refer to the learning cycles after the first day, making this aspect of the training seem disconnected from the modules. Based on HRI’s observations of professional development sessions and data from teacher questionnaires and interviews, it appears that the SIE professional development likely succeeded in its main goals. The majority of teachers seemed to have enjoyed the training; they also showed enthusiasm for the SIE program, a necessary precursor for deeper engagement with the program. The professional development appears to have succeeded in orienting teachers to the SIE program, and in making them comfortable enough with science notebooks and the modules to implement the materials in their classroom for the first time. In addition, the training has started to lay the groundwork for teachers to use the modules in a more purposeful way. Given the large number of teachers the program has served in its first year, these are substantial accomplishments. However, to help teachers move from mechanical to purposeful implementation of the modules, future professional development will likely need to help teachers develop a clear vision of high-quality, inquiry-based instruction, and make explicit connections between that vision and the module. The program should also consider building more structures into the training (i.e., sense-making activities for a module that all facilitators would be expected to do) to help ensure that all participants understand the intended content in each lesson as well as the overall conceptual storyline of the module.

Horizon Research, Inc. 24 October 2007

Page 33: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

IMPACT OF SIE ON TEACHERS AND TEACHING This section of the report focuses on how the SIE program has impacted teachers, including their perceptions of preparedness to use their SIE-provided module, and their classroom teaching. Data for this section come from teacher questionnaires and interviews, as well as from classroom observations during module implementation. Teacher Preparedness The post-professional development questionnaire included a series of questions aimed at assessing teachers’ feelings of preparedness to use the teaching strategies promoted by SIE (e.g., how well prepared they felt to use the inquiry-based teaching strategies embedded in the SIE module, use science notebooks to support student learning of the content in the SIE module, and use the FERA and 5E learning cycles.) To assess impacts, teachers were asked to rate both their current preparedness and their preparedness prior to the professional development. This “retrospective pre” approach is useful when respondents are likely to change their perceptions of initial knowledge/preparedness as they learn more about the topic (i.e., they didn’t realize how much/little they knew about a topic until after their participation in the program). In addition, the end-of-year questionnaire repeated these items to examine whether use of the module affected teachers’ perceptions of preparedness. These items were combined into a composite variable called “perceptions of pedagogical preparedness” to reduce the unreliability associated with individual survey items. (Definitions of this and other composites described in this report, a description of how the composites were created, and reliability information are included in Appendix D.) Each composite has a minimum possible score of 0 and a maximum possible score of 100. A score of 0 would indicate that a teacher selected the lowest response option for each item in the composite, whereas a score of 100 would indicate that a teacher selected the highest response option for each item. These longitudinal data have a nested structure, with time points nested within individual teachers. Statistical techniques that do not account for such nested data structures can lead to incorrect estimates of the relationship between independent factors and the outcome. Hierarchical regression modeling11 is an appropriate technique for analyzing nested data and was used to examine trends in teachers’ composite scores. As can be seen in Figure 4, teachers’ perceptions of pedagogical preparedness increased dramatically as a result of the professional development, a very large effect size12 of 1.88

11 Bryk, A.S. & Raudenbush, S.W. (1992). Hierarchical linear models: Applications and data analysis methods. Newbury Park, CA: Sage Publications. 12 The effect size for the comparison of two means is calculated as the difference between the means divided by the pooled standard deviation. Effect sizes of about 0.20 are typically considered small, 0.50 medium, and 0.80 large. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Horizon Research, Inc. 25 October 2007

Page 34: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

standard deviations. The slight decrease (-0.32 standard deviations) at the end of the school year may be due to a wearing off of a “halo effect” (i.e., teachers who enjoy a professional development session may overestimate the impacts of that session when asked immediately following it). The dip may also be due to teachers having the experience of trying these strategies in the classroom, and realizing they weren’t quite as well prepared as they thought they were. Still, the overall trend is positive and large, an indication that the professional development increased teachers’ feelings of preparedness to implement the modules in their classrooms.

Teacher Perceptions of Their Pedagogical Preparedness*

25

7365

0

20

40

60

80

100

Pre-PD Post-PD End of Year

Perc

ent o

f Tot

al P

ossi

ble

Poin

ts

* Teacher scores change significantly over time, p < 0.05 (HLM)

Figure 4 Interviews support the questionnaire data. The vast majority of interviewed teachers reported that the training was effective in preparing them to use the pedagogies in the modules. Comments included:

I think they did a nice job with that. Just the idea that it was a little different…them [students] discovering the things. The one we had been doing, even though it was called “Discovery Science,” you were presenting the ideas first and then doing an experiment and then asking questions. This was more, “here it is, let’s look at it, now let’s talk about it and see what happened. Let’s come up with ideas about the science.” [Teacher] Just the whole way to go about the inquiry process really helped me. I had heard about it and I’d tried a couple little things, but to just go whole-heartedly into a program like that, I think they did a really good job explaining that. [Teacher]

The questionnaires also asked teachers to rate their understanding of the science content in their module. Items included their understanding of (a) the student learning goals in the SIE module; (b) the science content in the SIE module at a deeper level than what students are expected to

Horizon Research, Inc. 26 October 2007

Page 35: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

learn; and (c) ideas (either correct or incorrect) that students are likely to have about the content prior to instruction. These items were also combined into a composite, in this case called “perceptions of pedagogical content knowledge.” The results for this composite parallel those for the perceptions of pedagogical preparedness composite, likely for the same reasons. Teachers’ ratings of their pedagogical content knowledge were quite low initially and increased significantly as a result of the professional development, an effect size of 1.87 standard deviations (see Figure 5). There was also a slight decrease from just after the professional development to the end of the school year (-0.18 standard deviations).

Teacher Perceptions of Their Pedagogical Content Knowledge*

20

7368

0

20

40

60

80

100

Pre-PD Post-PD End of Year

Perc

ent o

f Tot

al P

ossi

ble

Poin

ts

* Teacher scores change significantly over time, p < 0.05 (HLM)

Figure 5 Most of the interviewed teachers indicated that the professional development had helped them understand the content in their module. As two teachers said:

They gave a lot of background beyond…more in depth than kids would need, but it was for us. [Teacher] They had a content specialist there who could help us with the concepts if we didn’t remember them from our own schooling and that was helpful. [Teacher]

Module Implementation For the majority of participating schools, SIE provided their first foray into the use of modules for teaching science. In its first year, SIE offered support for 11 different modules. In grades K–2, SIE offered one module at each grade. In each grade 3–6, SIE allowed schools to select one of

Horizon Research, Inc. 27 October 2007

Page 36: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

two modules being supported. However, some schools that had previously adopted science modules were allowed to substitute their own modules for those SIE offered. Table 7 shows the number of teachers using each module and the number of students in those teachers’ classes.13

Table 7 Teachers and Students Module Experience

Module Grade(s) Number of Teachers Number of Students Fabric K 178 4,003 Air and Weather† 1 4 63 Weather 1 202 4,318 Balance and Motion† 2 7 114 Changes K,1,2 183 4,051 Structures of Life† 3 4 105 Ideas and Inventions 3 95 2,190 Rocks and Minerals 3 31 737 Water 3,4 71 2,102 Earth Materials† 4 6 118 Electric Circuits 4 102 2,482 Environments† 5 10 281 Levers and Pulleys 5 68 2,443 Motion and Design 5 77 2,572 Mixtures and Solutions 6 39 1,523 Variables 6 37 1,504 † Module used by school prior to SIE that school was allowed by PDE to substituted for one the program provided.

The end-of-year questionnaire asked teachers about factors that may have affected, positively or negatively, their use of the module. As can be seen in Table 8, 91 percent of responding teachers thought that SIE professional development facilitated their use of the module, and 89 percent thought their school was supportive of their use of the module. However, 55 percent of respondents thought that there was not enough instructional time for science to effectively use the module, and 47 percent indicated that pressure to teach mathematics and/or reading/language arts inhibited their use of the module.

13 Data are from PDE’s mid-year report on the SIE program and are based on only 65 of the 67 districts participating in SIE.

Horizon Research, Inc. 28 October 2007

Page 37: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table 8 Factors Affecting Teacher Use of the SIE-Provided Module†

Percent of Teachers

(N = 1,100) The training I received from SIE made it easier for me to use the module. 91 My school was supportive of my use of the module. 89 Other teachers in my school provided a support system for use of the module. 78 My own science background was helpful when I was teaching from the module. 76 There is not enough instruction time for science to effectively use the SIE module. 55 My SOS facilitated my use of the module. 51 The pressures to teach mathematics and/or reading inhibited my use of the module. 47 I did not have the module for long enough to use as much of it as I wanted to. 32 I received assistance from SIE outside of the 3-day training that helped me to use the module. 20 My lack of experience in science made it more difficult for me to teach from the module. 10 † Represents those teachers agreeing or strongly agreeing with each statement.

As part of the student achievement study (described more fully later in this report),14 HRI asked teachers to complete a module use survey that asked them to indicate which lessons from the module they used in their classroom and the extent to which they used other materials to teach the topic covered by the module. As can be seen in Table 9, teacher use of the SIE-provided modules varied greatly. On average, teachers using Electric Circuits covered 86 percent of the lessons in the module; teachers using Levers and Pulleys covered 67 percent of the module’s lessons.

Table 9 Teacher Use of Modules†

Percent of Module Lessons Taught

Module N Mean Standard Deviation Minimum Maximum

Electric Circuits 87 86 17 35 100 Variables 57 81 18 43 100 Motion and Design 31 78 27 29 100 Rocks and Minerals 71 76 27 6 100 Mixtures and Solutions 40 72 23 33 100 Levers and Pulleys 39 67 30 15 100 † Preliminary findings based on data from the 48 of 73 schools participating in SIE that returned data in time to be included in

this report. In interviews, the great majority of teachers indicated that a lack of time affected their module implementation, with both instructional time and preparation time mentioned as constraints. Some teachers reported receiving their modules late in the school year, resulting in additional time pressure. Others indicated that they were being asked to teach science at the expense of other disciplines (e.g., social studies, art). Several teachers mentioned that a barrier to using the modules was a lack of space/classroom configuration to carry out the activities, including a lack of room to store the numerous materials included with the module. A few teachers mentioned

14 The student achievement study focused on 6 of the 11 modules provided by SIE in Year One.

Horizon Research, Inc. 29 October 2007

Page 38: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

that their classrooms were not intended to be used to teach science and that they did not have equipment necessary for some of the activities (e.g., sinks for water, flat desktops). Typical responses included:

Time! I spent a lot more time teaching science, social studies was left out, handwriting was left out, creative writing was left out. The module came late in the year which was good. I had reading and math well in hand and almost completed. With another module next year, I don’t know how I can do it. There were a lot of little things prior to doing it that I had to do. There was a lot of clean up. A lot of liquids and solids and I had to wash all of the containers to be used again. And finding the space to let everything dry. Space was a real problem. I did buy some extra containers, dishpans and things, for around the room, but at my personal expense.

Time limitations were not the only reason teachers skipped parts of the module. Teachers reported a number of reasons for cutting lessons, including their own lack of interest in an activity or not liking how the activity was designed. In the words of three teachers:

We really only did about five lessons because we got it so late in the year and we didn’t want to disrupt the daily routine all that much. I kind of skipped around. I chose the ones that we actually did in the training and the ones with the materials in the kit. I’m skipping the very last one because it’s called water from home and I’m just not interested in where the kids are bringing in water from home. I just don’t want to get into that. We skipped the rain gauge lesson because, of course, we have to be flexible because it depends on the weather. And we talked about this as a team, we didn’t really like the rain gauge lesson because number one, they want you to use cubes to measure the collection of rain you get and we feel cubes are not something you use to measure.

In addition to some teachers cutting out entire lessons, about half of the interviewed teachers reported making modifications to the lessons. One of the most common reasons for the modifications was to save time. Teachers reported making the lessons more directed to shorten the investigation, or omitting parts of the activities that they judged to be redundant. Some teachers tried to save time by telling students what to write in their science notebooks. In contrast, some teachers thought that, because it was their first time implementing the module, they thought it was important to follow each lesson as it was laid out in the module. It will be important for the time issue to be addressed. Given that a number of teachers raised concerns about having time to implement one module, it is likely this issue will become more pressing when they are asked to implement a second module next year. The implementation of the science portion of the PSSA may help convince teachers of the necessity of devoting more time to teaching science, but the breadth of the state science standards coupled with the amount of time it takes to implement a module that covers a relatively small fraction of those standards will likely force teachers to make difficult decisions about their instruction. Thus, SIE should

Horizon Research, Inc. 30 October 2007

Page 39: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

seek ways to proactively address this issue. If teachers feel it is necessary to skip portions of a module because of time constraints, the program may want to provide guidance as to which lessons and activities are critical to cover and which are candidates for omission while still maintaining the conceptual storyline of the module. Student Opportunity to Learn A critical indicator of the success of SIE is the extent to which participation in the program increases student opportunity to learn important science. Teachers were asked in interviews whether they had taught the content in their SIE-provided module in previous years; most said no. For some, the module was the only science instruction they included this year; for others, the use of the module required them to skip science topics they had typically covered in the past. Teachers that had covered the topic of their module in the past usually indicated that the incorporation of the module allowed them to cover topics in greater depth than they had previously. These teachers also reported that they had not used as much hands-on and inquiry-based strategies in the past. As two teachers said when asked how their teaching of the topic was different with the SIE-provided module:

The big difference is this one is hands on. This one was inquiry, this one was investigation. When I taught water in the first part of the year, we read the textbook and talked about it and had some worksheets on it. We did some experiments. But with the SIE module on water we didn’t take out a textbook and read at all. There was no reading of facts and knowledge. It was all discovery on their own. To some extent I have taught some of the things in the kit [in previous years], but not the same design. Well not all of it, parts of it. It [the module] is more in depth. Before I used the science book and little science experiment books that I had bought myself.

Even for those who had not taught the topic in the past, the use of the SIE-provided modules appears to have increased the number of hands-on activities teachers use. Teachers indicated that having all of the lessons written out, having all of the materials included, and the fact that the modules provide a sequential approach to the topic as being critical to their ability to make this type of change in their science instruction. As one teacher said:

I always knew that inquiry was the best method for teaching science, but I just couldn’t always come up with the activities and supplies on my own to do those kinds of things with the inquiry method. I tried to do that, but it just isn’t realistic when you are trying to come up with those activities on your own and find the materials to do the activities.

These impacts have also been noted by principals and district staff. When asked what impacts the SIE program has had on their science program, the transition to a hands-on science instruction, through the use of the modules, was the most common response given by district science supervisors and principals. Some noted that the use of the modules has increased the motivation and enthusiasm their students have for science, and others said that more science is being taught in their schools at the early elementary level.

Horizon Research, Inc. 31 October 2007

Page 40: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

We’ve noticed a real increase in the motivation of our students to want to learn science. Yesterday I received a letter that a parent wrote to their second grade science teacher. Their child said that after having science with this teacher their child now wants to be a scientist when they grow up. Teachers are impressed with the vocabulary the students are using in science too. The kids really like the hands on and look forward to science now. [District Science Supervisor] The teachers have moved from a lecture approach to the teachers being more of a facilitator. [Principal]

To better understand how the modules and the professional development teachers attended were being translated into classrooms, members of the evaluation team observed the classroom instruction of a sample of 30 participating teachers. Trained observers visited each of these teachers between April and May 2007 while they were implementing the module. For each teacher, the evaluator observed a science lesson based on the module in which the teacher had received SIE professional development, and then conducted an extended interview with the teacher. The classroom observations focused on the quality of the instruction in terms of opportunity for students to deepen their understanding of the intended content. The classroom observations focused on the six modules being examined in the student achievement study. HRI randomly selected 15 schools to visit, with the goal of observing 2 randomly selected teachers within each school. Evaluators were able to observe 26 of the 30 teachers originally sampled; four backup teachers needed to be observed,15 including one teaching from a module outside the six focus modules. Table 10 shows the number of lessons observed in each module.

Table 10 Observed Lessons

Module Number of Lessons Observed Electric Circuits 7 Motion and Design 6 Mixtures and Solutions 6 Rocks and Minerals 4 Levers and Pulleys 4 Variables 2 Water 1

As mentioned previously, the framework HRI used for assessing opportunity to learn is based on research on how people learn. In short, research suggests that science instruction is most effective when instructional activities are well-aligned with the learning goal, engage students intellectually (i.e., “minds-on” rather than just “hands-on”) with important and accurate content

15 Backup teachers were selected for a variety of reasons. One original sample teacher had a student teacher and wasn’t leading the class. A couple had finished instruction with the module prior to the visit. The fourth was team teaching.

Horizon Research, Inc. 32 October 2007

Page 41: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

at an appropriate level of difficulty, reflect the nature of science (e.g., requiring conclusions to be supported by data/evidence), and help students make sense of the important ideas. The framework also specifies that learning is facilitated by a supportive classroom climate that encourages active participation of all students. Following is an analysis of the observed lessons on each element of this framework, as well as an analysis of how teachers’ modifications to instructional materials affected student opportunity to learn. Extent to which the disciplinary content experienced by students was accurate Reviewers examined the accuracy of the content students experienced, including the content presented by the teacher, the content in the instructional materials the students used, and the conclusions drawn from data/evidence. The science content in the observed lessons was generally accurate; although, there were content errors present in about one-quarter of the lessons, most were minor in nature. For example, in a lesson based on the Mixtures and Solutions module, one teacher incorrectly stated that when a solute dissolves in a solvent, the solute is “all gone.” Although relatively minor, this type of misstatement can foster the common misconception that the solute is no longer present in the solution. However, there were a few instances in which the content error was more substantive. In a lesson from Electric Circuits, students were testing the conductivity of a variety of common objects. Although student data were consistent for many of the objects, a discrepancy about the conductivity of a pencil arose. Some students found that they could use a pencil to complete a simple electric circuit if they used the metal band around the eraser. Other students, who were testing just the graphite and wooden parts, could not complete the circuit with the pencil. The teacher asked, “If it works sometimes and not other times what would it be?” A student responded that it would be “a semiconductor,” an answer that the teacher accepted as correct (even though it is incorrect), rather than trying to resolve the difference in the experimental procedure that led to the inconsistent data. Extent to which the nature of science was accurately portrayed Understanding the nature of science (e.g., how scientific knowledge is generated) is an important aspect of learning about science. Students need to know that science involves the drawing of conclusions based on gathered data, not the memorization of a series of facts. In addition, student data collection should be systematic, with evidence critically examined and interpreted. For some concepts, especially those that research has shown to be resistant to change, such a critical analysis is critical. For example, children have observed that a moving object will eventually come to a stop. Many students believe that the object “runs out of force,” and telling them that in actuality it stops because another force is opposing the motion will not convince most students. Therefore, the use of data and evidence to support and critique conclusions was looked for in the observed lessons. In addition, evaluators looked for the appropriate use of scientific language, careful recording of observations/data, controlling of variables, and appropriate attention to scientific uncertainly when present. The nature of science was accurately portrayed in about half of the observed lessons. In these lessons, data were carefully recorded and variables controlled during experimentation. Scientific language was used appropriately, and students were expected to use evidence when sharing their

Horizon Research, Inc. 33 October 2007

Page 42: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

findings. Furthermore, the teachers modeled and structured the lessons so there was an expectation of using data and drawing conclusions from the data. For example:

In a fourth grade class using Electric Circuits, students were focused on applying what they had learned about complete circuits to identify hidden circuits within “puzzle boxes.” After a whole class review of the concept of a compete circuit and how electricity flows through it, students were introduced to the use of a circuit tester. After small groups had assembled their own circuit testers, the class was presented with 20 different puzzle boxes and asked how they would test for a circuit within each box. The students generated a number of strategies to allow for systematic exploration. For example, one student suggested that it would be important to write down results for each test; another suggested drawing the boxes or circuits in their notebooks could help; a third suggested that the class go through the experiment using “ordered pairs” (a concept that was a part of an earlier mathematics lesson). As students generated ideas, the teacher recorded their strategies on the board. Following the class discussion on data collection, each group began testing the circuits within the puzzle boxes and recording their data. Each group was careful to record the box letter and the number of each circuit and whether it passed or failed the test with their circuit tester. Two attempts were completed for each circuit to help control for potential error. The teacher also encouraged the use of scientific processes such as confirmation of findings, collaboration with others, review with an “expert,” and sharing one’s findings widely. For example, during the lesson when one group of students was unable to find a complete circuit in two of the boxes, the students carefully re-tried each circuit, and then asked another group to try as well. Student 1 (group 1): I can’t get any of these to work. Student 2 (group 1): Let me try (attempts each circuit). None are working, maybe it’s broken. Student 1 (group 1) speaking to group 2: Did you get any of these to work … on Box D? Student 3 (group 2): We didn’t do Box D. Give it to me … we’ll try it. Groups 1 and 2 join while the students in group 2 explore Box D. Student 4 (group 2): No, this box has to broken … none of them are working at all. Student 1 (group 1): Mr. X, this box is broken. Nothing is working at all. Teacher (Mr. X): (tries each circuit) Uh, oh, you’re right, nothing is lighting! Let’s open it and see. The students then open the box and the wire falls out. The students and the teacher laugh. Teacher (Mr. X): Well I guess I screwed that one up! The class was stopped and the box was shown to everyone and the two groups were thanked for their discovery. A similar situation occurred later and the group was encouraged to explore the inside of the box as these groups had. At the conclusion of the lesson, the students shared their observations with the class and a brief summary was made of the data.

Observers also noted several lessons in which students were carefully collecting data, but there was little to no expectation for critical thinking around the meaning of the collected data. About a quarter of the observed lessons included little, if any, use of data/evidence to draw conclusions. In most of these lessons, the investigation was presented more as a recipe of instructions to follow rather than as a systematic exploration. For example, in a fifth grade lesson from Levers and Pulleys on the advantages of a single-fixed pulley and a single-moveable pulley, the emphasis of the lesson was on accurate data collection techniques rather than the science content (i.e., to gain an understanding that a single-fixed pulley provides a directional advantage and that a single-moveable pulley provides a mechanical advantage). As the observer wrote:

Horizon Research, Inc. 34 October 2007

Page 43: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

The lesson began with the teacher reviewing in detail how a spring scale can be used to measure the effort force required to lift a load in a pulley system. The class also reviewed terminology related to single fixed and moveable pulleys. After the teacher demonstrated the procedures for the activity, students worked in groups to build the two pulley systems and use the spring scales to record measurements on their data collection sheets. At the end of the activity, the teacher asked students to report out their data findings, recording the data on an overhead. The idea of measurement error was addressed and discussed in relation to the use of the spring scale throughout the lesson. As students’ data were compared, differences were discussed and reasons suggested for the differences (e.g., lack of spring scale calibration, looking at the scale at an angle). The measurement lecture did reinforce the importance for multiple measures and accuracy with the instrument and students seemed to know if their measurements needed to be adjusted. However, students were not asked to support/critique claims with their data. Rather, it appeared that there were “right” answers from the experiment that the teacher was looking for students to provide. If the student findings were not in line with what the teacher expected, he would look at the students questioningly and wait for them to correct their response, or tell them that the data were incorrect and ask them to do the experiment again. Thus, the emphasis of the lesson was on how to use the spring scale accurately rather than on the differences between the two pulley systems.

In the remaining one-fourth of the lessons, the importance of controlling variables and consistency in data collection were not emphasized at all. For example, in a fifth grade lesson from Motion and Design, students were to apply what they had learned about motion in building, testing, and redesigning a vehicle to meet provided specifications (e.g., to travel a certain distance in a certain time). Although student groups had created an initial design and cost estimate for their vehicle, the testing and modification of their designs were not carefully conducted. None of the groups recorded their test results or the modifications they made to their vehicles, and several lost sight of the end goal they were working towards. Wrote the observer:

Students were building, testing, and modifying vehicles they had constructed during a Motion and Design lesson intended to have students apply what they had previously learned. The class began with the teacher asking one group of students about their plan to use rubber band energy in moving their car up an incline (e.g., could they think of alternative designs that would also work). After the teacher reminded students that they might need to change their plans to improve their designs, the remainder of the lesson was spent by students constructing and testing their vehicles.

Although students were busily testing their vehicles and modifying their designs, they were not keeping track of what they tested and the results of those tests. For example, the teacher noticed that one group was not recording the number of times they wound the rubber band powering their vehicle. The teacher asked, “How can you make your vehicle go faster? Is there a difference between small and big rubber bands? Sometimes it may help to count.” With different ideas expressed in the questioning, the idea of purposeful testing and making modifications based on previous experiments did not come through. In order to meet their design challenge, the group varied the number of times they wound the rubber band, switched rubber bands, and changed tires one after another in no systematic way.

In another lesson, the evaluator noted that the teacher was so focused on getting the students’ observations to match the expected results listed in the teacher’s manual for the activity, that the important elements of scientific experimentation, such as careful recording of data and scientific uncertainty, were ignored.

In this lesson, the students conducted investigations of four different pulley systems they had previously explored (i.e., single-fixed pulley system; single-movable pulley system; single-fixed/single-movable pulley system—effort down; single-fixed/single-movable pulley system—effort up). During the investigation, they recorded data for the weight of the load, the effort needed to lift the load, and the

Horizon Research, Inc. 35 October 2007

Page 44: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

number of ropes supporting the load. After each group reported out their data, a class mean was calculated and compared to the theoretical values. The teacher pointed out that for three of the systems, the class means were off by a small amount: Teacher: These are the theoretical results; this is what you should be getting. Look here [pointing to the chart on the board], you are off by 0.1, and here by 0.3 and 0.1. That’s pretty good, but here you are off by 1.3. That’s not good. What system is that? We need to redo this one again. The teacher did not explore any possible explanations for the measurements being so far off and indicated that the students must not have followed the directions or been careful with their measurements.

Extent to which all students were intellectually engaged with the targeted ideas in the lesson Another indicator of a high quality lesson is the extent to which students are intellectually engaged with the ideas targeted by the module. In assessing this aspect, observers examined the extent to which students were thinking and talking about the science content and were engaged in accessible and challenging intellectual work. Students understanding what they were doing and why they were doing it also provided evidence of intellectual engagement. Students were, for the most part, on task and cooperative in each of the observed lessons. Students were often observed following activity procedures; collecting and recording data; listening and taking notes; and completing assigned problems. Students were highly engaged with the important ideas in a small number of the observed lessons. These lessons were both structured and implemented in ways to facilitate intellectual engagement. For example, the observer reported that fourth graders in a classroom using the Electric Circuits module were very engaged in determining the connections in their mystery boxes:

In small groups, students were actively engaged in testing the circuits in each box. Students were also diagramming the boxes/circuits and where they thought wires were connected within the boxes. The teacher asked the students questions as he circulated among groups and allowed the students time to explore responses. The teacher asked a number of groups to share their data charts and drawings. In each case, the teacher asked students to identify at least one example of how they thought the wiring was connected in a sample puzzle box and to explain their reasoning. The students were very actively engaged and focused with the ideas in the lesson. Each group operated independently with no need to seek information beyond the small group exploration.

Although students were physically engaged with the activities in a lesson, observers often noted that students were not intellectually engaged with the ideas inherent in the activities (i.e., the lessons were “hands-on,” but not “minds-on”). In the majority of the observed lessons, students were focused on collecting data and/or completing questions related to an activity, but were not focused on the meaning behind the activity. Their work was often procedural, not conceptual, in nature. For example:

Students in a fifth grade class working in the Variables module were using a plane they had previously built to set up a controlled experiment. Students were challenged to figure out the minimum number of times the propeller needed to be wound to fly the plane the length of a designated line. Students worked in groups to conduct several trials and record all information about the flights of their planes on a worksheet. Next, students were challenged to find out how many winds of the propeller it would take to fly the plane halfway down the line. Once groups had investigated this question, the whole class discussed what variables might have affected the flights of their planes.

Horizon Research, Inc. 36 October 2007

Page 45: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

This lesson was implemented in a hands-on manner, but was not minds-on for the students. Students were observed setting up their flights and recording measured distances and number of propeller winds in their data tables, with student conversations focusing on what numbers to record. Although students were invested in meeting the challenge of finding the minimum number of winds to fly the plane the designated lengths, many students appeared to not understand what they were doing and why they were doing it. For example, two groups were sharing data and when they noticed that they had widely different results, they seemed lost as to how to explain the discrepancy. In addition, while students were recording the data, it was evident that they were not thinking about the data. One group noted that zero winds resulted in a flight distance of 123 cm, not considering the unreasonableness of this measurement or the fact that they had other trials with more winds that resulted in shorter flight distances. By the end of the experiment two groups were obviously frustrated with their lack of success and expressed the sentiment that the task was, “impossible….because it all makes no sense.” During the concluding activity only a few individuals shared their thoughts, and most seemed uninterested in the discussion.

In a few lessons, students made observations and collected data while doing activities, but the teacher short-circuited the learning by telling students the correct answers. For example:

In a fourth grade lesson from Electric Circuits, the class was examining the predictions and observations they had made in a previous lesson as to whether various items were insulators or conductors. The teacher began by reviewing the terms insulator, conductor, and semiconductor. Next, the teacher went through each item, one by one, asking students what they had predicted and what they had found when they did their experiments. The teacher then tested each item’s conductivity, and telling the students the “correct answer,” without regard to what the students found, having them make corrections to their notebooks. At the end of the lesson, students copied the definitions of insulator and conductor into their notebooks. Then the teacher “helped” the students write a power conclusion by synthesizing their predictions, experimental data, and error analysis into a paragraph, telling the students what to write.

In a handful of observed lessons, the level of intellectual engagement was mixed. For example, at the beginning of a Motion and Design lesson, students were highly engaged in making predictions about how adding a sail to a toy vehicle would affect its motion. Students enthusiastically generated and volunteered their ideas. However, later in the lesson when they were building their vehicles, students were often off task, playing with materials and not recording what did and did not work with their sail, as they had been instructed. The teacher had difficulty keeping the class working, and students were not engaged with the task. Extent to which students were given the opportunity to “make-sense” of the targeted ideas The framework used to examine the sample lessons asserts that high-quality instruction involves the teacher and/or instructional materials helping students make meaningful and relevant connections between their experiences and key science concepts (i.e., “making-sense” of the targeted ideas). Only about one-third of the observed lessons were judged to include some sense-making of the key science concepts in the modules. However, even in these lessons, observers reported that the sense-making opportunities were mixed in quality, or were provided for some of the target ideas but not others. The common element in lessons with sense-making activities was the teacher helping students make connections between the activities and conclusions. In these cases, the student notebooks were often used to refer back to experimental data. In some lessons, the teacher’s facilitation of a group discussion helped students make sense of the content, as was the case in this lesson:

Horizon Research, Inc. 37 October 2007

Page 46: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

The lesson began with students reviewing vocabulary related to concentration using their science notebooks and a word wall. Next, students were shown three mystery solutions (one red, one blue, one green) and asked how they could determine if the concentrations were the same or different for each solution. Students worked in groups to discuss the query and develop a procedure for determining the concentration of each. Students were highly engaged in the task of developing a plan. After gathering their materials, students worked to implement their plans and gather data. The teacher walked around the room to answer questions and focused students on the important ideas of the lesson by referencing the word wall and notebooks. She also used probing and guiding questions to help students engage with key concepts, such as, “What can you tell me about the concentration if you’re getting those results on the balance?” and “What can you tell me about the solutions based on your results?”

Once all groups had completed the activity, the teacher led a class discussion in which groups shared their results and the teacher asked questions to ensure all students understand how to measure the concentration of a solution.

In some of the lessons that lacked sense-making, teachers appeared to assume that the students, on their own, would be able to identify the important ideas in activities. In other lessons, there was no structured time for students to make sense of the targeted ideas. In other cases, the teacher attempted to make sense, but did not draw the students into the process; students were not asked to interpret their data or explain their understanding of any of the key concepts. The following lesson description illustrates inadequate sense-making:

In an Electric Circuits lesson, fourth grade students were introduced to the concept of using schematic symbols in constructing diagrams of simple circuits. After developing a definition of the term “decipher,” students cut out pictures of symbols for circuit components and glued them into their journals. After modeling for students how to use the symbols to represent the parts of a circuit tester, the students were instructed to draw two different circuits using the symbols. After working on their own schematic drawings, the lesson ended with two students sharing their drawings on the board with the entire class. The students appeared to understand that the schematic symbols could be used in place of a literal drawing for items such as batteries and bulbs, but they did not understand how the schematic symbols should be connected in order to represent a complete circuit. Most students had not correctly connected the wire symbols to the symbols of the dry cells and bulbs to represent a complete circuit. For those students who included some type of wire symbol, the circuits were not closed (i.e., all of the parts connected so that electric current would flow). Also, students did not distinguish between the positive and negative sides of their dry cells by making the line on the positive side longer than the line on the negative side, as instructed. After the two students drew their schematics on the board, the teacher corrected their mistakes. However, it appeared that most students did not know how to draw a simple circuit using the schematic symbols. The students were not given time to reflect upon the corrections that the teacher made, nor were they given additional opportunities to practice making schematic diagrams. The wrap up conversation included: Teacher: (pointing to a student’s diagram on the board which contained only the symbols for two dry cells and one bulb): Why is this diagram not correct? Students: (silence) Teacher: Because these (pointing to the symbols) need to be connected. The way it is drawn, the bulb would not light, because it is an open circuit. We need to connect this. (The teacher drew in wire symbols.) Teacher: (pointing to the next diagram which contained the symbol for one dry cell and two bulbs) These all need to be connected. (She connects the symbols by drawing in symbols of the wires.) Okay, it’s time to get ready for dismissal. You did a good job today.

Horizon Research, Inc. 38 October 2007

Page 47: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

The teacher acknowledged in the follow-up interview that the students were not making sense of the key idea of the lesson. During the interview, she said, “They got the symbols down. I think some of them didn’t understand that the symbols for the filament in the light bulb needed to be connected. Some of the kids took the actual sections of it and wrote it in a row and didn’t realize it until I told them they needed to connect it.” Although the teacher realized the students did not have a complete grasp of the lesson’s key concept, she nevertheless indicated she was going to move on to the next activity in the module in the following lesson.

Extent to which the classroom culture/learning environment facilitated students’ opportunity to learn Addressing important content, engaging students intellectually, and making sense of key ideas are all essential to student learning. In addition, a classroom culture that is respectful of students, encourages students to participate and share ideas, and maintains high expectations for all students facilitates learning. A collegial atmosphere, among students and between the students and the teacher, that encouraged student participation was a strength of most of the observed lessons. For example:

In a sixth grade lesson from the Mixtures and Solutions module, students cooperated with the teacher and one another as they examined three types of crystals with hand lenses and shared their observations. The classroom was well managed, and many different students were called on to participate. If the teacher needed the students’ attention, he simply said, “give me five,” and the students looked to him for their next instructions. The students seemed comfortable asking questions of the teacher. For example, during the reading of an article about citric acid called “Sour Power,” a student asked, “In what country was he a chemist?” The teacher responded, “Read the rest and you’ll figure it out.” Also, the students interacted well with each other. They shared their materials when observing the crystals, and they took turns reading the “Sour Power” article with their partners. Students were eager to volunteer their answers when the questions from the article were reviewed as a class.

In another lesson, the teacher took special care to substantively involve all students in the learning:

This fourth grade lesson from the Electric Circuits module began with the teacher posing a focus question, “Does the bulbs’ brightness differ when the batteries and bulbs are arranged in different ways?” After making their predictions, students worked to build two different circuits, one with batteries in a series and one with batteries in parallel. The students worked collaboratively in pairs, assisting one another with their work.

The teacher spent time with the English language learners (ELL) in the classroom, assisting them with the lesson; indicating that expectations were the same for all students. The teacher used hand gestures and pointed to the light bulbs or diagrams on the board to help communicate the lesson and tasks. She used a dimmer lamp to exemplify brighter and dimmer to the ELL students, allowing them to understand the concept of brightness and to check for brightness in their circuits.

Overall, the teacher was highly respectful of all students, and moved about the classroom checking in with each pair.

In only a few of the observed lessons were there elements of the learning environment that created a barrier to student learning. These typically involved classroom management issues. For example, in one lesson the students immediately started to play with the materials on their desks and did not listen to the teachers’ instructions, either before or during the activity. As a

Horizon Research, Inc. 39 October 2007

Page 48: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

result, the teacher spent the majority of the lesson stopping to deal with behavioral interruptions and repeating the instructions for the activity. Extent to which teachers made modifications to the lesson as described in the module and the extent to which those modifications affected students’ opportunity to learn As was described previously, a number of teachers reported making modifications to the lessons in their module. The observed lessons were no exception. Observers noted that teachers made modifications to most of the observed lessons; only 8 of the 30 lessons adhered to the modules’ lesson plans. In each case that a lesson was modified, evaluators indicated that the modifications likely reduced student opportunity to learn. Most often, the teacher omitted probing questions or the wrap-up session outlined in the instructional materials, reducing the sense-making opportunities in the lessons. Other teachers deleted some of the steps in an activity, for example, not requiring students to collect data or not having students share their data with the class. As one observer described:

In a sixth grade class, the goals of a lesson from Mixtures and Solutions were to review what had been learned in previous lessons and to introduce the terms concentration, dilute, and volume. Specifically, students were expected to learn that concentration expresses a relationship between the amount of dissolved material and the volume of the solvent; the more material dissolved in the liquid, the more concentrated the solution. In addition, they were expected to learn that a concentrated solution can be made more dilute by adding more solvent.

During the lesson, the teacher mixed up solutions with different concentrations of Kool-Aid (i.e., one scoop of Kool-Aid in a pitcher of water vs. three scoops in a pitcher; two scoops in 1000mL of water vs. two scoops in 500mL) and asked students to compare their concentration. When comparing the solutions, students were told to “think deep” and that “obviously they are different colors, but think of what they’re made of. Think of all the ways they are the same and different.” However, the teacher guide specifically states that the teacher should focus the students on the amount of ingredients, the appearance, and the taste of the solutions when making and recording their observations. This guidance is intended to help focus students on the idea of concentration and keep them from going off on non-productive tangents.

In addition, the teacher did not use the suggested discussion prompts such as “Which soft-drink solution tasted sweeter?” and “What happens to the soft-drink solution when you increase the amount of powder in a given amount of water?” Instead, the teacher asked one question that provided little insight into students’ thinking, “If I had a plain cup of water with six tablespoons of salt and one with two tablespoons of salt, which one is more concentrated?” This modification reduced the likelihood that students would connect the activity they conducted to the learning goals of the lesson.

In a number of lessons, the teacher omitted the reading provided with the module. Teachers may think that these materials are intended solely to strengthen reading skills; however, the readings contain many of the important concepts intended to be taught by the module that are not covered in the activities. Overall extent to which the lessons provided students with an opportunity to learn the targeted ideas All of the observed lessons included some elements of effective practice, though only a few lessons managed to “put it all together.” Common strengths of the lessons included involvement of students with key content as well as collegial interactions among students and respectful contact between the students and the teacher. Areas of weakness included the extent of student intellectual engagement with and sense-making of the content. Students engaged in many hands-

Horizon Research, Inc. 40 October 2007

Page 49: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

on activities, but the amount of intellectual rigor required of the students in most lessons was generally low. Modifications to the lessons, often in the form of deleting probing questions or wrap-up discussions, likely hindered students’ opportunity to make sense of the target concepts.

IMPACT OF SIE ON STUDENTS As part of the evaluation of the SIE program, HRI, with the assistance of the PDE, designed and implemented a study to examine the impact of SIE professional development and teachers’ implementation of the science modules on student achievement in science. The study utilized a treatment/comparison group pretest/posttest design to compare assessment results for students receiving instruction on a topic using the SIE-provided modules to those not receiving instruction on that topic. The study sought to answer the following questions:

1. Do students of teachers who participated in SIE professional development and who implemented the SIE-provided module on a topic exhibit greater achievement in science after instruction than students of teachers who participated in SIE professional development and who implemented a module in a different topic?

2. Are there gender or race/ethnicity differences in student achievement, and if so, does use

of an SIE-provided module reduce these differences?

3. Is there a relationship between the extent of use of the SIE-provided modules and student achievement?

4. Is there a relationship between the extent of teacher participation in SIE professional

development and student achievement?

5. Is there a relationship between the extent of school/district participation in SIE events (i.e., the Strategic Planning Institute, Curriculum Showcase, and Vision Conference) and student achievement?

Instrumentation HRI developed six assessment scales for this study, with each scale corresponding to a science module provided by the SIE program. The decision to develop assessments for only 6 of the 11 modules provided by SIE in Year One, made in conjunction with ASSET, Inc. and PDE, was based on two factors. First, in-line with current state and federal assessment policies, it was decided not to assess students in Grades K–2. This decision reduced the candidate pool of modules to eight. Second, two of the modules provided by SIE in Year One focus mainly on science process skills such as making observations, measurements, and using various scientific tools. Developing, administering, and scoring appropriate assessments for these modules, while

Horizon Research, Inc. 41 October 2007

Page 50: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

possible, would be cost-prohibitive. Thus, it was decided to focus this study on the six modules for Grades 3–6 with a heavy focus on science concepts:

1. Grade 3: Rocks and Minerals (STC); 2. Grade 4: Electric Circuits (STC); 3. Grade 5: Levers and Pulleys (FOSS); 4. Grade 5: Motion and Design (STC); 5. Grade 6: Mixtures and Solutions (FOSS); and 6. Grade 6: Variables (FOSS).

Because of the large number of students involved in the SIE program, and the timeframe in which results were needed, it was also decided to use only selected-response (i.e., multiple-choice) items rather than including open-ended or performance items. For each module, HRI developed a pool of assessment items that covered the concepts included in both the module and the Pennsylvania Science and Technology Standards. All items went through a stringent, internal review to ensure alignment with the targeted content and language accessibility. The items were then reviewed by Ph.D. scientists with expertise in the relevant topic area for content accuracy; any items with content issues were revised or removed from the item pool. HRI also sent the six pools of items to PDE for review and approval. For each module, HRI selected 20 items that covered various concepts in the content domain. These steps provide some assurance of content validity of the assessments. In addition, statistical analyses can be used to examine the validity and reliability of items. Factor and dimensionality analyses were used to determine whether a set of items formed a scale (i.e., a set of items that measure the same ability or trait, for example knowledge of levers and pulleys). Item response theory was then used to determine the extent to which individual items contribute to the scale. Together, these analyses identified a small number of items that did not function well. Table 11 shows the number of items out of the original 20 for each scale that were retained for these analyses, as well as the Cronbach’s alpha reliability for each set of items.16

Table 11 Assessment Scale Reliabilities

Scale Number of Items Retained Reliability

(Cronbach’s Alpha) Rocks and Minerals 18 0.72 Electric Circuits 20 0.75 Levers and Pulleys 18 0.68 Motion and Design 18 0.75 Mixtures and Solutions 19 0.73 Variables 19 0.81

16 Cronbach’s alpha ranges from 0 to 1; typically, a Cronbach’s alpha ≥ 0.60 is considered acceptable. The reliabilities reported in Table 11 are based on the post-test data.

Horizon Research, Inc. 42 October 2007

Page 51: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Race/ethnicity and gender data were also collected from students. Finally, teachers provided information about their classes, including how much of the module they covered and the extent to which they used other instructional materials to teach the topic. The Sample Of the 75 schools participating in the first year of SIE, 72 were expected to participate in the student achievement study (three schools were granted exemptions from the study by PDE as they were implementing modules other than those currently being provided by SIE). Of these 72 schools, 62 returned pre- and post-test data in time to be included in these analyses.17 The great majority of data returned were included in the analyses; however, data from a number of classes were excluded due to assessment administration errors (e.g., some teachers did not follow the instructions for distributing answer sheets, making it impossible to link student pre- and post-test data for those classes). Table 12 shows the number of classes that were expected to be part of the study for each assessment scale, as well as the number of classes included in the final analyses.

Table 12 Number of Classes in the Study

Using Module Not Using Module

Scale Expected Included in

Analyses Expected Included in

Analyses Rocks and Minerals 91 73 124 77 Electric Circuits 135 96 89 71 Levers and Pulleys 111 91 125 89 Motion and Design 125 87 111 80 Mixtures and Solutions 76 48 87 56 Variables 87 54 76 47

Table 13 provides demographic information for students in the classes included in these analyses. Overall, the classrooms contained slightly fewer females than males. Most students classified themselves as White; nearly all students indicated that English was their primary language.

17 Sixty-eight schools returned pre-test materials. Five schools returned post-test assessments after data collection and analysis ended.

Horizon Research, Inc. 43 October 2007

Page 52: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table 13 Student Demographics, by Assessment Scale

Percent of Students

Rocks and Minerals

(N = 2,604)

Electric Circuits

(N = 3,072)

Levers and Pulleys

(N = 3,435)

Motion and Design(N = 3,111)

Mixtures and

Solutions (N = 2,089)

Variables (N = 1,979)

Gender Female 48 49 49 48 49 49 Male 52 51 51 52 51 51

Race/Ethnicity† American Indian/Alaskan

Native 1 1 1 1 1 1

Asian 1 1 1 1 2 1 Black/African-American 14 14 13 13 8 6 Hispanic/Latino 5 2 3 3 4 4 Native Hawaiian/Other

Pacific Islander 0 0 0 0 0 0

White 82 84 85 86 89 91 English is Primary Language 97 99 98 99 97 98 † The total percent may add to more than 100 as students could select more than one category. Analysis and Results Item response theory (IRT) was used to compute difficulty and discrimination parameters for each item. These parameters were then used to calculate “true test scores” for each student, an IRT-based score that takes into account the relative difficulty of each item. Using the true test score removes the error associated with day-to-day fluctuations in student performance, and is a better estimate of student knowledge than a raw score such as number or percent correct. The true test scores were scaled to have a minimum of 0 and a maximum of 100. Table 14 shows the true test scores for each assessment scale.

Table 14 Mean True Test Scores (and standard deviations), by Module Use

Not Using Module Using Module Scale Pre-Test Post-Test Pre-Test Post-Test

Rocks and Minerals 36.76 (11.23)

40.43 (13.30)

35.42 (10.72)

56.27 (16.20)

Electric Circuits 46.30 (12.70)

49.10 (13.00)

42.35 (10.58)

59.26 (12.47)

Levers and Pulleys 38.86 (8.73)

41.09 (9.86)

41.57 (10.05)

52.40 (11.27)

Motion and Design 40.34 (13.71)

44.15 (14.42)

39.61 (13.19)

45.91 (14.90)

Mixtures and Solutions 49.43 (11.88)

50.67 (12.83)

48.44 (10.82)

59.23 (13.79)

Variables 47.56 (16.65)

50.66 (17.77)

49.95 (17.06)

53.29 (18.16)

Horizon Research, Inc. 44 October 2007

Page 53: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Because these scores do not take into account initial differences among students, or variation in student demographics among classes, they should not be directly interpreted. Statistical techniques such as regression or analysis of variance can be used to control for these factors. However, the student assessment data have a nested structure, with students nested within classes. Statistical techniques that do not account for potential grouping effects (e.g., all students in a class having common instructional experiences) in nested data structures can lead to incorrect estimates of the relationship between independent factors and the outcome. Hierarchical (multilevel) regression modeling (HLM) is an appropriate technique for nested data18 and was used to examine student assessment scores. Results for each model follow, organized by research question. Regression coefficients and standard errors are presented in Appendix E. Do students of teachers who participated in SIE professional development and who implemented the SIE-provided module on a topic exhibit greater achievement in science after instruction than students of teachers who participated in SIE professional development and who implemented a module in a different topic? HRI examined two sets of models for each assessment scale to answer this research question. The first set compared the achievement of the treatment and comparison groups on the pre-test to examine initial equivalence of the groups. The second set of models examined post-test scores. The models also examined whether achievement gaps existed by gender and race/ethnicity. If achievement gaps were found, the models investigated whether the use of the SIE-provided module reduced the achievement gap. Because the number of students classifying themselves as any group other than White was small, the race/ethnicity data were collapsed into two categories: white/Asian vs. non-Asian minority.19

Treatment group students significantly and substantially outperformed comparison group students on 4 of the 6 assessment scales: Rocks and Minerals; Electric Circuits; Levers and Pulleys; and Mixtures and Solutions. Rocks and Minerals There was no significant difference between the treatment and comparison groups on the pre-test, indicating initial equivalence of the two groups. Controlling for pre-test scores and student demographics, students of teachers using the SIE-provided STC Rocks and Minerals module

18 Bryk, A. S. & Raudenbush, S. W. (1992). Hierarchical linear models: Applications and data analysis methods. Newbury Park, CA: Sage Publications. 19 Asian students often outperform all other groups of students, and are often grouped with white students in these types of analyses.

Horizon Research, Inc. 45 October 2007

Page 54: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

scored significantly higher than students of teachers not using that module (see Figure 6).20 The 1.03 standard deviations difference represents a large effect.21

Rocks and Minerals

36 3639

56

0

10

20

30

40

50

60

70

Comparison Treatment*

Pred

icte

d Te

st S

core

Pre Post

* Controlling for pre-test scores and student demographics, treatment group post-test score significantly different than comparison group post-test score (HLM, p < 0.05).

Figure 6 Electric Circuits There was a significant difference in pre-test score between the treatment and comparison groups, with comparison group students scoring higher than treatment group students. Controlling for student demographics and pre-test scores, students of teachers using the SIE-provided STC Electric Circuits module scored significantly higher than students of teachers not using that module, a large effect of 0.83 standard deviations. (See Figure 7.)

20 Pre-test score was used as a control variable when examining post-test scores, even when there were no initial differences between the groups. Doing so reduces the unexplained variance in post-test scores and increases the statistical power (i.e., the probability of detecting a difference if one truly exists) for examining other relationships. 21 The effect size for the comparison of two adjusted means is calculated as the difference between the means divided by the pooled standard deviation. Effect sizes of about 0.2 are typically considered small, 0.5 medium, and 0.8 large. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Horizon Research, Inc. 46 October 2007

Page 55: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Electric Circuits

4542

47

59

0

10

20

30

40

50

60

70

Comparison Treatment*

Pred

icte

d Te

st S

core

Pre Post

* Controlling for pre-test scores and student demographics, treatment group post-test score significantly different than comparison group post-test score (HLM, p < 0.05).

Figure 7 Levers and Pulleys There was no significant difference between the treatment and comparison groups on the pre-test. Students in classes using the SIE-provided FOSS Levers and Pulleys module scored significantly higher on the post-test than students in classes not using that module, controlling for student demographics and pre-test scores, a large effect of 0.90 standard deviations. (See Figure 8.)

Horizon Research, Inc. 47 October 2007

Page 56: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Levers and Pulleys

40 4041

52

0

10

20

30

40

50

60

70

Comparison Treatment*

Pred

icte

d Te

st S

core

Pre Post

* Controlling for pre-test scores and student demographics, treatment group post-test score significantly different than comparison group post-test score (HLM, p < 0.05).

Figure 8 Mixtures and Solutions There was no significant difference between the treatment and comparison groups on the pre-test. Controlling for student demographics and pre-test scores, students in classes using the SIE-provided FOSS Mixtures and Solutions module scored significantly higher than students in classes not using that module, a medium effect of 0.61 standard deviations. (See Figure 9.)

Horizon Research, Inc. 48 October 2007

Page 57: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Mixtures and Solutions

48 4851

59

0

10

20

30

40

50

60

70

Comparison Treatment*

Pred

icte

d Te

st S

core

Pre Post

* Controlling for pre-test scores and student demographics, treatment group post-test score significantly different than comparison group post-test score (HLM, p < 0.05).

Figure 9 Motion and Design Although post-test scores were, on average, higher than pre-test scores, there were no significant differences between the treatment and comparison groups on either the pre-test or post-test score. There are a number of possible explanations for this finding. One is that research has shown that students typically have misconceptions about the concepts dealt with in this module that are very resistant to change.22 Thus, the level of teacher knowledge and expertise required to implement this module effectively may be greater than that which is needed for the other modules. A second possibility is suggested by the fact that both the treatment and comparison group means were higher in the post-test. The comparison group for this scale received instruction using the FOSS Levers and Pulleys module which may have provided students opportunities to learn some of the same concepts. Variables Although post-test scores were, on average, higher than pre-test scores, there were no significant differences between the treatment and comparison groups on either the pre-test or post-test score. As with the Motion and Design outcome, both the treatment and comparison group means were higher on the post-test than on the pre-test. Thus, it is possible that comparison group students learned the concepts measured on this assessment scale through the FOSS Mixtures and Solutions module.

22 See for example: Driver, R., Squires, A., Rushworth, P, & Wood-Robinson, V. (1994). Making sense of secondary science: Research into children’s ideas. New York: Rutledge Falmer.

Horizon Research, Inc. 49 October 2007

Page 58: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Are there gender or race/ethnicity differences in student achievement, and if so, does use of an SIE-provided module reduce these differences? On each scale, white/Asian students outperformed non-Asian minority students. In addition, male students outperformed female students on three of the assessment scales: Electric Circuits; Levers and Pulleys; and Motion and Design. Table 15 shows the magnitudes of the pre-test and post-test achievement gaps. When gaps existed they tended to be small. In addition, 2 of the 3 gender gaps and 2 of the 6 race/ethnicity gaps were smaller on the post-test than on the pre-test. However, the narrowing of these gaps was not related to instruction with the SIE-provided module.

Table 15 Achievement Gaps (in standard deviations), by Module

Gender Gap (males > females)

Race/Ethnicity Gap (white/Asian > non-Asian minority)

Module Pre-test Post-test Pre-test Post-test Rocks and Minerals -- -- 0.20 0.22 Electric Circuits 0.43 0.05* 0.22 0.28 Levers and Pulleys 0.27 0.17* 0.37 0.23* Motion and Design 0.21 0.11 0.42 0.20 Mixtures and Solutions -- -- 0.19 0.21 Variables -- -- 0.32 0.00*

* Post-test gap significantly different than pre-test gap in both treatment and comparison classes (p < 0.05). Is there a relationship between extent of use of the SIE-provided modules and student achievement? HRI also examined whether the extent of module use (see Table 9) was related to student achievement in treatment group classes. For 4 of the 6 modules (Electric Circuits; Levers and Pulleys; Motion and Design; and Mixtures and Solutions), the extent of use of the module was significantly and positively related to student achievement on the post-test (there were no differences on the pre-test associated with the percent of module used). In other words, on average, the greater the proportion of the module used, the larger the student achievement gains were. Table 16 summarizes the relationship between extent of module use and student achievement; effect sizes23 are relatively small, ranging from 0.11 to 0.19 standard deviations. In terms of test scores, students of teachers using one standard deviation more of the module would be expected to score about 2 points higher than students of teachers using the average amount of the module. For the other two modules (Rocks and Minerals; and Variables), the relationship was in the positive direction, but fell just outside the criterion used for determining statistical significance.

23 For continuous independent variables such as the percent of module lessons used, the effect size is calculated as the number of standard deviations change in the dependent variable for every one standard deviation change in the independent variable.

Horizon Research, Inc. 50 October 2007

Page 59: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table 16

Relationship Between Extent of Module Use and Student Achievement

Module Effect Size

(standard deviations) Levers and Pulleys 0.19 Electric Circuits 0.17 Mixtures and Solutions 0.17 Motion and Design 0.11

Is there a relationship between the extent of teacher participation in SIE professional development and student achievement? Teacher professional development is a key component of the SIE program. In the first year of participation, teachers are expected to attend a three-day training on the module designated for their use. Although teachers are expected to attend all three days, not all teachers did. Thus, HRI was able to examine whether the extent of teacher participation in SIE professional development was related to student assessment scores. Based on program records, 91 percent of teachers for whom HRI had usable assessment data attended all three days of SIE professional development, ranging from 79 percent of teachers using the Levers and Pulleys module to 100 percent of teachers using Rocks and Minerals. The remaining teachers attended 2 of the 3 days; the data do not indicate which day teachers did not attend. Because the relative number of teachers attending fewer than the full three days is small, HRI combined data across the six assessments to increase the statistical power (i.e., the probability of detecting a difference if one truly exists) of the analysis. Analysis of the pre-test data indicate that there was no significant different in initial scores between the two sets of classes. On the post-test, students of teachers who attended all three days of SIE professional development outperformed students of teachers who attended only two days of the professional development by about 3 points (an effect size of 0.20 standard deviations). Full attendance at the professional development was not associated with a change in the gender and race/ethnicity achievement gaps. Interestingly, teachers who attended all three days of SIE professional development tended to use a greater proportion of the SIE-provided module (on average, 80 vs. 67 percent of the module). However, whether full participation in the professional development leads to greater module implementation or whether there is a self-selection bias (e.g., teachers that are more enthusiastic about the program being more likely to attend the professional development and use the module) is not discernable from these data.

Horizon Research, Inc. 51 October 2007

Page 60: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Is there a relationship between the extent of school/district participation in SIE events (i.e., the Strategic Planning Institute, Curriculum Showcase, and Vision Conference) and student achievement? School and district support are important elements of successful reform. Without such support teachers may be less likely to implement the reform, or may modify it in a way that impacts its efficacy. Although the evaluation did not attempt to measure teachers’ perceptions of support for SIE, school/district participation in the three events that might contribute to developing support for SIE can be viewed as a proxy for support. These three events were the Strategic Planning Institute, Curriculum Showcase, and Vision Conference. HRI took two approaches to determine whether district participation in the three SIE events was related to student achievement. The first, and simpler, approach was to examine whether the number of events attended was related to student achievement. As can be seen in Table 17, over two-thirds of districts from whom HRI received assessment data attended all three events. Because very few districts attended fewer than two events, districts were categorized as having attended all three events vs. two or fewer events.

Table 17 District Attendance at SIE Events Number of Events Attended

Percent of Districts (N = 56)

0 2 1 5 2 25 3 68

The second approach HRI took was to weight support by the length of the three events and the number of people attending each. The Strategic Planning Institute was five days long; in contrast, the Curriculum Showcase and Vision Conference were each one day long. In addition, districts were encouraged to send a larger number of people to the Strategic Planning Institute than to the other two events. Thus, HRI calculated the number of person-days each district committed to the three events. (See Table 18.)

Table 18 Person-Days Committed to SIE Events (N = 56)

Mean Standard Deviation Minimum Maximum 23.77 8.45 0.00 49.00

Neither measure was a significant predictor of student achievement. Because schools had to apply to participate in SIE, it may be that they all are supportive of the program. Another possibility is that variation in attendance at the three events is not a good indicator of variation in support. Given the quick start-up of the program, it is quite possible that some schools were unable to free up personnel to attend these events which occurred early in the program year.

Horizon Research, Inc. 52 October 2007

Page 61: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

SUMMARY AND RECOMMENDATIONS In its first year, the SIE program can be credited with a number of accomplishments. The program has conducted two Strategic Planning Institutes (one for schools in the western part of the state and another for schools in the eastern half) and Vision Conferences; provided schools with opportunities to explore the various modules being offered by SIE; provided three days of professional development to nearly 1,300 teachers across the state; and delivered a ready-to-use science module to each teacher participating in the program. Overall, teachers, principals, and district staff have reported positive experiences with each of the elements of the SIE program. The Strategic Planning Institutes provided an opportunity for school teams to become familiar with science education reform and gave them the rare opportunity to work together on a plan for improving their science program. The Vision Conferences provided schools the opportunity to broaden the base of stakeholder support for science education reform. The three days of professional development were extremely well received by teachers and generated a great deal of enthusiasm for teaching science from the modules, even among teachers who were skeptical of the program at the start. In addition, the professional development was successful at providing teachers with the knowledge and skills they needed to feel confident in implementing a science module for the first time. All of the observed lessons included some elements of effective practice, though only a few lessons managed to “put it all together.” Common strengths of the lessons included involvement of students with key content as well as collegial interactions among students and respectful contact between the students and the teacher. Areas of weakness included the extent of student intellectual engagement with and sense-making of the content. Students engaged in many hands-on activities, but the amount of intellectual rigor required of the students in most lessons was generally low. Many teachers reported making modifications to the instructional materials during module implementation, often due to time constraints. Some teachers chose to skip entire activities, others cut or added pieces within individual lessons. Modifications to the lessons, often in the form of deleting probing questions or wrap-up discussions, likely hindered students’ opportunity to make sense of the target concepts. Analyses of student achievement data provide evidence that the SIE program is having a positive effect on student learning in science. On 4 of the 6 assessment scales (Rocks and Minerals; Electric Circuits; Levers and Pulleys; and Mixtures and Solutions), students receiving instruction using the SIE-provided module related to the scale scored significantly higher than students not receiving instruction from that module, controlling for pre-test scores and student demographics. The effects were sizeable, ranging from 0.61 to 1.03 standard deviations. A few race/ethnicity and gender achievement gaps were found; in each case, the gap tended to be small, and when it did narrow from pre- to post-test, it narrowed in both treatment and comparison classes. The student achievement analyses also found that the extent of both module use and participation in SIE professional development were positively related to student achievement. On 4 of the 6 assessment scales (Electric Circuits; Levers and Pulleys; Motion and Design; and Mixtures and

Horizon Research, Inc. 53 October 2007

Page 62: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Solutions), the use of a higher percentage of lessons from the module was associated with higher student assessment scores, with effect sizes ranging from 0.11 to 0.19 standard deviations. Across all six assessments, students of teachers attending all three days of SIE professional development outperformed students of teachers attending only two days of professional development (an effect size of 0.20 standard deviations). Interestingly, teachers who attended all three days of professional development tended to use a greater proportion of lessons from the module. It is possible that the difference in professional development led to a difference in the amount of module used. However, it is also possible that a lack of buy-in or commitment to the SIE approach led to both less participation in the professional development and lower use of the module. Accordingly, the findings regarding the extent of module use and participation in professional development should be interpreted with caution; the results could be due to selection bias rather than a real effect of the program. Although the project has accomplished a great deal in its first year, as is the case with most programs in their early stages, a substantial amount of work remains to be done. In the spirit of a critical friend, HRI offers the following recommendations to assist the program in reflection and planning for the future.

Provide greater support to districts, schools, and teachers in developing articulated curricula that incorporate the SIE-provided modules and that also ensure that all state science standards will be addressed.

Although the majority of districts participating in SIE report having curriculum guides to direct teachers in the selection of science topics to teach (most of which were developed pre-SIE), the evaluation data indicate that the enacted curriculum may not align well with Pennsylvania science standards. This lack of alignment is likely due to at least two factors. First, because of the emphasis on reading/language arts and mathematics due to high-stakes testing, science (and other non-tested subjects) has been deemphasized in schools (i.e., what gets tested is what gets taught). Second, elementary teachers often do not feel well prepared to teach science, consequently minimizing the amount of science they teach. Teachers also tend to cover the topics in which they feel comfortable and skip the others. Although the addition of a state science assessment may encourage schools to include more science in their instructional programs, the fact that science is not yet counted towards Adequate Yearly Progress may temper the impact of the science PSSA in this regard. Furthermore, teachers reported difficulties finding time to implement even a single module, and many indicated they have skipped portions of their module because of time constraints. Some indicated that science was taking away from other subjects; others indicated that the module spent too much time on a relatively small subset of the state’s science standards. Adding modules in future years will likely exacerbate this situation, forcing teachers to pick and choose activities from the modules.

Continue to refine the initial three-day professional development provided to teachers.

Horizon Research, Inc. 54 October 2007

Page 63: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

The professional development sessions HRI observed varied widely in regard to focus on the content storyline in the modules. In some sessions, facilitators consistently made explicit connections between module activities and content goals, but in other sessions they did not. Thus, the program should consider building a structure into the training (i.e., some sense-making activity for each lesson in a module that all facilitators would be expected to do) to help ensure that all participants understand the intended content in each lesson. In addition, there was often a disconnect between portions of the trainings that focused on instructional strategies (e.g., inquiry, the learning cycle) and the module. Better integrating these different aspects will not only help teachers better understand the instructional strategies, but will also help them implement them at appropriate places during their teaching of their module.

The program’s plans for future professional development wisely include a focus on

deepening teachers’ understanding of the content in their first module. In addition, the program should work to strengthen teachers’ vision of effective inquiry-based science instruction.

In order for teachers to effectively use inquiry-based teaching methods, they need to have a clear vision of what inquiry instruction is and how they can use the module in an inquiry-based manner. In the three-day professional development HRI observed, the concept of inquiry was discussed during the first day’s overview, but no clear vision of inquiry ever emerged. Furthermore, inquiry was rarely discussed explicitly after the first day, nor was it related specifically to the module teachers would be using. Observations of teachers implementing the modules highlight the need to help teachers move from mechanical to purposeful use of the modules. In order to accomplish this goal, the program and the participating teachers need to develop a shared understanding of what effective science instruction looks like. Without such a vision of teaching and learning, professional development cannot be focused on helping teachers work towards that goal. The set of knowledge and skills needed by teachers to achieve this vision becomes the objectives for professional development, and also provides teachers a “gold standard” for reflecting upon their practice. Developing this vision is not an easy or quick task, however, it will be essential if the program is to maximize its impact on science teaching and learning. Examining and discussing video of classroom instruction, using role-plays that provide examples and non-examples of effective teaching, and analyzing student work can all be used to help foster this common vision.

With the necessary addition of a large number of trainers for the program’s second year, the program should consider additional ways to build structures to support their work.

Horizon Research, Inc. 55 October 2007

Page 64: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Providing effective professional development requires a great deal of knowledge and skill. First, professional development providers need to have a common vision of the goals of the program (e.g., what the program hopes science instruction will look like if it is successful). Second, future leaders need to have a strong understanding of the content in the modules and how the module activities address that content. Given the number of modules the program will eventually support, and the program’s plan to use this year’s participants as future facilitators, the program should consider ways to help leaders develop the knowledge they need. One possibility would be to have an expert review each module and create annotations that would make the disciplinary and pedagogical content more explicit. Providing this sort of scaffolding would make it less important that each individual facilitator possess deep content knowledge of each module. In addition to helping prepare future leaders, these annotations could be shared with teachers to assist them in learning about and implementing the modules.

Horizon Research, Inc. 56 October 2007

Page 65: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Appendix A

District Curriculum Alignment Questionnaire and Summary of Data

Horizon Research, Inc. October 2007

Page 66: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

District Curriculum Alignment Questionnaire http://www.horizon-research.com/sie/dist_curric/daq.php?quest=999

1 of 5 6/7/2007 3:31 PM

District Curriculum Alignment Questionnaire

The Pennsylvania Department of Education (PDE) has requested that each district participating in the Science: It's Elementary program complete this questionnaire.The responses will help PDE decide what support to offer districts in future years, so please be candid. If you have any questions about this survey, please email Eric Banilower at [email protected] or call (toll-free) 877-297-6829.

1. Does the district have a curriculum guide for science in grades K-6?

nmlkj Yes

nmlkj No

If you answered "no" to question #1 above, click here to skip to question #10.

2. What is this guide called in your district?

3. Was a process used to align the district K-6 science curriculum guide with the Pennsylvania Academic Standards for Science and Technology and Environment and Ecology?

nmlkj Yes

nmlkj No

If you answered "no" to question #3 above, click here to skip to question #7.

4. When was the most recent alignment process completed? (Select one.)

nmlkj 2001 or earlier

nmlkj 2002

nmlkj 2003

nmlkj 2004

nmlkj 2005

nmlkj 2006

nmlkj 2007

5. Who participated in the alignment process? (Select all that apply.)

gfedc Superintendent

gfedc Assistant superintendent

gfedc District science specialist/supervisor

gfedc Elementary school administrators

gfedc Elementary school teachers

gfedc Middle school science teachers

gfedc High school science teachers

Page 67: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

District Curriculum Alignment Questionnaire http://www.horizon-research.com/sie/dist_curric/daq.php?quest=999

2 of 5 6/7/2007 3:31 PM

gfedc University science educators

gfedc University/business/industry scientists

gfedc Other (please specify):

6. Briefly describe the process that was used to conduct the alignment:

7. What are the functions of the district science curriculum guide? (Select all that apply.)

gfedc To specify the grade(s) at which each Pennsylvania standard should be taught

gfedc To help the district select instructional materials

gfedc To help teachers decide what topics to cover/instructional materials to use

gfedc Other (please specify):

8. How was the district science curriculum guide for K-6 science communicated to teachers when it was developed? (Select all that apply.)

gfedc Each teacher responsible for teaching science in these grades received a copy of the district science curriculum.

gfedc The district provided staff development on the district science curriculum.

gfedc The district science curriculum was implicitly communicated through the selection of instructional materials.

gfedc Other (please specify):

9. How is the district science curriculum guide for K-6 science communicated to teachers that have joined the district since it was developed? (Select all that apply.)

gfedc New teachers receive a copy of the district science curriculum.

gfedc The district provides staff development to new teachers on the district science curriculum.

gfedc The district science curriculum is implicitly communicated through the instructional materials.

gfedc Other (please specify):

10. Which of the following best describes your district's policy regarding the selection of instructional materials for K-6 science?

nmlkj a. All schools in the district are required to use the same instructional materials.

Page 68: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

District Curriculum Alignment Questionnaire http://www.horizon-research.com/sie/dist_curric/daq.php?quest=999

3 of 5 6/7/2007 3:31 PM

nmlkj b. The district provides a list of materials from which individual schools may select.

nmlkj c. Individual schools select instructional materials without constraints by the district.

nmlkj d. Individual teachers select the instructional materials they want to use.

If you selected "c" or "d" in question #10 above, click here to skip to question #15.

11. Which of the following best describes the instructional materials used by your district to teach science in grades K-6?

nmlkj a. We predominantly use a series of materials from a single publisher.

nmlkj b. We use a series of materials from a publisher and some supplemental materials.

nmlkj c. We use lots of different materials.

If you selected "c" in question #11 above, click here to skip to question #13.

12. When were the current instructional materials for K-6 science adopted? (Select one.)

nmlkj 2001 or earlier

nmlkj 2002

nmlkj 2003

nmlkj 2004

nmlkj 2005

nmlkj 2006

nmlkj 2007

13. Who was involved in the selection of these materials? (Select all that apply.)

gfedc Superintendent

gfedc Assistant superintendent

gfedc District science specialist/supervisor

gfedc Elementary school administrators

gfedc Elementary school teachers

gfedc Middle school science teachers

gfedc High school science teachers

gfedc University science educators

gfedc University/business/industry scientists

gfedc Other (please specify):

14. Please describe the process by which these materials were selected, including how the alignment between the materials and the district science curriculum and/or the Pennsylvania Standards was assessed.

Page 69: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

District Curriculum Alignment Questionnaire http://www.horizon-research.com/sie/dist_curric/daq.php?quest=999

4 of 5 6/7/2007 3:31 PM

15. About how much professional development did a typical teacher receive on these instructional materials?

nmlkj None

nmlkj 6 or fewer hours

nmlkj Between 7 and 12 hours

nmlkj Between 13 and 18 hours

nmlkj Between 19 and 24 hours

nmlkj 25 or more hours

If you responded "none" to question #15 above, click here to skip to question #17.

16. Who provided the professional development? (Select all that apply.)

gfedc The publisher

gfedc District science specialist/supervisor

gfedc Other district personnel (please specify):

gfedc ASSET, Inc.

gfedc Other (please specify):

17. When will the next adoption for K-6 science instructional materials take place?

nmlkj 2007

nmlkj 2008

nmlkj 2009

nmlkj 2010

nmlkj 2011

nmlkj Other, please specify:

PDE has also asked that we interview a subset of the people responding to this survey. Please provide the following information so that we maycontact you if you are selected for an interview.

Page 70: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

District Curriculum Alignment Questionnaire http://www.horizon-research.com/sie/dist_curric/daq.php?quest=999

5 of 5 6/7/2007 3:31 PM

Name:

Title:

Phone:

E-mail:

Best times to contact:

Submit Form Clear

Page 71: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Number of valid questionnaires

59Total Number

N

Q1

59 0 32 68q1: Does the district have a curriculum guide for science in gradesK-6?

Valid N Missing N

Total

Percent

No

Percent

Yes

Q3*

39 1 10 90

q3: Was a process used to align the district K-6 sciencecurriculum guide with the Pennsylvania AcademicStandards for Science and Technology and Environmentand Ecology?

Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts that have a curriculum guide for science in grades K-6.

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 72: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q4*

35 0 3 6 14 9 11 37 20q4: When was the most recent alignment processcompleted?

Valid N Missing N

Total

Percent

2001 orearlier

Percent

2002

Percent

2003

Percent

2004

Percent

2005

Percent

2006

Percent

2007

* Including only those districts that used a process to align the district K-6 science curriculum guide.

Q5*

35 0 94 6

35 0 63 37

35 0 49 51

35 0 31 69

35 0 6 94

35 0 31 69

35 0 31 69

35 0 86 14

35 0 97 3

35 0 71 29

q5a: Superintendent

q5b: Assistant superintendent

q5c: District science specialist/supervisor

q5d: Elementary school administrators

q5e: Elementary school teachers

q5f: Middle school science teachers

q5g: High school science teachers

q5h: University science educators

q5i: University/business/industry scientists

q5j: Other (please specify):

Who participated in the alignmentprocess?

Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts that used a process to align the district K-6 science curriculum guide.

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 73: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q7*

40 0 23 78

40 0 45 55

40 0 15 85

40 0 78 23

q7a: To specify the grade(s) at which each Pennsylvania standard should betaught

q7b: To help the district select instructional materials

q7c: To help teachers decide what topics to cover/instructional materials touse

q7d: Other (please specify):

What are the functions of the district science curriculum guide?Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts that have a curriculum guide for science in grades K-6.

Q8*

40 0 23 78

40 0 53 48

40 0 58 43

40 0 75 25

q8a: Each teacher responsible for teaching science in these grades received acopy of the district science curriculum.

q8b: The district provided staff development on the district sciencecurriculum.

q8c: The district science curriculum was implicitly communicated throughthe selection of instructional materials.

q8d: Other (please specify):

How was the district science curriculum guide for K-6 sciencecommunicated to teachers when it was developed?

Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts that have a curriculum guide for science in grades K-6.

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 74: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q9*

40 0 30 70

40 0 80 20

40 0 60 40

40 0 70 30

q9a: New teachers receive a copy of the district sciencecurriculum.

q9b: The district provides staff development to new teachers onthe district science curriculum.

q9c: The district science curriculum is implicitly communicatedthrough the instructional materials.

q9d: Other (please specify):

How is the district science curriculum guide for K-6 sciencecommunicated to teachers that have joined the district since itwas developed?

Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts that have a curriculum guide for science in grades K-6.

Q10

58 1 47 3 33 17q10: Which of the following best describes your district'spolicy regarding the selection of instructional materials for K-6science?

Valid N Missing N

Total

Percent

All schools inthe district arerequired to use

the sameinstructional

materials

Percent

The districtprovides a

list ofmaterials

from whichindividual

schoolsmay select

Percent

Individualschools selectinstructional

materialswithout

constraints bythe district

Percent

Individualteachers select

theinstructional

materials theywant to use

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 75: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q11*

30 0 37 47 17q11: Which of the following best describes the instructionalmaterials used by your district to teach science in gradesK-6?

Valid N Missing N

Total

Percent

Wepredominantlyuse a series of

materials from asingle publisher

Percent

We use a seriesof materials

from apublisher and

somesupplemental

materials

Percent

We use lotsof differentmaterials

* Including only those districts that select the instructional materials for the whole district or provide a list of materials from which schools may select.

Q12*

25 0 24 12 8 12 0 28 16q12: When were the current instructional materials for K-6 scienceadopted?

Valid N Missing N

Total

Percent

2001 orearlier

Percent

2002

Percent

2003

Percent

2004

Percent

2005

Percent

2006

Percent

2007

* Including only those districts that used materials from a single publisher with or without some supplemental materials.

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 76: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q13*

30 0 70 30

30 0 57 43

30 0 50 50

30 0 10 90

30 0 3 97

30 0 50 50

30 0 67 33

30 0 87 13

30 0 97 3

30 0 83 17

q13a: Superintendent

q13b: Assistant superintendent

q13c: District science specialist/supervisor

q13d: Elementary school administrators

q13e: Elementary school teachers

q13f: Middle school science teachers

q13g: High school science teachers

q13h: University science educators

q13i: University/business/industry scientists

q13j: Other (please specify):

Who was involved in the selection of thesematerials?

Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts that select the instructional materials for the whole district or provide a list of materials from which schools may select.

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 77: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q15

59 0 22 31 17 22 2 7q15: About how much professional development did atypical teacher receive on these instructionalmaterials?

Valid N Missing N

Total

Percent

None

Percent

6 or fewerhours

Percent

Between 7and 12hours

Percent

Between 13and 18hours

Percent

Between 19and 24hours

Percent

25 or morehours

Q16*

46 0 43 57

46 0 63 37

46 0 70 30

46 0 54 46

46 0 78 22

q16a: The publisher

q16b: District science specialist/supervisor

q16c: Other district personnel (pleasespecify):

q16d: ASSET, Inc.

q16e: Other (please specify):

Who provided the professional development?Valid N Missing N

Total

Percent

No

Percent

Yes

* Including only those districts where typical teachers received professional development on their instructional materials.

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 78: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Q17

56 3 16 13 13 5 18 36q17: When will the next adoption forK-6 science instructional materials takeplace?

Valid N Missing N

Total

Percent

2007

Percent

2008

Percent

2009

Percent

2010

Percent

2011

Percent

Other

District Curriculum Alignment Questionnaire

Horizon Research, Inc.

Page 79: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Appendix B

Teacher Questionnaires

Horizon Research, Inc. June 2007

Page 80: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Science It's Elementary (SIE)Baseline Teacher Questionnaire

Instructions: Please complete this form using a #2 pencil or a blue or black ink pen. Darken circlescompletely, but do not stray into adjacent circles. Erase completely or white out any stray marks. Onlygroup results will be reported from this questionnaire, and all responses are completely anonymous, soplease be candid.

In an effort to track responses over time while protecting your identity, we ask that you create a unique IDnumber using the initials of your mother’s maiden name (first and last name), your mother’s birthday(2-digit month-day), and your 2-digit birth order. (For instance, if your mother’s name is MaryAnderson, her birthday is April 10th, and you are the 2nd child in your family, you would bubble in MA 0410 02.) Please use the boxes below to fill in the requested information, then darken the correspondingovals.

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

PLEASE DO NOT WRITE IN THIS AREA

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

First and lastinitials for

mother's maiden nameYour mother's

birthday

Month Day

Your birthorder

SIE Baseline Teacher Questionnaire - Page 1Horizon Research, Inc. December 2006

[QID]

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

Page 81: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

1. Please indicate the grade level of the students in this class. (Darken all circles that apply.)

K 1 2 3 4 5 6 7 8

I. Your Science Teaching

No Yes

2. Do you teach in a self-contained classroom? (Darken one circle.)

Questions (1-6) ask about your science teaching. If you teach science to more than one class of students, please answer for yourfirst science class of the day in which you will be using a module provided by Science It's Elementary (SIE).

3. How many lessons per week do you typically teach science to this class? (Darken one circle.)

0 1 2 3 4 5

4. Approximately how many minutes is a typical science lesson? (Darken one circle.)

10 or fewer 11-20 21-30 31-40 41-50 51-60 61-70 71-80 81 or more

5. How many science units do you plan to cover over the course of this academic year? (We’re defining “unit”as a series of related activities, often on a single topic such as sound, rocks, or genetics.) (Darken one circle.)

0 1 2 3 4 5 6 7 8 9 or more

6. How many weeks do your science units typically last? (Darken one circle.)

1 2 3 4 5 6 7 8 9 10 or more

II. Attitudes and OpinionsQuestion 7 refers to the FOSS and STC modules being provided by the Science It’s Elementary (SIE) program.

7. Please indicate how true each of the following statements is of you at this time: (Darken one circle on each line.) Not true of

me now

Somewhat true of

me nowVery true of

me now0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

Irrel-evant

a. I am concerned about students’ attitudes toward the SIE modules.b. I now know of some other approaches that might work better.c. I don’t even know what the SIE modules are.d. I am concerned about not having enough time to organize myself

each day.e. I would like to help other faculty in their use of the SIE modules.

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7(Question 7 continued on next page.)

SIE Baseline Teacher Questionnaire - Page 2Horizon Research, Inc. December 2006

Page 82: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

7. Continued. (Darken one circle.)Not true of

me now

Somewhat true of

me nowVery true of

me now0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

Irrel-evant

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

f. I have a very limited knowledge about the SIE modules.g. I would like to know the effect of Science It’s Elementary on my

professional status.h. I am concerned about conflict between my interests and my

responsibilities.i. I am concerned about revising my use of the SIE modules.j. I would like to develop working relationships with both our faculty and

outside faculty using SIE modules.k. I am concerned about how the SIE modules affect students.l. I am not concerned about the SIE modules.

m. I would like to know who will make the decisions regarding SIEmodules.

n. I would like to discuss the possibility of using the SIE modules.o. I would like to know what resources are available for implementing the

SIE modules.p. I am concerned about my inability to manage all that the SIE modules

require.q. I would like to know how my teaching is supposed to change using the

SIE modules.r. I would like to familiarize other departments or people with the progress

related to use of the SIE modules.s. I am concerned about evaluating my impact on students.

t. I would like to revise the instructional approach of the SIE modules.u. I am completely occupied with other things.v. I would like to modify our use of the SIE modules based on the

experiences of our students.w. Although I don’t know about the SIE modules, I am concerned about

issues in science education.x. I would like to excite my students about learning science from the SIE

modules.y. I am concerned about time spent working with non-academic problems

related to the SIE modules.z. I would like to know what the use of the SIE modules will require in the

immediate future.

aa. I would like to coordinate my efforts with others to maximize the effectsof the SIE modules.

bb. I would like to have more information on time and energy commitmentsrequired by the SIE modules.

cc. I would like to know what other teachers are doing with regard to theSIE modules.

(Question 7 continued on next page.)

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

PLEASE DO NOT WRITE IN THIS AREA

SIE Baseline Teacher Questionnaire - Page 3Horizon Research, Inc. December 2006

[QID]

Page 83: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Not true ofme now

Somewhat true of

me nowVery true of

me now0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

Irrel-evant

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

7. Continued. (Darken one circle.)

dd. At this time, I am not interested in learning about the SIE modules.ee. I would like to determine how to supplement, enhance, or replace the

SIE modules.ff. I would like to use feedback from students to change the SIE modules.gg. I would like to know how my role will change when I am using the SIE

modules.hh. Coordination of tasks and people is taking too much of my time.ii. I would like to know how the SIE modules are better than what we have

now. 0 1 2 3 4 5 6 7

8. What do you consider to be the main strengths of your school’s science program?

9. In what ways could your school’s science program be improved?

10. What do you hope to gain from your school’s participation in the Science It’s Elementary program?

11. What are you hoping to gain from participating in these three days of Science It’s Elementary professional development?

SIE Baseline Teacher Questionnaire - Page 4Horizon Research, Inc. December 2006

Page 84: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

III. About You

12. Are you:

MaleFemale

13. Are you: (Darken all that apply).

American Indian or Alaskan NativeAsianBlack or African-AmericanHispanic or Latino Native Hawaiian or other Pacific IslanderWhite

15. What grade level(s) do you currently teach? (Darken all that apply.)

14. How many years have you taught prior to this school year? (Darken one circle.)

0-2 3-5 6-10 11-15 16-20 21-25 26 or more

Thank you very much for participating in this survey!

PLEASE DO NOT WRITE IN THIS AREA

K 1 2 3 4 5 6 7 8

SIE Baseline Teacher Questionnaire - Page 5Horizon Research, Inc. December 2006

[QID]

Page 85: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Science It's Elementary (SIE)Post-Professional Development Questionnaire

Instructions: Please complete this form using a #2 pencil or a blue or black ink pen. Darken circlescompletely, but do not stray into adjacent circles. Erase completely or white out any stray marks. Only groupresults will be reported from this questionnaire, and all responses are completely anonymous, so please becandid.

In an effort to track responses over time while protecting your identity, we ask that you create a unique IDnumber using the initials of your mother’s maiden name (first and last name), your mother’s birthday(2-digit month-day), and your 2-digit birth order. (For instance, if your mother’s name is Mary Anderson,her birthday is April 10th, and you are the 2nd child in your family, you would bubble in MA 04 10 02.) Please use the boxes below to fill in the requested information, then darken the corresponding circles.

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

PLEASE DO NOT WRITE IN THIS AREA

0

1

2

3

4

5

6

7

8

9

0

1

2

3

4

5

6

7

8

9

First and lastinitials for

mother's maiden nameYour mother's

birthday

Month Day

Your birthorder

SIE Post- Professional Development Teacher Questionnaire - Page 1Horizon Research, Inc. December 2006

[QID]

Page 86: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

In this questionnaire, “SIE modules” refers to the FOSS and STC modules being provided by theScience It’s Elementary (SIE) program.

Strongly No Strongly Disagree Disagree Opinion Agree Agree

a. The goals of the professional development were clear.b. The professional development reflected careful planning and organization.c. Adequate time, structure, and guidance were provided for participants to

reflect individually on the substance of the professional development.d. Adequate time, structure, and guidance were provided for participants to

discuss the SIE modules and pedagogical strategies with each other.e. The professional development increased my confidence in my ability to teach

using the SIE module.

1. Please indicate the extent to which you agree with each of the followingstatements about the 3-days of SIE professional development you just attended.

(Darken one circle on each line.)

2. The pace of the professional development was:. (Darken one circle.)

Much too slowSomewhattoo slow Appropriate

Somewhattoo fast Much too fast

3. Please rate your understanding of each of the following a) prior to the 3-day SIE professional development and b) now. (Darken one circle in each column on each line.)

Understanding Prior to Professional Development Understanding Now

a. Student learning goals (big ideas)in the SIE module.

b. Science content in the SIE moduleat the level that the students areexpected to learn it.

c. Science content in the SIE moduleat a deeper level than what studentsare expected to learn.

d. Ideas (either correct or incorrect)that students are likely to haveabout the content in the SIEmodule before instruction.

No Minimal Moderate Proficient Expert under- under- under- under- under- standing standing standing standing standing

No Minimal Moderate Proficient Expert under- under- under- under- under- standing standing standing standing standing

SIE Post- Professional Development Teacher Questionnaire - Page 2Horizon Research, Inc. December 2006

Page 87: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

4. Please rate your preparedness to do each of the following a) prior to the 3-day SIE professional development and b) now. (Darken one circle in each column on each line.)

Preparedness Prior to Professional Development Preparedness Now

a. Use the inquiry-based teachingstrategies embedded in the SIEmodule.

b. Use science notebooks to supportstudent learning of the content inthe SIE module.

c. Use the FERA (Focus, Explore,Reflect, Apply) and 5 E(Engagement, Exploration,Explanation, Elaboration,Evaluation) Learning Cycles toteach using the SIE module.

d. Manage the logistics of the SIEmodule.

e. Handle classroom managementissues with the SIE module.

Not Mini- Moderately Very at all mally well Well well prepared prepared prepared prepared prepared

Not Mini- Moderately Very at all mally well Well well prepared prepared prepared prepared prepared

5. What aspects of the professional development did you find the most helpful or effective in preparing you toteach using the SIE module?

6. In what ways could the professional development have been more effective for you?

PLEASE DO NOT WRITE IN THIS AREA

SIE Post- Professional Development Teacher Questionnaire - Page 3Horizon Research, Inc. December 2006

[QID]

Page 88: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

7. Please indicate how true each of the following statements is of you at this time: (Darken one circle on each line.) Not true of

me now

Somewhat true of

me nowVery true of

me now0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

Irrel-evant

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

a. I am concerned about students’ attitudes toward the SIE modules.b. I now know of some other approaches that might work better.c. I don’t even know what the SIE modules are.d. I am concerned about not having enough time to organize myself each

day.e. I would like to help other faculty in their use of the SIE modules.f. I have a very limited knowledge about the SIE modules.g. I would like to know the effect of Science It’s Elementary on my

professional status.

h. I am concerned about conflict between my interests and myresponsibilities.

i. I am concerned about revising my use of the SIE modules.j. I would like to develop working relationships with both our faculty and

outside faculty using SIE modules.k. I am concerned about how the SIE modules affect students.l. I am not concerned about the SIE modules.m. I would like to know who will make the decisions regarding SIE

modules.n. I would like to discuss the possibility of using the SIE modules.

o. I would like to know what resources are available for implementing theSIE modules.

p. I am concerned about my inability to manage all that the SIE modulesrequire.

q. I would like to know how my teaching is supposed to change using theSIE modules.

r. I would like to familiarize other departments or people with the progressrelated to use of the SIE modules.

s. I am concerned about evaluating my impact on students.t. I would like to revise the instructional approach of the SIE modules.u. I am completely occupied with other things.

v. I would like to modify our use of the SIE modules based on theexperiences of our students.

w. Although I don’t know about the SIE modules, I am concerned aboutissues in science education.

x. I would like to excite my students about their part in the SIE modules.y. I am concerned about time spent working with nonacademic problems

related to the SIE modules.z. I would like to know what the use of the SIE modules will require in the

immediate future.

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7(Question 7 continued on next page.)

SIE Post- Professional Development Teacher Questionnaire - Page 4Horizon Research, Inc. December 2006

Page 89: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

PLEASE DO NOT WRITE IN THIS AREA

0 1 2 3 4 5 6 7

Very true ofme now

7. Continued. (Darken one circle.)Somewhat

true ofme now

Not true ofme now

Irrel-evant

aa. I would like to coordinate my efforts with others to maximize the effectsof the SIE modules.

bb. I would like to have more information on time and energy commitmentsrequired by the SIE modules.

cc. I would like to know what other teachers are doing with regard to theSIE modules.

dd. At this time, I am not interested in learning about the SIE modules.ee. I would like to determine how to supplement, enhance, or replace the

SIE modules.

ff. I would like to use feedback from students to change the SIE modules.gg. I would like to know how my role will change when I am using the SIE

modules.hh. Coordination of tasks and people is taking too much of my time.ii. I would like to know how the SIE modules are better than what we have

now.

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7

Thank you very much for participating in this survey!

SIE Post- Professional Development Teacher Questionnaire - Page 5Horizon Research, Inc. December 2006

[QID]

Page 90: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Science: It's Elementary End-of-Year Questionnaire

Responses to questions regarding the SIE program will be kept strictly anonymous, no names or identifying information will be shared in any form, so please be candid.

If you have any questions about this survey, please contact Melanie Taylor via email at [email protected] or by calling (toll-free) 877-297-6829.

In an effort to track responses over time while protecting your identity, we ask that you create a unique ID number using the initials of your mother's maiden name (first and last name), your mother's birthday, and your birth order. (For instance, if your mother's name is Mary Anderson, her birthday is April 10th, and you are the 2nd child in your family, you would enter first initial: M; last initial: A; birth month: April; birthday: 10; birth order: 2.)

Mother's First Initial: ______ Mother's Last Initial: ______

Mother's Birth Month: ___________ Mother's Birth Day: ______

Your Birth Order: ______

In this questionnaire, "SIE modules" refers to the FOSS and STC modules being provided by the Science: It's Elementary (SIE) program.

1. Please rate your current preparedness to do each of the following.

(Darken one circle on each line.)

Not adequately prepared

Minimally prepared

Moderately well

prepared Well

prepared

Very well

prepared a. Use the inquiry-based teaching strategies

embedded in the SIE module. ○ ○ ○ ○ ○ b. Use science notebooks to support student

learning of the content in the SIE module. ○ ○ ○ ○ ○ c. Use the FERA (Focus, Explore, Reflect,

Apply) and 5 E (Engagement, Exploration, Explanation, Elaboration, Evaluation) Learning Cycles to teach using the SIE module. ○ ○ ○ ○ ○

d. Manage the logistics of the SIE module. ○ ○ ○ ○ ○ e. Handle classroom management issues with the

SIE module. ○ ○ ○ ○ ○

Horizon Research, Inc. SIE: End of Year Teacher Questionnaire - 1 May 2007

Page 91: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

2. Please rate your current understanding of each of the following.

(Darken one circle on each line.)

No

understanding Minimal

understanding Moderate

understanding Proficient

understanding Expert

understanding a. Student learning goals (big ideas) in the

SIE module. ○ ○ ○ ○ ○ b. Science content in the SIE module at the

level that the students are expected to learn it. ○ ○ ○ ○ ○

c. Science content in the SIE module at a deeper level than what students are expected to learn. ○ ○ ○ ○ ○

d. Ideas (either correct or incorrect) that students are likely to have about the content in the SIE module before instruction. ○ ○ ○ ○ ○

3. Thinking back on your experience with the SIE program this year (including both the training and your use of the

modules in the classroom), what was the single greatest benefit to you or your science teaching? 4. Thinking back on your experience with the SIE program this year (including both the training and your use of the

modules in your classroom), what could the program have done to be more effective for you?

Horizon Research, Inc. SIE: End of Year Teacher Questionnaire - 2 May 2007

Page 92: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

5. Please indicate the extent of your agreement with the following statements.

(Darken one circle on each line.)

Stongly disagree

Disagree

No opinion

Agree

Strongly agree

a. There is not enough instructional time for science to effectively use the SIE module. ○ ○ ○ ○ ○

b. My SOS facilitated my use of the module. ○ ○ ○ ○ ○ c. Other teachers in my school provided a support

system for use of the module. ○ ○ ○ ○ ○ d. I did not have the module for long enough to use

as much of it as I wanted to. ○ ○ ○ ○ ○ e. The pressures to teach mathematics and/or

reading inhibited my use of the module. ○ ○ ○ ○ ○ f. The training I received from SIE made it easier

for me to use the module. ○ ○ ○ ○ ○ g. I received assistance from SIE outside of the 3-

day training that helped me to use the module. ○ ○ ○ ○ ○ h. My school was supportive of my use of the

module. ○ ○ ○ ○ ○ i. My own science background was helpful when I

was teaching from the module. ○ ○ ○ ○ ○ j. My lack of experience in science made it more

difficult for me to teach from the module. ○ ○ ○ ○ ○ 6. Please indicate how true each of the following statements is of you at this time.

(Darken one circle on each line.) Irrel-

evant Not true

of me now Somewhat true

of me now Very true of me now

a. I am concerned about students’ attitudes toward the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

b. I now know of some other approaches that might work better. ○ ○ ○ ○ ○ ○ ○ ○

c. I don’t even know what the SIE modules are. ○ ○ ○ ○ ○ ○ ○ ○

d. I am concerned about not having enough time to organize myself each day. ○ ○ ○ ○ ○ ○ ○ ○

e. I would like to help other faculty in their use of the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

f. I have a very limited knowledge about the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

g. I would like to know the effect of Science It’s Elementary on my professional status. ○ ○ ○ ○ ○ ○ ○ ○

h. I am concerned about conflict between my interests and my responsibilities. ○ ○ ○ ○ ○ ○ ○ ○

i. I am concerned about revising my use of the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

j. I would like to develop working relationships with both our faculty and outside faculty using SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

k. I am concerned about how the SIE modules affect students. ○ ○ ○ ○ ○ ○ ○ ○

l. I am not concerned about the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

m. I would like to know who will make the decisions regarding SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

question 6 continues on next page…

Horizon Research, Inc. SIE: End of Year Teacher Questionnaire - 3 May 2007

Page 93: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

6. (continued) Irrel-

evant Not true

of me now Somewhat true

of me now Very true of me now

n. I would like to discuss the possibility of using the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

o. I would like to know what resources are available for implementing the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

p. I am concerned about my inability to manage all that the SIE modules require. ○ ○ ○ ○ ○ ○ ○ ○

q. I would like to know how my teaching is supposed to change using the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

r. I would like to familiarize other departments or people with the progress related to use of the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

s. I am concerned about evaluating my impact on students. ○ ○ ○ ○ ○ ○ ○ ○

t. I would like to revise the instructional approach of the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

u. I am completely occupied with other things. ○ ○ ○ ○ ○ ○ ○ ○

v. I would like to modify our use of the SIE modules based on the experiences of our students. ○ ○ ○ ○ ○ ○ ○ ○

w. Although I don’t know about the SIE modules, I am concerned about issues in science education. ○ ○ ○ ○ ○ ○ ○ ○

x. I would like to excite my students about their part in the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

y. I am concerned about time spent working with nonacademic problems related to the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

z. I would like to know what the use of the SIE modules will require in the immediate future. ○ ○ ○ ○ ○ ○ ○ ○

aa. I would like to coordinate my efforts with others to maximize the effects of the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

bb. I would like to have more information on time and energy commitments required by the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

cc. I would like to know what other teachers are doing with regard to the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

dd. At this time, I am not interested in learning about the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

ee. I would like to determine how to supplement, enhance, or replace the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

ff. I would like to use feedback from students to change the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

gg. I would like to know how my role will change when I am using the SIE modules. ○ ○ ○ ○ ○ ○ ○ ○

hh. Coordination of tasks and people is taking too much of my time. ○ ○ ○ ○ ○ ○ ○ ○

ii. I would like to know how the SIE modules are better than what we have now. ○ ○ ○ ○ ○ ○ ○ ○

Horizon Research, Inc. SIE: End of Year Teacher Questionnaire - 4 May 2007

Page 94: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Appendix C

Selected Teacher Curriculum Survey Results

Horizon Research, Inc. October 2007

Page 95: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade K-4 Teachers:4th Grade Science and Technology Standards

8.2

10.5

7.2

3.9

7.2

0.2

4.1

0.7

0

2

4

6

8

10

12U

nify

ing

Them

es

Inqu

iry a

nd D

esig

n

Bio

logi

cal

Scie

nces

Phys

ical

Sci

ence

,C

hem

istr

y, a

ndPh

ysic

s

Eart

h Sc

ienc

es

Tech

nnol

ogy

Educ

atio

n

Tech

nolo

gica

lD

evic

es

Scie

nce,

Tech

nolo

gy, a

ndH

uman

End

eavo

rs

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-1

Horizon Research, Inc. October 2007

Page 96: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade K-4 Teachers:4th Grade Environment and Ecology Standards

2.4

3.3 3.2

1.7

0.2

2.5 2.4 2.4

0.10

2

4

6

8

10

12

Wat

ersh

eds

and

Wet

land

s

Ren

ewab

le a

ndN

on-r

enew

able

Res

ourc

es

Envi

ronm

enta

lH

ealth

Agr

icul

ture

and

Soci

ety

Inte

grat

ed P

est

Man

agem

ent

Ecos

yste

ms

and

Thei

r Int

erac

tions

Thre

aten

ed,

Enda

nger

ed a

ndEx

tinct

Spe

cies

Hum

ans

and

the

Envi

ronm

ent

Envi

ronm

enta

lLa

ws

and

Reg

ulat

ions

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-2

Horizon Research, Inc. October 2007

Page 97: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade K-4 Teachers:7th Grade Science and Technology Standards

6.5

9.1

2.5

3.2

4.5

0.2

2.8

0.5

0

2

4

6

8

10

12U

nify

ing

Them

es

Inqu

iry a

nd D

esig

n

Bio

logi

cal

Scie

nces

Phys

ical

Sci

ence

,C

hem

istr

y, a

ndPh

ysic

s

Eart

h Sc

ienc

es

Tech

nnol

ogy

Educ

atio

n

Tech

nolo

gica

lD

evic

es

Scie

nce,

Tech

nolo

gy, a

ndH

uman

End

eavo

rs

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-3

Horizon Research, Inc. October 2007

Page 98: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade K-4 Teachers:7th Grade Environment and Ecology Standards

1.4

2.31.6

0.60.2

1.5 1.6 1.6

0.10

2

4

6

8

10

12

Wat

ersh

eds

and

Wet

land

s

Ren

ewab

le a

ndN

on-r

enew

able

Res

ourc

es

Envi

ronm

enta

lH

ealth

Agr

icul

ture

and

Soci

ety

Inte

grat

ed P

est

Man

agem

ent

Ecos

yste

ms

and

Thei

r Int

erac

tions

Thre

aten

ed,

Enda

nger

ed a

ndEx

tinct

Spe

cies

Hum

ans

and

the

Envi

ronm

ent

Envi

ronm

enta

lLa

ws

and

Reg

ulat

ions

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-4

Horizon Research, Inc. October 2007

Page 99: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade 5-6 Teachers:4th Grade Science and Technology Standards

8.8 8.9

7.3

5.8

3.6

0.7

3.7

1.1

0

2

4

6

8

10

12

Uni

fyin

g Th

emes

Inqu

iry a

nd D

esig

n

Bio

logi

cal

Scie

nces

Phys

ical

Sci

ence

,C

hem

istr

y, a

ndPh

ysic

s

Eart

h Sc

ienc

es

Tech

nnol

ogy

Educ

atio

n

Tech

nolo

gica

lD

evic

es

Scie

nce,

Tech

nolo

gy, a

ndH

uman

End

eavo

rs

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-5

Horizon Research, Inc. October 2007

Page 100: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade 5-6 Teachers:4th Grade Environment and Ecology Standards

2.1 2.4 2.6

0.80.3

2.9

1.9 2.0

0.10

2

4

6

8

10

12

Wat

ersh

eds

and

Wet

land

s

Ren

ewab

le a

ndN

on-r

enew

able

Res

ourc

es

Envi

ronm

enta

lH

ealth

Agr

icul

ture

and

Soci

ety

Inte

grat

ed P

est

Man

agem

ent

Ecos

yste

ms

and

Thei

r Int

erac

tions

Thre

aten

ed,

Enda

nger

ed a

ndEx

tinct

Spe

cies

Hum

ans

and

the

Envi

ronm

ent

Envi

ronm

enta

lLa

ws

and

Reg

ulat

ions

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-6

Horizon Research, Inc. October 2007

Page 101: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade 5-6 Teachers:7th Grade Science and Technology Standards

8.6

9.5

4.6 4.6

2.9

0.6

2.7

1.0

0

2

4

6

8

10

12U

nify

ing

Them

es

Inqu

iry a

nd D

esig

n

Bio

logi

cal

Scie

nces

Phys

ical

Sci

ence

,C

hem

istr

y, a

ndPh

ysic

s

Eart

h Sc

ienc

es

Tech

nnol

ogy

Educ

atio

n

Tech

nolo

gica

lD

evic

es

Scie

nce,

Tech

nolo

gy, a

ndH

uman

End

eavo

rs

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-7

Horizon Research, Inc. October 2007

Page 102: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Grade 5-6 Teachers:7th Grade Environment and Ecology Standards

1.7 1.7 1.7

0.5 0.2

1.7 1.8 1.6

0.10

2

4

6

8

10

12

Wat

ersh

eds

and

Wet

land

s

Ren

ewab

le a

ndN

on-r

enew

able

Res

ourc

es

Envi

ronm

enta

lH

ealth

Agr

icul

ture

and

Soci

ety

Inte

grat

ed P

est

Man

agem

ent

Ecos

yste

ms

and

Thei

r Int

erac

tions

Thre

aten

ed,

Enda

nger

ed a

ndEx

tinct

Spe

cies

Hum

ans

and

the

Envi

ronm

ent

Envi

ronm

enta

lLa

ws

and

Reg

ulat

ions

Perc

ent o

f Ins

truc

tiona

l Tim

e

Figure C-8

Horizon Research, Inc. October 2007

Page 103: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Appendix D

Composite Definitions

Horizon Research, Inc. October 2007

Page 104: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

To facilitate the reporting of large amounts of survey data, and because individual questionnaire items are potentially unreliable, groups of survey questions that measure similar ideas can be combined into “composites.” Each composite represents an important construct related to science teaching or professional development. Cronbach’s Coefficient Alpha is a measure of the reliability of a composite (i.e., the extent to which the items appear to be measuring the same construct). A Cronbach’s Alpha of 0.6 is considered acceptable, 0.7 fair, 0.8 good, and 0.9 excellent. Each composite is calculated by summing the responses to the items associated with that composite and then dividing by the total points possible. In order for the composites to be on a 100-point scale, the lowest response option on each scale was set to 0. As a result, someone who marks the lowest point on every item in a composite receives a score of 0, and someone who marks the highest point on every item receives a score of 100. It also assures that 50 is the true mid-point. The denominator for each composite is determined by computing the maximum possible sum of responses for a series of items and dividing by 100; e.g., a nine-item composite where each item is on a scale of 0–4 would have a denominator of 0.36.

Table D-1 Composite: Teacher Perceptions of Pedagogical Preparedness

Preparedness to: Post-PD (Prior)

Post-PD (Now)

End-of-Year

Use the inquiry-based teaching strategies embedded in the SIE module Q4a-p Q4a-n Q1a Use science notebooks to support student learning of the content in the SIE

module Q4b-p Q4b-n Q1b

Use the FERA (Focus, Explore, Reflect, Apply) and 5 E (Engagement, Exploration, Explanation, Elaboration, Evaluation) Learning Cycles to teach using the SIE module

Q4c-p Q4c-n Q1c

Manage the logistics of the SIE module Q4d-p Q4d-n Q1d Handle classroom management issues with the SIE module Q4e-p Q4e-n Q1e Number of Items in Composite 5 5 5 Reliability (Cronbach’s Coefficient Alpha) 0.87 0.90 0.89

Table D-2 Composite: Teacher Perceptions of Pedagogical Content Knowledge

Understanding of: Post-PD (Prior)

Post-PD (Now)

End-of-Year

Student learning goals (big ideas) in the SIE module Q3a-p Q3a-n Q2a Science content in the SIE module at the level that the students are

expected to learn it Q3b-p Q3b-n Q2b

Science content in the SIE module at a deeper level than what students are expected to learn

Q3c-p Q3c-n Q2c

Ideas (either correct or incorrect) that students are likely to have about the content in the SIE module before instruction

Q3d-p Q3d-n Q2d

Number of Items in Composite 4 4 4 Reliability (Cronbach’s Coefficient Alpha) 0.91 0.86 0.82

Horizon Research, Inc. October 2007

Page 105: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Appendix E

Regression Tables from Student Achievement Analyses

Horizon Research, Inc. October 2007

Page 106: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-1 HLM Regression Coefficients (and standard errors) for Rocks and Minerals Models

Initial Differences† Main Effects

Achievement Gaps

Extent of Module Use‡

Intercept 35.72 (0.48)

39.14 (1.04)

39.17 (1.16)

56.15 (1.16)

Used Rocks and Minerals Module -0.35 (0.91)

17.32* (1.40)

17.36* (1.40) --

Percent of Lessons Taught -- -- -- 0.07 (0.04)

Student Pre-test Score -- 0.52* (0.02)

0.53* (0.02)

0.54* (0.04)

Female Student -0.15 (0.41)

0.08 (0.47)

0.08 (0.47)

-0.32 (0.75)

Treatment Class x Female§ -- -- -- -- Percent of Lessons Taught x Female§ -- -- -- --

Non-Asian minority Student -2.39* (0.71)

-3.43* (0.87)

-0.40 (2.30)

-4.13* (1.28)

Treatment Class x Non-Asian Minority§ -- -- -- -- Percent of Lessons Taught x Non-Asian

Minority§ -- -- -- --

Pre-test x Female Interaction~ -- -- -- -- Pre-test x Non-Asian Minority Interaction~

-- -- -0.09 (0.06) --

* p < 0.05. Note: All variables were grand-mean centered except “Used Rocks and Minerals Module” which was uncentered. † Outcome for this model was pre-test score. ‡ Includes only classes using the Rocks and Minerals module. § Variable was entered only if there was significant variation among classes in the size of the achievement gap. ~ Variable was entered only if an achievement gap was detected on pre-test scores; a statistically significant

coefficient (flagged with a “*”) indicates a change in the size of the achievement gap from pre-test to post-test. Note that the female and/or non-Asian minority variable should not be interpreted if the corresponding interaction term is included in the model.

Horizon Research, Inc. October 2007

Page 107: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-2 HLM Regression Coefficients (and standard errors) for Electric Circuits Models

Initial Differences† Main Effects

Achievement Gaps

Extent of Module Use‡

Intercept 46.63 (1.02)

47.30 (0.99)

46.24 (1.22)

58.79 (0.76)

Used Electric Circuits Module -3.28* (1.40)

11.37* (1.37)

11.28* (1.36) --

Percent of Lessons Taught -- -- -- 0.13* (0.04)

Student Pre-test Score -- 0.56* (0.02)

0.61* (0.03)

0.51* (0.02)

Female Student -2.56* (0.37)

-0.75* (0.33)

2.68* (1.31)

-0.90 (0.47)

Treatment Class x Female§ -- -- -- -- Percent of Lessons Taught x Female§ -- -- -- --

Non-Asian minority Student -5.01* (1.42)

-3.81* (0.63)

-1.15 (2.06)

-4.31* (0.79)

Treatment Class x Non-Asian Minority§ 1.03 (1.83) -- -- --

Percent of Lessons Taught x Non-Asian Minority§ -- -- -- 0.00

(0.04) Pre-test x Female Interaction~

-- -- -0.08* (0.03) --

Pre-test x Non-Asian Minority Interaction~-- -- -0.07

(0.05) --

* p < 0.05. Note: All variables were grand-mean centered except “Used Electric Circuits Module” which was uncentered. † Outcome for this model was pre-test score. ‡ Includes only classes using the Electric Circuits module. § Variable was entered only if there was significant variation among classes in the size of the achievement gap. ~ Variable was entered only if an achievement gap was detected on pre-test scores; a statistically significant

coefficient (flagged with a “*”) indicates a change in the size of the achievement gap from pre-test to post-test. Note that the female and/or non-Asian minority variable should not be interpreted if the corresponding interaction term is included in the model.

Horizon Research, Inc. October 2007

Page 108: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-3 HLM Regression Coefficients (and standard errors) for Levers and Pulleys Models

Initial Differences† Main Effects

Achievement Gaps

Extent of Module Use‡

Intercept 39.73 (0.51)

41.00 (0.74)

40.73 (0.97)

52.01 (0.74)

Used Levers and Pulleys Module 1.38 (0.98)

10.86* (1.02)

10.97* (1.04) --

Percent of Lessons Taught -- -- -- 0.08* (0.02)

Student Pre-test Score -- 0.44* (0.02)

0.49* (0.02)

0.44* (0.03)

Female Student -2.53* (0.28)

-2.09* (0.30)

0.44 (1.30)

-2.14* (0.50)

Treatment Class x Female§ -- -- -- -- Percent of Lessons Taught x Female§

-- -- -- -0.02 (0.02)

Non-Asian minority Student -3.53* (0.50)

-2.78* (0.53)

4.19* (1.85)

-3.82* (0.89)

Treatment Class x Non-Asian Minority§ -- -- -- -- Percent of Lessons Taught x Non-Asian

Minority § -- -- -- --

Pre-test x Female Interaction~-- -- -0.06*

(0.03) --

Pre-test x Non-Asian Minority Interaction~-- -- -0.19*

(0.05) --

* p < 0.05. Note: All variables were grand-mean centered except “Used Levers and Pulleys Module” which was uncentered. † Outcome for this model was pre-test score. ‡ Includes only classes using the Levers and Pulleys module. § Variable was entered only if there was significant variation among classes in the size of the achievement gap. ~ Variable was entered only if an achievement gap was detected on pre-test scores; a statistically significant

coefficient (flagged with a “*”) indicates a change in the size of the achievement gap from pre-test to post-test. Note that the female and/or non-Asian minority variable should not be interpreted if the corresponding interaction term is included in the model.

Horizon Research, Inc. October 2007

Page 109: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-4 HLM Regression Coefficients (and standard errors) for Motion and Design Models

Initial Differences† Main Effects

Achievement Gaps

Extent of Module Use‡

Intercept 39.24 (0.97)

43.58 (0.85)

43.57 (0.85)

45.52 (0.90)

Used Motion and Design Module 0.12 (1.32)

2.00 (1.14)

1.99 (1.14) --

Percent of Lessons Taught -- -- -- 0.06* (0.03)

Student Pre-test Score -- 0.57* (0.03)

0.59* (0.03)

0.58* (0.03)

Female Student -2.77* (0.44)

-1.62* (0.49)

-0.59 (1.31)

-1.31* (0.57)

Treatment Class x Female§ -- -- -- -- Percent of Lessons Taught x Female§ -- -- -- --

Non-Asian minority Student -5.58* (0.76)

-2.93* (1.13)

-1.74 (2.13)

-4.37* (0.95)

Treatment Class x Non-Asian Minority§-- -- -1.98

(1.48) --

Percent of Lessons Taught x Non-Asian Minority § -- -- -- --

Pre-test x Female Interaction~-- -- -0.03

(0.03) --

Pre-test x Non-Asian Minority Interaction~-- -- -0.03

(0.05) --

* p < 0.05. Note: All variables were grand-mean centered except “Used Motion and Design Module” which was uncentered. † Outcome for this model was pre-test score. ‡ Includes only classes using the Motion and Design module. § Variable was entered only if there was significant variation among classes in the size of the achievement gap. ~ Variable was entered only if an achievement gap was detected on pre-test scores; a statistically significant

coefficient (flagged with a “*”) indicates a change in the size of the achievement gap from pre-test to post-test. Note that the female and/or non-Asian minority variable should not be interpreted if the corresponding interaction term is included in the model.

Horizon Research, Inc. October 2007

Page 110: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-5 HLM Regression Coefficients (and standard errors) for Mixtures and Solutions Models

Initial Differences† Main Effects

Achievement Gaps

Extent of Module Use‡

Intercept 48.12 (0.68)

50.53 (1.01)

50.50 (1.02)

59.30 (0.99)

Used Mixtures and Solutions Module -1.42 (1.18)

8.44* (1.18)

8.43* (1.18) --

Percent of Lessons Taught -- -- -- 0.10* (0.04)

Student Pre-test Score -- 0.60* (0.02)

0.61* (0.02)

0.59* (0.04)

Female Student -0.36 (0.46)

-0.09 (0.45)

-0.09 (0.45)

0.18 (0.72)

Treatment Class x Female§ -- -- -- -- Percent of Lessons Taught x Female§ -- -- -- --

Non-Asian minority Student -2.13* (0.88)

-2.98* (0.87)

1.32 (3.23)

-4.55* (1.49)

Treatment Class x Non-Asian Minority§ -- -- -- -- Percent of Lessons Taught x Non-Asian

Minority § -- -- -- --

Pre-test x Female Interaction~ -- -- -- -- Pre-test x Non-Asian Minority Interaction~

-- -- -0.09 (0.07) --

* p < 0.05. Note: All variables were grand-mean centered except “Used Mixtures and Solutions Module” which was uncentered. † Outcome for this model was pre-test score. ‡ Includes only classes using the Mixtures and Solutions module. § Variable was entered only if there was significant variation among classes in the size of the achievement gap. ~ Variable was entered only if an achievement gap was detected on pre-test scores; a statistically significant

coefficient (flagged with a “*”) indicates a change in the size of the achievement gap from pre-test to post-test. Note that the female and/or non-Asian minority variable should not be interpreted if the corresponding interaction term is included in the model.

Horizon Research, Inc. October 2007

Page 111: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-6 HLM Regression Coefficients (and standard errors) for Variables Models

Initial Differences† Main Effects

Achievement Gaps

Extent of Module Use‡

Intercept 47.02 (1.35)

52.14 (1.07)

52.09 (1.06)

53.39 (1.54)

Used Variables Module 2.67 (1.83)

-0.91 (1.38)

-0.78 (1.37) --

Percent of Lessons Taught -- -- -- 0.11 (0.06)

Student Pre-test Score -- 0.68* (0.02)

0.69* (0.02)

0.64* (0.03)

Female Student 1.25 (0.68)

0.74 (0.56)

0.72 (0.56)

1.92* (0.81)

Treatment Class x Female§ -- -- -- -- Percent of Lessons Taught x Female§ -- -- -- --

Non-Asian minority Student -5.71* (2.18)

-1.74 (1.26)

3.91 (2.80)

-3.01 (1.58)

Treatment Class x Non-Asian Minority§ -- -- -- -- Percent of Lessons Taught x Non-Asian

Minority § -- -- -- --

Pre-test x Female Interaction~ -- -- -- -- Pre-test x Non-Asian Minority Interaction~

-- -- -0.14* (0.06) --

* p < 0.05. Note: All variables were grand-mean centered except “Used Variables Module” which was uncentered. † Outcome for this model was pre-test score. ‡ Includes only classes using the Variables module. § Variable was entered only if there was significant variation among classes in the size of the achievement gap. ~ Variable was entered only if an achievement gap was detected on pre-test scores; a statistically significant

coefficient (flagged with a “*”) indicates a change in the size of the achievement gap from pre-test to post-test. Note that the female and/or non-Asian minority variable should not be interpreted if the corresponding interaction term is included in the model.

Horizon Research, Inc. October 2007

Page 112: Science: It’s Elementary Year One Evaluation Report · Melanie J. Taylor, Aaron M. Weis, Iris R. Weiss, and Justin Wiley. In addition, the following individuals assisted with data

Table E-7 HLM Regression Coefficients (and standard errors) for Cross-Module Models

Professional Development

Attended All SIE Events

Person-Days Committed to

SIE Events Intercept -0.02

(0.03) -0.04 (0.04)

-0.04 (0.04)

Attended All Three Days of PD (vs. Two Days) 0.20* (0.08) -- --

Attended All Three SIE Events -- 0.02 (0.08) --

Person-Days Committed To SIE Events -- -- -0.00 (0.00)

Student Pre-test Score 0.44* (0.01)

0.44* (0.01)

0.43* (0.01)

Attended All Three Days of PD x Pre-Test 0.09* (0.04) -- --

Attended All Three SIE Events x Pre-Test -- 0.04 (0.03) --

Person-Days Committed To SIE Events x Pre-Test -- -- 0.01* (0.00)

Female Student -0.04* (0.02)

-0.05* (0.02)

-0.05* (0.02)

Attended All Three Days of PD x Female 0.07 (0.07) -- --

Non-Asian minority Student -0.30* (0.03)

-0.29* (0.03)

-0.30* (0.03)

* p < 0.05.

Horizon Research, Inc. October 2007