Summary of Findings: Reliability. Relationship between SPS Overall Percent Favorable and Overall Evaluation Rating . What about the survey elements? . Classroom Community. Classroom Management. Student Learning. Student-Centered Environment. Summary of Findings - Validity. - PowerPoint PPT Presentation
Summary of Findings: Reliability
Summary of Findings: ReliabilityStudent-Level Reliability ()Grades 3-5 Grades 6-12Overall Reliability (all items) 0.940.96Student Learning0.900.94Students-Centered Environment0.860.90Classroom Community0.800.86Classroom Management0.750.80Teacher-Level Reliability ()Grades 3-5 Grades 6-12Overall Reliability (all items) 0.970.98Student Learning0.950.97Students-Centered Environment0.940.96Classroom Community0.900.94Classroom Management0.900.91Reliability analyses consider the internal structure and consistency of the items in the Student Perception Survey and for the four component elements. These numbers represent Cronbachs Alpha (), which is a measure of internal consistency, designed to estimate the extent to which the items in an instrument measure a similar construct. Generally, for high-stakes assessments researchers recommend > 0.9; for other purposes, > 0.7 is considered defensible.
As you can see all of the survey elements meet at least a 0.7, with the overall reliability for both instruments meeting the high-bar of 0.9.
1Relationship between SPS Overall Percent Favorable and Overall Evaluation Rating
This slide demonstrates the relationship between the overall SPS percent favorable score and the overall evaluation rating of teachers. As you can see, teachers with higher SPS scores are also receiving higher evaluation ratings. 2
What about the survey elements? Classroom CommunityClassroom ManagementStudent-Centered EnvironmentStudent LearningThis also holds true for the four survey elements. 3Summary of Findings - ValidityAnalyzed 14,539 open-ended responses (3-5 and 6-12)We find that not only were the majority of students taking the survey seriously, but that many of the responses were specific and actionable in nature98.6% (N=14,341) were considered substantive 66.3% (N=9,646) were coded as actionable Moreover, although some subjects and grades were slightly more likely to garner actionable feedback, in general actionable responses came from students in all grades and subjectsThe pilot version of the survey also included an open-ended question that asked: Do you have any other thoughts or feedback for your teacher?To evaluate the open question about the appropriateness of asking students to assess their teachers instructional behaviors, we conducted in-depth qualitative analyses of the open-ended responses. In particular, we were interested in whether students took the survey seriously and in what ways responses were substantive in nature.
Amongst the almost 30,000 student responses collected via the Student Perception Survey in the spring pilot, there were 14,539 responses to the question Do you have any other thoughts or feedback for your teacher? These responses were then coded as substantive (meaning that the student feedback was on-topic and provided at least a general statement about their teacher and/or classroom environment) or off-topic (meaning that students provided written feedback that did not address the question). The substantive responses were then coded as actionable or not, with an actionable response considered to be feedback specific enough for teachers to take action to alter or maintain their current classroom practices. For instance, the response, My teacher is great! would be deemed substantive but not actionable, whereas My teacher is great because he provides us with multiple ways of understanding the material would be considered actionable. Not included in this number are the approximately 1,500 students that wrote No or the equivalent thereof to the open-ended question. These responses were placed in the non-response group.
4What Students are Saying About Teachers
Overview of Student Feedback for TeachersThis image represents the words and phrases used most often by students in response to that question when describing teachers in the top 5% (for overall survey results). For more information about student responses to the open-ended question, including some of the key themes that emerged from our analysis, see our website for an overview of student feedback for teachers. 5Guiding principles for administration
We know that teachers care about their practice and especially about their students. They also may experience nervousness and anxiety about the surveys, and district and school leaders must address those fears. By itself, a reliable and valid instrument does not ensure that teachers will receive good feedback. Messaging matters! Engage stakeholders early and oftenMake the process as transparent as possible Give stakeholders real decision-making power
Throughout the pilot we learned several lessons about survey administration. 6Engaging StakeholdersForming a planning committeeBuilding educator investmentBuilding student understanding and comfortInforming other stakeholders
Building trust and investment in the survey process is a key first step in planning. Districts should assign a survey coordinator that can manage the survey process across all schools. The survey coordinator should work with the superintendent or other senior leadership to form a planning committee that includes representatives from key stakeholder groups such as district and building administrators, teachers, including representatives from the teachers association and district data staff member(s) like a member of the IT and/or or assessment/data office. Having a variety of people on the planning committee will build transparency and create strong communication. The planning committee will make several key decisions about the administration process.
7Key DecisionsUse a vendor or manage administration internally? Online or paper/pencil survey administration?When will the survey be administered? How will students and teachers be sampled?Include an open-ended question on the survey?How will survey results be used?There are many key decisions to think through when considering survey administration. 8Key Decision: How will survey results be used?
Consider a hold-harmless pilot yearAs a formative toolReflect on individual practiceIdentify over-arching trends and create strategies to address them. As a part of an evaluation Used as an artifact for determining ratings for professional practicesIncluded as one of several multiple measures Sharing teacher-level results
It is critical to the success of your survey administration that there is a thoughtful consideration of how results will be used, and that teachers and school leaders are engaged in district decision-making. It is also important to communicate these decisions with all teachers early and often, so that they understand the ways the data will (and will not) be used to inform practice, make decisions, and evaluate educators.If this is the first year that your district is using a Student Perception Survey you can make these decisions after you have piloted the survey in your district. Many districts use the first year of survey administration to introduce educators to the concept of student surveys, build buy-in, and work collectively to decide how results should be used moving forward. If you choose not to formally use survey results during your first year, make sure that all staff are aware of that decision and the process and timeline for deciding how results will be used in the future.
If your district is ready to make a decision about how to use results, there are many ways that results can be used to inform teacher, school and district goals.
We urge you to work with your teachers and association representatives to determine how teacher-level data will be shared. Some districts may choose to share teacher-level data with school and district administrators, while others may choose to have teachers see their individual results but only provide aggregate data to school and district administrators.
This decision should also align with how you choose to use results. (e.g., If you want principals to pair teachers with complementary strengths and areas for growth then they will need to see teacher-level results.)
9Contact InformationColorados Student Perception Survey Website:www.colegacy.org/studentsurvey
Amy Farley, Director, Research and Impact720-502-4723 firstname.lastname@example.org
Elaine Allensworth, Consortium on Chicago School Researchelainea@uchicago.edu
Amy Farley, Colorado Legacy Foundationafarley@colegacy.org
Kendra Wilhelm, Denver Public SchoolsKendra_wilhelm@dpsk12.org