114
MTSS in Early Education: Developing Research-Based Solutions to Persistent Challenges Judy Carta, University of Kansas Alana Schnitz, University of Kansas Barbara Wasik, Temple University Lillian Durán, University of Oregon

MTSS in Early Education: Developing Research-Based

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

MTSS in Early Education: Developing Research-Based

Solutions to Persistent Challenges

Judy Carta, University of Kansas

Alana Schnitz, University of Kansas

Barbara Wasik, Temple University

Lillian Durán, University of Oregon

What is MTSS?

A whole-school data-driven framework for improving learning outcomes for ALL students delivered through a continuum of evidence-based practices and systems.

What is MTSS?

• Goal: to identify children who may be struggling to learn and intervene early so they can catch up to their peers.

• It can be designed to identify children who are struggling in academic or behavioralareas.

How is MTSS different from typical practice in early education?

• MTSS identifies who needs the additional support and provides a continuum of evidence-based practiceswithin the general education settings.

• The focus of MTSS is prevention—providing additional support as soon as it’s needed for success.

• The aim of MTSS is to use proven instructional strategies.

Why is MTSS needed?

► Children enter kindergarten with vastly different backgrounds in their readiness for school.

► Programs need to identify children who need additional instructional support and provide it in a timely and efficient way.

► We have evidence-based practices (assessments and interventions) that can prevent later academic and behavior problems.

Seven Core Components / Defining Features

Layered Continuum of Supports

Evidence-Based Practices

Emphasis on Fidelity

Universal Screening

and Progress Monitoring

Data-Based Decision Making

Family, School, and Community Partnerships

Shared Leadership

Layered Continuum of

Support

Layered Continuum of Supports

Highly individualized outcomes and

teaching/caregiving strategies

Targeted outcomes and

teaching/caregiving strategies

Core or universal outcomes and

teaching/caregiving strategies

Core or universal outcomes and

teaching/caregiving strategies

Core or universal outcomes and

teaching/caregiving strategies

Increased individualization, intensity, and frequency of instruction

Decreased individualization, intensity, and frequency of instruction

STRONG CORE / TIER 1!

• Multiple tiers

• Tiers are additive

• Seamless boundaries

• Movement through tiers

• Tiers for different skills and/or domains

LayeredContinuum of Supports

Seven Core Components / Defining Features

Layered Continuum of

Supports

Evidence-Based

Practices Emphasis on

Fidelity

Universal Screening and

Progress Monitoring

Data-Based Decision Making

Family, School, and Community Partnerships

Shared Leadership

Evidence-Based Practices: What are they and why are they the best starting point?

• “Proven techniques” based on multiple studies. • Few practices in EC meet the strict definition of EBP.• Even practices with strong evidence may not work

for all children.• Practitioners can provide own evidence through

progress monitoring data: Practice-based evidence.

Seven Core Components / Defining Features

Layered Continuum of Supports

Evidence-Based Practices Emphasis

on Fidelity

Universal Screening and

Progress Monitoring

Data-Based Decision Making

Family, School, and Community Partnerships

Shared Leadership

Fidelity of Interventions• When interventions are implemented with low or

inconsistent fidelity, they are less likely to work—children won’t show change.

• Without measurement of fidelity, you don’t know if the child needs a different intervention, a more intensive intervention, or the same intervention with better implementation.

LESSON:Make sure an intervention is being implemented correctly before recommending changes to it.

Seven Core Components: Defining Features

Layered Continuum of

Supports

Evidence-Based Practices

Emphasis on Fidelity

Universal Screening

and Progress

Monitoring

Data-Based Decision Making

Family, School, and Community Partnerships

Shared Leadership

• Universal Screening: are students achieving targeted benchmarks? Is the Core curriculum effective? If not, what changes should be made?

• Which students/families need more support and/or services? What changes should be made to ensure success?

• Progress Monitoring: how, when and by whom?

3 Big Challenges for MTSS in Early Childhood

1.How can we enhance the quality of Tier 1 in ways that are feasible, sustainable?

2.How can get teachers to implement evidence-based practices with fidelity?

3.How can we accurately and reliably identify children who need additional levels of instructional support when they speak languages other than English?

LITERACY 3D

Alana Schnitz

Juniper Gardens Children’s Project

University of Kansas

https://earlyliteracy.ku.edu/

LITERACY-3D STAFF

� Charles Greenwood� Dwight Irvin� Judy Carta� Numerous Juniper Gardens Children’s Project Staff

� Consultants: Mary Abbott and Lilian Duran

EARLY LITERACY/READING

� Reading is a keystone skill for future success

� 1 out of 3 children are struggling readers (Save the Children, 2018)

� Some estimates report 85% children who live in poverty are poor readers

� Children enter preschool with a wide range of language and emergent literacy skills (Shanahan & Lonigan, 2008)

CHALLENGES THAT PROGRAMS AND TEACHERS FACE

� Children in a classroom present with a wide range of literacy skills that is a challenge for even well trained teachers to meet

� Programs and teachers are often not prepared to formally teach early literacy

Teacher Literacy Focus

Premise: When Teacher Literacy Focus goes up so does Student Academic Engagement

Our Goal:

Decrease the amt. of time when teachers do not have a

specific Literacy

Focus

Breakdown of Student Behavior including

“No Engagement”

THE REALITY OF PRESCHOOL CURRICULA IS…• “limited evidence for the efficacy of most preschool curricula

for promoting early literacy” (Lonigan & Cunningham, 2013)• 5 of 60 commercially available curricula met WWC standards

• “limited evidence that most preschool programs are implementing Tier 1 curricula that are efficacious” (Lonigan & Phillips, 2016)

USE YOUR BEST POKER STRATEGIES

LITERACY 3D CAN HELP!

� Literacy 3D- Data Based Decision MakingBuilds Upon and Strengthens universal instruction (Tier 1) Increases intentional teaching of early language and literacy skillEmbedded with teaching of social-emotional skills and other outcomes

HOW DO WE IMPROVE THE LANGUAGE ANDEARLY LITERACY OUTCOMES FOR ALLCHILDREN?

LITERACY 3D FOCUSES ON HOW TOINCREASE…

� Teacher literacy focus

� Child literacy engagement

� Child outcomes in� Alphabet & Print Knowledge� Vocabulary� Phonological Awareness� Comprehension

THEORY OF CHANGE

5 Professional Development sessions+Adapted Practice Based Coaching

Evidence-based Literacy strategies and more opportunities to respond

Children are more engaged and have greater pre-literacy skills

LITERACY 3D EVALUATION

� Randomized Control Wait-List Trial

� Teachers and children

� ConditionsLiteracy 3DBusiness of Usual (BAU)Maintenance

� Based on pilot intervention research (Greenwood, Abbott et al., 2018)

USE ADAPTED PRACTICE-BASED COACHING APPROACH

Strengths & Needs

assessments of children

and teachers

Professional Development

Sessions

Goal setting and Action Planning

Focused observation

Reflection and

Feedback

STRENGTHS AND NEEDS ASSESSMENT

•We use the data to tell us 3 things:•What skills to teach •When to teach them • Is the plan working?

•2 kinds of data•Child Data•Classroom Data

OTHER CHILD MEASURES (PRE/ POST)

TOPEL GRTR

Pre-IPT

Child Data

TEACHER LITERACY FOCUS

� We use CIRCLE to look at teacher literacy focus (where it happens and how much takes place)

Phonological AwarenessAlphabet/Print ConceptsComprehension – StoryEtc.

CHILD ACADEMIC ENGAGEMENT

� We use CIRCLE to look at students’ engagement (where it happens [e.g., centers, meals & snacks] and how much takes place):

Academic Engagement� Reading, Writing and Verbal Literacy (Primary Focus)� Math, Science, Social Studies� Attending to Academic ActivitiesOther Engagement� Pretend Play and Play with Non-Literacy Manipulatives� Music and Motor Activities� Etc. Other Behaviors � aggression, noncompliance, etc. � any other child behaviors not otherwise defined in CIRCL

PROFESSIONAL DEVELOPMENT

� Five Professional Development Sessions

� Topics based on data from teachers and children

� Support teachers to interpret data

� Train on the Top 10 Strategies

� Teachers reflect on their use of literacy strategies and

data

TOP 10 STRATEGIES

� Focus on increasing teacher-student interactions that increase child language- and literacy-engaged behaviors

� Strategies range from simple to complex and are embedded in daily activities

Top 10 Strategies

I do it, We do it, You do it

Name Games (Child choice/ sign-in/ vote)

Peer talk

Choral reading

Word Bank Games

Transition Password Game

Learning Quests

Interactive Writing

IDEAS

ASK (Ask, Stretch, Kid Repeat)

PIC - Pocket Intervention Card (always in conjunction with another strategy)

GOAL SETTING AND ACTION PLANNING

� Teachers pick a goal at the end of each PDBased on needs assessmentsChoose a strategy to implementTeachers use formative assessment to determine child skill

TEACHERS IMPLEMENT STRATEGIES

FOCUSED OBSERVATION

� Coaches observe the targeted strategy implementation� Coaches provide in class support

PromptsModelingVisual cues

� Help monitor child progress

COACHING

� Feedback every other week

� Face-to-face, phone, or email

� About 12 sessions over the school year (Range: 9-12)

REFLECTION AND FEEDBACK

� Both teacher and coach reflect on the focused observation

� Coach gives supportive and constructive feedback

� Provide resources

Danielle,

Thanks for letting me come out this week and watch your literacy strategies and the great things you do with your children. I can tell that you have been working super hard at transforming your classroom for the ECERS and adding some new things into the environment. I love the sign-in sheets for the kids to work on writing their names each day. Great idea! Below, I will just list out suggestions for each strategy to make it stronger for next time.

I/W/Y TUC 1

-I normally observe this one during your small groups, and you have done great things with it this year and have individualized for your groups according to skill level.

-I think it is fine to do this one as they are going outside, but you will need to tweak a couple of things if this is where youplan on keeping it.

1) Make sure when you give the word to the children, the entire group can see and hear you model the I do it part as you are using your arm tapping to show this. If you are going to do this on the way outside, I would stop the kids in the hall and explain what is about to happen and do the modeling at that time.

2) Continue to go through the entire I/W/Y process for extra practice and what you are really modeling with this, or need to focus on, is the First part. E.g.- “My word is Can-dy”. And the first part is Can!” “Lets do that together.”

3) As the children model the “you do it” part on the way out the door, (“Can you show me the first part of Can-dy?”) that will be fine for each child to do, as long as you provide some extension in there for a few children. I would work on first sound as extensions, this late in the year. “Can is first part of Can-dy. What does the Cccc-an start with?” You can also extend by having them come up with additional words that start with C, just as you did in your large group.

DEMOGRAPHICS

� 73 Classrooms� 6 programs (2 Head Starts, 2 School district Pre-

K, 1 Bilingual child care center, and 1 Educare)

Control Lit 3DAvg. # of children in classroom 16 17Avg. # of children w/ IEP 2.2 2.1Avg. # of teachers & assistants 2.2 2.3

Classrooms w/ children who come from homes where Eng is not primary lang % 89 90Currently using a curriculum % 92 89

Use of strategies to identify children who need support% yes 71 75

Teachers with a graduate degree % 37 24Avg. # of years working in EC programs? 10.8 9.9

CHILD DEMOGRAPHICSControl Lit 3Dn= 216 n= 222

Black % 36 39Asian % 4 10White % 30 18Latinx % 33 31Native American % 1 2Other % 2 5Gender (% male) 55 50Child has an IEP?* % 13 13Have books at home? 84 89In past week, told your child story 3x a week or more? % agreement 38 32

In past week, taught letters, words or numbers 3 X or more? % 45 40

In past week, taught songs or music or sang w/child 3 X or more? % 60 55

Marital Status-Single 49 43Marital Status-Engaged or with partner 9 11Marital Status-Married 42 46

� RQ1. Does Literacy 3D improve instruction, as measured by the teachers’ instructional quality, Teacher Literacy Focused interactions, and Child Academic Engagement?

QUALITY OF LITERACY IMPLEMENTATION

Cohen’s d= 1.36

RQ 2. DOES LITERACY 3D INCREASECHILDREN’S EARLY LITERACY SKILLS(PELI), AND SUMMATIVE OUTCOMES(TOPEL)?

Cohen’s d= 0.32

Cohen’s d = 0.71

BAU TEACHERS LIT 3D TEACHERS

LITERACY IS EFFECTIVE TO STRENGTHEN TIER 1

NEXT STEPS

� Continue with current cohort

� Consent 50 new teachers

� Further AnalysisChild characteristicsTeacher fidelity of implementationTeacher strategy and skill targets

This is a program of the Juniper Gardens

Children’s Project at the

University of Kansas

Children’s Campus of Kansas City (CCKC)

444 Minnesota Avenue, Suite 300

Kansas City, KS 66101

Phone: 913.321.3143Fax: 913.371.8522

This grant is funded by the

Institute of Education Sciences, U.S. Department of Education

National Center for Educational Research

Award # R305A170241

QUESTIONS?

REFERENCES

Greenwood, C. R., Abbott, M., Beecher, C., Atwater, J., & Petersen, S. (2018). Development, validation, and evaluation of literacy 3D: A package supporting tier 1 preschool literacy instruction implementation and intervention. Topics in Early Childhood Special Education, 37(1), 29-41. doi:10.1177/0271121416652103

Greenwood , C. R., Beecher, C., Atwater, J., Petersen, S., Schiefelbusch, J., & Irvin, D. (2018). An eco-behavioral analysis of child academic engagement: Implications for preschool children not responding to instructional intervention. Topics in Early Childhood Special Education, 1-15. doi:10.1177/0271121417741968

Greenwood, C. R., Carta, J. J., Atwater, J., Goldstein, H., Kaminski, R., & McConnell, S. R. (2013 An eco-behavioral analysis ofchild academic engagement: Implications for preschool children not responding to instructional intervention Topics in Early Childhood Special Education, 33(1), 48-64. Doi:10.1177/0271121417741968

Greenwood, C. R., Carta, J. J., Schnitz, A. G., Irvin, D. W., Jia, F., & Atwater, J. (2019). Filling an information gap in preschool MTSS and RTI decision making. Exceptional Children. 85(3), 271–290. doi:10.1177/0014402918812473

Hart, B., & Risley, T. R. (1995). Meaningful differences in the everyday experience of young American children. Baltimore: Brookes.

Hart, B., & Risley, T. R. (2003). The early catastrophe: The 30-million-word gap by age 3. Retrieved from http://isites.harvard.edu/fs/docs/icb.topic1317532.files/09-10/Hart-Risley-2003.pdf

Kong, N. Y., Greenwood, C. R., & Carta, J. J. (in press). Studies in MTSS problem solving: Improving response to a pre-kindergarten supplemental vocabulary intervention. Topics in Early Childhood Special Education.

Teachers’ Use of Progress Monitoring Data in a Tier 1 Early Language and Literacy Intervention

BARBARA A. WASIK & ANNEMARIE H. HINDMAN

CRIEI FEBRUARY 2020

Story Talk AN EVIDENCED-BASED VOCABULARY AND LANGUAGE INTERVENTION

Story Talk

Story Talk Components

Year-long teacher Professional Development

2 x’s/ month, 1 hours coaching

4 trainings (3 hours/training)

Trade Books Theme –based

Story Maps

Target Vocabulary Words

Open-ended questions

Story Maps

Picture Cards with Word Definitions

Before, During and

After Questions

Book-related Center

Activities

Progress Monitoring

Three times per year (Nov., Feb. & April)

Assessment of 20 target words randomly selected from Story Maps

Expressive word knowledge followed by receptive word knowledge

Progress Monitoring Training & Coaching

One training devoted to Progress Monitoring DataWhat do the data mean?What to do for children who are not learning the words?

Explicit Instructional strategies

Increase engagement with words (conversations, activities, etc)Work in small groups

Randomized Controlled Trial

35 PUBLIC PRESCHOOL TEACHERS IN A MAJOR

CITY IN THE NORTHEAST

RANDOM ASSIGNMENT: 20 TO INTERVENTION,

15 TO CONTROL

Teachers

40% of teachers were African American; 60% were white

100% with BA, 65% with MA

Average experience = 10 years

All used district-created balanced literacy curriculum

Children

550 children

85% were African American, 10% Hispanic/Latino, 5% white

Average age = 4.5 years

10% dual language learners

15% with identified special needs

Child Measures

All children were assessed on three vocabulary measures in the fall (pre) and spring (post)

Receptive Vocabulary: Peabody Picture

Vocabulary Test – 4th

ed

Expressive Vocabulary: Expressive One Word

Picture Vocabulary Test – 4th

Taught words: Project-aligned measure (Nov,

Feb, April)

Child Outcomes • Story Talk children made significantly greater gains on the Peabody Picture

Vocabulary Test-4 • p = .007, d = .19

• Story Talk children made significantly greater gains on the Expressive One Word Picture Vocabulary Test

• p = .010, d = .14

• Story Talk children outscored peers on Progress Monitoring Assessments • β = .77 in Nov, • β = .71 in Feb, • β = .73 in April, • p <.001 for all

Treatment Effects

for Children below the Mean on Expressive One Word

Divided sample with scores below the into high and low

Children scoring high (below the mean) = low impact of treatment

Children scoring low (below the mean) = high impact of treatment

Unpacking the results

EXAMINING IMPLEMENTATION

Research Questions

RQ 1- Was Progress Monitoring discussed as a part of coaching sessions?

RQ 2- Did coaches provide teachers with explicit strategies to work with children who performed low on PM?

RQ 3- Was there evidence that teachers implemented RTI in classrooms?

Qualitative data

2/MONTHCOACHING VIDEOS

COACHING INTERVIEWS

BI-MONTHLY CLASSROOM

OBSERVATIONS VIDEOS

TEACHER INTERVIEWS

RQ 1- Was Progress Monitoring discussed as a part of coaching sessions?

91% of the coaching videos showed coaches talking to teachers about the progress monitoring◦ “What are you doing with children who are

struggling?”◦ “What are you doing with (specific

children)?”◦ “Can we review the Progress Monitoring

data?”

RQ 2- Did coaches provide teachers with explicit strategies to work with children who performed low on PM?

89% of the coaching sessions provided at least one explicit strategy to address low Progress Monitoring scores: ◦ Work with children in small groups◦ Increase the conversations with specific

children◦ Increase the opportunities for children to

use those words

RQ 3: Evidence of RTI

Videos of Book reading and Small Groups ◦ Limited data ◦ Limited information for available

videos◦ Limited evidence

Teacher Interviews

Not sure what to do

Not enough time to do what was suggested

Didn’t think what was suggested would work

Had to emphasis other content such as letter learning and letter sounds

Summary

Without systematic use of PM data, children still learned words; improved their vocabulary

Potentially with more focused & higher dosage of RTI = greater impact (?)

Need well-designed Tier 2 & 3 instructional strategies to support teachers

Thanks to my CollaboratorsDr. Annemarie Hindman - Temple University

Dr. Emily Snell - Temple University

Mary Alice Bond - Johns Hopkins University

Kate Anderson - Johns Hopkins University

Teachers and children of the Baltimore City Public Schools

Teachers and children of the School District of Philadelphia

Thank you!QUESTIONS?

Treatment Effects on Children below the Mean

CHILDREN SCORING HIGH BELOW THE MEAN = LOW IMPACT OF TREATMENT

CHILDREN SCORING LOW BELOW THE MEAN = HIGH IMPACT OF TREATMENT

MTSSDual Language Universal Screening

Lillian K. DuránUniversity of [email protected] CRIEI February 2020

Our team• This work has been completed by the hard work

of Drs. Wackerle-Hollman, Durán & Rodriguez with the support of a highly skilled coordinator, Erin Lease, and a committed team of GRAs and Research Associates: Jose Palma, Dr. Ruby Batz, Elizabeth Stein, Cheryl Perez, and Alejandra Miranda. This project is funded by the Institute of Education Sciences.

Financial disclosure: Dr. Durán serves as a consultant, and holds equity in, Renaissance, a company which has

a license from the University of Minnesota to commercialize Individual Growth & Development Indicators

(IGDIs). These relationships have been reviewed and managed by the University of Oregon in accordance with

its conflict of interest policies.

Universal Screening• A basic component of MTSS is the implementation

of universal screening procedures that accurately identify children in need of additional instructional support

• These assessment procedures form the skeletal structure of MTSS because without technically adequate assessment there is no way to accurately identify which children will need additional and targeted supports

Critiques of the implementation of MTSS with DLLs

• Assessments are primarily administered in English• There is a lack of technically adequate universal

screening measures available in languages other than English

• There is little information available to guide teachers in interpreting scores in children’s home language and English

• Even if children are assessed in their home language there are few evidence-based interventions available in languages other than English and trained staff to implement interventions in children’s home language

(Brown & Sanford, 2011; Linan-Thompson, Cirino, & Vaughn, 2007)

The Evaluation of Response to Intervention Practices for Elementary School Reading (Balu et al., 2015)

• Reading outcomes were only measured in English

• There was no discussion of the language (s) used for screening and progress monitoring DLLs

• Interestingly there were significant positive effects found for DLLs, but it was not made clear that their gains brought them to the level of benchmark performance

Student SampleStudent sample sizes are as follows: • 6,236 for Grade 1 ECLS-K reading assessment• 5,398 for Grade 1 TOWRE2• 4,301 for Grade 2• 6,549 for Grade 3ELL population• 13% of first-graders. • 9% of second-graders • 6% of third graders

This was one of the few statements regarding screening in the report

• Screen all students, and target intervention support. Early identification of students at risk for long-term reading difficulties begins with systematic screening near the beginning of the school year and at least once again in the middle of the year. Elements of a screening battery include standardization of screening procedures; grade-level benchmarks or expectations; designated risk levels; ease and efficiency of administration; and documented reliability, validity, and diagnostic accuracy of the screening measures. Such measures can be used to identify individual students or can be aggregated to examine the adequacy of the core curriculum as well as the effectiveness of different instructional strategies used in a school

Limitations in English-only measurement

• When children have limited English proficiency it is impossible to accurately estimate their ability level in language and literacy skill such as rhyming, blending, alliteration, vocabulary, word reading, etc. when only assessed in their emerging second language

Why Screen in Spanish?Quite simply…to improve accuracy.Children will be able to demonstrate their highest level of ability in discrete early literacy and language skills in a language they know best (Peña, Bedore, & Kester, 2012; Peña & Halle, 2011)

In 2017 the Educational Testing Service issued a report titled A Framework for the dual language assessment of young dual language learners in the United States that clearly describes an assessment approach that includes testing children in their home languages as well as English (Guzman-Orth, Lopez, & Tolentino, 2017)

Language Exposure

• It is also well documented that the amount of home language exposure to English and Spanish influences language proficiency in each language

• language proficiency has been found to be related to reading performance

(Cardenas-Hagan et al., 2007; Hammer, et al., 2008; Mancilla-Martinez et al.,, 2019)

Simultaneous and sequential bilinguals

Exploring the effects of language exposure provides the rationale for testing differences between:• Simultaneous bilinguals who grow up

speaking more than one language from birth or before the age of three and

• Sequential bilinguals who generally learn a second language sometime after the age of 3.

Gap in research to practice

• However, despite increasing evidence and key practice recommendations the authors of the RTI evaluation report made no mention of Spanish language assessment

• The programs involved in the RTI report were selected to be representative of common practice, but it appears they were all using English-only approaches to screening in their RTI models

Purpose of the study• To contribute to what is known about

differences in the identification of risk when DLLs are measured in Spanish and English in phonological awareness, alphabet knowledge and oral language

• To explore differences between simultaneous and sequential bilinguals

Study Research Questions

1. Is there a difference in rates of tier designation when universal screening measures are administered in English or Spanish?

2. Do these rates differ between children who are simultaneous or sequential bilinguals?

MeasuresIGDIs-Español

• Brief, easy to implement general outcome measures (GOMs) for 4-5- year olds

• Designed for Spanish-English bilingual preschoolers who are in United States who may move into English-only or bilingual K-12 experiences.

• Includes measures of oral language, phonological awareness, & alphabet knowledge

• Designed by specifically attending to how Spanish develops, rather than translating English tasks, leading to an emphasis on tasks that are more culturally and functionally salient for Spanish–English bilinguals.

S-IGDIs: Picture Naming, Expressive Verbs, Letter

Identification

S-IGDIs: Letter Sounds, First Sounds, Storybook

English IGDIs 2.0

• Brief, easy to implement general outcome measures (GOMs) for use with 3-5- year-olds

• Includes measures of oral language, phonological awareness, comprehension,& alphabet knowledge

• Available in a computer adapted testing format

• Developed with about 1,000 children across the US

Picture Naming Letter Sounds

First Sounds

Sample

• 214 of over 800 4-5-year-old Spanish-speaking dual language learners in the total sample

• Only children who had complete data sets were included in this sample

• All were enrolled in Head Start programs• Participants were from CA, OR, UT, and MN

Procedures

• Children were measured once a month for two academic years in English and Spanish by trained data collectors in their preschool classrooms (2017-2019)

• The first year began in January and the second year began in September with the last data point in May.

• We combined both samples of children in this report

Results

Discussion

• There is a significant and meaningful difference in the identification of risk when DLLs are measured in Spanish and English across phonological awareness, alphabet knowledge and oral language

• The largest difference being in oral language with Tier2/3 designations falling for the total sample at 86% on English Picture Naming and 29% on Identificación de los Dibujos

• 63% of children were identified as at-risk in Phonological Awareness in English versus 21% in Spanish

• In Alphabet Knowledge in English 47% were identified as at-risk versus 22% in Spanish

Discussion

• It is also important to investigate differences within group in DLLs to analyze differences between simultaneous and sequential bilinguals

93% of Spanish-Dominant (sequential) children were identified as at-risk on English picture naming whereas 18% were identified at-risk on the Spanish measures

Vs.78% of bilingual (simultaneous) children were identified as at-risk on English picture naming whereas 40% were identified at-risk on the Spanish measures

Notice the differences in both Spanish and English based on exposure

45% of Spanish-Dominant (sequential) children were identified as at-risk in English alphabet knowledge whereas 19% were identified at-risk on the Spanish measure

Vs.48% of bilingual (simultaneous) children were identified as at-risk on English picture naming whereas 27% were identified at-risk on the Spanish measures

Interestingly the bilingual group has higher rates of risk in both languages.

66% of Spanish-Dominant (sequential) children were identified as at-risk in English alphabet knowledge whereas 21% were identified at-risk on the Spanish measure

Vs.60% of bilingual (simultaneous) children were identified as at-risk on English picture naming whereas 21% were identified at-risk on the Spanish measures

In alphabet knowledge the groups are nearly equivalent.

Summary• It is important to screen Spanish speakers in Spanish to

reduce the risk of over-identification of risk• We need to increase awareness of the need to assess

students in the language or languages that will most accurately reflect their ability

• We also need to provide more information and guidelines about the limitations of assessing DLLs only in English

• More information is needed about how to use scores in both English and Spanish to inform instruction

• Analyses disaggregating simultaneous from sequential bilinguals could lead to more effective approaches to differentiating instruction based on language proficiency and to more accurately screening and progress monitoring

Discussion questions• How can we respond more rapidly in the field to

this growing imperative for dual language measurement approaches?

• How do we reach program directors and others that make the on the ground decisions about selecting assessments for their programs?

• How do we continue to offer professional development on the technical adequacy of instruments and assessment approaches as outlined in the ETS report for the practitioner audience?