128
JULY 2021 The EdTech Genome Project Report

Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

JULY2021

TheEdTechGenomeProjectReport

Page 2: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

AcknowledgmentsThe EdTech Genome Project stands on the shoulders of thousands of researchers and educators. In the year before this initiative convened its Steering Committee, the University of Virginia-based research team driving the work examined more than 1,300 peer-reviewed articles and conducted original research involving the participation of more than 1,500 teachers.

Once the technical working network for the EdTech Genome Project kicked into gear, more than 140 researchers, practitioners, experts, system leaders, and industry representatives joined committees and working groups. Over one thousand educators and stakeholders across the country engaged at various points in the initiative to provide feedback on which context variables to select, how to define them, and how to refine draft measurement instruments.

The findings and path laid forward in this report would not be possible without all of their contributions, for which we are extremely grateful. Additionally, many individuals across the EdTech Evidence Exchange community poured their efforts into creating this report.

Bart Epstein provided vision and leadership for every stage of the initiative. Drs. Emily Barton and Kate Tindle led the University of Virginia-based research team, whose exhaustive literature review provided ballast for the EdTech Genome Project. Dan Brown managed recruitment and engagement with the more than 140 educators, researchers, advocates, experts, and industry leaders who participated in the technical working network that produced this report. Dr. Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and content experts, to ensure that project outputs meet high standards for education research. Christine Tomasik provided organizational leadership from the project’s inception through this report’s publication. Ivy Missen, Ayana D’Aguilar, and Blake Bussie provided communications and logistics support. Dr. Lauren Molloy Elreda, Madelyn Roeder, Claire Fisher, Laura Pottmeyer, and Katie Heinekamp provided research support. Allie Strandmark did the graphic design for this report.

We also wish to thank Dr. Robert Pianta, our board chair and dean of the University of Virginia School of Education and Human Development, for his candid and thoughtful feedback throughout the process. His vision, with CEO Bart Epstein, to convene the EdTech Efficacy Academic Research Symposium in 2017 led directly to the EdTech Genome Project effort and to the groundbreaking EdTech Evidence Exchange Platform being built on its findings.

Special thanks to the Chan Zuckerberg Initiative, Carnegie Corporation of New York, Strada Education Network, and University of Virginia School of Education and Human Development, whose support for our organization makes this work possible.

Finally, we wish to honor the memory of Candice Dodson, a true education innovator, thoughtful colleague, and ebullient EdTech Genome Steering Committee member.

The EdTech Evidence Exchange is structured as a public charity that is supported exclusively by grants, philanthropy, and support from a growing list of funders including the University of Virginia School of Education and Human Development, Strada Education Network, Carnegie Corporation of New York, and the Chan Zuckerberg Initiative. Its mission is to help educators make better-informed decisions about education technology.

© 2021, EdTech Evidence Exchange.

+++++

Page 3: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

TABLE OF CONTENTS

Acknowledgments 1

Table of Contents 2

Key Takeaways 4

Executive Summary 6

Participants 10

Participant Logos 11Steering Committee 12Advisory Board 14Variable Working Groups 16Measurement Council 21Industry Council 22Research Council 24

Showing Our Work 25

Timeline 26Literature Review & Data Collection 27Convening Experts 36Gaining Variable Consensus 42Diving Deep on Variables 44Creating the Context Inventory 48Validating the Context Inventory 50

EdTech Context Framework 54

Vision for Teaching & Learning 55Selection Processes 59

Page 4: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

Teacher Agency 63Infrastructure & Operations 66Implementation Systems & Processes 71Staff Culture 75Teacher Beliefs & Knowledge 79Strategic Leadership Support 82Professional Learning 85Competing Priorities 89

EdTech Context Inventory 92

Introduction 93Vision for Teaching & Learning 94Selection Processes 95Teacher Agency 96Infrastructure & Operations 97Implementation Systems & Processes 98Staff Culture 99Teacher Beliefs & Knowledge 100Strategic Leadership Support 101Professional Learning 102Competing Priorities 103

Frequently Asked Questions 104

Glossary 118

Supplemental Resources 120

Researcher Action Steps 121Educator Action Steps 123Industry Action Steps 125

Page 5: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTKey Takeaways

• Edtech decision makers currently select and implement technologies with almost no information about what is likely to work in their schools. They spend tens of billions of dollars each year on edtech that is underused, inequitably used, or ineffectively used.

• This cycle persists because educators lack the shared language, incentives, and mechanisms to document their experiences with edtech and to share lessons learned with other educators working in similar schools and districts.

• The EdTech Evidence Exchange took a crucial step toward solving this problem via the EdTech Genome Project. Over the last several years, this project built sector-wide consensus on 10 consequential variables that describe how school contexts vary in ways that influence the success or failure of edtech.

• Harnessing the expertise of a broad and diverse group of practicing educators, researchers, industry representatives, policy makers, and nonprofit leaders, the EdTech Genome Project developed two tools (1) the EdTech Implementation Framework, comprehensive definitions for each variable, and (2) the EdTech Implementation Inventory, a set of 10 cohesive instruments for detecting and measuring these variables.

• With the release of these tools, educators and education stakeholders now have the shared language and measurement instruments needed to detect and document how and why the impact of edtech tools varies so widely from school to school.

• Next, the EdTech Evidence Exchange is launching a massive effort to incentivize and collect implementation data from hundreds of thousands of educators who will use the new tools to carefully describe their contexts and document their experiences with specific technologies.

• As these data reach critical mass, educators nationwide will — for the first time ever — be able to learn, at scale, from the experiences of those working in similar schools and districts.

Context Variables

Page 6: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

The vision for teaching and learning unifies stakeholders with clear direction, purpose, and rationale for technology-supported learning. A high-quality vision is forward-thinking and actionable, and to have e�ect, must be consistently communicated and referenced as a guide for action. Visioning helps schools and districts recognize opportunities for technology to address problems of practice, prioritize equity, and plan for technology integration that promotes student learning opportunities. Visions describe the ideal state of teaching and learning for all students in which digital technologies transform daily life.

Selection processes occur prior to procurement and are the presence and quality of consistent methods through which classrooms/schools/districts/states identify technologies, evaluate those technologies, and choose technologies for procurement to meet established student and teacher needs for learning and instruction.

Teacher agency is the extent to which teachers consistently have a voice in shaping their work and the conditions and tools for that work. Regarding education technology implementation, this is the extent to which the conditions for agency are in place and a variety of teachers are consistently involved in decision-making related to shared visioning, selection processes, implementation processes, infrastructure, and professional learning.

Infrastructure and operations are the enabling conditions that lower barriers for implementation, facilitate uptake, and support scaling and sustaining new education technology. These conditions include physical resources, broadband Internet connectivity, students’ remote devices and connectivity, human resources, system specifications, operational policies, and funding.

Implementation systems and processes occur after procurement and are the presence and quality of methods through which school communities put education technology into e�ect over time to achieve intended outcomes. This includes mechanisms for monitoring ongoing fit with current initiatives, conducting resource inventories, monitoring the ongoing use of the technology as it was designed, making systemic adjustments as needed, and documenting evidence of impact on target outcomes.

Sta� culture refers to the set of beliefs, values, norms, and assumptions that are shared collectively by the school and/or district sta� and that influence the way in which sta� members work individually and collaboratively to fulfill the school’s shared vision for teaching and learning. Important facets of sta� culture include trust, social capital, communication, and equity.

Teacher beliefs and knowledge are individual teacher’s perceived ability to use education technologies and integrate them into their practice. This variable combines (1) teachers’ beliefs about, knowledge about, and experiences using education technology and (2) teachers’ understanding of curriculum, instruction, and assessment. Together, these elements interact to enable the comfort and flexibility necessary to use education technologies e�ectively and appropriately in di�erent learning settings.

Strategic leadership support is the extent to which district and school leaders provide explicit encouragement and guidance to sta� who are selecting and implementing education technology tools. This support sets and communicates a vision, develops sta�, and aligns technology implementation with the district instructional plan.

Professional learning is the presence, duration, and quality of a range of intentional, adult learning activities that support the e�ective integration of education technology to advance student learning and outcomes. This includes both formal and informal opportunities that lead to shifts in beliefs, knowledge, skills, and practices related to technology integration.

Competing priorities are the extent to which a school or district has other prioritized initiatives that impact the available time and attention for new technology implementations. The presence of competing priorities is influenced by limited instructional time, limited preparation time, overlapping initiatives, and communication of priorities.

Definitions of Context Variables

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

Page 7: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

6

With the publication of this report, we celebrate a meaningful milestone in an ongoing effort to improve our nation’s education system.

We know that education technology (edtech) can have a transformative impact on student learning. When used properly, it can help teachers enhance and differentiate instruction and can help students master new concepts and skills in engaging and empowering ways.

Unfortunately, our collective investment of more than $100 billion in edtech over the last decade1 has fallen far short of its potential impact. A disturbing amount of edtech is used ineffectively or not at all. Worse still, students in schools with predominantly economically

THE EDTECH GENOME PROJECTExecutive Summary

disadvantaged learner populations likely experience inequitable, lower-quality implementation.

Our work in recent years has made clear that a source of this problem is a lack of information that is rooted in a collective action problem. In our highly fragmented education system, teachers and administrators crave information. They want to learn from the experiences of other educators whose school contexts are most similar to their own. Currently, the information available to educators making edtech selection and implementation decisions does not help them understand how a technology will work in their context,

We can and must do better.

Page 8: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

with their students. Educators have no effective way to learn from (and build on) each other’s experiences using technology in their classrooms and schools. More information about this

collective action problem is contained in the Frequently Asked Questions section of this report.

To address the problem, we needed to develop common language and measurement instruments to collect contextually relevant data from large numbers of educators across the nation. This was the motivation for the EdTech Genome Project, in which we aimed to map the “genome” of edtech implementation contexts. Next, we need to build a place where educators can easily learn from one another.

Over the last two years, thanks to the hard work of the broad and diverse group of education stakeholders who participated in the EdTech Genome Project, we reached sector-wide consensus on how to define and

measure 10 variables that describe how school contexts vary from each other in the ways that likely matter most to the successful selection and implementation of education

technology. You will find the list of the stakeholders whose work powered the EdTech Genome Project in the Participants section of this report.

We now have the shared language and measurement instruments needed to form the backbone of a software platform that will enable hundreds of thousands of teachers and administrators to share detailed, standardized data not only on their experiences with various education technology products — but also on the characteristics of their local contexts. Paired together, these datasets will

We reached sector-wide consensus on how to define and measure 10 variables that describe how school contexts vary from each other in ways that likely matter most to the successful selection and implementation of education technology.

We have collectively taken a substantial step toward fixing the problem.

7

Page 9: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

finally make it possible for educators to learn from the experiences of other educators at scale.

As you will read in the Showing Our Work section of this report, EdTech

Genome Project participants, including 10 variable-specific working groups, engaged in an exhaustive process to:

• find and review previous academic research;

• collect perspectives from teachers and administrators;

• convene stakeholders to discuss, debate, and make decisions;

• take public comment; and• pilot and revise instruments

—all in service of reaching consensus on how to define and measure each context variable.

You will find the long and short definitions for each of the 10 context variables in the EdTech Context Framework section of this report. You will find samples of the new measurement instruments

in the EdTech Context Inventory section of this report. As these new measurement instruments undergo field validation, we expect them to be sharpened and shortened to make it increasingly easy for all of us

to understand and discuss diverse implementation contexts.

We now have standardized ways to measure conditions that influence edtech implementation, such as teacher agency, infrastructure & operations, professional learning, competing priorities, and more.

In addition to this report, we prepared three supplemental two-pagers: Researcher Action Steps, Educator Action Steps, and Industry Action Steps. These two-pagers offer concrete ways that researchers, administrators, teachers, and industry can begin to bring the work of the EdTech Genome Project to life. They are also a helpful tool for sharing the variables with your colleagues.

We now have standardized ways to measure conditions that influence edtech implementation, such as teacher agency, infrastructure & operations, professional learning, competing priorities, and more.

8

Page 10: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

The accomplishments detailed in this report are worth celebrating but are only a means to an end. To make it feasible for millions of educators to learn from each other’s experiences, we must now embed these new definitions and measurement instruments into software and systems that we can use to learn from each other at scale.

To that end, we have begun building the EdTech Evidence Exchange Platform, which will become a national repository for detailed and context-sensitive evidence, matching educators to information from contexts like their own across the nation. Hundreds of thousands of educators will use the platform to systematically document and share evidence on their implementation contexts and experiences.

We must work together to fill the EdTech Evidence Exchange Platform with a critical mass of data from a sufficient number of educators. These data are the key ingredients and secret sauce that will empower educators nationwide to make dramatically improved decisions about which edtech to buy and how to most effectively implement it.

Collecting these data will be challenging but doable. In our fragmented system (of more than 13,000 school districts and more than 100,000 schools) nearly everyone wants to learn and benefit from their peers’ experiences. But nobody has an incentive to carefully document their own experiences. As you will read in the Frequently Asked Questions section of this report, we are working to solve that problem next, and we are counting on your help.

Thank you for being a part of our collective journey.

Onward!

Indeed, the real work lies ahead.

9

1 Based on data prior to 2015, estimates placed the annual edtech spend at $13.1B per year. In 2021, we collected data sources that illustrate a pre-COVID-19 pandemic estimate of between $26B and $41B per year being spent on edtech. These data were collected before the pandemic-related spending led to incredible additional edtech investments. As such, $100B is actually a conservative estimate of the spending on edtech over the last decade.

Page 11: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTParticipants

Participant Logos 11

Steering Committee 12

Advisory Board 14

Variable Working Groups 16

Measurement Council 21

Industry Council 22

Research Council 24

Page 12: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

Thank you to these organizations who allowed their team members to participate on EdTech Genome Project committees, boards, and working groups.

Page 13: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

12

Steering CommitteeThis diverse group of education leaders, selected both by application and by appointment, made key decisions to form national consensus on top variables for edtech implementation. The EdTech Genome Steering Committee had authority to guide and to approve the final deliverables of the 10 working groups that developed definitions and measurement instruments for each of the context variables selected to be studied first.

Co-chair: Melissa CollinsGlobal Teacher Prize Finalist; TeacherShelby County Schools (TN)

Co-chair: Verna LalbeharieManaging Director of Digital Age Personalized LearningAmerican Institutes for Research (AIR)

Co-chair: Joseph SouthChief Learning OfficerInternational Society for Technology in Education (ISTE)

Doretha AllenMiddle School Innovation CoordinatorDallas Independent School District (TX)

Lennon AudrainTeacherBrookline Public SchoolsFormer Student PresidentEducators Rising

Jason BaileyDirector of Innovation and DesignState Educational Technology Directors Association (SETDA)

Danny CarlsonAssociate Executive Director for Policy & AdvocacyNational Association of Elementary School Principals (NAESP)

Kimberly DadismanSenior Policy and Research ManagerAbdul Latif Jameel Poverty Action Lab (J-PAL North America)

Stacey Dallas JohnstonTeacherClark County School District (NV)Former Teaching Ambassador FellowUS Department of Education

Aman DhandaDirector of Member Engagement and PartnershipsNational Association of Secondary School Principals (NASSP)

Candice Dodson*Executive DirectorState Educational Technology Directors Association (SETDA)

Jason EdwardsSenior AssociateAmerican Federation of Teachers (AFT)

Rose Else-MitchellPresident of Education SolutionsScholastic

Brent EngelmanDirector of Education Data and Information SystemsCouncil of Chief State School Officers (CCSSO)

Page 14: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

13*Deceased

Steering Committee

Meg HamelDirector of Learning Initiatives and EdSurge ResearchInternational Society for Technology in Education (ISTE)

Kristin HamiltonVice President of Standards & AssessmentNational Board for Professional Teaching Standards (NBPTS)

Barbara HickmanAssistant Professor, College of EducationUniversity of Wyoming

Maria HylerDirector of Washington D.C. Office and Senior ResearcherLearning Policy Institute (LPI)

Alexander KmicikewyczTeach Plus Fellow; TeacherChicago Public Schools (IL)

Keith KruegerChief Executive OfficerConsortium for School Networking (CoSN)

Chris Liang-VergaraFounderWorld Class Edu

Stephanie MarkenExecutive Director of Education ResearchGallup

Saro MohammedFounder and PrincipalEd Research Works

Andrea PrejeanDirector of Teacher QualityNational Education Association (NEA)

Alexandra ReschDirector of Learning & Strategy, Human ServicesMathematica

Brian SeymourEducation Technology DirectorPickerington Local School District (OH)

David SlykhuisAssistant Dean, College of Natural and Health Science; Director, Mathematics and Science Teaching (MAST) InstituteUniversity of Northern ColoradoChairNational Technology Leadership Summit (NTLS)

Daniel StanhopeVice President of Research & AnalyticsLearnPlatform

Katrina StevensPresident and CEOThe Tech Interactive

Lauren StuartTeacherBeverly Hills School District (CA)Board MemberEdReports.org

Michelle TiuDirector of Educational TechnologyWestEd

Bi VuongManaging Director of Education PracticeProject Evident

Page 15: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

14

Advisory BoardThese senior leaders from across the education sector provided guidance on project strategy, participant recruitment, and the content of the EdTech Context Framework and the EdTech Context Inventory.

Dana AnselChief Academic OfficerLearnLaunch

Thomas ArnettSenior Research FellowChristensen Institute

Ryan BakerAssociate ProfessorUniversity of Pennsylvania Graduate School of Education

Peggy BrookinsChief Executive OfficerNational Board for Professional Teaching Standards (NBPTS)

Eric BrownExecutive CommitteeNational Education Association (NEA)

Tonika ClaytonManaging PartnerNew Schools Venture Fund (NSVF)

Patrice Dawkins-JacksonDirector of Post-Baccalaureate Fellowship ProgramsCarnegie Foundation for the Advancement of Teaching

Elizabeth FosterVice President of Research & StandardsLearning Forward

David IrwinCo-FounderThru

Eric IsselhardtFormer PresidentNational Network of State Teachers of the Year (NNSTOY)

Jacqueline JodlAssociate ProfessorUniversity of Virginia School of Education and Human Development

George KaneGeneral Manager, Education VenturesEmerson Collective

Tom KaneEconomist and Walter H. Gale Professor of EducationHarvard Graduate School of Education

Sonny MaganaChief Executive OfficerMagana Education

Christopher MazzeoDirector, Center for Research, Evaluation and AnalysisEducation Northwest

Michele McLaughlinPresidentKnowledge Alliance

Page 16: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

15

Erin MoteExecutive Director and Co-FounderInnovateEDU

Dean NafzigerVice PresidentWestat

Jennifer NorfordChief Program OfficerMarzano Research

Ronn NozoeChief Executive OfficerNational Association of Secondary School Principals (NASSP)

Leila NulandManaging Content DirectorHanover Research

Lynn OlsonConsultant; Former Deputy Director of K12 EducationBill and Melinda Gates Foundation

John PaneSenior ScientistRAND

Beth RabbittChief Executive OfficerThe Learning Accelerator

Ron ReedFounder and Executive ProducerSXSW EDU

Kimberly SmithExecutive Director, League of Innovative SchoolsDigital Promise

LaVerne SrinivasanVice President of National Programs, Program Director for EducationCarnegie Corporation of New York (CCNY)

Bill TallySenior Research Scientist, Center for Children and TechnologyEducation Development Center (EDC)

Valerie TruesdaleAssistant Executive DirectorAmerican Association of School Administrators (AASA)

Richard VarnPresidentEducational Testing Services (ETS)

Rob WeilDirector of Field ProgramsAmerican Federation of Teachers (AFT)

Advisory Board

Page 17: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

16

Variable Working GroupsThese 10 working groups each took one context variable selected by the Steering Committee for further study. Each working group spent half a year developing short and long definitions, as well as a draft instrument, for their variable. The 10 harmonized instruments comprise the EdTech Context Inventory.

SELECTION PROCESSES

Chair: Kimberly DadismanSenior Policy and Research ManagerAbdul Latif Jameel Poverty Action Lab (J-PAL North America)

Jena DraperFounder & EvangelistCatchOn

Julia FebigerDirector of Assessment and Research MarketsCurriculum Associates

Jin-Soo HuhPartnerThe Learning Accelerator

Vytas LaitusisEducation Research DirectorHoughton Mifflin Harcourt

Bryan MatlenSenior Research AssociateWestEd

Caitlin McLemoreEducational ConsultantBlank Crayon

Elizabeth MyersDirector of Education Research & EvaluationWGBH

COMPETING PRIORITIES

Chair: Thomas ArnettSenior Research FellowChristensen Institute

Nathan CraverEducational ConsultantNorth Carolina Department of Public Instruction

Ryan ImbrialeExecutive Director of Educational OperationsBaltimore County Public Schools

Nancy KolasDirector of Success ManagementLexia Learning

Emily LepkowskiManager of Teaching and LearningNewsela

Ruby WestDirector of Assessment PlatformCurriculum Associates

Page 18: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

17

INFRASTRUCTURE & OPERATIONS

Chair: Mark SambergDirector of Technology ProgramsFriday Institute for Educational Innovation

Daniel BrennerSenior Research AssociateWestEd

Doug CaseyExecutive DirectorConnecticut Commission for Educational Technology

Monica CouganManager of Strategic Relationships and InitiativesEducation Networks of America (ENA)

Nithi ThomasPartnerThe Learning Accelerator

IMPLEMENTATION SYSTEMS & PROCESSES

Chair: Barbara HickmanAssistant Professor, College of EducationUniversity of Wyoming

Malvika BhagwatDirector of Outcomes and EfficacyOwl Ventures

Ryan BurkeResearch AssociateWestEd

Jennifer DavisSenior PsychometricianPearson VUE

Marci HousemanState Success ManagerLexia Learning

Fan JiangDirector of Data Analytics for K12 EducationHanover Research

Sierra NoakesResearch Project DirectorDigital Promise

Jeremy SimonPrincipal Manager for Big DistrictsClever

Eric StickneySenior Director of Educational ResearchRenaissance Learning

Variable Working Groups

Page 19: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

18

PROFESSIONAL LEARNING

Chair: Doretha AllenMiddle School Innovation CoordinatorDallas Independent School District (TX)

Kat BrownSenior Director of Professional DevelopmentDreamBox Learning

Leslie FaginInstructional Technology CoachGriffin Spalding County School System (GA)

Brittany GuyManager of Personalized LearningChicago Public Schools (IL)

Maria HylerDirector of Washington D.C. Office and Senior ResearcherLearning Policy Institute (LPI)

Nancy MangumCo-FounderLeading EDge Learning

Amy O’ConnellProject Director for MAPLE Innovative School Leaders NetworkLearnLaunch

Stephen PhamDirector of Organizational LearningThe Learning Accelerator

STAFF CULTURE

Chair: Daniel StanhopeVice President of Research & AnalyticsLearnPlatform

Debby Almonte-BertlingDirector and Product OwnerEducational Testing Service (ETS)

Elizabeth BirieProgram ManagerMIND Research Institute

Danielle BrownProfessional Learning DirectorArizona K12 Center

Shelby HubachManaging Senior ResearcherMarzano Research

Emily NesterEducational Technology SpecialistTalladega County Schools (AL)

Devin VodickaChief Impact Officer and Chief Academic OfficerAltitude Learning

Variable Working Groups

Page 20: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

19

STRATEGIC LEADERSHIP SUPPORT

Chair: Danny CarlsonAssociate Executive Director for Policy & AdvocacyNational Association of Elementary School Principals (NAESP)

Jason BaileyDirector of Innovation and DesignState Educational Technology Directors Association (SETDA)

David ChanDirector of Instructional TechnologyEvanston Township High School D202 (IL)

David IrwinCo-FounderThru

Ann Koufman-FrederickChief Academic OfficerLearnLaunch

Blair RushPartnerThe New Teacher Project (TNTP)

Ron WahlenDirector of Digital LearningDurham Public Schools (NC)

TEACHER AGENCY

Co-chair: Melissa CollinsGlobal Teacher Prize Finalist; TeacherShelby County Schools (TN)

Co-chair: Michael DunleaGlobal Teacher Prize Finalist; TeacherTabernacle School District (NJ)

Jeanette JoyceSenior ResearcherMarzano Research

Jarrett Reid WhitakerExecutive Director of Digital Teaching & LearningRice University

Talya SchwartzDirector of Education Research and InsightsTeachers Pay Teachers

Tiffany WycoffCo-Founder and Chief Operating OfficerLearning Innovation Catalyst

Variable Working Groups

Page 21: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

20

TEACHER BELIEFS & KNOWLEDGE

Chair: David SlykhuisAssistant Dean, College of Natural and Health Science; Director, Mathematics and Science Teaching (MAST) InstituteUniversity of Northern ColoradoChairNational Technology Leadership Summit (NTLS)

Rachel BursteinResearch AssociateEdSurge

Madhu GovindGraduate Researcher, DevTech Research GroupTufts University

Chase NordengrenSenior Research ScientistNorthwest Evaluation Association (NWEA)

Leila NulandSenior Managing DirectorHanover Research

Rachel SchechterVice President of Learning SciencesHoughton Mifflin Harcourt

Molly ZielezinskiFounder and CEOMBZ Labs

VISION FOR TEACHING & LEARNING

Chair: Yvonne KaoSenior Research AssociateWestEd

Emma BraatenExecutive Director of Digital Teaching and LearningChatham County Schools (NC)

Cassondra Corbin-ThaddiesDirector of Transformational CoachingLearning Innovation Catalyst

Paul FranzDirector of ResearchMBZ Labs

James FryePrincipalCatawba County Schools (NC)

Variable Working Groups

Page 22: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

21

Measurement CouncilThese senior measurement experts reviewed and revised the 10 draft instruments developed by the 10 variable working groups. Their contributions shaped the final EdTech Context Inventory.

Wing Yi (Winnie) ChanDirector of P12 Research EducationEducation Trust Winnie Chan completed an independent equity review of the EdTech Context Inventory.

Jonas BertlingDirector of Large-scale Assessment QuestionnairesEducational Testing Service (ETS)

Stephanie MarkenExecutive Director of Education ResearchGallup

Saro MohammedFounder and PrincipalEd Research Works

Chase NordengrenSenior Research ScientistNorthwest Evaluation Association (NWEA)

Daniel StanhopeVice President of Research & AnalyticsLearnPlatform

Bi VuongManaging DirectorProject Evident

Page 23: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

22

Industry CouncilThis group met quarterly to provide feedback and industry perspectives on each stage of the initiative.

Chair: Rose Else-MitchellPresident of Education SolutionsScholastic

Jill AbbottCEOAbbott Consulting Group

Kristal AyresChief Business Development OfficerBrightBytes

Malvika BhagwatDirector of Outcomes & EfficacyOwl Ventures

Todd BrekhusChief Product OfficerRenaissance Learning

Alex BrownSenior Director of Customer and Technical Support & ServicesTeaching Strategies

Dan CarrollCo-Founder and Chief Product OfficerClever

Dan Cogan-DrewChief Academic OfficerNewsela

Matt DohertyChief Operating OfficerLearnPlatform

Patricio DujanRegional Vice President of SalesScholastic

Nicole FosterHead of Global Partnerships and MarketingAmazon

Sunil GunderiaChief Innovation Officer and Head of Mastery & Adaptive ProductAge of Learning, Inc.

Greg GunnFounderLingo Ventures

Amy JacksonVice President of Applied Research & StrategyIlluminate Education

Kelli HillDirector of Efficacy & ResearchKhan Academy

Timna MolbergerVice President of Partner SuccessEllevation Education

LaShon OrmondVice President of Strategic InitiativesAmplify

Roya SalehiVice President of Customer SuccessLexia Learning and Voyager Sopris Learning, Cambium Learning Group®

Page 24: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

23

Amy ScholzChief Marketing OfficerImagine Learning

Maia SharpleyPartnerLearn Capital

Marty ThomasVice President of Professional ServicesEdmentum

Lanette TrowerySenior Director of Learning Research and StrategyMcGraw-Hill

Industry Council

Page 25: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

24

Research CouncilThis group of advisors provided strategic advice on the content and process of the EdTech Genome Project’s deliverables, as well as guidance on positioning the initiative’s work for long-term adoption by the research sector.

Chair: Saro MohammedFounder and PrincipalEd Research Works

Ryan BakerAssociate ProfessorUniversity of Pennsylvania Graduate School of Education

Maria HylerDirector of Washington D.C. Office and Senior ResearcherLearning Policy Institute (LPI)

Verna LalbeharieManaging Director of Digital Age Personalized LearningAmerican Institutes for Research (AIR)

Chris Liang-VergaraFounderWorld Class Edu

Christopher MazzeoDirector, Center for Research, Evaluation and AnalysisEducation Northwest

Denis NewmanCo-Founder and Board ChairEvidentially

Leila NulandSenior Managing DirectorHanover Research

Michelle TiuDirector of Educational TechnologyWestEd

Chelsea WaiteResearch FellowChristensen Institute

Molly ZielezinskiFounder and CEOMBZ Labs

Page 26: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTShowing Our Work

Timeline 26

Literature Review & Data Collection 27

Convening Experts 36

Gaining Variable Consensus 42

Diving Deep on Variables 44

Creating the Context Inventory 48

Validating the Context Inventory 50

Page 27: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

26

SUMMER 2018 - SUMMER 2019

WINTER 2020Gaining Variable

Consensus

Literature Review & Data Collection

FALL 2019Convening Experts

SPRING 2020Diving Deep on Variables

FALL 2020 - ONGOINGValidating the Context Inventory

SUMMER 2020Creating the

Context Inventory

EdTech Genome Project Timeline

Page 28: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

27

SUMMER 2018 - SUMMER 2019

Literature Review & Data Collection

Who

What Identify individual and context variables likely to impact edtech implementation.

UVA research team & the EdTech Evidence Exchange

Page 29: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

28

Literature Review & Data Collection

IntroductionBetween summer 2018 and summer 2019, the EdTech Evidence Exchange (i.e., the Exchange) and the UVA research team sought to identify two sets of variables: variables educators perceive to impact edtech implementation and variables previously quantitatively associated with edtech implementation in the literature. We gathered these data to provide comprehensive information to the EdTech Genome Project Advisory Board and Steering Committee on the variables most likely to matter for defining edtech implementation contexts, as well as to inform future research. Based on theories of adoption and diffusion (Straub, 2009), we specifically sought to understand both individual (e.g., teacher beliefs) and setting (e.g., leadership support) variables. To gather this information, we relied on four data sources: stakeholder expertise, event-based teacher surveys and focus groups, local education agency (LEA) research, and systematic literature review. From these data, we presented a list of 23 “contender” variables for the Advisory Board and Steering Committee to review (See Fall 2019: Convening Experts).

Tables 1 and 2 below provide more detail on each type of variable outcome and data source.

Variable Outcomes What is it? How is edtech

implementation defined? What isn’t it? Why is it important?

Variables Educators Perceive to Impact EdTech Implementation

Indication that educators believe a variable is responsible for the success or failure of edtech implementation

Educators identified variables they believed impacted:

1. use at all and 2. success in terms of

impacting student outcomes.

It is not statistical evidence of an association between a variable and edtech implementation.

⊲ Informs information to be included in edtech implementation reports

⊲ Suggests new variables for statistical association investigation

Variables Associated with EdTech Implementation

Statistical evidence of an association between a variable and a quantitative measure of edtech implementation

Implementation is operationalized differently across articles in the literature review (e.g., frequency by purpose, frequency by technology type). To be included, the measure of implementation must be continuous or ordinal. In the LEA research, implementation is quantified as the total number of programs and mean days of use across programs.

It is not statistical evidence of a causal association between a variable and edtech implementation.

⊲ Drives measurement focus and development

⊲ Informs variables to be included in the EdTech Context Framework & Inventory, with which we will match classrooms, schools, and districts

Table 1. Variable Outcomes

Page 30: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

29

Literature Review & Data Collection

Source Sample Type of Data Which outcome?

Stakeholder Expertise

Educators and education stakeholders who shared informal thinking with the Exchange based on their expertise

Qualitative Variables Educators Perceive to Impact EdTech Implementation

Event-based Teacher Surveys and Focus Groups

The Exchange held several events where we specifically asked 1,367 educators which variables impact edtech implementation in terms of use at all and impact on student learning.

⊲ Summer 2018 EdTech Implementation Summits

⊲ Fall 2018 Educator Convenings on Research Use (held in partnership with the Institute of Education Sciences)

Qualitative Variables Educators Perceive to Impact EdTech Implementation

Local Education Agency (LEA) Research

The Exchange conducted research with 506 teachers in 8 schools from a charter management organization and 3 schools from a public school district.

⊲ Context survey of teachers to measure hypothesized implementation variables

⊲ Edtech implementation questionnaires ⊲ Administrator interviews ⊲ Teacher focus groups and interviews ⊲ Edtech use data from LearnPlatform Chrome extension

⊲ Associations between implementation variables and edtech use

Qualitative & Quantitative

Variables Educators Perceive to Impact EdTech Implementation and Variables Associated with EdTech Implementation

Systematic Literature Review

Through a systematic literature review, the Exchange preliminarily identified 43 articles that include quantitative associations between at least one variable, often more, and a continuous measure of edtech implementation.

Quantitative Variables Associated with EdTech Implementation

Table 2. Data Sources

Page 31: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

30

Literature Review & Data Collection

Data Collection and Analysis TimelineThis timeline illustrates the sequential nature of data collections and the iterative nature of analyses to develop cohesive findings from multiple data sources.

July 2018

Data Collection Analysis

Aug. 2018

Sept. 2018

July 2019

Aug. 2019

Sept. 2019

Oct. 2018

Nov. 2018

Dec. 2018

Jan. 2019

Feb. 2019

Mar. 2019

Apr. 2019

May 2019

June 2019

EdTech ImplementationSummits

Emergent coding onSummits variables

Emergent coding onLEA Context

Survey variables

Reconciling of 2emergent codingschemes for 60

variables/sub variables

Coding Teacher &Administrator

Interviews

Analyzing associationsbetween variables

& edtechimplementation

Coding EducatorConvening on Research

Use & EdTechImplementationQuestionnaires

LEA ResearchContext Survey

LEA ResearchAdministrator Interviews

LEA Research TeacherFocus Groups

Systematic Literature Review

LEA ResearchEdTech Implementation

Questionnaires &Teacher Interviews

EdTech Use Data &Student Achievement

Educator Conveningon Research Use

Page 32: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

31

Literature Review & Data Collection

Event-based Teacher Surveys and Focus Groups Methods

EdTech Implementation SummitsBuilding on previous work at the 2017 EdTech Efficacy Research Academic Symposium calling for crowdsourced edtech implementation research (Epstein et al., 2017), the Exchange hosted three EdTech Implementation Research Summits in summer 2018 to ask educators about specific variables that they believe impact edtech implementation success or failure. Working with PreK12 and higher education (HE) regional partners, the Exchange invited educators and administrators from both PreK12 and HE regional institutions to each summit (Barton et al., 2019).

We divided attending educators into focus groups based on sector (PreK12 versus HE), as well as position (teachers versus administrators). Within HE, we did not distinguish between teachers and administrators given their overlapping roles and fewer overall attendants from HE. We formed groups of four to six educators in each of these sections. Each group discussed and collectively recorded their responses to: Beyond the essential conditions (defined as structural conditions, e.g., WiFi), what else might impact why edtech is successful? This question and discussion intentionally focused on human-centered variables (e.g., teacher beliefs), as opposed to structural, concrete variables (e.g., WiFi, devices).

We developed an emergent coding scheme2 (Crabtree & Miller, 1999) for the individual, contextual, and technology-specific factors identified by educators as variables that influence why edtech implementation is or is not successful. (This was the first of two emergent coding schemes.) These codes illustrate the variables educators identified, such as the quality of professional learning/support that teachers receive or the extent to which the particular innovation aligns with curriculum/content priorities in a given setting. Three experienced coders came to consensus application of each code.

Educator Convenings on Research UseIn fall 2018, the EdTech Evidence Exchange partnered with the Institute of Education Sciences (IES) to host a series of convenings, titled “Elevating Educator Voice in the National Education Research Agenda.” We aimed to (1) capture educators’ current perspectives on education research, (2) identify the categories and types of research educators believe would help them drive more meaningful increases in student learning, and (3) gauge educators’ interest

2Emergent coding is when coders allow themes to organically grow (emerge) from the data, while a priori coding is when coders bring a predetermined coding scheme to the data.

Page 33: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

32

Literature Review & Data Collection

in further engaging with research efforts. (See this report for descriptive findings related to educators’ research use.) We hosted two K12 educator convenings, in Omaha, Nebraska and Raleigh, North Carolina, followed by a third convening of educator association leaders in Washington, D.C. During the recruiting process, the EdTech Evidence Exchange invited educators to share their voices to “better understand educators’ needs” but did not specify research as a topic of discussion in that invitation to reduce the likelihood that attendees would be educators who specifically appreciate and use research. As such, our pre-event survey also asked educators about the variables they believe impact edtech implementation in terms of a technology being used at all and a technology being successful for improving student outcomes. Two coders applied the reconciled coding scheme to these data, coming to consensus. (See the LEA Research Methods section below for the process of developing and reconciling a second emergent coding scheme.)

LEA Research MethodsDuring the 2018-2019 school year, the EdTech Evidence Exchange partnered with a public school district (PSD educator n = 138) and a charter management organization (CMO educator n = 368) to gather data about edtech implementations. (See this demographic summary of participating schools.)

LEA Research Plan by Phase • In Phase 1, we asked all teachers and school-based administrators to respond to the context survey (506 teachers/administrators, 81% response rate).

• This survey collected quantitative data about variables hypothesized to influence edtech implementation, such as student access to technology, professional development availability, teachers’ technology self-efficacy/beliefs, and teachers’ openness to change. Educators also qualitatively described variables they believed impacted technology implementation.

• To prepare for Phases 2 and 3, we used survey responses to randomly select teachers with positive and negative beliefs about technology and various levels of teaching experience.

• In Phase 2, we selected 202 teachers (65 PSD, 137 CMO) to participate in focus groups to discuss edtech implementation and the context variables influencing implementation. Administrators at each school participated in an interview to discuss how their context influences edtech selection and implementation.

• In Phase 3, we selected 49 teachers (18 PSD, 31 CMO) to respond to 6 bi-weekly edtech implementation questionnaires and participate in two interviews.

• Throughout the year, we collected edtech use data (via LearnPlatform Google Chrome Extension) and student achievement. (Note. We only reported on the edtech use data for the public school district.)

Page 34: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

33

Literature Review & Data Collection

LEA AnalysesTo consider the variables educators identify as influencing edtech implementation success or failure, a group of coders developed a second emergent coding scheme for the variables identified by educators on the LEA research context survey. In spring 2019, the coding teams reconciled this coding scheme with the scheme developed from the EdTech Implementation Summits for the codebook of variables educators identify as impacting edtech implementation. We then coded data from the edtech implementation questionnaires using this reconciled coding scheme. Additionally, we coded teacher focus groups and interviews with a combination of a priori and emergent coding, searching for confirming and disconfirming evidence of the variables.

To investigate the quantitative associations between implementation variables and technology use, we used multiple linear regression to predict technology use from measured individual and context variables from the context survey. We used LearnPlatform’s Chrome extension edtech usage data to quantify edtech use, calculating educators’ total number of technologies used across five months and educators’ mean number of days used across technologies.

506 teachers:

PHASE 1

PHASE 2

Context Survey

202 teachers:Focus Groups

PHASE 3

49 teachers:Interviews &Document

EdTechUse

11 admin:Interviews

Page 35: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

34

Literature Review & Data Collection

Literature Review MethodsBetween January and October 2019, the EdTech Evidence Exchange and UVA research team conducted a systematic literature review to identify individual and context variables previously associated with edtech implementation in the literature. We took the following steps to prepare a draft review for the Steering Committee in October 2019. The research team continued to build upon this draft review and is currently finalizing a manuscript of the literature review for publication.

Step 1: Database Search and Title Review resulting in 1,380 ArticlesWe searched 5 databases (ERIC, Academic Search Complete, Education Research Complete, PsycINFO, Google Scholar) for articles using keywords, including “education technology,” “implementation,” “variables,” “preK-12,” “united states,” and NOT “preservice teachers.” For each keyword, we included a series of synonyms. Given our current focus on the United States context for edtech implementation, we limited the search to articles describing research either fully or partially conducted in the U.S. We also limited our search to research conducted with practicing teachers. Finally, given the rapidly changing nature of technology, we only included articles published in 2000 or later. Additionally, we conducted a hand search of the top 6 journals appearing in our database search, including Journal of Research on Technology in Education, Educational Technology Research & Development, Computers in the Schools, Tech Trends: Linking Research & Practice to Improve Learning, Computers & Education, and Journal of Educational Technology & Society.

We tracked articles that indicated any discussion of edtech implementation in the title in a spreadsheet, recording associated information (title, authors, abstract, journal, search database). For example, we would include Explaining technology integration in K-12 classrooms: A multilevel path analysis model, and we would exclude What do we value most in schools? A study of preference rankings of school attributes.

Step 2: Abstract Review resulting in 672 ArticlesUsing the title review spreadsheet, we read each abstract, looking for an indication that the article would provide evidence of an individual or context variable associated with edtech implementation. We tracked each article’s methods as quantitative, qualitative, mixed methods, theoretical, review, practice description or unclear. One coder did a first pass on abstracts, marking articles as included, excluded, or needing a second opinion. To ensure no article was mistakenly eliminated, the research team lead did a second review on all article abstracts that were either excluded or flagged for needing a second opinion.

Page 36: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

35

Literature Review & Data Collection

Step 3: Article Review resulting in 48 ArticlesWorking from the reduced list of articles, the research team began an article review of all quantitative and mixed methods articles. We set a priori inclusion criteria:

• Study population included PreK12 grade students and/or educators

• Setting was a brick & mortar (as opposed to a virtual or fully online) PreK12 school in the USA

• Published in a refereed journal (or a completed dissertation) between 2000 and 2019

• Measured a quantitative association with:

• Independent variables as context or individual characteristics (variables)

• Dependent or moderating variables as continuous (e.g., number of hours teachers used technology in a year) or ordinal (e.g., high vs. low implementers) measures of edtech use representing implementation variation

• If moderating variables are measures of edtech use, dependent variables may be measures of a student outcomes

Once we identified included quantitative articles and quantitative sections of mixed methods articles, we read through those articles to identify any and all associations between individual and context variables and edtech implementation. We also shared findings from 3 descriptive articles with the steering committee to show known evidence of variables with little to no available inferential information.

Page 37: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

36

Who

What

FALL 2019

Convening Experts

Narrow the full list of contender variables to select the 13 individual and context variables most likely to impact edtech implementation.

EdTech Genome Advisory Board & Steering Committee

Page 38: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

37

Variable Top 23 Variable Top 23

1 Adoption Plan YES 15 Time Allocated for Technology Implementation

2 Research Use in Adoption 16 Professional Acknowledgement for Technology Use

3 Student/Family Agency in Technology Selection 17 Leaders’ Selection of Technology

4 Family Buy-in/Beliefs about Technology 18 Administrative Support YES

5 Home Access to Devices, Products, & Reliable Internet YES 19 Technological Leadership Content

Knowledge (TLACK) YES

6 School-Home Connection/Communication 20 Modeling

7 Technology Resources YES 21 Leadership Knowledge

8 Internet Specified 22 School Vision for Technology YES

9 Devices Specified 23 Competing Priorities/Initiatives YES

10 Product Specified 24 Technology Consistency YES

11 Classroom Structure 25 Staff Retention/Tenure

12 Digital Safety Processes/Protocols 26 Trust

13 Implementation Plan YES 27 Willingness to Take Risks

14 Usage Goals/Measurement of Implementation YES 28 Support for Risk-Taking

Convening Experts

IntroductionIn fall 2019, the EdTech Genome Advisory Board and Steering Committee reviewed data collected in the previous year on the 23 “contender variables,” based on the strength of supporting data. These variables were narrowed from a list of 60 individual or context variables/sub variables identified in the previous coding schemes (see Table 3 below). This does not include technology-specific variables. To be included, each variable needed to either (1) have evidence of an association with education technology implementation in previous literature or original research conducted by the UVA research team or (2) be identified by at least 2% of educators as being associated with implementation. The Advisory Board and Steering Committee each expressed initial opinions on the variables, and the Steering Committee met for two days to review all evidence, debate, and select 13 variables to move forward for public comment.

Table 3. Individual and Context Variables Shared with Advisory Board and Steering Committee

Page 39: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

38

Convening Experts

Variable Top 23 Variable Top 23

29 Collaborative Environment 45 Student Behavior

30 Communication Processes 46 Student Technology Abilities YES

31 Technology Champions 47 Financial Resources YES

32 Professional Learning/Development Support YES 48 Neighborhood Demographics

33 Contextualized 49 School Demographics

34 Differentiated 50 Teacher Demographics

35 Skill-focused 51 Contextual Awareness

36 Sustained/On-demand 52 Teacher Autonomy/Agency YES

37 Instructional Technology Support 53 Teacher Beliefs about Technology/

Self-Efficacy YES

38 Professional Learning Tree YES 54 Teacher Technology Readiness YES

39 Planning and Instructional Preparation Support 55 Teacher Openness to Change YES

40 Operational Technology Support YES 56 Educator Motivation

41 Scheduling & Time YES 57 Technological Pedagogical Content Knowledge (TPACK) YES

42 Between School Interoperability 58 Pedagogical Knowledge

43 Within School Interoperability 59 Teacher Pedagogical Beliefs YES

44 General Student Abilities 60 Time Commitment to Teaching YES

Initial ReviewEach Advisory Board and Steering Committee member sorted the 23 contender variables into 5 categories, ranging from 5 = “definitely should be included” to 1 = “definitely should not be included,” placing no more than 5 variables in each category. Each variable received a score between 1 and 5, and this information was the final data source for the Steering Committee’s narrowing to 13 variables.

Steering Committee Meeting and Variable SelectionThe Steering Committee met for two days in person to select the top 13 variables to move forward for public comment. They reviewed the provided evidence and contributed their

Page 40: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

39

Convening Experts

professional expertise through a series of exercises, small group discussions, and whole group debates. Additionally, throughout the two days, individuals added to a running record of “wildcard” variables. These were variables not previously identified on the list but arising in conversation and as a result of professional expertise. By the end of the meeting, the Steering Committee members came to consensus on 13 variables, combining several variables in some cases (e.g., teacher beliefs & knowledge). These included:

• Adoption Plan

• Alignment of Technology to Instructional Purpose

• Coaching

• Competing Priorities

• Foundational Resources (Technology Resources, Operational Tech Support, Financial Resources)

• Implementation Plan

• Professional Learning (Development)/Support

• School (Staff) Culture

• Support from School/District Administration

• Teacher Agency/Autonomy

• Teacher Beliefs about Tech/Self-Efficacy & Technological Pedagogical Content Knowledge

• Time & Scheduling

• Vision for Teaching & Learning with Technology

Public CommentFollowing the Steering Committee’s selection of 13 variables, members and the EdTech Evidence Exchange distributed a public comment survey for feedback across the field.

Public Comment Survey Feedback • Organized by position

• Axis scale: 3.00-4.80

• Measure scale: 1.00-5.00

• Rate the extent to which you agree that each of the following variables impact the success or failure of edtech implementation.

• Total N = 111

Page 41: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

40

Convening Experts

Page 42: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

41

Convening Experts

Page 43: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

42

WINTER 2020

Gaining Variable Consensus

Who

What Confirm and build buy-in for the 10 selected individual and context variables.

Education Sector, EdTech Genome Advisory Board, & Steering Committee

Page 44: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

43

Gaining Variable Consensus

IntroductionBased on the public comment survey feedback, the EdTech Evidence Exchange interviewed each Advisory Board member to collect their feedback on which 10 variables should move forward and prepared a suggested list on which the Steering Committee voted. Of the 27 members who voted (90%), 22 members fully supported the final 10 variables as a set (81%) and 5 members (19%) felt they could live with this set and did not suggest an alternative. The final set of 10 variables blended variables rather than removing 3:

• Adoption Plan (incorporate Alignment of Technology to Instructional Purpose)

• Competing Priorities

• Foundational Resources (Technology Resources, Operational Tech Support, Financial Resources)

• Implementation Plan

• Professional Learning (Development)/Support (incorporate Coaching)

• School (Staff) Culture

• Support from School/District Administration

• Teacher Agency/Autonomy

• Teacher Beliefs about Tech/Self-Efficacy & Technological Pedagogical Content Knowledge

• Vision for Teaching & Learning with Technology

• Embed Time & Scheduling in multiple variables

Page 45: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

44

SPRING 2020

Diving Deep on Variables

Who

What Define each of the 10 individual and context variables based on literature and professional expertise.

EdTech Genome Working Groups

Page 46: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

45

Diving Deep on Variables

Working Group RecruitmentDuring January and February 2020, we recruited 10 expert working groups through a public application process, as well as soliciting and vetting referrals from members of the EdTech Genome Steering Committee, Advisory Board, and Industry Council. Each potential working group member submitted an application; each application was reviewed by at least 2 members of the Exchange and UVA team. Applicants identified specific variables of interest, shared artifacts that reflected their work, and described:

• Their expertise as it related to the variables they selected

• Why they were interested in this project

• Their experiences on working groups, task forces, or team-based projects

Ultimately, the Exchange selected 68 applicants across the 10 variables. Each working group ranged from 5 to 9 members; represented a range of education professionals (i.e., educator practitioners, researchers, association representatives, edtech industry representatives); and had a team leader (all but 2 of whom were also Steering Committee members).

Working Group Tasks and DeliverablesEach working group received:

• A draft name and definition for their assigned variable and

• Materials related to that variable including:

• EdTech Evidence Exchange & UVA research

• Related literature & existing instruments to measure related variables

• Notes from the related Steering Committee conversations at the two-day, in-person meeting

• Suggestions from feedback loops

The working group deliverables integrate both provided evidence and the professional expertise of working group members. Each working group:

• Iterated (if needed) to improve the clarity and brevity of the provided variable name

• Wrote long and short variables definitions, including citations, to describe specific indicators of the variable (becomes the EdTech Context Framework)

• Drafted an instrument to measure the variable indicators as they are identified in the variable definition (becomes the EdTech Context Inventory)

Page 47: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

46

Diving Deep on Variables

Names and DefinitionsFirst, each working group revised their variable’s provided name and short definition, as needed. Table 4 below shows the original and final names. Each name is intended to be:

• Clear — Educator and researcher users should understand the variable without reading the definition. To the extent possible, names use terms with one common definition to prevent confusion about the variable. The name and definition should reflect a unitary construct.

• Concise — Names use no more than four primary words.

• Exclusive — Names clearly delineate each variable from the 9 other variables. Users should not be confused about similar names/variables when reading the full list of 10 variables.

The Exchange and UVA research team conducted a feedback loop with 516 California educators, recruited through their membership in an educator association. The purpose of this feedback loop was to ensure the variable names and short definitions made sense to an educator practitioner audience. For each variable, educators identified the extent to which (1) the definition matched how they would have originally identified the term; (2) they feel comfortable using the term to discuss the use of edtech in their classroom, school, or district; and (3) they understand the definition such that they would feel comfortable explaining it to a colleague. See Table 5 below for a summary of educators feedback. Educators also provided free-text feedback which the working groups used for revisions. Each working group iteratively revised their definitions with Exchange and UVA team input.

Original Name Final Name

Adoption Plan Selection Processes

Competing Priorities No change

Foundational Resources Infrastructure & Operations

Implementation Plan Implementation Systems & Processes

Professional Learning (Development)/Support Professional Learning

School (Staff) Culture Staff Culture

Support from School/District Administration Strategic Leadership Support

Teacher Agency/Autonomy Teacher Agency

Teacher Beliefs about Tech/Self-Efficacy & Technological Pedagogical Content Knowledge Teacher Beliefs & Knowledge

Vision for Teaching & Learning with Technology Vision for Teaching & Learning

Table 4. Original and Final Variable Names

Page 48: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

47

This definition matches how I would have originally defined the term myself.

I feel comfortable using this term to discuss the use of edtech in my classroom, school, or district.

I understand this definition such that I am comfortable explaining it to a colleague.

Average of All Variables 4.15 (0.53) 4.17 (0.57) 4.17 (0.58)

Selection Processes 4.08 (0.86) 4.09 (0.89) 4.11 (0.87)

Competing Priorities 4.17 (0.77) 4.17 (0.79) 4.23 (0.77)

Infrastructure & Operations 4.00 (0.88) 4.06 (0.85) 4.05 (0.85)

Implementation Systems & Processes 4.00 (0.76) 3.97 (0.84) 3.94 (0.86)

Professional Learning 4.40 (0.80) 4.45 (0.77) 4.45 (0.74)

Staff Culture 4.48 (0.70) 4.48 (0.68) 4.45 (0.69)

Strategic Leadership Support 4.07 (0.81) 4.05 (0.85) 4.06 (0.85)

Vision for Teaching & Learning 4.12 (0.86) 4.12 (0.85) 4.09 (0.85)

Teacher Agency 4.08 (0.94) 4.10 (0.91) 4.11 (0.90)

Teachers Beliefs & Knowledge 4.13 (0.85) 4.19 (0.78) 4.19 (0.78)

Note. Each response is on a scale of 1 (Strongly Disagree) to 5 (Strongly Agree). In each cell, we report Mean (Standard Deviation). Total N = 516

Diving Deep on Variables

After receiving educator feedback, each working group developed a long definition (approximately 1-1.5 pages). These definitions elaborate on the key elements in the short definitions, specifically aiming to set up the components of the measurement instrument. Again, the working groups iteratively revised with the Exchange and UVA team. Each set of short and long definition is intended to be:

• Distinct — Both definitions clearly define this variable as unique from the other variables that describe an edtech implementation environment.

• Approachable — Both definitions avoid using any specialized terms, jargon, etc. that may be unfamiliar to some users.

• Exhaustive — The longer definition clearly demarcates what the variable is and is not.

Once the definitions were in a final draft state, the full Genome Project network offered line-by-line feedback and edits, including a close review by the Industry Council. Working groups made an additional round of revisions based on this feedback.

Table 5. Educators Feedback on Variable Names and Short Definitions

Page 49: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

48

SUMMER 2020

Creating the Context Inventory

Who

What Develop instruments to measure each of the 10 individual and context variables in alignment with their definitions.

EdTech Genome Working Groups, Measurement Council, & UVA research team

Page 50: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

49

Creating the Context Inventory

Instrument DraftsEach working group produced a draft measurement instrument for their variable, aligned with the long definition. Working groups received specific guidance on the format of the draft instruments, using 5-point response scales and aligning each question with a single variable indicator. Additionally, each working group developed a teacher and a leader form for their instrument.

Measurement CouncilOnce the working groups each drafted their instruments, the 6-member EdTech Genome Project Measurement Council, which included four Steering Committee members, worked with the UVA research team to review the full set of instruments and offered item-level revisions. The Measurement Council offered suggestions in areas such as alignment with the definitions, consistency and redundancy across instruments, item clarity, and item variance. They also aligned all of the teacher and leader items such that they could be aggregated for one score in a school or district.

Equity ReviewThe Education Trust completed an external review of the instruments, specifically focused on diversity, equity and inclusion. This review provided suggestions for specific items to better capture diverse instructional environments and indicators of equity in context. These suggestions informed item revisions.

Page 51: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

50

FALL 2020 - ONGOING

Validating the Context Inventory

Who

What Iteratively pilot, revise, and validate the full EdTech Context Inventory with diverse educator samples.

EdTech Evidence Exchange & UVA research team

Page 52: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

51

Validating the Context Inventory

Pilot #1 and Initial AnalysesIn October 2020, the Exchange and UVA research team piloted the draft instruments with 507 educators in Washington State, recruited via their teachers’ union or educator association membership. Each educator completed 2 of the 10 instruments and provided item-level feedback. Therefore, approximately 80-100 participants completed each instrument.

First, to create an analytic dataset, we stacked all items across roles (teacher, school leader, and district leader), and cleaned the dataset (e.g., checking coding for consistency, re-coding as needed, reverse coding as needed, re-coding “not sure” as missing, creating planned aggregate variables). We based initial aggregate variables on theorized constructs/subscales in the original instruments.

Second, we conducted preliminary descriptive analyses. This included:

• Frequencies and descriptive statistics (mean, SD, min, max) on all items

• Flagged items with limited variability (i.e., those with only a handful of 1’s and 2’s on the 5-point scale or those on which nobody responded with a 1)

• Exploratory correlations among all items from a given instrument

• Exploratory correlations and reliability alphas on the items making up each theorized subscale

• Flagged items with noticeably lower correlations with the other items on the subscale, and/or with all other items from the instrument

• Flagged items for which reliability alpha output showed noticeably lower correlations with the “full” subscale; noticeably lower average correlation with other items on the subscale; and/or alpha would substantially improve without that item

• Flagged subscales with alphas below recommended cutoff

Third, we conducted exploratory factor analysis (EFA) for each instrument. We ran the initial EFA without specifying number of factors to extract on all items making up that instrument (excluding items for which we only have responses from teachers or only admins and items that were intended to be separate from the subscales, such as binary (yes/no) items). We flagged any items below the recommended threshold (.5) on the Kaiser-Meyer-Olkin (KMO) Test, which indicates a low proportion of shared variance of the item with the rest of the scale and suggests the item may be inappropriate for factor analysis.

Fourth, we re-ran each EFA, extracting the anticipated number of factors. We flagged:

• Factors with only a single item loading on them

• Items that did not load above .4 on any factors

• Items that cross-loaded on 2 factors

Page 53: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

52

Validating the Context Inventory

As appropriate, we re-ran each EFA, one-by-one excluding items that the initial EFA suggested were not appropriate for factor analysis (i.e., with KMO below .5). We continued to re-run the EFAs as needed until the KMO statistics for all remaining items fell above .5.

Instrument RevisionsBased on these analyses, we conducted another round of systematic instrument revisions, specifically focusing on reducing the number of items in each instrument and increasing the variance in educators’ responses to each item. We also improved the clarity of each question identified by at least 10% of educators as being challenging to understand. The Measurement Council reviewed suggested changes from the UVA research team.

Pilot #2 and Planned AnalysesIn January 2021, the Exchange and UVA research team piloted the draft instruments with 234 educators in Nebraska, recruited via their teachers’ union membership. Each educator completed the full set of 10 instruments. We will use these data, combined with additional responses to be collected in spring 2021, (1) to re-run exploratory and confirmatory factor analyses to determine the underlying factor structure and (2) to identify any remaining item redundancy.

Influence of COVID-19 on Instrument Validation

We planned and began the EdTech Genome Project prior to the COVID-19 pandemic. The intent of the project was to define and measure non-pandemic edtech implementation environments. During the 2020-2021 school year, we could not fully validate the instruments. Educators would have needed to reflect on a pre-pandemic world that is inherently different than a post-pandemic world. Or, they would have needed to imagine a hypothetical post-pandemic world. Instead, we collected pilot data and focused on item clarity, variance, and correlations. During the October 2020 pilot, educators reflected on the extent to which their responses were influenced by COVID-19. Responses varied among instruments, but on average, educators identified that their responses were somewhat impacted by COVID-19. To accommodate this, we chose to significantly extend the validation process into the next several years as schools settle into a post-COVID-19 context.

Page 54: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

53

ReferencesBarton, E. A., Epstein, B., & Pianta, R. (2019, April). Crowdsourcing educator perspectives on

first- and second-order variables impacting edtech implementation. Paper presented at the annual meeting of the American Educational Research Association, Toronto, CA.

Crabtree, B. F., & Miller, W. L. (Eds.). (1999). Doing qualitative research (2nd ed.). Sage Publications.

Epstein, B., Rush, C., & Slykhuis, D. (2017). Crowdsourcing efficacy research and product reviews. http://symposium.curry.virginia.edu/wp-content/uploads/2017/07/Crowdsourcing-Efficacy-Research-and-Product-Reviews.pdf

Straub, E. T. (2009). Understanding technology adoption: Theory and future directions for informal learning. Review of Educational Research, 79, 625–649. https://doi.org/10.3102/0034654308325896

Page 55: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTEdTech Context Framework: Definitions for Context Variables

Vision for Teaching & Learning 55

Selection Processes 59

Teacher Agency 63

Infrastructure & Operations 66

Implementation Systems & Processes 71

Staff Culture 75

Teacher Beliefs & Knowledge 79

Strategic Leadership Support 82

Professional Learning 85

Competing Priorities 89

Page 56: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

55

01Vision for Teaching & Learning

Short DefinitionThe vision for teaching and learning unifies stakeholders with clear direction, purpose, and rationale for technology-supported learning. A high-quality vision is forward-thinking and actionable, and to have effect, must be consistently communicated and referenced as a guide for action. Visioning helps schools and districts recognize opportunities for technology to address problems of practice, prioritize equity, and plan for technology integration that promotes student learning opportunities. Visions describe the ideal state of teaching and learning for all students in which digital technologies transform daily life.

Page 57: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

56

Vision for Teaching & Learning

Long DefinitionThe vision for teaching and learning unifies stakeholders with clear direction, purpose, and rationale for technology-supported learning. A high-quality vision is forward-thinking and actionable, and to have effect, must be consistently communicated and referenced as a guide for action. Visioning helps schools and districts recognize opportunities for technology to address problems of practice, prioritize equity, and plan for technology integration that promotes student learning opportunities. Visions describe the ideal state of teaching and learning for all students in which digital technologies transform daily life.

A high-quality vision:

• Provides clear direction, purpose, and rationale • Establishes what high-quality technology-supported learning should look like and how that aligns with existing or new pedagogical practices

• Identifies problems of practice in teaching and learning that can be solved with existing or emerging technology

• Influences strategic decision-making across all aspects of teaching, learning, and leadership

• Unifies stakeholders • Is developed with input from all stakeholder groups to build buy-in (school leaders, teachers, students, parents, and community members)

• Considers existing school climate, culture, and attitudes towards learning • Shapes school culture around technology (See Staff Culture) • Creates a shared understanding about how all students will learn and demonstrate their learning with technology

• Prioritizes equity • Advocates for equitable access to learning opportunities so that the ability to learn with technology is not predicated by race, ethnicity, socioeconomic status, gender, ability, or other characteristics of individuals that might influence the equity of learning opportunities

• Supports accommodations for those with learning differences or differences in physical ability

• Is rooted in research and technology-integration frameworks

• Is forward-thinking and actionable

A forward-thinking vision:

• Accommodates the impact of technology on student learning • Recognizes that the accessibility of personal computers, the Internet, digital media, and other technologies have transformed how students engage in learning

Page 58: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

57

Vision for Teaching & Learning

• Considers how technology creates new core competencies for citizenship, including digital media literacy, computing, and data science

• Promotes the mindset of perseverance and lifelong learning • Anticipates the integration of technology in all subject areas

• Positions students at the center of technology-supported learning • Fosters student ownership, agency, voice, and choice • Adapts to students based on their learning needs and readiness • Is relevant and authentic for students • Allows students to demonstrate what they know and are able to do through a variety of assessment types (formative and summative)

• Encourages innovative approaches to teaching and learning across the school (e.g., competency-based learning, learning beyond face-to-face)

An actionable vision:

• Adapts to a wide range of contexts, environments, and needs

• Initiates a clear plan for achieving the ideal future state of technology-supported learning

• Communicated clearly to internal (e.g., teachers, other staff) and external (e.g., families, governing boards) stakeholders

• Influences the selection of, implementation of, and ongoing support for technologies (See Selection Processes and Implementation Systems & Processes)

• Requires ongoing assessment and evaluation to direct next steps

• Signals a school’s commitment of resources to increase capacity • Provides guidance to leaders in allocating available financial, technological, physical, and human resources (See Infrastructure & Operations), including professional learning (See Professional Learning)

• Provides guidance to leaders in prioritization (See Competing Priorities)

A vision for teaching and learning may or may not be formally documented.

The short and long definitions are informed by professional experiences and the following readings:

Dexter, S., & Richardson, J. W. (2019). What does technology integration research tell us about the leadership of technology? Journal of Research on Technology in Education, 52(1), 17-36. https://doi.org/10.1080/15391523.2019.1668316

Dexter, S., Richardson, J. W., & Nash, J. B. (2016). Leadership for technology use, integration, and innovation. In M. D. Young & G. M. Crow (Eds.), Handbook of Research on the Education of School Leaders (p. 202-228). Routledge.

Page 59: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

58

Vision for Teaching & Learning

Friday Institute for Educational Innovation (2018). North Carolina digital learning progress rubric for charters version 2.0. North Carolina State University. https://www.fi.ncsu.edu/resources/digital-learning-progress-rubric-for-charters/

Friday Institute for Educational Innovation (2018). North Carolina digital learning progress rubric for districts version 2.0. North Carolina State University. https://www.fi.ncsu.edu/resources/digital-learning-rubric-for-districts/

Friday Institute for Educational Innovation (2018). North Carolina digital learning progress rubric for schools version 2.0. North Carolina State University. https://www.fi.ncsu.edu/resources/digital-learning-progress-rubric-for-schools/

IGI Global. (n.d.). What is digital age | IGI Global. https://www.igi-global.com/dictionary/resource-sharing/7562

Richardson, J. W., Flora, K., & Bathon, J. (2013). Fostering a school technology vision in school leaders. NCPEA International Journal of Educational Leadership Preparation, 8(1), 144-160. https://files.eric.ed.gov/fulltext/EJ1012953.pdf

Shapley, K. S., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2010). Evaluating the implementation fidelity of technology immersion and its relationship with student achievement. The Journal of Technology, Learning, and Assessment, 9(4), 1-68. https://www.mackenty.org/images/uploads/Evaluating_the_Implementation_Fidelity_of_Technology_Immersion_an.pdf

Page 60: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

59

Short DefinitionSelection processes occur prior to procurement and are the presence and quality of consistent methods through which classrooms/schools/districts/states identify technologies, evaluate those technologies, and choose technologies for procurement to meet established student and teacher needs for learning and instruction.

02Selection Processes

Page 61: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

60

Selection Processes

Long DefinitionSelection processes are the presence and quality of consistent methods and resources occurring before procurement through which classrooms/schools/districts/states identify, evaluate, and choose education technology to meet student and teacher needs for learning and instruction. These processes adapt to the scale and scope of the need and technology, but the presence of consistent methods over time allows for schools and districts to efficiently select new technologies, maintain consistent roles and responsibilities for those routinely involved with selection, and improve upon methods.

High-quality technology selection processes should engage a range of stakeholders to leverage diverse perspectives throughout, such as teachers (See Teacher Agency), administrators, curriculum and instruction staff, technology support staff, and students/families.

Selection processes should be initiated by an established student or teacher need (i.e., learning and instruction) for which a technology-based solution is (1) appropriate and (2) not already in place. High-quality technology selection processes include:

• Identifying technologies: Selection begins with a systematic process for identifying possible technologies that appear to or claim to meet an articulated need for students or teachers.

• Evaluating technologies: Evaluation can be qualitative, quantitative, or a mixture of both; it might use structured methods such as rubrics or checklists. Once there is a list of identified technologies, selection continues with evaluation of possible technologies with key indicators.

• Available evidence includes anecdotal, descriptive, correlational, and causal evidence. Educators should consider the established effectiveness of the technology (e.g., Does the technology effect change in the identified outcomes?) and the experiences of other users, particularly those in schools with similar contexts to their own.

• Fit with organizational requirements requires educators to consider how the new technology will align with established policies (e.g., privacy statements, accessibility), resources (e.g., required support, need for professional learning), curriculum/standards, and instructional routines (e.g., feedback loops, customized instruction).

• Fit with technical requirements requires educators to consider how the new technology will align with the technical infrastructure and existing technologies (See Infrastructure & Operations).

• Effectiveness in meeting the identified need requires educators to either identify high quality evidence of success in a similar context to their own

Page 62: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

61

Selection Processes

and/or conduct a pilot to test the technology. Pilots identify and collect data on target outcomes but can vary in scale and duration, depending on the intended reach of the program (e.g., one class vs. whole district), expertise and capacity of the school staff, and offerings from the vendor. Pilots should include all students.

• Pilots could be: » Completed at the state, district, school or classroom level » Performed internally by the school or district or externally by

the technology vendor or a third-party research organization; all stakeholders need to be engaged

» Small scale (e.g., teacher implements a technology in their classroom for some students and not others, comparing student outcomes) to large scale (e.g., third-party evaluator randomly assigns classrooms to treatment or control conditions and measures gains with a pre-post standardized assessment)

» Short (e.g., weeks) to long duration (e.g., a school year), depending on the nature of the program, level of implementation and timetable for decision-making

• Choosing technologies for procurement: Once educators collect evidence about technology-based options for addressing the established need, they complete selection by choosing a technology to purchase (if fee-based) and use. The technology choice should leverage the collected evidence and engage key stakeholders.

The short and long definitions are informed by professional experiences and the following readings:

ISTE. (2019). Better edtech buying for educators. A practical guide. International Society for Technology in Education.

Morrison, J. R., Ross, S. M., & Cheung, A. C. K. (2019). From the market to the classroom: How ed-tech products are procured by school districts interacting with vendors. Education Tech Research Dev, 67, 389-421. https://doi.org/10.1007/s11423-019-09649-4

Nebraska Public School System. (2017). Rubric of essential technology conditions for Nebraska schools. https://www.education.ne.gov/wp-content/uploads/2017/07/NERETC.pdf

Noakes, S., Richendollar, T., Xiao, W., & Luke, C. (2020). Designing edtech that matters for learning: Research-based design product certifications report. Digital Promise. https://digitalpromise.org/wp-content/uploads/2020/02/Product-Certifications-Report.pdf

Page 63: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

62

Selection Processes

Ramirez Jr., A. (2011). Technology planning, purchasing, and training: How school leaders can help support the successful implementation and integration of technology in the learning environment. Journal of Technology Integration in the Classroom, 3(1), 67-73.

edreports.org. (2020, June). Selecting for quality: Six key adoption steps. https://www.edreports.org/resources/adoption-steps?utm_medium=email&utm_source=pardot&utm_campaign=rundown-june-2020

Zhao, Y. & Frank, K. A. (2003). Factors affecting technology uses in schools: An ecological perspective. American Education Research Journal, 40(4), 807–840. http://dx.doi.org/10.3102/00028312040004807

Page 64: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

63

03Teacher Agency

Short DefinitionTeacher agency is the extent to which teachers consistently have a voice in shaping their work and the conditions and tools for that work. Regarding education technology implementation, this is the extent to which the conditions for agency are in place and a variety of teachers are consistently involved in decision-making related to shared visioning, selection processes, implementation processes, infrastructure, and professional learning.

Page 65: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

64

Teacher Agency

Long DefinitionTeacher agency is the extent to which teachers consistently have a voice in shaping their work and the conditions and tools for that work. Regarding education technology implementation, this is the extent to which the conditions for agency are in place and a variety of teachers are consistently involved in decision-making related to shared visioning, selection process, implementation processes, infrastructure, and professional learning.

Conditions for Agency

• Impact of teacher contributions: Teacher agency requires that teachers not only have opportunities to engage in decision-making processes but also that their contributions are valued as part of the decision-making process. Meaningful impact of teacher contributions is the extent to which diverse educator voice, choice, and contributions actually inform ultimate decisions.

• Time: Teacher agency requires that educators be given adequate time within contract hours to engage in professional learning and exploration of resources to gain knowledge needed to effectively contribute. Educators also need time to actively participate in the decision-making processes and other opportunities for stakeholder engagement.

Areas of Agency

• Shared visioning: Teacher agency in shared visioning is the extent to which teachers’ perspectives are meaningfully incorporated in the formation of structures, strategies, learning outcomes, and overall impact that comprise the vision for teaching and learning with edtech. This includes creating opportunities for educator advocacy regarding equitable consideration of all students (See Vision for Teaching & Learning).

• Selection processes: Teacher agency in selection processes is the extent to which teachers are consistently involved in identifying, evaluating, and choosing technologies. Teacher agency involves opportunities for teachers to explore and propose new hardware and software tools for classroom use (See Selection Processes).

• Implementation processes: Teacher agency in implementation processes is the extent to which teacher perspectives shape the methods by which a school or district puts new technology into effect and carries out/scales the use of technology (See Implementation Systems & Processes).

• Professional learning: Teacher agency in professional learning is the extent to which teachers make decisions in their own learning and have voice and choice in the decisions made around professional learning to support edtech implementation at the school or district. Teachers engage in collaborative

Page 66: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

65

Teacher Agency

learning activities relevant to their challenges and provide ongoing feedback. This includes teachers recommending professional learning resources, forming and supporting professional learning communities, creating opportunities for teacher-led professional learning, and guiding the frequency of professional learning such that it meets teacher needs (See Professional Learning).

The short and long definitions are informed by professional experiences and the following readings:

Hadar, L. L., & Benish-Weisman, M. (2018). Teachers’ agency: Do their values make a difference. British Educational Research Journal, 45(1), 137-160. https://doi.org/10.1002/berj.3489

Ingersoll, R. M., Sirinides, P., & Dougherty, P. (2018). Leadership matters: Teachers’ roles in school decision making and school performance. American Educator, 42(1) 13-17.

Kayi-Aydar, H., Gao, X., Miller, E. R., Varghese, M., & Vitanova, G. (2019). Theorizing and analyzing language teacher agency. Multilingual Matters. https://doi.org/10.21832/9781788923927

Page 67: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

66

Short DefinitionInfrastructure and operations are the enabling conditions that lower barriers for implementation, facilitate uptake, and support scaling and sustaining new education technology. These conditions include physical resources, broadband Internet connectivity, students’ remote devices and connectivity, human resources, system specifications, operational policies, and funding.

04Infrastructure & Operations

Page 68: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

67

Infrastructure & Operations

Long DefinitionInfrastructure and operations are the enabling conditions that lower barriers for edtech implementation, facilitate uptake, and support scaling and sustaining new edtech tools. This includes the following:

Infrastructure: The technical infrastructure of a school or district consists of physical resources, Internet connectivity, students’ remote devices and connectivity, human resources, and system specifications. This infrastructure requires a secure, scalable system to organize resources.

• District Physical Resources: A wide array of digital devices (e.g., laptops, document cameras) and associated peripherals (e.g., projectors, microphones, headphones) are available for education environments. Primary devices include those that students and educators use to access software on a daily basis (e.g., laptops, tablets). To support successful technology implementation, the following factors should be considered in relation to physical resources:

• Consistency/availability of devices: The extent to which educators/classrooms across a school or district have access to consistent and/or compatible primary devices

• Distribution of devices: How devices are distributed to students (e.g., classroom computer carts; one device per student; Bring Your Own Device programs)

• Equitable access to devices: The extent to which students have access to the devices they need to effectively participate in learning

• Functionality of devices: The extent to which devices are functional and updated such that they meet educators’ and students’ needs

• Device interoperability with software: The extent to which the available devices are able to run selected software applications

• District broadband Internet connectivity: Broadband Internet connectivity is high-speed Internet access that is always on and accessible to allow for technology-enhanced teaching, learning, and day-to-day operations. To support successful technology integration, the following factors must be considered in relation to connectivity:

• Range of connectivity: The extent to which broadband connectivity is available across a school district’s geography/network of schools

• Format of connectivity: How educators and students connect to the Internet (e.g., Ethernet, WiFi)

• Reliability of connectivity: The extent to which educators and students have consistent, reliable access to high-speed Internet such that it does not hinder the use of software tools

Page 69: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

68

Infrastructure & Operations

• Student remote device and connectivity access: • Students’ remote device access: The extent to which students have access to primary devices beyond schools or at home and the plans in place to address gaps in students’ home device access

• Students’ remote and/or home connectivity access: The extent to which students have sufficient Internet access at home and the plans in place to address gaps in students’ remote connectivity (FCC guidelines)

• Family access and connectivity: The extent to which students’ families have access to remote devices and connectivity, as well as the skills to operate those devices

• Technical support: The extent to which students and families have access to district technical personnel and resources to support technology-enhanced learning

• Human resources: In addition to devices and connectivity, implementation requires that sufficient human resources (district or managed service provider) are available to support the technical infrastructure. To support successful technology integration, the following factors must be considered in relation to human resources:

• Availability of technical staff: The extent to which there are clear communication channels in place and the technical support staff are consistently available when there are technical challenges — in the classroom or outside of school — such that teaching and learning are not materially impacted; these staff are focused on the functionality of technology rather than instructional design with technology.

• Skill of technical staff: The extent to which the available technical support staff are able to address a wide variety of technical challenges as they occur and integrate new technologies as introduced

• System specifications: Successful technology integration depends on a system that is secure and scalable. Systems can range from fully operated by school district-based staff to fully managed and serviced by an outside agency.

• Security architecture: The extent to which the permissions model used by an application ensures that users are only authorized to access the data and functions needed in their role (principle of least privilege) and to prevent unauthorized access or intrusion

• Provisioning: The extent to which processes are in place to provision applications for access, including account creation, rostering, and student grouping

• Data integration: The extent to which processes and standards used by an application are able to pass data back-and-forth to other applications

Page 70: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

69

• Hosting: The extent to which the location where applications are hosted addresses services including load balancing, elasticity, backups, fault tolerance, and remote access

• Scalability: The extent to which the technology system can maintain optimal functionality with the introduction of new tools over time

Operations: The technical operations of a school or district consists of the operational policies and funding.

• Operational policies: Schools and districts follow policies established by boards that address hundreds of different aspects of teaching, learning, and operations. Operational policies that support technology integration:

• Align with possibilities afforded by technology articulated in the vision (See Vision for Teaching & Learning)

• Provide for sufficient data security standards throughout the school or district

• Support distance/remote learning initiatives • Allocate financial, technological, physical, and human resources to support technology selection, implementation systems and processes, and professional learning (See funding section below)

• Funding: Appropriate startup and recurring funding allocation is vital for any initiative’s success. Schools and districts need sufficient funding to cover the “real cost” of technology, accounting for the cost of the initial purchase, associated digital content/resources, peripherals, adjustments to existing infrastructure, professional learning, and associated human resources.

The short and long definitions are informed by professional experiences and the following readings:

Consortium for School Networking (CoSN). (n.d.). Smart education networks by design (SEND). https://www.cosn.org/focus-areas/it-management/send-smart-education-networks-design

Federal Communications Commission (FCC). (n.d.). Household broadband guide. https://www.fcc.gov/consumers/guides/household-broadband-guide

Fox, C., & Jones, R. (2019). The broadband imperative iii: Driving connectivity, access and student success. State Educational Technology Directors Association (SETDA). https://www.setda.org/wp-content/uploads/2019/11/SETDA_Broadband-Imperative-III_110519.pdf

Future Ready Schools. (n.d.). Infrastructure framework. https://futureready.org/ourwork/future-ready-frameworks/robust-infrastructure/

Infrastructure & Operations

Page 71: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

70

Infrastructure & Operations

Project Unicorn. (n.d.). Data interoperability rubric. https://www.projectunicorn.org/project-unicorn-rubric

Cybersecurity & Infrastructure Security Agency (CISA). (n.d.). Cyber resource hub. U.S. Department of Homeland Security. https://www.cisa.gov/cyber-resource-hub

Page 72: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

71

05Implementation Systems & Processes

Short DefinitionImplementation systems and processes occur after procurement and are the presence and quality of methods through which school communities put education technology into effect over time to achieve intended outcomes.

This includes mechanisms for monitoring ongoing fit with current initiatives, conducting resource inventories, monitoring the ongoing use of the technology as it was designed, making systemic adjustments as needed, and documenting evidence of impact on target outcomes.

Page 73: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

72

Implementation Systems & Processes

Long DefinitionImplementation systems and processes are the presence and quality of methods by which school communities put technology into effect, beginning after procurement and continuing over the life of the initiative. These systems and processes adapt to fit the needs of a given initiative, but the presence of consistent methods allows for school communities to respond efficiently to new initiatives, maintain consistent roles and responsibilities, and improve upon methods.

High-quality implementation systems and processes provide a roadmap with clear, actionable steps for implementation. Although these systems and processes are detailed and comprehensive, they are also realistic. They explicitly offer flexibility for educators to adapt implementations as needed to make the vision a reality. Implementation systems and processes require disseminating information and interactive communication to share key information about processes and receive feedback on the success of those processes.

• Identifying diverse stakeholders: Collaborate with diverse stakeholders to ensure their feedback is considered in the planning process. This includes a range of staff (e.g., teacher, principals, superintendent), as well as students and their families.

• Establishing technology-specific roles and responsibilities: Specific technologies may require new roles and responsibilities for associated stakeholders, depending on the technology’s focus (e.g., reading specialist for a language/reading program). Those roles may be revisited during feedback loops throughout the implementation process.

• Establishing timelines: Timelines for technology implementation account for when and at what pace implementation occurs. Feedback loops could identify the need for adjustments to established timelines.

• Planning for professional learning: Plans for professional learning explicitly address how all teachers will acquire and maintain the beliefs, knowledge, skills, and practices needed to meet the expectations of successful usage throughout implementation (See Professional Learning).

• Planning for aligned integration: Plans for aligned integration explicitly account for how technology will be initially and continuously implemented in alignment with:

• School community vision for teaching and learning with technology (i.e., the “why,” See Vision for Teaching & Learning and Selection Processes)

• Existing and new initiatives (See Competing Priorities) • Existing and new pedagogical practices • Organizational structures (e.g., staffing) and policies (e.g., state and federal mandates)

Page 74: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

73

Implementation Systems & Processes

• Defining and documenting impact: Implementation systems and processes include protocols for defining target outcomes (i.e., evidence of impact), when to measure them, how to measure them, and how to consider variation in outcomes by all student subgroups. These may include formative, interim, and summative outcomes as measured by district or school leaders and all teachers, the technology itself, or by district- or state-mandated assessments.

• Monitoring usage and engagement: Monitoring usage and engagement includes setting goals and tracking the use of technologies against those goals. Goals should be set based on short and long-term outcomes and may take the format of frequency (x times per week or y minutes per day) or purpose (to accomplish x in the classroom). Systems for monitoring usage and engagement address:

• The extent to which the district’s implementation plan is consistent with developer recommendations

• Who will monitor (e.g., district or school leaders, teachers) • When/how frequently monitoring will occur (e.g., routinely, randomly) • How monitoring will occur (e.g., observation, data review) • What monitoring will examine (e.g., supplement to instruction, redefinition of instruction)

• Communication of how usage goals will be monitored

• Evaluating and planning for resource sustainability: Implementation systems and processes include protocols for considering the available resources against the resources needed to sustain an implementation over time. Resource inventories document the available financial, material, and human resources for an edtech implementation and either ensures those resources will be available as needed or adjusts implementations to address constraints.

• Conducting feedback loops: Implementation systems and processes include protocols to support individual educators and staff in reaching goals through feedback loops. Feedback loops can focus on a variety of quantitative (e.g., usage data) and qualitative (e.g., technology fit) indicators and can be used to refine or adapt implementation. These processes may include follow-up such as targeted professional learning or identification of exemplar implementations.

The short and long definitions are informed by professional experiences and the following readings:

An, Y.-J., & Reigeluth, C. (2011). Creating technology-enhanced, learner-centered classrooms: K–12 teachers’ beliefs, perceptions, barriers, and support needs. Journal of Digital Learning in Teacher Education, 28(2), 54–62. https://doi.org/10.1080/21532974.2011.10784681

Page 75: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

74

Implementation Systems & Processes

Bauer, J., & Kenton, J. ( 2005). Toward technology integration in the schools: Why it isn’t happening. Journal of Technology and Teacher Education, 13(4), 519-546.

Corn. J., Tagsold, J., & Patel, R. ( 2011). The tech-savvy teacher: Instruction in a 1:1 learning environment. Journal of Educational Research and Practice, 1(1), 1-22. https://doi.org/10.5590/JERAP.2011.01.1.01

Hong, J. E. (2016). Social studies teachers’ views of ICT integration. Review of International Geographical Education Online, 6(1), 32-48.

Olmstead, C. (2013). Using technology to increase parent involvement in schools. Techtrends, 57(6), 28-37. https://doi.org/10.1007/s11528-013-0699-0

Pittman, T., & Gaines, G. ( 2015). Technology integration in third, fourth and fifth grade classrooms in a Florida school district. Educational Technology Research and Development, 63(4), 539-554. https://doi.org/10.1007/s11423-015-9391-8

Vannatta, R., & Fordham, N. ( 2004). Teacher dispositions as predictors of classroom technology use, Journal of Research on Technology in Education, 36(3), 253-271, https://doi.org/10.1080/15391523.2004.10782415

Smerden, B., Cronen, S., Lanahan, L., Anderson, J., Iannotti, N., Angeles, J., & Green, B. (2000). Teachers’ tools for the 21st century: A report on teachers use of technology. National Center for Education Statistics. https://nces.ed.gov/pubs2000/2000102.pdf

Yu, C. (2013). The integration of technology in the 21st century classroom: Teachers’ attitudes and pedagogical beliefs toward emerging technologies. Journal of Technology Integration in the Classroom, 5(1), 5-11.

Page 76: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

75

06Staff Culture

Short DefinitionStaff culture refers to the set of beliefs, values, norms, and assumptions that are shared collectively by the school and/or district staff and that influence the way in which staff members work individually and collaboratively to fulfill the school’s shared vision for teaching and learning. Important facets of staff culture include trust, social capital, communication, and equity.

Page 77: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

76

Staff Culture

Long DefinitionStaff culture refers to the set of beliefs, values, norms, and assumptions that are shared collectively by school and/or district staff and that influence the ways in which staff members work individually and collaboratively to fulfill the school and/or district’s shared vision for teaching and learning. As a shared belief system, staff culture is reflected in staff perspectives, practices, and interactions. Students, parents, teachers, administrators, and other staff members contribute to school or district culture, and all individuals working at a school or district are considered staff. However, for the purposes of defining and measuring staff culture as an important construct in implementation contexts, we focus only on instructional staff (i.e., those who interact on an instructional basis with students). Important dimensions of staff culture include:

• Trust: Relational trust and interpersonal belief in others (e.g., leadership and colleagues) is built through empathy, commitment, reliability, and accountability and plays an important role in developing culture.

• Social capital: Social networks and connections with others play important roles in developing culture. Social capital in school and/or district settings refers to those networks and relationships among staff, which facilitate cooperation and enable the school and/or district to function efficiently and effectively.

• Communication: Communication among school and/or district staff members can take many forms (e.g., verbal, electronic, school policy, training), and there are a variety of communication norms and expectations. It is key that all staff have an opportunity to interactively participate and voice their thoughts and concerns; it is equally important that those thoughts and concerns are actively heard and recognized.

• Equity: Perceptions of equity, justice, and fairness are important determinants of staff culture. This includes perceptions of fairness regarding (a) information and outcomes, (b) operational practices or processes, and (c) interpersonal treatment and interactions. These perceptions can be formed at the school and/or district level and are influenced by all members of the organization.

Staff culture is an important determinant of attitudes, behaviors, and outcomes at the student, teacher, school, and district levels. Not only does it influence teaching practices and student learning, but staff culture also facilitates the fulfillment of a school and/or district’s shared vision for teaching and learning. By shared vision, we refer to:

• The overall purpose and goals for why the school exists to serve various stakeholders (e.g., students, teachers) (for vision specific to technology See Vision for Teaching & Learning)

Page 78: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

77

Staff Culture

• Key performance indicators and objectives at multiple levels (e.g., student, classroom, and school)

• Defining characteristics and core values of school and/or district members

Because staff culture is a collective construct, it must be characterized through phenomena such as interpersonal interactions, normative perceptions, and shared beliefs. Further, it is important to distinguish between climate and culture. Whereas climate is more transient, based on flexible perceptions and attitudes, culture is more ingrained in the fabric of the school based on an evolution of interpersonal experiences and collective learning. These nuances must be captured when examining the various impacts of staff culture.

The short and long definitions are informed by professional experiences and the following readings:

ASCD. (n.d.). School and culture climate. http://www.ascd.org/research-a-topic/school-culture-and-climate-resources.aspx

Clark, K. (2006). Practices for the use of technology in high schools: A delphi study. Journal of Technology and Teacher Education, 14(3), 481-499.

Dexter, S., & Richardson, J. W. (2019). What does technology integration research tell us about the leadership of technology? Journal of Research on Technology in Education, 52(1), 17-36. https://doi.org/10.1080/15391523.2019.1668316

Ertmer, P., & Ottenbreit-Leftwich, A. (2010). Teacher technology change: How knowledge, confidence, beliefs, and culture intersect. Journal of Research on Technology in Education, 42(3), 255-284. https://doi.org/10.1080/15391523.2010.10782551

Lee, M., & Louis, K. S. (2019). Mapping a strong school culture and linking it to sustainable school improvement. Teaching and Teacher Education, 81, 84-96. https://doi.org/10.1016/j.tate.2019.02.001

Levin, B., & Schrum, L. (2013). Using systems thinking to leverage technology for school improvement. Journal of Research on Technology in Education, 46(1), 29-51. https://doi.org/10.1080/15391523.2013.10782612

National Center for Education Statistics. (2015). National teacher and principal survey: Teacher questionnaire 2015-2016. United States Department of Education. https://nces.ed.gov/surveys/ntps/question1516.asp

Page 79: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

78

Staff Culture

Spiteri, M., & Rundgren, S. C. (2018). Literature review on the factors affecting primary teachers’ use of digital technology. Technology, Knowledge and Learning, 25, 115-128. https://doi.org/10.1007/s10758-018-9376-x

Tschannen-Moran, M., & Hoy, A. W. (2003). Comprehensive teacher trust scale (principal, colleagues, students, parents). https://mxtsch.people.wm.edu/ResearchTools/Faculty%20Trust%20Survey.pdf

Tschannen-Moran, M. (2009). Fostering teacher professionalism in schools: The role of leadership orientation and trust. Educational Administration Quarterly, 45(2), 217–247. https://doi.org/10.1177/0013161x08330501

Zhao, Y. & Frank, K. A. (2003). Factors affecting technology uses in schools: An ecological perspective. American Education Research Journal, 40(4), 807–840. http://dx.doi.org/10.3102/00028312040004807

Page 80: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

79

07Teacher Beliefs & Knowledge

Short DefinitionTeacher beliefs and knowledge are individual teacher’s perceived ability to use education technologies and integrate them into their practice. This variable combines (1) teachers’ beliefs about, knowledge about, and experiences using education technology and (2) teachers’ understanding of curriculum, instruction, and assessment.

Together, these elements interact to enable the comfort and flexibility necessary to use education technologies effectively and appropriately in different learning settings.

Page 81: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

80

Teacher Beliefs & Knowledge

Long DefinitionTeacher beliefs and knowledge refers to teacher beliefs about technology and the knowledge they need to use education technology in the classroom to support student learning. Components within this variable include:

• Teacher beliefs about technology: One of the most important indicators of education technology use is how teachers feel, in general, about the role and value of technology in society and our lives. When teachers believe education technology is valuable, they may be more likely to learn about it and try to use it.

• Teacher knowledge about technology: Teachers bring specific knowledge about how to use technologies for teaching and learning. This knowledge may be gained during teacher preparation, in the classroom, via formal and informal professional learning, and through their everyday lives.

• Teacher experiences using technology: Teachers bring a variety of experiences that influence their likelihood of adopting a new technology and implementing it to meet student and teacher needs.

• Teacher understanding of curriculum, instruction, and assessment: Teachers bring specific understandings of how students learn, the content and standards students are expected to master, and the processes and practices of effective teaching and assessment. Teachers also bring an understanding of how technology can support overall educational purposes, values, and aims.

Together, these elements interact to enable the comfort and flexibility necessary to use education technology effectively and appropriately in different learning settings. In the most effective technology-integrated classrooms, teachers understand the connection among pedagogy, technology, and content in the context of individual learners’ strengths and needs.

Teacher beliefs and knowledge does not contain factors such as access to resources, teachers’ freedom to choose technology, and availability of professional learning and technical support.

The short and long definitions are informed by professional experiences and the following readings:

Brush, T., Glazewski, K. D., & Hew, K. F. (2008). Development of an instrument to measure preservice teachers’ technology skills, technology beliefs, and technology barriers. Computers in the Schools, 25(1-2), 112-125. https://doi.org/10.1080/07380560802157972

Page 82: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

81

Teacher Beliefs & Knowledge

Canbazoğlu Bilici, S., Yamak, H., Kavak, N., & Selcen Guzey, S. (2013). Technological pedagogical content knowledge self-efficacy scale (TPACK-SeS) for pre-service science teachers: Construction, validation, and reliability. Eurasian Journal of Educational Research, 13(52), 37-60.

Christensen, R. (2002). Effects of technology integration education on the attitudes of teachers and students. Journal of Research on Technology in Education, 34(4), 411-433. https://doi.org/10.1080/15391523.2002.10782359

Judson, E. (2006). How teachers integrate technology and their beliefs about learning: Is there a connection? Journal of Technology and Teacher Education, 14(3), 581-597.

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x

O’Neal, L. J., Gibson, P., & Cotten, S. R. (2017). Elementary school teachers’ beliefs about the role of technology in 21st-century teaching and learning. Computers in the Schools, 34, 192-206. https://doi.org/10.1080/07380569.2017.1347443

Rosenberg, J. M., & Koehler, M. J. (2015). Context and technological pedagogical content knowledge (TPACK): A systematic review. Journal of Research on Technology in Education, 47(3), 186-210. https://doi.org/10.1080/15391523.2015.1052663

Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009). Technological pedagogical content knowledge (TPACK) the development and validation of an assessment instrument for preservice teachers. Journal of Research on Technology in Education, 42(2), 123-149. https://doi.org/10.1080/15391523.2009.10782544

Page 83: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

82

08Strategic Leadership Support

Short DefinitionStrategic leadership support is the extent to which district and school leaders provide explicit encouragement and guidance to staff who are selecting and implementing education technology tools. This support sets and communicates a vision, develops staff, and aligns technology implementation with the district instructional plan.

Page 84: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

83

Strategic Leadership Support

Long DefinitionStrategic leadership support is the extent to which district and school leaders provide explicit encouragement and guidance to staff who are selecting and implementing education technology tools (See Selection Processes and Implementation Systems & Processes). This support sets and communicates a vision, develops staff, and aligns technology implementation with the district instructional plan. It is different from tactical support, which provides the means for technology implementation, such as allocating resources (See Infrastructure & Operations).

High-quality strategic leadership support includes:

• Setting and communicating vision (See Vision for Teaching & Learning) • Strategically elevating technology as a priority: Leaders are positioned to see the role of technology in the vision for teaching and learning across all school or district initiatives. Leaders identify technology as a priority by explicitly allocating time and resources to technology selection and implementation across district plans and policies, reducing competing priorities.

• Translating the vision to school context: Leaders translate the vision to their school contexts through explicit goals for technology use. Leaders clearly communicate how specific instructional practices are enhanced through technology integration and how technology addresses the needs of all students in their school.

• Planning for equitable technology use: Leaders implement plans and policies that promote equitable technology selection and implementation. Leaders appropriately allocate resources as needed to ensure all teachers and students are able to achieve target outcomes.

• Creating explicit expectations: Leaders establish goal-oriented expectations for the use of technology (frequency, purpose, and outcomes) with school staff. These expectations shift over time as students and teachers become more comfortable with enacting the vision.

• Continuous engagement and communication: Leaders clearly, consistently, and authentically engage key stakeholders in technology selection and implementation through multiple communication channels; this communication consistently focuses on the purpose for integration (i.e., what technology enables) and how it will be integrated in alignment with the vision for teaching and learning supports.

• Developing staff • Supporting educators’ professional learning: Leaders create plans for and allocate resources to formal professional learning to support educators’ technology integration through their beliefs and knowledge. Leaders also support educators’ participation in informal professional

Page 85: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

84

Strategic Leadership Support

learning (See Professional Learning and Teacher Beliefs & Knowledge). • Developing leaders’ knowledge of technology integration: Leaders participate in professional learning to develop their own knowledge about technology integration.

• Developing the organization • Establishing selection and implementation processes: Leaders make explicit plans for selection and implementation of technology. As appropriate, leaders identify roles and responsibilities for key stakeholders in these processes (See Selection Processes and Implementation Systems & Processes).

• Maintaining feedback mechanisms: Leaders collect, analyze, and take action based on meaningful feedback from stakeholders throughout edtech implementation; to better understand how edtech implementation is occurring in schools and classrooms, leaders are present and communicative with diverse stakeholders through activities such as surveys, site/classroom visits, and teacher cabinets.

• Building a culture of innovative risk-taking: Leaders actively encourage staff to take risks, try new approaches and tools, learn from setbacks, and iterate on those learnings (See Staff Culture).

• Engaging teachers in decision-making: Leaders support teacher agency by soliciting teacher participation in decision-making to adopt or adapt technology solutions to address emerging challenges (See Teacher Agency).

The short and long definitions are informed by professional experiences and the following readings:

Dexter, S. (2018). The role of leadership for information technology in education: Systems of practices. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), Second Handbook of Information Technology in Primary and Secondary Education (pp. 1–16). Springer International Publishing. https://doi.org/10.1007/978-3-319-71054-9_32

Dexter, S., Richardson, J. W., & Nash, J. B. (2016). Leadership for technology use, integration, and innovation (Eds.). In Young, M.D., & Crow, G. M. Handbook of Research on the Education of School Leaders (pp. 202–228). Routledge. https://www.academia.edu/30447889/Leadership_for_Technology_Use_Integration_and_Innovation

Leithwood, K. (2012). The Ontario leadership framework 2012 with a discussion of the research foundations. The Institute for Education Leadership. https://www.education-leadership-ontario.ca/application/files/8814/9452/4183/Ontario_Leadership_Framework_OLF.pdf

Page 86: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

85

09Professional Learning

Short DefinitionProfessional learning is the presence, duration, and quality of a range of intentional, adult learning activities that support the effective integration of education technology to advance student learning and outcomes. This includes both formal and informal opportunities that lead to shifts in beliefs, knowledge, skills, and practices related to technology integration.

Page 87: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

86

Professional Learning

Long DefinitionProfessional learning is the presence, duration, and quality of a range of intentional, adult learning activities that support the effective integration of education technology to advance student learning and outcomes. Integral to schools and districts, professional learning supports all types of educators, including teachers, school leaders, and instructional staff members. This includes both formal and informal opportunities that lead to shifts in beliefs, knowledge, skills, and practices related to technology integration.

Presence Professional learning occurs in both formal and informal ways, and optimal professional learning for technology integration suggests both are present and support one another. Formal learning activities are initiated by leaders or outside agencies and traditionally have a start and end date. This learning is often tied to credits, certification, or other recognition. Formal professional learning includes, but is not limited to, the following:

• District/school organized learning opportunities

• Conference sessions

• Peer coaching (technical, expert, or content)

• Online courses

• Grade level or department meetings (Professional Learning Communities)

• Postsecondary courses

Informal learning activities are initiated by learners and occur outside of the context of organized learning structures. These activities involve self-directed and collaborative or collegial activities. Although this type of learning can be encouraged, informal learning is organic, such as offering opportunities or suggestions to help teachers plug in to topics that they are curious about. While these ideas of self-direction and choice are not limited to informal learning, it is the primary driver for this type of professional learning. Examples of informal learning include, but are not limited to, the following:

• Discussing and reflecting with colleagues on implementation

• Researching new initiatives, strategies, and resources online

• Establishing a professional/personal learning network (PLN)

• Using social media and sharing platforms as a way to learn from colleagues around the world

Page 88: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

87

Professional Learning

Duration The duration of professional learning is both the frequency and the length of activities over time, often based on the type and goals of the activity. Teachers’ practices are more likely to change with sustained professional learning over time. When professional learning takes place as longitudinal and continuous events, it offers teachers space to reflect on their use of technology in teaching, which can result in teachers’ instructional changes with the use of technology.

Quality High-quality professional learning

• Engages teachers in professional learning decisions: Teachers can recommend professional learning resources, form and support professional learning communities, create opportunities for teacher-led professional learning, and guide the frequency of professional learning such that it meets teacher needs (See Teacher Agency).

• Is sustained over time, as described above.

• Is situated and content-focused: Professional learning focuses on how technologies are used based on teachers’ specific context, subject matter, and grade level; professional learning also focuses on technology in relation to teachers’ needs and the goal for the tool/service and uses teachers’ lessons, student work, and data as part of the learning process.

• Is strategically timed to provide authentic context and be based on the needs of educators.

• Uses models and modeling of the effective use of technology: Teachers use the focal technology, have the opportunity to observe other teachers using it effectively, and view examples of effective integration.

• Is collaborative: Teachers have dedicated time to work with peers, coaches, and/or experts in edtech use; this can be one-on-one opportunities of coaching, small groups organized as professional learning communities, departments, or other configurations of collective learning that are formal or informal.

• Aligns with an overall vision for student learning and achievement (See Vision for Teaching & Learning).

• Supports best practices in curriculum and pedagogy related to technology: Best practice models include models such as SAMR, TPACK, and TIM.

• Incorporates active, hands-on learning: Participants have opportunities to explore tools that are presented and ask questions.

• Provides teachers the opportunity and support for risk-taking and experimentation with edtech.

Page 89: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

88

Professional Learning

• Provides opportunities to receive ongoing feedback and reflect: Feedback and reflection gives teachers the time and space to improve their practice as educators learn how to integrate technology into current practices.

The short and long definitions are informed by professional experiences and the following readings:

Blanchard, M. R., LeProvost, C. E., Tolin, A. D., & Gutierrez, K. S. (2016). Investigating technology-enhanced teacher professional development in rural, high-poverty middle schools. Educational Researcher, 45(3), 207–220. https://doi.org/10.3102/0013189X16644602

Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional development. Learning Policy Institute. https://learningpolicyinstitute.org/sites/default/files/product-files/Effective_Teacher_Professional_Development_REPORT.pdf

Jones, M., & Dexter, S. (2018). Teacher perspectives on technology integration professional development: formal, informal, and independent learning activities. Journal of Educational Multimedia and Hypermedia, 27(1), 83-102.

Lee, H., Longhurst, M. L., & Campbell, T. (2017). Teacher learning in technology professional development and its impact on student achievement in science. International Journal of Science Education, 39(10), 1282-1303. https://doi.org/10.1080/09500693.2017.1327733

Mackey, J., & Evans, T. (2011). Interconnecting networks of practice for professional learning. International Review of Research in Open and Distributed Learning, 12(3), 1–18. https://doi.org/10.19173/irrodl.v12i3.873

Standards for Professional Learning. (2013). School-based professional learning for implementing the common core. Learning Forward. https://learningforward.org/wp-content/uploads/2017/09/school-based-professional-learning-unit-4-packet.pdf

Shulman, L. (2016). What teachers should know and be able to do. National Board for Professional Teaching Standards. http://accomplishedteacher.org/wp-content/uploads/2016/12/NBPTS-What-Teachers-Should-Know-and-Be-Able-to-Do-.pdf

Page 90: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

89

10Competing Priorities

Short DefinitionCompeting priorities are the extent to which a school or district has other prioritized initiatives that impact the available time and attention for new technology implementations. The presence of competing priorities is influenced by limited instructional time, limited preparation time, overlapping initiatives, and communication of priorities.

Page 91: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

90

Competing Priorities

Long DefinitionCompeting priorities is the extent to which a school or district has other prioritized initiatives that impact the available time and attention for edtech implementation. An initiative is a focused effort to adopt new resources and/or change existing processes and practices in a school or district, such as the implementation of a new technology tool, professional learning for a new instructional philosophy, or adoption of a new program. Districts or schools frequently overestimate how many initiatives educators and administrators can manage at once and over time and do not often identify what is not prioritized. Educators’ “capacity to change” or “bandwidth” is impacted by the presence of competing priorities (i.e., initiatives competing for attention).

The following factors influence the presence of competing priorities in a school or district:

• Limited instructional time

• Limited preparation time

• Overlapping initiatives

• Communication of priorities

Limited instructional time Instructional time refers to the number of available instructional minutes at a school, and how those minutes are organized by leaders and teachers. Leaders and teachers allocate instructional minutes for the implementation of prioritized initiatives, leaving fewer minutes for other initiatives.

Limited preparation time Preparation time refers to the number of available planning and professional learning minutes and how those minutes are organized by leaders and teachers. Sufficient time needs to be given to each initiative such that teachers and leaders can reach a level of understanding and proficiency with a tool, program, or philosophy to sustain effective use over time.

Overlapping initiatives Overlapping initiatives refers to the presence of multiple initiatives from the school, district, or state that align with the same instructional or managerial need. As initiatives demand time and focus from educators, overlapping initiatives create competing initiatives and force educators to select one over another to prioritize.

Communication of priorities Communication of priorities is the extent to which school and district stakeholders clearly identify which initiatives are prioritized. This communication establishes

Page 92: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

91

consistent views on the value, scope, and importance of initiatives, preventing misaligned prioritization.

The short and long definitions are informed by professional experiences and the following readings:

Gherardi, S. (2017). Digitized and decoupled? Teacher sensemaking around educational technology in a model 1:1 program. Mid-Western Educational Researcher, 29(2), 166-194.

Lee, V., Leary, H., Sellers, L., & Recker, M. (2014). The role of school district science coordinators in the district-wide appropriation of an online resource discovery and sharing tool for teachers. Journal of Science Education Technology, 23, 309-323. https://doi.org/10.1007/s10956-013-9465-5

Muse, M. D., & Abrams, L. M. (2011). An investigation of school leadership priorities. Delta Kappa Gamma Bulletin, 77(4), 49-58.

Pollock, K., & Winton, S. (2012). School improvement: A case of competing priorities!. Journal of Cases in Educational Leadership, 15(3), 11-21. https://doi.org/10.1177/1555458912447840

Rohanna, K. (2017). Breaking the “adopt, attack, abandon” cycle: A case for improvement science in K–12 education. New Directions for Evaluation, 2017(153), 65-77. https://doi.org/10.1002/ev.20233

Waite, C., & Arnett, T. (2020). Will schools change forever? Predicting how two pandemics could catalyze lasting innovation in public schools. Clayton Christensen Institute. https://www.christenseninstitute.org/publications/school-change/

Competing Priorities

Page 93: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTEdTech Context Inventory: Sample Items from Variable Instruments

Introduction 93

Vision for Teaching & Learning 94

Selection Processes 95

Teacher Agency 96

Infrastructure & Operations 97

Implementation Systems & Processes 98

Staff Culture 99

Teacher Beliefs & Knowledge 100

Strategic Leadership Support 101

Professional Learning 102

Competing Priorities 103

Page 94: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

93

IntroductionThe EdTech Context Inventory consists of 10 quantitative, self-report instruments. Each instrument has three forms: teacher, school leader, and district leader. For the purposes of this instrument, teachers primarily work with students and use technology in instruction firsthand. Leaders primarily work with teachers to support students and support teachers’ and students’ use of technology. This includes all levels of leadership from teacher leader to superintendent. The sample items shown here are from the teacher version of each instrument.

Educators receive the following definition of edtech at the beginning of the survey: Edtech is any form of technology that is designed to facilitate, supplement or complement instruction; enhance teaching practices; and/or improve learning outcomes. This is technology used in instruction, as opposed to technologies that purely support educator workflow.

We piloted the current instrument version in 2020. Please See Showing Our Work for more information on the instrument validation process. Throughout 2021 and 2022, the UVA research team and EdTech Evidence Exchange will continue to refine the instruments.

Please contact the EdTech Evidence Exchange if you are interested in seeing or using the full instruments.

[email protected]

Page 95: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

94

01Vision for Teaching & Learning

Sample Items

Item Response Scale

Leaders in my school clearly describe what high-quality technology-supported learning should look like.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

My school's plan for technology-supported learning supports accommodations for students with differences in abilities.

My school's plan for technology-supported learning encourages innovative approaches to teaching and learning.

My school's plan for technology-supported learning guides technology decisions.

These are 4 of the 16 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 96: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

95

02Selection Processes

Sample Items

Item Response Scale

My school allows enough time to carefully select new technologies. I’m not sure.*

Never

Almost Never

Occasionally/Sometimes

Almost Always

Always

My school only selects new technologies when there is an identified need.

My school evaluates possible technologies for their fit with our instructional need(s).

My school pilots new technologies before selecting them.

*Selection Processes includes an “I’m not sure” option on some questions to accommodate educators who are not involved in selection processes. This option will be further evaluated in ongoing validation analyses.These are 4 of the 23 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 97: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

96

03Teacher Agency

Sample Items

Item Response Scale

Leaders at my school consistently seek out my opinion about decisions.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

I have enough time to participate in decision-making at my school.

Teachers and leaders share responsibility for instructional decisions.

I am involved in selecting new technologies for my classroom.

These are 4 of the 12 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 98: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

97

04Infrastructure & Operations

Sample Items

Item Response Scale

Devices provided to students consistently function properly.

Never

Almost Never

Occasionally/Sometimes

Almost Always

Always

Internet connectivity is adequate to allow me and my students to do our work.

Remote learning plans ensure all students have access to a device outside of school.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly AgreeI know how to contact technical support staff.

These are 4 of the 31 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 99: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

98

05Implementation Systems & Processes

Sample Items

Item Response Scale

My school timelines for technology implementations are achievable.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

My school effectively aligns technology implementations with existing or new pedagogical practices.

My school measures target usage for technology implementations.

My school communicates progress towards target outcomes for technology implementations.

These are 4 of the 23 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 100: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

99

06Staff Culture

Sample Items

Item Response Scale

Teachers and other instructional staff at my school feel empowered to take risks.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

Teachers and other instructional staff at my school can count on each other to deliver on expectations.

Teachers and other instructional staff at my school work together cooperatively.

Teachers and other instructional staff at my school have equal access to share their concerns and suggestions with leadership.

These are 4 of the 24 items for this variable’s instrument. We are continuing with validation and further item reduction. Several items on the staff culture instrument are informed by the National Teacher and Principal Survey (NCES, 2015) and the Comprehensive Teacher Trust Scale (Tschannen-Moran & Hoy, 2003).

Page 101: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

100

07Teacher Beliefs & Knowledge

Sample Items

Item Response Scale

Overall, the positives of using technology outweigh the negatives. Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

I can design effective instruction using technology.

I know about many different technologies.

I can use technology to make challenging content accessible to my students.

These are 4 of the 15 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 102: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

101

08Strategic Leadership Support

Sample Items

Item Response Scale

Leaders at my school allocate resources to technology selection and implementation.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

Leaders at my school consistently communicate the importance of technology.

Leaders at my school consistently communicate how technology supports our students’ needs.

Leaders at my school encourage new approaches with technology.

These are 4 of the 22 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 103: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

102

09Professional Learning

Sample Items

Item Response Scale

During the last school year, I had enough opportunities to participate in workshops or sessions on technology integration.

Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

During the last school year, I had enough opportunities to engage with a professional/personal learning network about technology integration.

My school explicitly supports participation in informal learning about technology throughout the year.

Professional learning about technology in which I participate includes helpful examples of exemplary practice with technology.

These are 4 of the 22 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 104: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

103

10Competing Priorities

Sample Items

Item Response Scale

I have adequate instructional time during my classes to use new technology tools with my students. Strongly Disagree

Disagree

Neither Agree nor Disagree

Agree

Strongly Agree

Leaders in my school clearly communicate which initiatives are the highest priority.

The technology tools I am expected to use each have a unique purpose.

I receive feedback from leaders in my school about how I’m prioritizing initiatives.

These are 4 of the 8 items for this variable’s instrument. We are continuing with validation and further item reduction.

Page 105: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTFrequently Asked QuestionsWhy did the EdTech Evidence Exchange launch the EdTech Genome Project? What problem was the Exchange aiming to solve?

What process did the EdTech Genome Project follow to produce its deliverables?

How is the EdTech Evidence Exchange using the EdTech Genome Project’s deliverables in the real world?

How can other key stakeholders use the EdTech Genome Project’s deliverables?

Why did the EdTech Genome Project focus on implementation contexts?

Why is edtech selection and implementation an equity issue?

How sure are you that these 10 variables are the most important ones? How do they rank in relative importance?

How did each of the individual councils, committees, and working groups contribute to the EdTech Genome Project?

1.

2.

3.

4.

5.

6.

7.

8.

Page 106: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

105

Why did the EdTech Evidence Exchange launch the EdTech Genome Project? What problem was the Exchange aiming to solve?

Virtually every sector of our nation’s economy has learned to use technology in ways that dramatically improve productivity and outcomes. Unfortunately, the education sector has not kept pace.

The effectiveness of education technology (edtech) varies incredibly in schools and districts across the nation. As such, educators’ use of technology has yet to significantly impact students’ educational outcomes overall or help to close persistent achievement gaps (NCES, 2019; Wade et al., 2013), despite documented promise (e.g., Chauhan, 2017; Escueta et al., 2017). We have not seen the returns that we hoped for from our massive edtech investments.

Why? Lack of overall spending does not appear to be a key problem.

Immediately before the COVID-19 pandemic, schools in the U.S. were collectively spending between $26B and $41B per year on edtech (CGCS, 2020; EdTech Evidence Exchange, 2020; Simba Information, 2019). Then, during the pandemic, our national response included sharp increases to school spending on edtech for new devices, user licenses, and professional development (Bushweller, 2020; Tamez-Robledo, 2020). Currently, schools are planning for how they will spend the billions of dollars made available through the Elementary and Secondary School Emergency Relief Fund in 2020 and 2021, and we are likely to continue to see schools pour resources into edtech.

Unfortunately, this spending is not likely to have the desired impact. History suggests that a great deal of these funds will go to well-intentioned, but ultimately unsuccessful, edtech initiatives that fail because they are never properly implemented.

A great deal of edtech is materially underused or unused entirely (Baker & Gowda, 2018; LearnPlatform 2019), with evidence suggesting that approximately 60% of pre-pandemic purchases failed to meet usage goals set by schools.

Edtech also seems to be inequitably implemented. Evidence suggests students in schools with predominantly economically disadvantaged learner populations often have fewer opportunities to use technology in transformative ways or in ways that enhance higher-order thinking skills (Andrade Johnson, 2020; Warschauer & Matuchniak, 2010). (See the question below, Why is edtech selection and implementation an equity issue, to dig further into this critical problem.)

Several years ago, as the scale of this problem started to become apparent, the Jefferson Education Accelerator, our predecessor organization, joined with the University of Virginia School of Education and Human Development and the nonprofit Digital Promise to co-host the EdTech Efficacy Academic Research Symposium. There, after a year of collaboration and research, more than 275 higher education and K12 district leaders, researchers, entrepreneurs,

Frequently Asked Questions

1.

Page 107: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

106

philanthropists, investors, policymakers, and educators worked to address an essential question: How might the education sector collaborate to ensure that evidence of impact, not marketing or popularity, drives edtech selection and implementation?

At the symposium, this group of expert stakeholders examined the question of why our educators lack efficient ways to learn about each other’s experiences with the thousands of education technologies on the market.

Part of the answer is grounded in the fact that our education system is comprised of more than 13,000 separate school districts. Two major barriers stand out as likely to prevent the surfacing and sharing of what works where and why between these districts:

• Educators, who are perpetually stretched thin, do not have compelling incentives to carefully document their experiences.

• Context matters, and the education sector lacks the shared language we need to describe the most important ways that our schools differ from each other.

With little to rely on beyond vendor marketing and word-of-mouth within their limited networks, overwhelmed educators understandably continue to make well-intentioned, but often unsuccessful, decisions about which tools to purchase and how to implement them in their schools. They are effectively flying blind due to a lack of contextually relevant information. In other words, it’s not that they don’t care. It’s that they don’t know.

Solving this problem requires leadership, movement-building, collaboration, and innovation. In response, a group of education leaders launched the nonprofit EdTech Evidence Exchange, which soon led the EdTech Genome Project. The EdTech Genome Project brought together a broad and diverse set of more than 140 education stakeholders to establish consensus on the shared language and instruments we all need to describe and measure school and district edtech implementation contexts.

What process did the EdTech Genome Project follow to produce its deliverables?

Until now, many of the variables that likely make or break an edtech implementation have felt ineffable or anecdotal for educators. The EdTech Genome Project set out to change that by identifying, defining, and creating new ways to measure each of the 10 key context variables likely to be most associated with the success or failure of edtech implementation.

To do this, the EdTech Evidence Exchange convened a diverse technical working network of more than 140 researchers, practitioners, experts, system leaders, and industry representatives and sought feedback from thousands of educators at multiple stages in the three-year research initiative. See How did each of the individual councils, committees, and working groups contribute to the EdTech Genome Project? to learn more about each group’s role.

Frequently Asked Questions

2.

Page 108: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

107

Stakeholders Advisory Board Industry Council

DedicatedProject Manager

University of VirginiaResearch Team

Working Groups for 10 Selected Variables

1 2 3 4 5 6 7 8 9 10

Measurement Council

Research CouncilSteering Committee

Frequently Asked Questions

The EdTech Genome Project’s process of reviewing literature, conducting data collection and analysis, recruiting participants, achieving consensus, soliciting feedback, and engaging in validation studies with representative samples of educators is detailed in the Showing Our Work section of the report. The committees, councils, and working groups comprising the EdTech Genome Project reached sector-wide agreement on two key contributions to the field:

1. The EdTech Context Framework - a common language for naming and defining 10 of the most important “context variables” that are likely to explain how school and district environments vary from one another in selecting and implementing learning technologies.

2. The EdTech Context Inventory - new measurement instruments for each of the 10 selected context variables. These tools will allow researchers and educators to describe school and district environments in data-anchored ways and will match educators who work in similar schools and districts. Together, the 10 instruments form a single comprehensive survey that provides a nuanced, quantitative portrait of an implementation context.

The EdTech Context Framework and the EdTech Context Inventory form the backbone of the information sharing process for the EdTech Evidence Exchange Platform, where educators across the country will be able to access relevant research evidence from other educators

Page 109: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

108

working in schools and districts like their own. These tools will help educators identify, define, and improve critical levers for increasing the likelihood of successful edtech selection and implementation, and they are intended to directly support educators’ practice.

The EdTech Context Framework and the EdTech Context Inventory are also resources for researchers to consistently and systematically measure and share knowledge about critical context variables that are likely to moderate edtech implementation success. The EdTech Context Inventory is a concrete resource to support future studies of edtech.

The graphic below shows the 10 context variables selected during the EdTech Genome Project’s process, for which we created shared language, definitions, and measurement instruments. There is also a space for additional variables to be discovered in the future.

Frequently Asked Questions

With the consensus-driven EdTech Context Framework and the EdTech Context Inventory, we are now on the road to learning from each other at scale. Getting this right, given the increased prevalence of technology in K12 education, could improve learning opportunities for tens of millions of students, while saving billions of taxpayer dollars that would have been wasted on failed edtech implementations.

For a full, detailed description of the EdTech Genome Project processes, review the Showing Our Work section of the report.

Page 110: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

109

Frequently Asked Questions

How is the EdTech Evidence Exchange using the EdTech Genome Project’s deliverables in the real world?

The EdTech Evidence Exchange is building a software platform through which we intend to systematically collect context-rich data from hundreds of thousands of educators. The EdTech Genome Project’s new definitions and instruments—the EdTech Context Framework and Inventory—are the backbone of the EdTech Evidence Exchange Platform.

The Exchange Platform also collects information about educators’ experiences selecting and implementing education technologies. These data are automatically converted into an infographic-style implementation report.

The Exchange Platform will match educators to evidence from others who are working in similar school districts, as defined by the 10 key context variables. This data collection and automated analysis will make it possible for educators to learn from the experiences of educators like them nationwide, without the time delay of traditional analysis and publications cycles.

To populate the Exchange Platform with detailed context and implementation reports, the Exchange will collaborate with funding partners to pay professional-grade stipends to hundreds of thousands of educators willing to contribute their insights, a 45-60 minute time commitment per educator.

Unless somebody provides these educators with the incentive and support necessary for them to document their work, it simply will not happen.

How can other key stakeholders use the EdTech Genome Project’s deliverables?

Multiple education stakeholders can and should begin using the new definitions and instruments created by the EdTech Genome Project:

Researchers • Review the 10 selected variables and definitions in this report. Use the 2-page Researcher Action Steps to facilitate discussions with colleagues.

• Incorporate the 10 new quantitative, self-report instruments in the EdTech Context Inventory into your existing and upcoming research studies to capture highly relevant information about school and district contexts.

• Reach out to the EdTech Evidence Exchange and UVA research team at [email protected] to collaborate on future edtech implementation research.

3.

4.

Page 111: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

110

Frequently Asked Questions

School and District Administrators • Review the 10 selected variables and definitions in this report. Use the 2-page Educator Action Steps to initiate important conversations with colleagues at upcoming staff meetings.

• Visit edtechevidence.org to express interest in gaining early access to implementation reports from schools and districts like yours, as well as the opportunity to be among the first to use the new tools to better understand your own school and district.

Classroom Educators and Instructional Support Staff • Review the 10 selected variables and definitions in this report. Use the 2-page Educator Action Steps to facilitate discussions with colleagues.

• Sign up at edtechevidence.org to find out about opportunities to earn stipends for documenting your context and edtech implementation experiences.

Funders • Contact the Exchange to discuss how we can collaborate to collect research evidence from tens of thousands of educators who are actively working in the content areas and/or geographic regions that are your areas of focus.

• Set future research agendas based on aggregate data about educators’ edtech contexts and implementation experiences collected via the EdTech Evidence Exchange Platform.

Policy-makers • Inform policies with aggregate data about educators’ edtech contexts and implementation experiences collected via the EdTech Evidence Exchange Platform.

• Require future research engagements to collect data using the 10 new instruments.

• Investigate the possibility of having impending research engagements augmented to also collect data using the 10 new instruments.

• Develop future policies that focus on documenting context and edtech use.

Industry • Review the selected variables and definitions in this report. Use the 2-page Industry Action Steps to facilitate discussions with colleagues and school and districts partners.

• Engage a subset of your clients in conversations about their perceptions of their school and district contexts: How are they doing on each of the 10 variables, and how do you and they see context impacting the implementation of your products?

• Begin to examine how your products’ performances vary in different contexts. Which of the variables appears most important to successful implementation?

Page 112: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

111

Frequently Asked Questions

Press • Expand coverage of edtech selection and implementation processes.

• As you interview education leaders, ask them questions about their local school and district contexts and their perceptions of how their schools are doing on each of the 10 variables. Are they aware of their strengths and weaknesses? What are they doing to improve?

Why did the EdTech Genome Project focus on implementation contexts?

When an edtech product thrives in one school but flops in another, the reason can’t only be about the product. It has to be something about the difference between the two schools. But what?

We know from theories of technology adoption and diffusion that variables describing the settings where technology implementation takes place (e.g., available professional learning & devices), as well as variables describing the educators who carry out an implementation (e.g., beliefs about technology), matter for implementation success (Straub, 2009). These variables describe the implementation context before a technology is even selected and many are likely to play a role in the success or failure of any given edtech implementation.

Some of these variables are concrete or structural—like the functionality of available devices or the availability of on-demand technology support. Other critically important variables are less tangible and more human-centered, such as teachers’ beliefs about the value of technology, the strategic support offered by leaders, and the sense of agency teachers feel in decision-making processes (Ertmer, 1999).

By defining and capturing important structural and human-centered variables, the EdTech Genome Project is contributing a common language that can undergird and strengthen much-needed research on which edtech tools work where and why.

Why is edtech selection and implementation an equity issue?

Technology has the potential to support historically underserved students (Andrade Johnson, 2020; Blanchard et al., 2016; Hull & Duch, 2017; Zielezinski & Darling-Hammond, 2016), and we must do better to meet their needs. However, evidence suggests this isn’t happening; students in schools with predominantly economically disadvantaged learner populations, which are also predominantly Black and Latinx (Koball & Jiang, 2018), tend to experience lower quality technology implementation than their peers (Andrade Johnson, 2020; Dolan, 2016; Warschauer & Matuchniak, 2010). Teachers of low income students less frequently use technology to support higher-order thinking skills (Andrade Johnson, 2020) and more often use technology

5.

6.

Page 113: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

112

...of educators believe students will need more individualized instruction to meet their needs.

...of educators believe technology needs will increase or significantly increase over the next 3 years.

Frequently Asked Questions

for a drill-oriented style of instruction that is less associated with achievement gains (Kulik & Fletcher, 2016; Reinhart et al., 2011). As a result, students are not only falling behind in core content areas but are also unprepared to excel in a digitally driven world (IEA, 2019). This problem is referred to as the second level digital divide (Hohlfeld et al., 2017).

Much has been written about the digital divide, which, in part, describes the inequitable distribution of Internet access and computing devices (Dolan, 2016). However, the EdTech Genome Project is addressing the second-level digital divide, which captures disparities in how technology is used. Inequities are likely perpetuated by a wild west status quo of edtech selection and implementation. When tens of billions of public dollars allocated for edtech are spent on tools that are collecting digital dust, barely used or unused at all, massive squandering of scarce resources disproportionately harms vulnerable students who can’t afford to have opportunity gaps expand.

The COVID-19 pandemic exacerbated these gaps (Kuhfeld et al., 2020). At the start of the pandemic in spring 2020, nearly all K12 education ceased and then became completely dependent on edtech platforms and tools. This literally brought the problem of poor edtech implementation home to tens of millions of families.

As we emerge from the pandemic’s historic disruption, teachers say they need edtech like never before. In a summer 2020 national survey by the EdTech Evidence Exchange and the University of Virginia School of Education and Human Development, we learned that educators overwhelmingly see individualization and technology in education on the rise (EdTech Evidence Exchange, 2020). It is absolutely critical that all students experience high-quality implementations of technologies that are an appropriate fit for their needs and support targeted outcomes.

Page 114: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

113

Frequently Asked Questions

The increasing centrality of technology in education makes it even more urgent that decision makers who are responsible for selecting and deploying technologies must understand how those tools will work in their specific contexts. In developing the EdTech Context Inventory, the EdTech Genome Project aimed to produce instruments that work to capture diverse implementation contexts across the country and to document inequities in those contexts. The Education Trust completed an external equity review of the EdTech Context Framework and the EdTech Context Inventory to support this effort.

How sure are you that these 10 variables are the most important ones? How do they rank in relative importance?

We are confident that these 10 context variables are important. We do not yet know how important they are relative to each other or how important other variables may be.

As detailed in the Showing Our Work section of the report, the EdTech Genome Project Steering Committee analyzed extensive research and feedback before unanimously picking these 10 variables from a field of approximately 60 individual and setting context variables. These 10 are the first to have new definitions and measurement instruments created for them; they won’t be the last. We hope the EdTech Context Framework, which defines these variables, and the EdTech Context Inventory, which measures these variables, will catalyze more focused data collection and analysis on edtech implementation. Through this work, we expect to identify additional variables that influence edtech implementation.

Time will tell if these 10 variables are indeed the most strongly or consistently associated with edtech implementation success or failure, as well as how they rank among each other. For now, they are a major step toward using common language to describe and define the essential nuances of implementation contexts.

How did each of the individual councils, committees, and working groups contribute to the EdTech Genome Project?

Steering CommitteeThis diverse group of education leaders, selected both by application and by appointment, made key decisions to form national consensus on top context variables for edtech implementation. The EdTech Genome Steering Committee had authority to guide and approve the final deliverables of the 10 working groups that developed definitions and measurement instruments for each for the context variables selected to be studied first.

8.

7.

Page 115: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

114

Frequently Asked Questions

Advisory BoardThese senior leaders from across the education sector provided guidance on project strategy, participant recruitment, and the content of the EdTech Context Framework and the EdTech Context Inventory.

Variable Working GroupsThese 10 working groups each took one context variable selected by the Steering Committee for further study. Each working group spent half a year developing short and long definitions, as well as a draft instrument, for their variable. The 10 harmonized instruments comprise the EdTech Context Inventory.

Measurement CouncilThese senior measurement experts reviewed and revised the 10 draft instruments developed by the 10 variable working groups. Their contributions shaped the final EdTech Context Inventory.

Industry CouncilThis group met quarterly to provide feedback and industry perspectives on each stage of the initiative.

Research CouncilThis group of advisors provided strategic advice on the content and process of the EdTech Genome Project’s deliverables, as well as guidance on positioning the initiative’s work for long-term adoption by the research sector.

Page 116: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

115

ReferencesAndrade Johnson, M. D. S. (2020). Digital equity: 1:1 technology and associated pedagogy.

In Handbook on Promoting Social Justice in Education (pp. 1609-1639). Springer International Publishing. https://doi.org/10.1007/978-3-030-14625-2_142

Baker, R. S., & Gowda, S. M. (2018). The 2018 technology & learning insights report: Towards understanding app effectiveness and cost. BrightBytes. https://www.brightbytes.net/resources-archive/insightsreport2018

Barton, E. A., & Brown, D. (2021) Evidence-informed decision-making about edtech is within reach. Educational Leadership, 78(8).

Blanchard, M. R., LePrevost, C. E., Tolin, A. D., & Gutierrez, K. S. (2016). Investigating technology-enhanced teacher professional development in rural, high-poverty middle schools. Educational Researcher, 45(3), 207–220. https://doi.org/10.3102/0013189X16644602

Bushweller, K. (2020). How COVID-19 is shaping tech use. What that means when schools reopen. Education Week. https://www.edweek.org/technology/how-covid-19-is-shaping-tech-use-what-that-means-when-schools-reopen/2020/06

Chauhan, S. (2017). A meta-analysis of the impact of technology on learning effectiveness of elementary students. Computers & Education, 105, 14-30. https://doi.org/10.1016/j.compedu.2016.11.005

Council of the Great City Schools (CGCS). (2020). Managing for results in America’s great city schools. https://www.cgcs.org/cms/lib/DC00001581/Centricity/Domain/4/Managing%20for%20Results%20in%20Americas%20Great%20City%20Schools%202020.pdf

Dolan, J. E. (2016). Splicing the divide: A review of research on the evolving digital divide among K-12 students. Journal of Research on Technology in Education, 48(1), 16-37. https://doi.org/10.1080/15391523.2015.1103147

EdTech Evidence Exchange. (2020). Technology as a pandemic recovery resource for educators. https://edtechevidence.org/wp-content/uploads/2021/07/Technology_Pandemic_Recovery_Resource_EdTechEvidenceExchangeUVA_8_2020v3.pdf

Epstein, B., Rush, C., & Slyhuis, D. (2017). Crowdsourcing efficacy research and product reviews. https://symposium.curry.virginia.edu/wp-content/uploads/2017/07/Crowdsourcing-Efficacy-Research-and-Product-Reviews.pdf

Ertmer, P. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology Research and Development, 47(4), 47–61. https://doi.org/10.1007/BF02299597

Escueta, M., Quan, V., Nickow, A. J., & Oreopoulos, P. (2017). Education technology: An evidence-based review (NBER Working Paper Series No. 23744). National Bureau of Economic Research. https://www.nber.org/papers/w23744

Page 117: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

116

References

Gallup. (2019). Education technology use in schools. https://www.newschools.org/wp-content/uploads/2020/03/Gallup-Ed-Tech-Use-in-Schools-2.pdf

Hohlfeld, T. N., Ritzhaupt, A. D., Dawson, K., & Wilson, M. L. (2017). An examination of seven years of technology integration in Florida schools: Through the lens of the Levels of Digital Divide in Schools. Computers and Education, 113, 135–161. https://doi.org/10.1016/j.compedu.2017.05.017

Hull, M., & Duch, K. (2017). One-to-one technology and student outcomes (IZA Discussion Papers No. 10886). IZA Institute of Labor Economics. http://ftp.iza.org/dp10886.pdf

Hulleman, C. S., Burke, R. A., May, M., Daniel, D. B., & Charania, M. (2017). Merit or marketing?: Evidence and quality of efficacy research in educational technology companies. [White paper produced by Working Group D for the EdTech Academic Efficacy Symposium]. University of Virginia.

IEA. (2019). Results of the international computer and information literacy study [ICILS 2018 Infographics]. https://www.iea.nl/studies/iea/icils

Koball, H., & Jiang, Y. (2018). Basic facts about low-income children: Children under 18 years, 2016. New York, NY: National Center for Children in Poverty, Columbia University Mailman School of Public Health. Retrieved from http://www.nccp.org/publications/pub_1194.html

Kuhfeld, M., Soland, J., Tarasawa, B., Johnson, A., Ruzek, E., & Liu, J. (2020). Projecting the potential impacts of COVID-19 school closures on academic achievement. Educational Researcher, 4(8), 549-565. https://doi.org/10.3102/0013189X20965918

Kulik, J. A., & Fletcher, J. D. (2016). Effectiveness of intelligent tutoring systems: A meta-analytic review. Review of Educational Research, 86(1), 42-78. https://doi.org/10.3102/0034654315581420

LearnPlatform. (2019). Edtech insights: 2019 usage trends report. https://static1.squarespace.com/static/56339016e4b095e84e825b9c/t/5dd351f0c2af6160a08722d5/1574130160421/2019+Usage+Trends+Report+_+LearnPlatform.pdf

National Center for Education Statistics (NCES). (2019). The nation’s report card: Achievement gaps dashboard, 2019. https://www.nationsreportcard.gov/dashboards/achievement_gaps.aspx

Reinhart, J. M., Thomas, E., & Toriskie, J. M. (2011). K-12 teachers: Technology use and the second level digital divide. Journal of Instructional Psychology, 38(3/4), 181-193.

Simba Information. (2019). Publishing for the preK-12 market 2019-2020. https://www.simbainformation.com/Publishing-PreK-12552111/

Page 118: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

117

References

Tamez-Robledo, N. (2020). COVID-19 is pushing school tech departments to their limits - and then some. EdSurge. https://www.edsurge.com/news/2020-11-10-covid-19-is-pushing-school-tech-departments-to-their-limits-and-then-some

Wade, W. Y., Rasmussen, K. L., & Fox-Turnbull, W. (2013). Can technology be a transformative force in education? Preventing School Failure, 57(3). 162-170. https://doi.org/10.1080/1045988X.2013.795790

Warschauer, M., & Matuchniak, T. (2010). New technology and digital worlds: Analyzing evidence of equity in access, use, and outcomes. Review of Research in Education, 34(1), 179-225. https://doi.org/10.3102/0091732X09349791

Zielezinski, M. B., & Darling-Hammond, L. (2016). Promising practices: A literature review of technology use by underserved students. Stanford Center of Opportunity Policy in Education. https://www.researchgate.net/publication/304040744_Promising_Practices_A_Literature_Review_of_Technology_Use_by_Underserved_Students

Page 119: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTGlossary

Term Definition

“All” Students Students from all races, ethnicities, creeds, geographic locales (rural, urban, suburban), socioeconomic statuses, grades, genders, sexual orientations, or other distinguishing characteristics; students who are identified as English language learners, special education, gifted and talented, or other distinguishing categories of services

“All” Teachers Teachers from all races, ethnicities, creeds, geographic locales (rural, urban, suburban), socioeconomic statuses, grades, genders, sexual orientations, or other distinguishing characteristics; teachers who instruct all content areas, grade bands, and students identified as English language learners, special education, gifted and talented, or other distinguishing categories of services

District Leaders The superintendent, assistant superintendent, principal supervisors, and those central office staff who support edtech at the district level

EdTech Context Framework (i.e., Context Framework)

Names and definitions for 10 of the most important “context variables” that are likely to explain how school environments vary from one another when it comes to selecting and implementing education technology

EdTech Context Inventory (i.e., Context Inventory)

Quantitative, self-report instruments for each of the 10 variables defined in the EdTech Context Framework

EdTech Evidence Exchange Platform (i.e., Exchange Platform)

An online platform where educators across the country will be matched to relevant research evidence from educators like them, based on the 10 variables defined in the Context Framework and measured in the Context Inventory

Page 120: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

119

Term Definition

Education Technology (i.e., edtech)

Education technology (edtech) is broadly defined. This includes: content-less tools, content-only tools, and content + platform tools; free tools and paid tools; and software tools and hardware tools. However, in the EdTech Genome Project work, education technology only includes technologies used in instruction, as opposed to technologies that purely support educator workflow (e.g., an online teacher evaluation instrument).

Educators Trained professionals who play a role in the development and/or carrying out of instructional programs/activities and curriculum for students in a given school, school district, or school system; specifically, teachers and leaders/administrators

School Leaders Administrators such as principals, assistant principals, and instructional coaches, as well as teacher leaders such as department chairs and heads of project-specific task forces or working groups, who support edtech at the school level

Staff Those who work in a school or district and either:

• are tasked with overseeing, developing, and/or carrying out curriculum and instruction or professional development programs (i.e., teachers and leaders); or

• work on the internal operations and organizational needs of a school or district

Stakeholders The collection of local individuals and entities that are directly invested in and affected by policies of a particular school district such as families, students, faculty, staff, contracted businesses and individuals, community members, or governing bodies

Teachers Trained professionals who interact on an instructional basis with students

Technology Implementation

The broad and long-term systematic process of carrying out the adoption of a given technology by incorporating it into the instructional routines of faculty and staff

Technology Integration The installation of a specific technology into a particular system or workflow

Page 121: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

THE EDTECH GENOME PROJECTSupplemental Resources

Researcher Action Steps 121

Educator Action Steps 123

Industry Action Steps 125

Page 122: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

Integrate new instruments into your research to consider the influence of context on edtech implementation.

EdTech Genome Project: Researcher Action Steps

Context matters deeply when it comes to implementing edtech products and services. Researchers have been investigating the influence of context on edtech implementation for a numberof years. However, variationsin language and instruments present a challenge for aggregating findings and translating those findings to practice.

The EdTech Evidence Exchange, a nonprofit working closely with University of Virginia researchers, organized a coalition of educators and education stakeholders to reach consensus on which 10 context variables appear to matter most for edtech implementation success or failure. They also developed new measurement instruments, one for each variable. These instruments can help the field measure and build knowledge about consistent constructs.

Read the EdTech Genome Project Report for comprehensive definitions of each variable.

What should researchers do?

Consider incorporating the EdTech Context Inventory - the 10 new quantitative self-report instruments - into your existing and upcoming research studies to capture highly relevant information about your school or district context. Contact the EdTech Evidence Exchange at [email protected] for access to the new instruments. Your work could contribute to validating these instruments with diverse educator populations.

1.

2.

3. Reach out to the EdTech Evidence Exchange andUVA research team at [email protected] collaborate on future edtech implementation research.

Why does this matter? Context Variables

Page 123: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

The vision for teaching and learning unifies stakeholders with clear direction, purpose, and rationale for technology-supported learning. A high-quality vision is forward-thinking and actionable, and to have e�ect, must be consistently communicated and referenced as a guide for action. Visioning helps schools and districts recognize opportunities for technology to address problems of practice, prioritize equity, and plan for technology integration that promotes student learning opportunities. Visions describe the ideal state of teaching and learning for all students in which digital technologies transform daily life.

Selection processes occur prior to procurement and are the presence and quality of consistent methods through which classrooms/schools/districts/states identify technologies, evaluate those technologies, and choose technologies for procurement to meet established student and teacher needs for learning and instruction.

Teacher agency is the extent to which teachers consistently have a voice in shaping their work and the conditions and tools for that work. Regarding education technology implementation, this is the extent to which the conditions for agency are in place and a variety of teachers are consistently involved in decision-making related to shared visioning, selection processes, implementation processes, infrastructure, and professional learning.

Infrastructure and operations are the enabling conditions that lower barriers for implementation, facilitate uptake, and support scaling and sustaining new education technology. These conditions include physical resources, broadband Internet connectivity, students’ remote devices and connectivity, human resources, system specifications, operational policies, and funding.

Implementation systems and processes occur after procurement and are the presence and quality of methods through which school communities put education technology into e�ect over time to achieve intended outcomes. This includes mechanisms for monitoring ongoing fit with current initiatives, conducting resource inventories, monitoring the ongoing use of the technology as it was designed, making systemic adjustments as needed, and documenting evidence of impact on target outcomes.

Sta� culture refers to the set of beliefs, values, norms, and assumptions that are shared collectively by the school and/or district sta� and that influence the way in which sta� members work individually and collaboratively to fulfill the school’s shared vision for teaching and learning. Important facets of sta� culture include trust, social capital, communication, and equity.

Teacher beliefs and knowledge are individual teacher’s perceived ability to use education technologies and integrate them into their practice. This variable combines (1) teachers’ beliefs about, knowledge about, and experiences using education technology and (2) teachers’ understanding of curriculum, instruction, and assessment. Together, these elements interact to enable the comfort and flexibility necessary to use education technologies e�ectively and appropriately in di�erent learning settings.

Strategic leadership support is the extent to which district and school leaders provide explicit encouragement and guidance to sta� who are selecting and implementing education technology tools. This support sets and communicates a vision, develops sta�, and aligns technology implementation with the district instructional plan.

Professional learning is the presence, duration, and quality of a range of intentional, adult learning activities that support the e�ective integration of education technology to advance student learning and outcomes. This includes both formal and informal opportunities that lead to shifts in beliefs, knowledge, skills, and practices related to technology integration.

Competing priorities are the extent to which a school or district has other prioritized initiatives that impact the available time and attention for new technology implementations. The presence of competing priorities is influenced by limited instructional time, limited preparation time, overlapping initiatives, and communication of priorities.

Definitions of Context Variables

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

Page 124: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

Learn about your context to improve edtech success in your school or district.

EdTech Genome Project: Educator Action Steps

Context matters deeply when it comes to implementing edtech products and services. Things that work well in some schools often fail to gain traction in other schools. Why? Because our schools and districts (i.e., our contexts) vary from each other.

The EdTech Evidence Exchange, a nonprofit working closely with University of Virginia researchers, organized a coalition of educators and education stakeholders to reach consensus on which 10 context variables appear to matter most for edtech implementation success or failure. They also developed new measurement instruments that we can all use to better understand our contexts.

Read the EdTech Genome Project Report for comprehensive definitions of each variable. Then, initiate important conversations with your district- or school-based team:

1.

Visit edtechevidence.org to get involved and access edtech implementation wisdom from schools and districts across the country that are similar to yours in the ways that matter most.

2.

Which of these variables are our strongest/weakest? Why? How has our strength/weakness related to those variables a�ected our past attempts to implement edtech tools? Which variable would be the easiest to improve? Who can we work withto help us improve on these variables?

What should educators do?Why does this matter? Context Variables

Page 125: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

The vision for teaching and learning unifies stakeholders with clear direction, purpose, and rationale for technology-supported learning. A high-quality vision is forward-thinking and actionable, and to have e�ect, must be consistently communicated and referenced as a guide for action. Visioning helps schools and districts recognize opportunities for technology to address problems of practice, prioritize equity, and plan for technology integration that promotes student learning opportunities. Visions describe the ideal state of teaching and learning for all students in which digital technologies transform daily life.

Selection processes occur prior to procurement and are the presence and quality of consistent methods through which classrooms/schools/districts/states identify technologies, evaluate those technologies, and choose technologies for procurement to meet established student and teacher needs for learning and instruction.

Teacher agency is the extent to which teachers consistently have a voice in shaping their work and the conditions and tools for that work. Regarding education technology implementation, this is the extent to which the conditions for agency are in place and a variety of teachers are consistently involved in decision-making related to shared visioning, selection processes, implementation processes, infrastructure, and professional learning.

Infrastructure and operations are the enabling conditions that lower barriers for implementation, facilitate uptake, and support scaling and sustaining new education technology. These conditions include physical resources, broadband Internet connectivity, students’ remote devices and connectivity, human resources, system specifications, operational policies, and funding.

Implementation systems and processes occur after procurement and are the presence and quality of methods through which school communities put education technology into e�ect over time to achieve intended outcomes. This includes mechanisms for monitoring ongoing fit with current initiatives, conducting resource inventories, monitoring the ongoing use of the technology as it was designed, making systemic adjustments as needed, and documenting evidence of impact on target outcomes.

Sta� culture refers to the set of beliefs, values, norms, and assumptions that are shared collectively by the school and/or district sta� and that influence the way in which sta� members work individually and collaboratively to fulfill the school’s shared vision for teaching and learning. Important facets of sta� culture include trust, social capital, communication, and equity.

Teacher beliefs and knowledge are individual teacher’s perceived ability to use education technologies and integrate them into their practice. This variable combines (1) teachers’ beliefs about, knowledge about, and experiences using education technology and (2) teachers’ understanding of curriculum, instruction, and assessment. Together, these elements interact to enable the comfort and flexibility necessary to use education technologies e�ectively and appropriately in di�erent learning settings.

Strategic leadership support is the extent to which district and school leaders provide explicit encouragement and guidance to sta� who are selecting and implementing education technology tools. This support sets and communicates a vision, develops sta�, and aligns technology implementation with the district instructional plan.

Professional learning is the presence, duration, and quality of a range of intentional, adult learning activities that support the e�ective integration of education technology to advance student learning and outcomes. This includes both formal and informal opportunities that lead to shifts in beliefs, knowledge, skills, and practices related to technology integration.

Competing priorities are the extent to which a school or district has other prioritized initiatives that impact the available time and attention for new technology implementations. The presence of competing priorities is influenced by limited instructional time, limited preparation time, overlapping initiatives, and communication of priorities.

Definitions of Context Variables

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

Page 126: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

Understand school and district contexts to increase your partners’ success with your edtech products and services.

EdTech Genome Project: Industry Action Steps

Context matters deeply when it comes to implementing edtech products and services. Most industry teams have first hand experience and data illustrating how variations in school and district contexts are associated with edtech success as much as attributes of the product(s) or service(s). Di�erences in context matter substantially for successful implementation and positive impact on student outcomes.

The EdTech Evidence Exchange, a nonprofit working closely with University of Virginia researchers, organized a coalition of educators and education stakeholders to reach consensus on which 10 context variables appear to matter most for edtech implementation success or failure. They also developed new measurement instruments that we can all use to better understand school and district contexts. Industry teams can use these insights to inform practices and enable success for edtech products and services.

Read the EdTech Genome Project Report for comprehensive definitions of each variable. Then, initiate important conversations with your team:

1.

Incorporate the 10 context variables into your processes to enhance shared understanding with common language about context among educators, researchers, policy makers, and industry.

2.

Initiate conversations about the variables with current or planned school or district partners to better understand the likelihood of implementation success in a specific context.

3.

Use the EdTech Context Inventory in existing and upcoming formal research studies.

Contact the EdTech Evidence Exchange at [email protected] for access to the new instruments and guidance on how to use them.

4.

Which of these variables do partners struggle with most? Why? Which of these variables do you/your team perceive the impact of most? Why? How and why might you interest partners in taking the EdTech Context Inventory? Who could your team work with to help a partner improve one or more variables?

What should industry do?Why does this matter? Context Variables

Page 127: Report...2021/07/01  · Barton then took the lead, supported by Dr. Tindle and Ayana D’Aguilar, throughout the initiative, working with the incredible collection of research and

The vision for teaching and learning unifies stakeholders with clear direction, purpose, and rationale for technology-supported learning. A high-quality vision is forward-thinking and actionable, and to have e�ect, must be consistently communicated and referenced as a guide for action. Visioning helps schools and districts recognize opportunities for technology to address problems of practice, prioritize equity, and plan for technology integration that promotes student learning opportunities. Visions describe the ideal state of teaching and learning for all students in which digital technologies transform daily life.

Selection processes occur prior to procurement and are the presence and quality of consistent methods through which classrooms/schools/districts/states identify technologies, evaluate those technologies, and choose technologies for procurement to meet established student and teacher needs for learning and instruction.

Teacher agency is the extent to which teachers consistently have a voice in shaping their work and the conditions and tools for that work. Regarding education technology implementation, this is the extent to which the conditions for agency are in place and a variety of teachers are consistently involved in decision-making related to shared visioning, selection processes, implementation processes, infrastructure, and professional learning.

Infrastructure and operations are the enabling conditions that lower barriers for implementation, facilitate uptake, and support scaling and sustaining new education technology. These conditions include physical resources, broadband Internet connectivity, students’ remote devices and connectivity, human resources, system specifications, operational policies, and funding.

Implementation systems and processes occur after procurement and are the presence and quality of methods through which school communities put education technology into e�ect over time to achieve intended outcomes. This includes mechanisms for monitoring ongoing fit with current initiatives, conducting resource inventories, monitoring the ongoing use of the technology as it was designed, making systemic adjustments as needed, and documenting evidence of impact on target outcomes.

Sta� culture refers to the set of beliefs, values, norms, and assumptions that are shared collectively by the school and/or district sta� and that influence the way in which sta� members work individually and collaboratively to fulfill the school’s shared vision for teaching and learning. Important facets of sta� culture include trust, social capital, communication, and equity.

Teacher beliefs and knowledge are individual teacher’s perceived ability to use education technologies and integrate them into their practice. This variable combines (1) teachers’ beliefs about, knowledge about, and experiences using education technology and (2) teachers’ understanding of curriculum, instruction, and assessment. Together, these elements interact to enable the comfort and flexibility necessary to use education technologies e�ectively and appropriately in di�erent learning settings.

Strategic leadership support is the extent to which district and school leaders provide explicit encouragement and guidance to sta� who are selecting and implementing education technology tools. This support sets and communicates a vision, develops sta�, and aligns technology implementation with the district instructional plan.

Professional learning is the presence, duration, and quality of a range of intentional, adult learning activities that support the e�ective integration of education technology to advance student learning and outcomes. This includes both formal and informal opportunities that lead to shifts in beliefs, knowledge, skills, and practices related to technology integration.

Competing priorities are the extent to which a school or district has other prioritized initiatives that impact the available time and attention for new technology implementations. The presence of competing priorities is influenced by limited instructional time, limited preparation time, overlapping initiatives, and communication of priorities.

Definitions of Context Variables

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.