196
Improving Quality Education and Children’s Learning Outcomes and Effective Practices in the Eastern and Southern Africa Region Report for UNICEF ESARO

Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

Embed Size (px)

Citation preview

Page 1: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

a

Improving Quality Education and Children’s Learning Outcomes and Effective Practices in the Eastern and Southern Africa RegionReport for UNICEF ESARO

Page 2: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

b

Tim Friedman, Ursula Schwantner, Jeaniene Spink, Naoko Tabata and Charlotte WatersAustralian Council for Educational Research (ACER)2016Commissioned by UNICEF Eastern and Southern Africa Regional Office (ESARO), Basic Education and Gender Equality (BEGE) Section.

©2016 United Nations Children’s Fund (UNICEF)

Cover photo: Francõise d’Elbee ©UNICEF Kenya/2009/d’Elbee

Permission is required to reproduce any part of this publication. Permission will be freely granted to educational or non-profit organizations. Others may be requested to pay a small fee. Requests should be addressed to: Basic Education and Gender Equality Section, Eastern and Southern Africa Regional Office, UNICEF; Tel: +254 207-622-307 email: [email protected].

Page 3: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

i

ContentsLists of tables and figures ..............................................................................................ii

Acronyms and abbreviations ..........................................................................................iv

Acknowledgements ..................................................................................................... vii

Foreword from the Eastern and Southern Africa Regional Director .................................... viii

Executive summary ...................................................................................................... x

Introduction ................................................................................................................. 1

Context for primary education in the ESA region ....................................................... 1

Conceptual framework .......................................................................................... 2

1. Stock-taking and comparative analysis of existing assessments in the ESA region ........... 5

1.1 Overview of assessments in the stock-taking ..................................................... 5

1.2 Comparative analysis of assessments ...............................................................11

2. Literacy and numeracy in primary education in the ESA region: Students experiencing limited learning outcomes and trends over time ......................... 27

2.1. Characteristics of students experiencing limited learning outcomes in literacy and numeracy in primary education in the ESA region ......................... 28

2.2. Trends in learning outcomes of children in primary education in literacy and numeracy in the ESA region....................................................... 38

3. Improving learning outcomes in the ESA region: Effective country-level practices .......... 45

3.1 Country-level programmes analysed ................................................................ 46

3.2 Key strategies for success ............................................................................. 47

4. A macro theory of change ...................................................................................... 51

References ................................................................................................................ 57

Appendix I: Methodology ............................................................................................ 67

Appendix II: Detailed tables for Chapter 1 ..................................................................... 81

Appendix III: Country case studies ................................................................................ 90

Appendix IV: Detailed tables and figures for Chapter 2 .................................................. 103

Appendix V: Detailed table for Chapter 3 .................................................................... 115

Appendix VI: Main stock-taking table .......................................................................... 117

Page 4: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

ii

Lists of tables and figuresTables Table 1: Participation in international assessments in ESAR countries ................................ 6

Table 2: Participation in regional assessments in ESAR countries ....................................... 7

Table 3: Implementation of national assessments by government/parastatal bodies in ESAR countries ........................................................................................... 8

Table 4: Implementation of EGRA/EGMA in ESAR countries .............................................10

Table 5: Purpose of the assessments from stock-taking ..................................................11

Table 6: Target population in the assessments from stock-taking ..................................... 12

Table 7: Cognitive domains in assessments from the stock-taking ................................... 15

Table 8: Test administration methods of assessments from the stock-taking ..................... 18

Table 9: National assessments and use of IRT in data analysis ........................................ 20

Table 10: Descriptions and attainment of competency levels in literacy: NASMLA in Kenya 2010 ................................................................................ 21

Table 11: Proportions of students defined as experiencing limited learning outcomes by assessment, country, grade and domain ...................................................... 30

Table 12: Direction of trends in SACMEQ reading and mathematics scale scores from 2000 to 2007, based on (Hungi et al., 2011) ............................................ 43

Table 13: Stock-taking framework ................................................................................. 68

Table 14: Assessment programmes for which data were available for analysis .................... 71

Table 15: Uwezo final sample for analysis in this report ................................................... 74

Table 16: Assessment implementation by type of assessments ......................................... 81

Table 17: ESAR countries with limited assessment activity in recent years ......................... 82

Table 18: Analytical techniques used in the assessments from the stock-taking of assessments in ESAR ................................................................................ 83

Table 19: Proportion of students experiencing limited learning outcomes for each contextual variable of interest for Uwezo countries ..............................104

Table 20: Proportion of students experiencing limited learning outcomes for each contextual variable of interest for PASEC countries ..............................107

Table 21: Proportion of students experiencing limited learning outcomes for each contextual variable of interest for TIMSS ............................................109

Table 22: Proportion of students experiencing limited learning outcomes for each contextual variable of interest for prePIRLS ......................................... 110

Table 23: Example programmes in ESAR with focus on improving learning outcomes in literacy and numeracy of disadvantaged children in primary education .............115

Table 24: Main stock-taking table ................................................................................117

Page 5: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

iii

Figures Figure 1: A macro theory of change. An evidence-based monitoring and

intervention cycle as premise for change: assessment, analysis, action ................ xiv

Figure 2: Assessments from the stock-taking by type of assessment .................................. 6

Figure 3: Mean percentage marks in Grade 3 mathematics by gender and province – Annual National Assessment in Kenya .......................................................... 22

Figure 4: Trends in English performance across time for students in Kenya (Uwezo) ............ 39

Figure 5: Trends in mathematics performance across time for students in Tanzania (Uwezo) . 40

Figure 6: Trends in proportions of students experiencing limited learning outcomes across Uwezo countries ................................................................................. 41

Figure 7: A macro theory of change. An evidence-based monitoring and intervention cycle as premise for change: assessment, analysis, action ................ 52

Figure 8: Trends in Swahili performance across time for students in Kenya (Uwezo) ........... 111

Figure 9: Trends in mathematics performance across time for students in Kenya (Uwezo) ...112

Figure 10: Trends in English performance across time for students in Tanzania (Uwezo) .......112

Figure 11: Trends in Swahili performance across time for students in Tanzania (Uwezo) .......113

Figure 12: Trends in English performance across time for students in Uganda (Uwezo) ........113

Figure 13: Trends in mathematics performance across time for students in Uganda (Uwezo) 114

Page 6: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

iv

Acronyms and abbreviations3ie International Initiative for Impact Evaluation

ACER Australian Council for Educational Research

ANA Annual National Assessment

ASALs Arid and semi-arid lands

ASLI Africa Student Learning Index

BEGE Basic Education and Gender Equality

CBA Competency-based approach

CO Country Office

CONFEMEN Conference of Ministers of Education of French-speaking Countries

DAC Development Assistance Committee

DfID Department for International Development (UK)

ECD Early childhood development

EDF Education Development Fund

EDI Education for All Development Index

EFA Education for All

EFA GMR Education for All Global Monitoring Report

EGMA Early Grade Mathematics Assessment

EGRA Early Grade Reading Assessment

ELMI Early Literacy and Maths Initiative

EMIS Education Management Information System

ESA Eastern and Southern Africa

ESAR Eastern and Southern Africa region

ESSP Education Sector Strategic Plan

ETF Education Transition Fund

GPE Global Partnership for Education

GPI Gender Parity Index

IEA International Association for the Evaluation of Educational Achievement

IEP Integrated Education Programme

IfE Innovation for Education

IIEP International Institute for Educational Planning

IRT Item response theory

LARS Learning Achievement in Rwandan Schools

LLO Limited learning outcomes

LNAEP Lesotho National Assessment of Educational Progress

MICS Multiple indicator cluster surveys

MLA Monitoring learning achievement

Page 7: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

v

MoESAC Ministry of Education, Sport, Arts and Culture of Zimbabwe

MTPDS Malawi Teacher Professional Development Support

NALA National Assessment of Learner Achievement

NAPE National Assessment of Progress in Education

NASMLA National Assessment System for Monitoring Learning Achievement

NER Net enrolment rate

NLA National Learning Assessment

NSAT National Standardized Achievement Test

OECD Organisation for Economic Co-operation and Development

OVC Orphans and vulnerable children

PASEC Programme for the Analysis of the Education Systems of CONFEMEN Countries

PIRLS Progress in International Reading Literacy Study

PISA Programme for International Student Assessment

PRIMR Primary Math and Reading Initiative

RCT Randomised control trial

READ Russia Education Aid for Development

REB Rwanda Education Board

RTI Research Triangle Institute

SACMEQ Southern and Eastern Africa Consortium for Monitoring Educational Quality

SBA School-based assessment

SSME Snapshot of School Management Effectiveness

TAC Teacher Advisory Centre

TIMSS Trends in International Mathematics and Science Study

UNESCO United Nations Educational, Scientific and Cultural Organization

UNICEF United Nations Children’s Fund

UNICEF ESARO UNICEF Eastern and Southern Africa Regional Office

USAID United States Agency for International Development

ZELA Zimbabwe Early Learning Assessment

ZIMSEC Zimbabwe School Examinations Council

Page 8: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

vi

Page 9: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

vii

AcknowledgementsThe Australian Council for Educational Research (ACER) was contracted by the United Nations Children’s Fund (UNICEF) to deliver a consultancy service for improving the quality of education and children’s learning outcomes and effective practices in the Eastern and Southern Africa region. This study was co-financed by UNICEF and ACER’s Centre for Global Education Monitoring.1 The authors of this report would like to thank Manuel Cardoso, Education Specialist, Programme Division, UNICEF; Camille R. Baudot, Regional Adviser, Basic Education and Gender Equality (BEGE), UNICEF Eastern and Southern Africa Regional Office (ESARO); Mitsue Uemura, Shiraz Chakera, Inge Vervloesem, Benoit d’Ansembourg and Pablo Stansbery, Education Specialists from BEGE, UNICEF ESARO, as well as the focal persons at the 21 UNICEF Country Offices in the Eastern and Southern Africa Region (ESAR) for their support during the conduct of the study and preparation of the report.2

We would also like to thank the key informants who took part in the interviews conducted as part of this study: John Mugo (Regional Director of Uwezo at Twaweza); Ben Piper (Lead in the Primary Maths and Reading Initiative, PRIMR, in Kenya) and the Kenya PRIMR team; Kenneth Russell (Education Specialist, UNICEF Zimbabwe Country Office); Professor Robert McCormick (Innovation for Education Evaluation and Monitoring advisor); and Joyce Kinyanjui (Programme Manager, Women Educational Researchers of Kenya, Opportunity Schools).3

Thanks also to the many other individuals who provided information for this study, including Aaron Benavot (Director, GMR team, UNESCO) and Nihan Blanchy-Koseleci (UNESCO GMR); Scott Murray (DataAngel); Marc van der Stouwe (Mott MacDonald/Cambridge Education); Lucy Maina (Regional Director of Africa Educational Trust, Somalia); and Vyjayanthi Sankar (International Consultant, Quality Education, UNICEF ESARO). We would also like to acknowledge the contributions of several education authorities in the ESA region, including the Botswana Ministry of Education and Skills Development; the Botswana Examinations Council; the Ethiopia National Educational Assessment and Examinations Agency; the Examinations Council of Lesotho; the Madagascar Ministry of Education; the Malawi Ministry of Education, Science and Technology; the Mozambique Ministry of Education; the Directorate of National Examinations and Assessment Namibia; the Director-General Department of Basic Education of South Africa; the Uganda National Examinations Board; the Examination Council of Zambia; and the Zimbabwe School Examinations Council.

The authors of this report are Tim Friedman, Ursula Schwantner, Jeaniene Spink, Naoko Tabata and Charlotte Waters, with valuable contributions from Elizabeth Cassity, Mary Kimani, Alejandra Osses, and Adeola Capel.

1 For further information about ACER’s Centre for Global Education Monitoring (GEM) visit https://www.acer.edu.au/gem.

2 The Eastern and Southern Africa (ESA) region, as under UNICEF programming, encompasses 21 countries: Angola, Botswana, Burundi, the Comoros, Eritrea, Ethiopia, Kenya, Lesotho, Madagascar, Malawi, Mozambique, Namibia, Rwanda, Somalia, South Africa, South Sudan, Swaziland, the United Republic of Tanzania (later in this report referred to as ‘Tanzania’), Uganda, Zambia and Zimbabwe.

3 Not all the interviews conducted for this study were included in the report. However, the contributions were highly valuable and the authors thank all interview partners for their time and information.

Page 10: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

viii

Foreword from the Eastern and Southern Africa Regional DirectorOver the past few decades the world’s attention has been focussed on attaining Millennium Development Goal 2 – universal access to primary education. During this period, governments and the international community have been investing on school infrastructure, training teachers, and learning materials. For UNICEF globally and across Eastern and Southern Africa, the challenge of our time is now how to sustain the momentum in access and to reinforce quality learning outcomes. There is, today, an urgent global realisation that beyond getting children in the classroom, it is imperative that they learn.

The new Sustainable Development Goal (SDG) 4, presents huge opportunities to meet this challenge through a strategic shift towards equitable quality education for all. This shift is essential. Emerging evidence shows that large numbers of children are in school, but are not learning. In 2012, the Africa Barometer report by the Centre for Universal Education at Brookings estimated that of the 97 million who enter school on time in Sub-Saharan Africa, 37 million will not learn basic skills. Thirty seven million: that is one-third of all children who go to school will reach their adolescent years unable to read, write, and/or perform basic numeracy tasks.

The analysis from Improving quality education and children’s learning outcomes and effective practices in the Eastern and Southern Africa region has similar conclusions. It reveals that as many as 40 per cent of children in school do not reach the expected basic learning benchmarks in numeracy and literacy. The new report also confirms that children from families with lower socio-economic status and whose home language is different from the language of instruction are less likely to learn.

The findings – that many children are in school, but not learning – represent a huge waste of human and financial resources. Fundamentally, the promise of education and the transformative opportunity of schooling for children, families and communities is not being fulfilled.

UNICEF believes that confronting this learning crisis, through high impact solutions, is the priority for the education community. Encouragingly, the report highlights that many countries in East and Southern Africa are promoting quality through improving learning monitoring by national, regional and international learning assessments, and through developing targeted programmes that improve teaching and learning.

Page 11: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

ix

Given this learning crisis, acceleration of these trends is essential. UNICEF will push for an increased system-wide emphasis on outcomes as opposed to inputs; improved assessments to gauge children’s learning progress; and building knowledge of the pedagogic practices that can improve learning.

There is so much potential. While highlighting challenges, this report also shows elements of national progress. UNICEF encourages countries to accelerate these developments. We are cognisant of the many challenges facing education in the region – 1 in 5 children not attending school; a demographic boom in the region that will see 70 million additional children by 2030; and continued overstretched public finances. In such critical circumstances, the winning combination of access, quality learning, and affordability is ever more crucial.

It is in this context that the report provides us with a critical baseline on quality education in every country in the region. The report assesses the learning outcomes being reached in the region, the learning assessment tools countries are deploying to generate evidence on learning, and the interventions countries are implementing to improve teaching and learning.

With this report, UNICEF and our many partners will be better equipped to support improvements in quality education for children.

Leila Gharagozloo-Pakkala Regional Director Eastern and Southern AfricaUnited Nations Children’s Fund

Page 12: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

x

Executive summary

IntroductionThe Eastern and Southern Africa (ESA) region is progressing well towards achieving important Education for All (EFA) goals, particularly with regard to increasing student enrolment in the primary years (UNESCO, 2014). Despite this achievement, there is still considerable work to be done to improve the quality of education. Primary school students in low-income sub-Saharan African countries have, on average, learned less than half of what is expected of them (Majgaard and Mingat, 2012, p.6). The gap between the learning achievements in developed economies and the learning achievements in Eastern and Southern Africa is estimated to be at least four grades (GPE, 2012, p.116).

In order to understand the major impediments to student learning in the region, the United Nations Children’s Fund (UNICEF) contracted the Australian Council for Educational Research (ACER) to take stock of and compare existing student assessments in the region, focusing on students in primary education. The terms of the contract called for ACER to study the existing assessment systems and methodologies in the region, and document how the assessment data are derived and used to inform education policy in the region. We were also asked to identify factors and practices that could help improve learning outcomes in literacy and numeracy in primary education, specifically for disadvantaged children with limited learning outcomes (LLOs).

Our study consists of three research components. The first provides an overview and comparative analysis of the existing assessments of student learning outcomes in literacy and numeracy in primary education in the region. The second considers the characteristics of children experiencing LLOs in the domains of literacy and numeracy, including trends in achievement over time. The third looks at effective country-level practices in the ESA region that could improve learning outcomes in the literacy and numeracy of disadvantaged children in primary education. Our report concludes with a macro theory of change drawing on the evidence we gathered for this report.

Page 13: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

xi

Comparative analysis of existing assessments in the ESA regionOur study covered 23 countries and identified 58 existing assessment systems that evaluate student learning outcomes in literacy and numeracy in primary education. Of these, EGRA and EGMA are the most prevalent programmes (36 per cent) followed by regional (29 per cent), national (28 per cent) and international (7 per cent) assessments. Most of these assessments target lower-primary students (grade 2 or 3) and most commonly focus on literacy and numeracy assessments. While these commonly use mean scores for the cognitive results or frequency analysis for the contextual data, item response theory (IRT) methods, which can scale data and meaningfully compare results across grades, contexts and time, is less prevalent. Contextual data linked to the cognitive results is available for many, but not all of the assessments, making it difficult to draw policy-related findings from the results. Not captured in our stock-taking was whether, and to what extent, student assessment data is linked to Education Management Information System (EMIS) data on a systems level. Access to the data is a challenge, and our study found that while the results of 71 per cent of the assessments were published, we were unable to obtain the original datasets for the remainder.

Students experiencing LLOs and trends over timeThe objective of the study was to investigate the characteristics of children experiencing LLOs and trends in their performance over time. In 32 out of the 58 assessments, competency-level benchmarks are defined. However, each of the assessments (PASEC, UWEZO, TIMSS and prePIRLS) used different metrics for literacy and numeracy. Therefore, there is no shared benchmark among them that could be used to construct a common definition of ‘limited learning outcomes.’ Instead, we employed the benchmarks each assessment used to gauge literacy and numeracy. Given the differences in these metrics across data sets, countries and year level, the percentage of students identified with LLOs is wide ranging, from 18 per cent to 40 per cent in numeracy, and 18 per cent to 50 per cent for literacy.

In international and regional assessments for the ESA region, average test scores for literacy and numeracy are generally low, with a considerable percentage of students failing to have acquired basic skills in reading and mathematics. In Lesotho, for example, by Grade 6 only 48 per cent of students have achieved basic reading skills. In Zambia and Malawi, only 27 per cent of students achieved this level. In mathematics, the proportion of primary students with basic skills is considerably lower, with fewer than 50 per cent of students in Grade 6 achieving the minimum level in two-thirds of the countries (UNESCO, 2014, p.35).

Page 14: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

xii

Consistent with other studies conducted in the region, individual and family characteristics of students, such as gender, age, language spoken at home, socio-economic factors, preschool attendance, activities prior to attending school, engagement and out-of-school tuition, were all found to be associated with the likelihood that a student would experience LLOs in literacy or numeracy. In addition, the type of school, the location of the school and the resourcing available to the school that the student attends also contributes to the likelihood that the student would be experiencing LLOs.

Our study showed that, in general, males are more likely to experience LLOs in literacy than females. In Botswana, for example, males are almost three times more likely than females to experience LLOs. While on the whole girls outperformed boys in reading literacy, rural boys outperformed rural girls on almost all tasks, but in urban schools the opposite was the case (RTI, 2010, p.37). However, in mathematics, on the whole, boys outperformed girls.

The age of the student relative to the school entry grade is an equally important factor. While the relations between age and performance are complex and may be determined by different socio-economic and demographic factors, Hungi et al (2014) found that in developed countries older students generally outperform their younger colleagues, while in developing countries, especially in Africa, younger students perform better than older students. Our study supported this insight. Across the region, we found that students who were relatively younger than the median class age tended to be less likely to be experiencing LLOs. For instance, Grade 6 students in Botswana who were 12 years or less were almost three times less likely to be experiencing LLOs in mathematics than students 12 years of age or older.

The language spoken at home also has a strong impact on learning outcomes. In countries where the official language is not the most common language spoken at home, there are strong links between language and marginalisation in education. Evidence from PASEC and SACMEQ show a strong link between home language and the language of instruction in determining test scores (Fehrler and Michaelowa, 2009, UNESCO 2010, p. 154; Garrouste, 2011). While low language skills are commonly viewed as a critical factor in literacy assessments, evidence from Namibia using SACMEQ results suggests that they also make a significant contribution to low performance levels in mathematics (Garrouste, 2011, p. 231).

Furthermore, the socio-economic status of students is a strong predictor of achievement. Our study found that students from lower socio-economic backgrounds were more likely to experience LLOs across all countries examined in both literacy and numeracy. We found this to be the case across all countries, despite the different measures used to assess socio-economic status. Among other factors, household possessions, including the availability of reading materials and books in the home, and levels of parental education, were also found to be associated with LLOs. Furthermore, the amount of time that students spent working was negatively associated with achievement data (ACER and ZIMSEC, 2015).

Page 15: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

xiii

Students who had limited exposure to a learning environment in the home were disadvantaged in performance at school. A profound impact on learning outcomes was evident in homes where students were involved in reading and storytelling, were not required to work outside of the home, started school early and were provided with adequate support in school by their teachers to build foundational literacy skills, and attended schools that had relevant and engaging reading and learning materials in buildings with clean water and sanitation.

Our study also considered student performance trends over time. While much of Eastern and Southern Africa has experienced a marked improvement in student enrolment, student performance has changed little over time. Indeed, on the whole, student performance has stagnated or worsened. It must be noted, however, that given the limited comparable data available for our study, drawing any general conclusions for the whole ESA region is problematic. Instead, conclusions based on improvement or decline in student abilities should only be considered at the national level.

Effective country-level practicesAs part of our study, a number of strategies were identified that contributed to the success of country-level practices in the ESA region. We found that while there is a considerable body of literature for the ESA region on practices that increase quantitative aspects of education quality, such as access, enrolment and retention rates, we found few reports on programmes to improve student learning outcomes.

Altogether, we identified 10 programmes in 7 out of the 21 countries identified as having had an impact on student learning in early grade literacy/numeracy. These comprised of teacher training on reading/mathematics instruction; provision of teaching-learning materials; production of reading materials in the local language; and community- and home-based reading activities which were linked to effective ECD programmes. Additionally, programmes that aimed at a whole-school improvement strategy were shown to have a significant impact on learning outcomes.

Broadly, these successful programmes use a three-pronged approach comprised of assessments, teacher training and community support for children’s reading. They provide a combination of well-targeted instructional interventions, regular professional development of teachers through school-level training and coaching, with regular system-level follow up and support, matched with sufficient relevant and quality classroom materials, and more literacy and numeracy instructional time. Students having a reading buddy to support their learning to read had a positive effect on learning outcomes in a number of locations. Overall, our study found that key strategies for improving learning outcomes of disadvantaged children share two common features: a holistic and coherent approach and consistent and continuous support over time.

Page 16: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

xiv

Output

Assessment

• Purpose: System level monitoring• Target population: Early, multiple grades,

inclusion of out-of-school children• Domains: Literacy and numeracy; Contexts• Current state and progress: Performance and

contexts• Dissemination strategy: Findings and

products, including datasets

Analysis

Policy analysis and interpretation for strategic decision-making and policy development towards improving student performance• Student performance levels• Association with context

factors at the different levels• Trends over time

Action

• Target interventions and strategies

• Integrated into a holistic programme design, involving a wide range of stakeholders

• Impact evaluation including measuremt of performance

Out

com

ePr

oces

sIn

put

School level

Evidence-based monitoring and intervention cycle

Classroom level Student level

A macro theory of changeBased on the evidence collected for our study, we developed a macro theory of change aiming at monitoring and improving literacy and numeracy performance of children in primary education in the region. The theory combines the main findings of each stage of the study and highlights the ‘3As’ approach for long-term and sustainable change in student performance: assessment, analysis and action. Critical to this framework is the dissemination of the assessment results in order to initiate action by governments, communities, parents and development partners (see Figure 1).

Figure 1. A macro theory of change. An evidence-based monitoring and intervention cycle as premise for change: assessment, analysis, action

Page 17: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

xv

ConclusionsOur study analysed existing student assessments, data resulting from some of these assessments and effective country-level programmes in the ESA region.

We found that programmes targeted toward early learning in disadvantaged communities made the biggest impact. The level of exposure students have to a learning environment, either through home or school, in their early years and the presence of holistic, system-level educational programmes that support quality early learning programmes in disadvantaged communities made a significant contribution to improved student performance.

While many researchers have studied the factors that contribute to student school attendance, fewer have explored what helps improve student learning. In financially constrained environments, resources should be targeted at understanding the gaps in the system with regard to student performance and supporting effective interventions.

In order to do this, policymakers must consider how learning assessment programmes that provide quality comparable data across population subsets, between grades and over time, can be integrated from the outset into education reform agendas.

Page 18: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

xvi

Page 19: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

1

Introduction

Context for primary education in the ESA regionThe Eastern and Southern Africa (ESA) region provides many challenges for primary education. The main issues are poverty, health and social issues, and fragile political/economic circumstances—particularly in Burundi, the Comoros, Eritrea, Madagascar, Somalia and South Sudan (UNESCO, 2014, p. 12).4,5

Overall, the region is making progress toward achieving Education for All (EFA) goals (UNESCO, 2014). The EFA Development Index shows a considerable improvement between 2000 and 2012, with increases in the primary education completion rate, the literacy rate for those 15 years and older, and the primary enrolment of girls and boys (UNESCO, 2014, p. 16; 2015a).

However, very little attention has been paid to the quality of education, or to progress in student performance. While improvements in education can be assessed by quantitative aspects, such as access to education, enrolment and completion rates, or gender parity, such metrics don’t assess education quality or, even more importantly, the improvement of student performance.

In most sub-Saharan African countries, average test scores in international/regional assessments of student learning are low. Primary school students in low-income, sub-Saharan African countries have, on average, learned less than half of what is expected of them (Majgaard and Mingat, 2012, p. 6). A comparison of high- and low-income countries using data from the Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ) and the Progress in International Reading Literacy Study (PIRLS) reveals large differences between the poorer economies in SACMEQ (for example, Lesotho, Malawi and Zambia) and the mainly high-income economies in PIRLS.6,7

The gap between the learning achievements in developed economies and the learning achievements in East and Southern Africa is estimated to be at least four grades (GPE, 2012, p. 116).8

4 Most countries in the region (16 out of 21) are classified as least developed countries under the United Nations (UN) definition <http://unohrlls.org/about-ldcs/>. Kenya is among low income countries; Swaziland forms part of the lower-middle income country group; and Botswana, Namibia and South Africa are among upper-middle income countries (according to DAC-ODA–recipient status (OECD Development Assistance Committee, Official Development Assistance). Source: DAC. List of ODA Recipients effective as at 1 January 2015 for reporting on 2014, 2015 and 2016 flows available at <http://www.oecd.org/dac/stats/daclist.htm>. However, there are also major disparities within these middle income countries; see <http://www.unicef.org/esaro/theregion_old.html>.

5 <http://www.worldbank.org/en/topic/fragilityconflictviolence> viewed 5 May 2015; in 2014, Malawi was also categorised as a fragile state.

6 ESAR countries participating in SACMEQ are Botswana, Kenya, Lesotho, Malawi, Mozambique, Namibia, South Africa, Swaziland, Tanzania, Uganda, Zambia and Zimbabwe.

7 Ross (2009) scaled PIRLS and SACMEQ data using common items and a Rasch model to put the test results on the same scale, based on anchor items and test equating, thus making the data comparable across economies (GPE, 2012, p. 116).

8 Considering that PIRLS measures learning outcomes of primary school students in Grade 4, and SACMEQ in Grade 6, this potentially corresponds to two more grades of schooling (GPE, 2012, p. 116).

Page 20: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

2

For this study we reviewed the assessment systems in the region that measure learning outcomes in literacy and numeracy of children in primary education, and undertook a comparative analysis of these assessments. We analysed existing data from the region to develop a portrait of students who are experiencing LLOs in literacy and numeracy, and examined trends in performance over time. We investigated education interventions that helped to improve students’ learning outcomes in literacy and numeracy, especially for disadvantaged children, to identify strategies to further improve student performance in the region. Using these elements, we developed a theory of change that sets out processes for improved learning outcomes.

Conceptual frameworkPolicies aiming at monitoring and improving educational progress must be based on data on learning outcomes and the factors related to the outcomes. ‘Learning outcomes’, as defined in this study, refer to student performance in a cognitive domain, in particular literacy and numeracy, as measured in the different assessments discussed in Chapter 1. The conceptual framework we used sets out the contextual factors associated with student performance, which provides a basis for the definition of factors to be considered in an assessment and support for the decision on which level of the education system that innovations are most needed.

The conceptual framework combines the main characteristics of two models that have both been highly influential in the field: the ‘Input-Process-Outcome model’ (Purves, 1987) and the ‘dynamic’ model of educational effectiveness (Creemers and Kyriakides, 2008; Kyriakides and Creemers, 2006).9,10 In these models, input, process and outcome factors operate at the different levels of an education system, i.e., system-level (national, regional and community); as well as school-, classroom- and student level. Input factors mainly refer to structural conditions, for example, economic wealth or community infrastructure (system level); school type (public, private); school location (rural, urban); and school resources (school level); class-size and teaching resources (classroom level); as well as individual (e.g. gender, age, grade) and family factors, such as socio-economic status and parental education (student level). Process factors mainly concern policies and strategies, and range from national curriculum and teacher education (system level), through management and leadership (school level), quality of instruction (classroom level), to the actual learning process (student level). Outcome factors—i.e., performance in literacy and numeracy—are measured at the student level and can be aggregated at the system level (national, regional and community level), school level, or classroom level.11 Relations between the factors at the different levels are complex and have not yet been fully investigated. In the ‘dynamic model,’ educational outcomes can become inputs for further development. For example, domain-related attitudes and beliefs can be considered as outcomes of schooling or as inputs affecting student behaviour (Creemers and Kyriakides, 2008; Kyriakides and Creemers, 2006; OECD, 2013).

9 The basic structure of the Input-Process-Outcome model was developed in the 1960s for the International Association for the Evaluation of Educational Achievement (IEA) (Purves, 1987).

10 Other recent examples where the main characteristics of these two models are combined are the contextual framework for OECD Programme for International Student Assessment (PISA) (OECD, 2013), or the input-process-output model for data in the student learning environment (Biggs, 1999; Biggs and Moore, 1993).

11 Other outcome factors at the system, school, and classroom level relate to aggregated pass rates and graduation rates, enrolment and retention rates; at the student level, domain-related and general school-related attitudes, beliefs and motivation are considered important outcomes of schooling.

Page 21: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

3

From an assessment-design perspective – regarding data collection of input, process and outcome factors on the four levels – it may not always be possible, nor feasible, to represent the full framework. The main source for information on input and process variables at the student and school level are questionnaires, sometimes combined with qualitative methods, such as classroom or school observation. Data on input and process factors on the system level are collected at the system level, using tools such as the Education Management Information System (EMIS). Outcome factors, i.e., performance in literacy and numeracy, are measured at student level using cognitive tests.

The comprehensive conceptual framework allows the main effectiveness factors to be considered in an assessment. The framework also has potential to inform the development of theories of change: Input, process and outcome factors can be defined and analysed for their stability and suitability for change. The framework can also inform decisions about the level at which innovations need to be implemented to ensure their maximum effectiveness. For example, determining which input, process and outcome factors need to be addressed by policy decisions at the system and school level, and which can be best shaped by school development activities at the school and classroom level.

Page 22: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

44

Page 23: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

5

1. Stock-taking and comparative analysis of existing assessments in the ESA region

A wide range of assessment systems is in use in the ESA region.12 For a better understanding of these assessments, the available data and how they support policies aiming at monitoring and improving learning outcomes, we started with an overview of the systems. The focus was on assessments that measure student performance in literacy and numeracy in primary education. Following the overview, we analysed and compared the assessments, highlighting the strengths and limitations of the different methodological approaches. Case studies for Zimbabwe and Rwanda offer an in-depth understanding of specific measurement practices implemented by the governments of these countries (see Appendix II). The detailed results of the stock-taking are presented in the main stock-taking table (see Appendix VI).

1.1 Overview of assessments in the stock-takingThe stock-taking considered assessments that were implemented from 2007 up to 2014/2015 in the ESA region.13 Overall, we identified 58 assessments that assess student performance in literacy and numeracy in primary education in the ESA region.

The assessments can be grouped into four types:• internationalassessments• regionalassessments• nationalassessments• EarlyGradeReadingAssessment(EGRA)/EarlyGradeMathematicsAssessment(EGMA).

Of the 58 assessments, 4 (7 per cent) are international and 17 (29 per cent) are regional. Implementation of EGRA and EGMA accounts for 21 (36 per cent), which is the largest group among the four types, and 16 (28 per cent) are national (see Figure 2).14

12 National examinations are not included in this study. Examinations don’t share the same purpose as learning assessments, which leads to different choices about sampling, data analysis, reporting (e.g. on examination pass rates and/or achieved grades, i.e. pass levels) etc.

13 The stock-taking of assessments took place between October 2014 and March 2015 and considers assessments that have been implemented since 2007. Assessments conducted outside this period were not considered. In the case of reoccurring assessments, the inception date of an assessment can be before 2007.

14 All percentages are rounded.

Page 24: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

6

Figure 2. Assessments from the stock-taking by type of assessment

Overall, 20 countries have implemented one or more assessments (see Table 16 in Appendix II). Six countries have used several types of assessments (i.e., Kenya, Malawi, Mozambique, South Africa, Uganda and Zambia), while another six countries show limited assessment activities with one or no assessment implementation in recent years (i.e., Angola, Comoros, Eritrea, Madagascar, South Sudan and Swaziland) (see Table 17 in Appendix II).15

Participation in international assessmentsTwo countries in the region have participated in international assessments of the IEA (International Association for the Evaluation of Educational Achievement): TIMSS (Trends in International Mathematics and Science Study), PIRLS (Progress in International Reading Literacy Study) and prePIRLS (see Table 1).16

Table 1. Participation in international assessments in ESAR countries

Country Assessment

Botswana TIMSS in 2007, 2011PIRLS in 2011, prePIRLS in 2011

South Africa TIMSS in 2011PIRLS in 2006, 2011; prePIRLS in 2011

Since 1995, TIMSS has been measuring trends in mathematics and science achievement at Grade 4 and Grade 8. Where it was expected that students in Grades 4 and 8 would find TIMSS assessments too difficult, IEA encouraged countries to test children in higher grades.

15 Four of these countries are characterised as fragile states (Comoros, Eritrea, Madagascar and South Sudan), which might be one reason for the limited assessment activities.

16 Zambia will participate in the pilot for PISA for Development in the coming years. This OECD/World Bank-led assessment aims to enhance PISA’s survey instruments to make them more relevant for contexts found in developing countries, but still permit the reporting of results on the standard PISA scales. For more information about PISA for Development, see <http://www.oecd.org/pisa/aboutpisa/pisafordevelopment.htm>.

International

Regional

4(7%)

17(29%)

16(28%)

21(36%)

EGRA/EGMA

National

Page 25: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

7

Thus, in TIMSS 2011 Botswana tested children in Grade 6 with the Grade 4 assessment, and children in Grade 9 with the Grade 8 assessment.17 South Africa took part in TIMSS in 2011 for the first time, testing Grade 9 children with the TIMSS Grade 8 assessment.18

First introduced in 2001, PIRLS measures trends in reading comprehension at Grade 4. In 2011, PIRLS was expanded to include prePIRLS, which is a less difficult and shorter version of PIRLS. PrePIRLS assesses the basic reading skills at the end of the primary school cycle that are a prerequisite for success in PIRLS (Mullis and Martin, 2013, p. 4). Thus prePIRLS permits learners from lower achieving countries to be measured more precisely than is the case using more difficult and longer assessments, such as PIRLS (Howie, Staden, Tshele, Dowse, and Zimmerman, 2012, p. 22).19

Participation in regional assessmentsFourteen countries in the region have participated in one or more of the regional assessments of SACMEQ (Southern Africa Consortium for Monitoring Educational Quality), PASEC (Programme for the Analysis of the Educational Systems of CONFEMEN Countries) and Uwezo (see Table 2).

Table 2. Participation in regional assessments in ESAR countries

Country Assessment

Botswana SACMEQ II and III

Burundi PASEC in 2008–2009

Comoros PASEC in 2008–2009

Kenya SACMEQ I, II and III Uwezo

Lesotho SACMEQ II and III

Malawi SACMEQ I, II and III

Mozambique SACMEQ II and III

Namibia SACMEQ I, II and III

South Africa SACMEQ II and III

Swaziland SACMEQ II and III

Tanzania SACMEQ II and III20 Uwezo

Uganda SACMEQ II and III Uwezo

Zambia SACMEQ I, II and III

Zimbabwe SACMEQ I and III

17 In TIMSS 2007, Botswana participated with Grade 8.

18 TIMSS 2015 introduces a new, less difficult mathematics assessment called TIMSS Numeracy for countries where most children are still developing fundamental mathematics skills. TIMSS Numeracy assesses fundamental mathematical knowledge, procedures, and problem-solving strategies at the end of the primary school cycle that are prerequisites for success on TIMSS (I. V.S. Mullis and Martin, 2013, pp. 7–8).

19 PrePIRLS and TIMSS Numeracy intend to be responsive to the needs of the global education community and efforts to work towards universal learning for all children. Depending on a country’s educational development, prePIRLS and TIMSS Numeracy can be given at Grade 4, 5 or 6 (I. V.S. Mullis and Martin, 2013, pp. 7–8).

20 Only Zanzibar of Tanzania participated in SACMEQ I.

Page 26: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

8

SACMEQ carries out large-scale, cross-national research studies in the Southern and Eastern Africa region. It assesses the performance levels of Grade 6 students and teachers in literacy and numeracy (ACER, 2015).

PASEC is an assessment programme for countries that have a link to the French-speaking community. It was established in 1991 by the Conference of Ministers of Education of French-speaking Countries (CONFEMEN), and it assesses Grade 2 and Grade 5 students in reading and mathematics (CONFEMEN, n.d.).

Uwezo measures the literacy and numeracy competencies of school-aged children in Kenya, Tanzania and Uganda. Its goal is to obtain data to inform improvements in educational policy and practice (Twaweza, n.d.).

Implementation of national assessmentsIn 13 countries in the region, national assessments have been implemented by government or parastatal bodies (see Table 3). In six countries (Eritrea, Ethiopia, Malawi, Rwanda, Somalia and Zimbabwe), an international development partner was directly involved in supporting the implementation.

Table 3. Implementation of national assessments by government/parastatal bodies in ESAR countries

Country Assessment NotesEritrea Monitoring Learning

Achievement (MLA)Conducted irregularly by ministry and UNICEFMost recent reported implementation in 2008

Ethiopia National Learning Assessment (NLA)

Conducted on a 3–4-year cycle by national assessment/examination body and USAIDMost recent reported implementation in 2010–2011

Kenya National Assessment System for Monitoring Learning Outcomes (NASMLA)

Conducted at an unknown frequency by national assessment/examination bodyMost recent reported implementation in 2010

Lesotho Lesotho National Assessment of Educational Progress (LNAEP)

Conducted on a 1–2-year cycle by national assessment/examination bodyMost recent reported implementation in 2010

Malawi* Assessing Learner Achievement

Conducted by ministry or national assessment/examination body

Malawi Monitoring Learning Achievement (MLA)

Conducted on an intended 3-year cycle by ministry and UNICEFFirst implementation in 2012

Mozambique* National Assessment Conducted by ministry or national assessment/examination body

Namibia* National Standardised Achievement Test (NSAT)

Conducted biannually by national assessment/examination body

Page 27: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

9

Rwanda Learning Achievement in Rwandan Schools (LARS)

Conducted on a 3-year cycle by national assessment/examination body, UNICEF and UNESCOFirst implementation in 2011, second in 2014

Somalia* Monitoring Learning Achievement (MLA)

Conducted irregularly by ministry and UNICEF

South Africa Annual National Assessment

Conducted annually by the ministryMost recent reported implementation in 2014

South Africa* National Assessment of Learner Achievement (NALA)

Conducted by ministry or national assessment/examination body

Uganda National Assessment of Progress in Education (NAPE)

Conducted on a 1–3-year cycle by national assessment/examination bodyMost recent reported implementation in 2010

Zambia National Assessment of Learning Achievement (NALA)

Conducted on a 2-year cycle by the national assessment/examination bodyMost recently reported implementation in 2014

Zimbabwe Zimbabwe Early Learning Assessment (ZELA)

Conducted in 2012–2015 by the national assessment/examination body and UNICEF.

*Note: These assessments are mentioned in the EFA Global Monitoring Report (UNESCO 2008, p. 2; 2015b), but only limited information was available for the main stock-taking table (see Table 24).

Implementation of EGRA/EGMAThe Early Grade Reading Assessment (EGRA) and Early Grade Mathematics Assessment (EGMA) measure the most basic foundation skills for literacy and numeracy acquisition in the early grades. These assessments were developed by the Research Triangle Institute (RTI), with funding provided by the United States Agency for International Development (USAID) and the World Bank (Gove and Wetterberg, 2011). EGRA/EGMA was designed to serve as a sample-based national or system-level diagnostic measure that would reveal gaps in reading competencies among students and inform education ministries and development partners about system needs for improving the professional development of teachers and pre-service programmes (Gove and Wetterberg, 2011). However, EGRA/EGMA has been used to address a wider range of assessment needs, including impact (programme) evaluations.

Although EGRA and EGMA have been used in many developing countries, including ESA and other regions, they are not grouped as international assessments in this report. This is because EGRA and EGMA have a common approach grounded on core foundation skills. At the same time, they can be adapted for use in individual countries and languages. This approach differs from international assessments where implementing countries are required to use an internationally agreed model (ACER, 2014a). The adaptability of EGRA and EGMA also means that direct comparison of the results is difficult due to differences in language structure and complexity. For this reason, developers of these assessment tools generally advise against comparing subtask results across countries and languages (Gove and Wetterberg, 2011).

Page 28: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

10

Twelve countries in the region have completed at least one implementation of EGRA/EGMA for either system-level diagnostic, monitoring or programme evaluation (see Table 4).21

The EGRA/EGMA tools are equally used for both purposes: in nine cases, EGRA and/or EGMA were implemented for system-level diagnostics, and in 10 cases they were used for programme evaluation.22 In two cases, EGRA and/or EGMA were implemented for the purpose of system-level monitoring.

Table 4. Implementations of EGRA/EGMA in ESAR countries23

Country Assessment Year Purpose

Angola EGRA 2010 System-level diagnostic

Burundi EGRA 2011 System level diagnostic

Ethiopia EGRA 2010 System-level diagnostic

2010–2012 Programme evaluation (Literacy Boost initiative)

2011 System-level diagnostic

Kenya EGRA 2007–2008

Programme evaluation (EMACK initiative)

EGRA, EGMA 2012–2013 Programme evaluation (PRIMR initiative)

Madagascar EGRA 2009 System-level diagnostic

Malawi EGMA 2010 Programme evaluation (baseline for Malawi Teacher Professional Development Support initiative)

EGRA 2009–2010 Programme evaluation (Literacy Boost initiative)

2010–2012 System-level monitoring

Mozambique EGRA 2010–2011 Programme evaluation (Literacy Boost initiative)

2013 Programme evaluation (USAID/Aprender a Ler (APAL) initiative)

Rwanda EGRA, EGMA 2011 System-level diagnostic

Somalia EGRA 2013–2014 Programme evaluation

Tanzania EGRA, EGMA 2013 System-level monitoring

Uganda EGRA 2009 System-level diagnostic

2010, 2012 Programme evaluation (Literacy Boost initiative)

Zambia EGRA, EGMA 2011 Pilot for system-level diagnostic

EGRA 2012 Programme evaluation

2014 System-level diagnostic (as part of the National Assessment Survey)

21 Further details of each EGRA/EGMA implementation can be found in the main stock-taking table in Appendix IV.

22 Assessments with a system-level diagnostic purpose are implemented to get a snapshot of learning levels at the system level (usually national, and in a one-off administration), as compared to assessments with a system-level monitoring purpose which have recurrent administrations to monitor learning levels at the system level.

23 We acknowledge that there are current EGRA/EGMA implementations which are not considered in our stock-taking and that took place between October 2014 and March 2015. Assessments conducted outside this period were not considered. We wish to thank UNICEF CO Rwanda and Tanzania for their input about a School Quality Assessment in three regions using EGRA/EGMA methodology supported by UNICEF which was completed in 2015 focusing on providing baselines in the three UNICEF targeted regions.

Page 29: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

11

1.2 Comparative analysis of assessments Our comparative analysis of the assessments in the region highlights the strengths and limitations of the different methodological approaches, and discusses when the different approaches might be more or less appropriate. In some instances, best practices in terms of methodological approaches are discussed, even if they do not feature in the majority of the assessments.

The assessments are analysed and compared based on the main eight elements of the stock-taking framework: purpose of the assessment; target population (grade-based, e.g., Grade 4, or age-based, e.g., 10-year-old students); sampling design and methodology; cognitive domains (assessment framework and major domains, i.e., literacy and numeracy); contextual instruments (types, e.g., student questionnaire, and key factors, e.g., gender, grade level, parental education); test administration (approaches for data collection); data analysis (key analytical approaches), and reporting and dissemination products.24

1.2.1 PurposeThere are three types of assessment purposes:• System-levelmonitoring:Recurrentadministrationstomonitorlearninglevelsatthesystem

level (usually national).• System-level diagnostic:One-off administration for a snapshot of learning levels at the

system level (usually national).• Programmeevaluation:Smallerscaleadministrationtoevaluatetheimpactofaprogramme

to improve learning outcomes, with treatment and control groups, and usually involving baseline, mid-line and end-line.

The majority of the assessments in the region (38, or 66 per cent) define their purpose at system-level monitoring. Ten assessments (17 per cent) aim at evaluating programmes. Nine assessments (16 per cent) measure learning outcomes for a system-level diagnostic purpose. One assessment, NASMLA in Kenya, has a dual purpose of system-level diagnostic and monitoring (see Table 5).

Table 5. Purpose of the assessments from stock-taking

Assessment purpose Number of assessments

System-level monitoring 38

Programme evaluation 10

System-level diagnostic 9

System-level diagnostic and monitoring 1

Total 58

24 Details about the stock-taking framework and other methodological aspects of the stock-taking and comparative analysis are provided in Appendix I.

Page 30: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

12

1.2.2 Target populationThe majority of the assessments in the region (52, or 90 per cent) have a grade-based target population. An age-based population is targeted in three Uwezo assessments. There was no information available about the target populations for another three assessments.

For the assessments with grade-based populations, the target populations range from early primary grades to end-of-primary (see Table 6). Of the assessments, 44 target students at lower-primary level (i.e., Grades 1, 2 or 3), 38 target Grades 4, 5 or 6, and 10 target Grades 7, 8 or 9. Nearly half (47 per cent) target multiple grades.

Table 6. Target population in the assessments from stock-taking

Target grade Number of assessments25

Grade 1 4

Grade 2 20

Grade 3 20

Grade 4 12

Grade 5 8

Grade 6 1826

Grade 7 4

Grade 8 2

Grade 9 4

Age-based 327

Unknown 3

The choice of the grade level for the target population depends on a number of factors. Usually a particular grade is chosen because of recent or planned policy reform, or because it is considered a pivotal point in children’s learning trajectories. For example, EGRA and EGMA typically target early primary grades because, as the assessment names indicate (‘EG’ for Early Grade), they measure the most basic foundation skills for literacy and numeracy acquisition in the early grades (Gove and Wetterberg, 2011).

If the assessment’s purpose is programme evaluation rather than system-level diagnostic/monitoring, the grade-based target populations are defined as only children in the treatment or control groups.

For assessments that monitor system-level learning outcomes and report on trends over time, a complete discussion of population-related matters is particularly important, as stakeholders must be informed of any differences that may affect the comparability of results from one year to the next.

25 The total number of assessments in this table is more than 58, as 27 of the 58 assessments target multiple grades.

26 12 assessments are SACMEQ studies

27 Uwezo in Kenya (6–16 years), Tanzania (6–16 years) and Uganda (7–16 years)

Page 31: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

13

Of the assessments in our stock-taking, the documentation from SACMEQ (Paul M. Wasanga, A. Ogle, and Wambua, 2012), and the IEA studies PIRLS and TIMSS (M.O. Martin and Mullis, 2012) are good examples of a complete discussion of population-related issues, as are the National Learning Assessment in Ethiopia (Ministry of Education of Ethiopia [FDRE], 2008), and NASMLA in Kenya (P.M. Wasanga, Ogle, and Wambua, 2010).

1.2.3 SamplingIn general, the assessments we reviewed employ scientific sampling procedures. The sampling process for assessments with a system-level monitoring or diagnostic purpose is usually multi-step, involving the sampling of schools, children and, sometimes, geographical units.

One exception is South Africa’s Annual National Assessment, in which all children in public and state-subsidised independent schools are assessed. This approach may be more accurate, but it is also more time-consuming and costly than testing a sample.

Uwezo, another exception to the general pattern, samples households instead of schools. This may be necessary where a complete and up-to-date list of schools does not exist, but it could make it more difficult to explore the relationships between learning outcomes and school-level factors.28

In four assessments, children were sampled through a central body (e.g., national centre) prior to testing. These assessments included TIMSS, PIRLS (M.O. Martin and Mullis, 2012), LARS in Rwanda (Rwanda Education Board, 2012) and MLA in Malawi (Ministry of Education, Science and Technology of Malawi, 2014). The success of this approach depends upon the availability of complete and up-to-date lists of students before the test is being administered.

1.2.4 Cognitive domains

Assessment frameworkAn assessment framework is intended to guide test development and help interested stakeholders understand the content and scope of the assessment. In general, a framework should support and ensure consistency of test development and provide a common language for discussing the assessment. An assessment framework should include:• adefinitionoftheconstructsthatarebeingmeasured;• a discussion of skills/knowledge that are tested tomeasure the constructs, aswell as

provision of a rationale for any omissions of skills/knowledge that one might expect to be tested when the stated constructs are being measured;

• adiscussionofanyalignmentoftests(e.g.,toaparticulargradelevel);• specificationsoftaskcontent(i.e.,numberorproportionoftaskspercontentarea);• specificationsoftaskformat(e.g.,multiplechoice,free-response);• adiscussionofscoring;and• anoutlineofhowtheresultsarereported.

28 Even though Uwezo’s purpose is to monitor children’s learning at the system level – like ASER in India, on which its sampling methodology is based – it eschews a more traditional school-based approach to sampling. This is partly because of the difficulty of obtaining a complete and up-to-date list of schools, but also because this approach cannot yield data representative of the entire population of school-aged children in a context where the percentages of school-aged children either not enrolled in school or not attending school regularly are high enough to affect the representativeness of an in-school sample.

Page 32: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

14

If the assessment’s purpose is system-level monitoring, and it reports on trends over time, then having an assessment framework is particularly important because it ensures consistency in test development from one assessment cycle to the next.

Comprehensive assessment frameworks were publicly available for three assessments. These include the IEA studies PIRLS (I.V.S. Mullis, Martin, Kennedy, Trong, and Sainsbury, 2009) and TIMSS (I.V.S. Mullis, Martin, Ruddock, O’Sullivan, and Preuschoff, 2009), and the Annual National Assessment in South Africa (Department of Basic Education, Republic of South Africa, 2014).29 PASEC is planning a new methodological framework for the next assessment cycle.

Literacy and numeracyOne-third of the assessments in the region (19, or 33 per cent) assessed both literacy and numeracy.30,31 Almost a third (17, or 29 per cent) focused on literacy as the only domain. Twelve assessments (21 per cent, all of which were SACMEQ studies) measured student performance in literacy, numeracy and health knowledge. These three combinations of domains (i.e., literacy and numeracy; literacy only; and literacy, numeracy and health knowledge) constitute the majority of the assessments in our stock-taking, totalling 48 (83 per cent). The rest of the assessments had various combinations of domains (see Table 7).

29 Assessment frameworks for PIRLS, TIMSS and ANA in South Africa are publicly available. Information about the development of an assessment framework for the National Assessment of Learning Achievement in Zambia (UNICEF Zambia Country Office, 2015) was obtained from UNICEF CO in Zambia, but the assessment framework was not publicly available. This may also apply to other assessments in the stock-taking.

30 Domains related to literacy are referred to differently in different assessments. The variations include ‘Reading’, ‘Mother tongue’, ‘Language’, ‘English’, or the names of particular local languages with slightly different scope of assessment. It is referred as ‘Literacy’ in this report to avoid confusion. Please see Table 24 for individual cases.

31 ‘Numeracy’ is sometimes referred as ‘mathematics’ as a domain with slightly different scope of assessment. It is referred to as ‘mathematics’ in this report to avoid confusion. Please see Table 24 for individual cases.

Page 33: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

15

Table 7. Cognitive domains in assessments from the stock-taking

Domains Number of assessments

Notes

Literacy and numeracy 19Literacy 17 Including PIRLS studiesLiteracy, numeracy and health knowledge 12 All are SACMEQ studiesNumeracy and science 2 Both are TIMSS studiesLiteracy, numeracy and environmental science (Grade 4)Literacy, numeracy, biology, chemistry and physics (Grade 8)

1 NLA in Ethiopia

Literacy, numeracy and life skills 1 Assessing Learner Achievement in Malawi

Literacy, numeracy and life skills (Grade 5)Literacy, numeracy and environmental sciences (Grade 9)

1 National Assessment of Learning Achievement in Zambia

Numeracy 1 EGMA in MalawiLiteracy and numeracy (Grade 5)Literacy, numeracy and natural science (Grade 7)

1 NSAT in Namibia

Literacy and numeracy (Grade 4)Literacy, numeracy and science (Grade 7) 1 MLA in SomaliaLiteracy, numeracy and natural science 1 NALA in South AfricaLiteracy and one numeracy sub-task 1 EGRA in UgandaTotal 58

Out of the 58 assessments in the stock-taking, 23 (40 per cent) tested in multiple languages. In this respect, South Africa is the most notable example, as its Annual National Assessment covers English, Afrikaans and nine local languages (Department of Basic Education, Republic of South Africa, 2014).

1.2.5 Contextual instruments

Type of contextual instrumentsAlmost all of the assessments collect contextual data of some kind through student, teacher and school questionnaires. EGRA and EGMA are often administered using a contextual instrument called the Snapshot of School Management Effectiveness (SSME). The SSME collects information through student and teacher questionnaires, as well as through classroom observation and classroom and school inventories. Uwezo also collects information through a similar technique of observation and inventory. In addition, TIMSS and PIRLS include a curriculum questionnaire completed by the national research centre in each participating country. Furthermore, PIRLS (M.O. Martin and Mullis, 2012), EGRA in Angola (Ministry of Education of Angola, World Bank, and Russia Education Aid for Development Programme [READ], 2011), MLA in Eritrea (UNICEF Eritrea, n.d.) and LARS in Rwanda (Rwanda Education Board, 2012) include a parent

Page 34: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

16

questionnaire. Data on a system level are not directly collected in the assessments, with the exception of the curriculum questionnaire used in PIRLS and TIMSS. However, countries with an Education Management Information System (EMIS) can derive data on a system level, and link it to schools that were observed in the assessment.32

In assessments undertaken with limited funds, the benefit of obtaining information from a parent questionnaire might be weighed against the risk that response rates will be low due to low literacy skills. An option would be interviews, although these are likely to be more cost-intensive. Experience shows that primary students are a reliable source of information about their parents and households. However, this depends on the type of questions, the kind of data collection instrument (questionnaire or interview), as well as the grade level and associated literacy skills of the children.

Contextual instrumentation must be based on a sound and fully articulated theoretical framework. Good examples for highly elaborated context frameworks are PIRLS and TIMSS. SAQMEC and PASEC use analytical models to describe the context factors collected and the expected relationships with achievement.

Key factors of contextual instrumentsComplete contextual instruments (or documentation about its content) were obtained for the following assessments:• EGRA/EGMA(i.e.,SSMEinstruments)(RTI,2004)• PIRLS,TIMSS(IEA,2013b,2013d)• SACMEQ(Hungi,2011;Hungietal.,2011)• Uwezo(Uwezo-Kenya,2013)• PASEC(CONFEMEN,2010a,2010b)

A review of these instruments and associated documentation suggests that the key factors in contextual data collection on student, classroom and school level are: • at the student level:• individualfactors,suchasgender,age,gradelevel,graderepetition,healthandwell-being;• family factors, such as socio-economic measures (possessions at home; books in the

home; parental literacy level, education and occupation); ethnic background and cultural practices; language spoken at home; home resources; early learning opportunities (preschool attendance); and family support;

• learningexperiencesatschool(e.g.,activitiesduringinstruction,teacherfeedbackprovidedto students);

• learning experiences out of school (homework and out-of-school lessons; readingindependently in and out of school; working outside school/domestic work);

• learningtime,attendance/absence;• accesstoresourcesatschool;beingallowedtotakebookshome;• communitysupport.

32 In order to link aggregated school data to system level data in EMIS, participating schools need to be able to be identified within the EMIS.

Page 35: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

17

• at the classroom level• teacherbackgroundvariables,includinggender,teachertraining,practiceandexperience class size• classroom equipment, teaching resources (e.g. availability of pedagogic materials for

students and teachers, learning material, text books, furniture)• languageofinstruction,languageofteacher frequency of homework • qualityofinstruction,teachingmethods(e.g.teacherreadstolearner,explainingthingsifnot

understood, providing extra time to complete task; domain-related activities; instructional time), classroom management, frequency and use of assessment for teaching

classroom climate, discipline;

• at the school level:• inputfactorssuchasschooltype(public/private);schoollocation(rural/urban);schoolsize;

school funding, teacher-student ratio; socio-economic background; and ethnic/language composition;

• teacherbody,teacherabsenteeism,teacherprofessionaldevelopment;• principal/headteacherbackgroundvariables; school management, school curriculum, assessment and evaluation; language for instruction, provisions for students who do not speak the language of

instruction at home;• schoolresources(e.g.library,computerrooms); school facilities (e.g. condition of school building, electricity and water supply, toilets and

canteens);• qualityofinstruction; school climate.

The collection of contextual information is central to an assessment. Context helps define the relationships between learning outcomes and background factors of research and policy interest. The relationships between contextual factors and achievement – and not the learning outcomes data alone – are essential to decision-making. The time and cost of collecting contextual information can be minimised by ensuring that the instruments are well targeted to specific areas of research and policy interest, and that particular questions yield response data that do not require excessive processing.

1.2.6 Test administrationIn just over half of the assessments (30, or 52 per cent), the cognitive assessment is administered as a paper-based test in schools to groups of students, where each student completes the assessment independently (i.e., by reading questions and recording responses on paper). Uwezo, EGRA and EGMA are exceptions. Their cognitive assessments are one-on-one, with the test administrator delivering the items orally, and students providing most of their answers orally. These oral one-on-one assessments generally make use of paper-based instruments, though EGRA and EGMA are starting to employ a method where the student refers to a paper-based test, but the administrator records the data on a tablet-based application called TangerineTM (see Table 8).33

33 See <http://www.tangerinecentral.org/home> for information about TangerineTM.

Page 36: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

18

Table 8. Test administration methods of assessments from the stock-taking

Test administration methods Number of assessments

Assessments

School-based

Group administration

Paper-based administration

30 PIRLS, prePIRLS TIMSS

PASEC

SACMEQ National assessments

School-based

One-on-one administration

Oral administration

20 EGMA

EGRA

Household-based

One-on-one administration

Oral administration

3 Uwezo

School-based

One-on-one or small group administration

Oral or tablet-based administration

1 Pilot assessment of Grades 1, 2 and 3 in Lesotho

Unknown 4Total 58

A pilot assessment of Grades 1–3 in Lesotho provides an example of how tablets can be used in oral assessments as more than mere data collection tools for the test administrators. In this pilot, students formed a small group with one tablet per student, they interacted directly with the tablets (i.e., the tablets ‘administered the test’ to the students), while the test administrator monitored the groups.

These different methods for cognitive assessments suit different aims and purposes. Group administration is most convenient if members of the target population have the skills to complete an assessment independently, and are easily located in naturally occurring groups (e.g., in schools). One-on-one oral administration is necessary if some students are expected to be unable to complete an assessment independently. If one-on-one oral administration is used, a tablet-based data collection application such as TangerineTM is an effective way of reducing human error. It can control the way the assessment is administered and restrict data input to only valid-response values. A small group tablet-based oral administration, such as the one used in the Lesotho pilot, offers further efficiency gains because children who would otherwise require one-on-one administration can be tested simultaneously.

1.2.7 Data analysis The assessments in our stock-taking use a range of different analytical techniques for data analysis. Seven major techniques were identified:• usingitemresponsetheory(IRT)toscalecognitivedata;• establishingcompetencylevelsorbenchmarks;• conducting frequency analyses or calculating mean scores for cognitive results,

disaggregated by contextual variables of interest;

Page 37: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

19

• conductingfrequencyanalysesoncontextualdata;• exploringrelationshipsbetweencognitiveperformanceandcontextualfactorsviaanalytical

techniques;• computingtrendsincognitiveperformance;• reportinginternationalcomparisonsofcognitivedata.

Grouping assessments by type reveals patterns of use of particular techniques (see Table 18 in Appendix II). The last row of the table tallies the frequency of use for each technique. Those used most frequently are: establishing competency levels or benchmarks; conducting frequency analyses or calculating mean scores for cognitive results, disaggregated by contextual variables of interest; and exploring relationships between cognitive performance and contextual factors via analytical techniques. These techniques are described in subsequent sections, as is the use of item response theory (IRT).

Use of item response theoryIRT is widely used in the design and analysis of educational assessments. Unlike classical test theory, which is based on the assumption that all items in a test contribute equally to a student’s performance, IRT takes into account different characteristics of test items (i.e., varying difficulty levels, discrimination), and hence the probability of getting particular items right or wrong, given the ability of the student taking the test (i.e., the probability that those who do well on the test have a high level of performance, and those who do poorly have low levels of performance) (Kaplan and Saccuzo, 1997). The different approaches account for different levels of test validity.34,35,36

A primary advantage of IRT is that scores obtained from a linked test design – where there are multiple test forms with a certain number of common items shared across the forms – can be placed on a common scale. In IRT, raw scores are converted to scale scores. Placing scores on a common scale permits valid comparisons of results across different test forms, across different grades, and over time.37 Hence, these functions offered by the use of IRT are of particular importance for assessments with system-monitoring purpose.

Another important advantage of using IRT is that it offers a greater depth in reporting. Because item difficulty and ability are on the same scale, it is possible to develop substantive descriptions of the skills and knowledge required to correctly answer items of varying difficulty, and to

34 Classical test theory uses observed test scores of individuals that are composed of a ‘true score’ (raw score) an individual would get if there were no measurement error. The measurement error, i.e. standard deviation of errors, is assumed to be a random variable with a normal distribution. The larger the standard error of measurement, the less certain is the accuracy of the measurement. Conversely, a small standard error of measurement indicates that an individual score is probably close to the true score (Kaplan and Saccuzo, 1997).

35 Discrimination refers to the capacity of an item to distinguish between different levels of ability (i.e. good quality test questions distinguish between students with the ability to answer the question correctly and those without).

36 For example, one-dimensional models consider item difficulty, two-dimensional models consider item difficulty and item discrimination, and three-dimensional models also account for ‘guessing’ (i.e. test takers with very low levels of ability getting a correct response).

37 If an assessment does not use a linked design and analyse data using IRT, comparisons of results are only true comparisons if the same assessment items are administered in the same order to all children whose results are being compared. In many cases this is not feasible, as it is often difficult to keep tests secure from one administration to the next. It is also impractical in instances where the construct being assessed is broad, because all items that cover the construct cannot possibly be administered to each child.

Page 38: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

20

make statements about the skills and knowledge possessed by children with different levels of ability. That way, consistent competency levels can be defined and used as a basis for setting benchmarks.38

Moreover, the IRT analysis of the different components provides information on the psychometric quality of the items, e.g., if items are well targeted in terms of difficulty, and can distinguish between students with different levels of ability. With the use of IRT analyses, items of poor psychometric quality can be identified and discarded or adjusted as required.

Of the assessments we examined, IRT is used in the international/regional assessments PIRLS, TIMSS, PASEC and SACMEQ, as well as in five of the national assessments. These assessments have a common purpose: system-level monitoring. However, in a large number of national assessments (seven; the information was not available in four cases) that also aim at system-level monitoring, IRT is not used (see Table 9).39

Table 9. National assessments and use of IRT in data analysis

National assessment Country IRT usedMLA EritreaNational Learning Assessment (NLA) Ethiopia

NASMLA Kenya

LNAEP LesothoAssessment of Grades 1, 2 and 3 in Lesotho Lesotho

Assessing Learner Achievement Malawi UnknownMLA MalawiNational Assessment Mozambique UnknownNSAT Namibia UnknownLARS Rwanda

MLA SomaliaAnnual National Assessment South AfricaNALA South Africa UnknownNAPE UgandaNALA ZambiaZELA Zimbabwe

In Uwezo and in EGRA/EGMA implementations that aim at system-level monitoring, IRT is not used (see Table 4). This finding shows that despite the advantages of effectively monitoring system performance across contexts and over time, IRT is not extensively applied. One reason for this may be the limited capacity for psychometric analysis at the country level.40

38 For some discussion of how results of IRT analysis can be reported and an example of a metric that uses both quantitative and substantive information about children’s performance, see <http://www.acer.edu.au/files/Described_Proficiency_Scales_and_Learning_Metrics.pdf>.

39 A more detailed table showing the analytical techniques used in all assessments from the stock-taking is presented in Appendix II.

40 This was one of the findings of the country case studies conducted in this study (see Appendix III).

Page 39: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

21

Competency levels/benchmarksIn more than half of the assessments (32, or 55 per cent), results are presented with reference to competency levels or benchmarks. In general, competency levels or benchmarks are established by describing the specific skills required to provide correct responses to each test item. Then, test items are placed into groups of items so that the items in each group have similar difficulties and share a common ‘theme’ relating to the underpinning competencies required to provide the correct response. Naming and defining the ‘themes’ identifies competency levels or benchmarks (Hungi et al., 2010).

For example, the National Assessments System for Monitoring Learning Outcomes (NASMLA) undertaken in 2010 in Kenya used this analytical technique to assess Grade 3 students in literacy and numeracy. Four competency levels were identified in literacy (The National Assessment Centre, 2010) (see Table 10).

Table 10. Descriptions and attainment of competency levels in literacy: NASMLA in Kenya 2010

Competency level

Description Percentage of students who attained the level

Level 1 Pre-reading: Matches words and pictures involving concrete concepts and everyday objects.

6.2

Level 2 Emergent reading: Spells correctly simple everyday words and recognises missing letters in such words. Uses familiar words to complete simple everyday sentences.

46.1

Level 3 Basic reading: Uses correct punctuation in simple sentences. Infers meaning from short passages, and interprets meaning by matching words and phrases. Identifies the main themes of a picture.

36.7

Level 4 Reading for meaning: Links and interprets information located in various part of a short passage. Understands and interprets meaning of a picture and writes short sentences to describe the theme.

11.0

Source: Adapted from Monitoring of Learner Achievement for Class 3 in Literacy and Numeracy in Kenya: Summary of results and recommendations (The National Assessment Centre, 2010, p. 23).

Students’ attainment was analysed as follows: Slightly less than half of the pupils (47.7 per cent) attained the desirable Levels 3 and 4 of competency in literacy. However, most Grade 3 pupils (46.1 per cent) demonstrated emergent reading ability, which is congruent with the Grade 2 level (The National Assessment Centre, 2010, p. 23).

As this example from Kenya demonstrates, competency levels can provide a more concrete understanding of what students are actually able to do than can the insights obtained from merely presenting test scores. Competency levels can also suggest instructional strategies relevant to students who are learning at each level of competence. Such descriptions would be of great

Page 40: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

22

assistance for the preparation of textbooks, the design of teacher in-service training programmes, and the development of general classroom teaching strategies. All of these activities require a sound knowledge of the skills already acquired and the higher order skills that must be mastered in order to move to the next stage of learning (Hungi et al., 2010).

Frequency analysis and mean scoresAll of the assessments we studied – other than those without documentation on data analysis – use frequency analysis. These total 50 assessments (86 per cent). Frequency analyses are conducted for performance data and mean scores are calculated. These analyses are usually undertaken on data disaggregated by key contextual variables, such as gender, grade and administrative location.

For example, the Annual National Assessment in South Africa is a system-level monitoring assessment targeting students in Grades 1–6 and 9. It undertakes mean score analysis to investigate the difference in learning achievement between boys and girls (Department of Basic Education, Republic of South Africa, 2014). The mean percentage marks of Grade 3 students in mathematics, calculated by gender and province, show that girls performed better than boys in all provinces (see Figure 3).

Conducting frequency analyses and calculating mean scores are simple procedures, yet they can yield information that is highly relevant to policy development in the implementing countries.

Figure 3. Mean percentage marks in Grade 3 Mathematics by gender and province – Annual National Assessment in Kenya

Source: Department of Basic Education Republic of South Africa, 2014, p. 85

0%

Nor

ther

n Ca

pe

East

ern

Cape

Tota

l

Nor

th W

est

Wes

tern

Cap

e

Free

Sta

te

Gau

ten

KwaZ

ulu-

Nat

al

Mpu

mal

anga

Um

popo

20%

40%

10%

30%

50%

60% Girls

Boys

70%

80%

90%

100%

Page 41: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

23

Relationship between cognitive performance and contextual factors Relationships between cognitive performance and contextual factors are explored in 34 assessments (59 per cent). Analytical techniques used were correlational analysis, regression analysis and multi-level modelling.

EGRA and EGMA in Zambia, implemented in 2011 with support from USAID, provide an example of this approach to data analysis. These reading and mathematics assessments were administered to Grade 2 and Grade 3 students in the Bemba-speaking regions as a pilot study for system-level diagnostic assessment. The final report of the assessments discusses which factors best predict student performance in reading and mathematics. Using multiple regression models, the report shows that five main factors contributed to students’ performance in school: socio-economic status; having attended preschool; starting school at the expected age; reading independently in and out of school; and receiving corrective feedback from teachers (Collins et al., 2012).

Conducting analyses such as these helps to ensure that cognitive results are not misinterpreted, as they can be when they are presented without any context. However, the results should not be taken to mean that the relationships are necessarily causal.

1.2.8 Reporting and disseminationIn any assessment programme, availability of quality information and data that address a diverse audience is the key to successful dissemination. Public availability of assessment results and data are important. They allow a wide range of stakeholders to instigate change within the system to improve student performance. The availability of a fully documented database can also, in turn, inform the work of independent researchers.

In the assessments we reviewed, results are publicly available for 41 (71 per cent). Approximately half (22, or 54 per cent) provide additional dissemination products, including results summaries, press releases and policy briefs. Large-scale international and regional assessments, such as TIMSS, PIRLS, SACMEQ and Uwezo, provide ample dissemination products to reach a diverse audience.

Results are not publicly available for 15 (26 per cent) of the assessments we reviewed. For two national assessments where results reports are not publicly available, other means of disseminating them were identified. MLA in Eritrea held workshops at national and sub-national levels (UNICEF Eritrea, n.d.). In Zambia, results of the Grade 5 national assessment in 2008 were disseminated at provincial level. Also, remedial materials were developed for the areas that were found to be challenging for teachers and learners based on the test item analysis (UNICEF Zambia Country Office, 2015).

Page 42: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

24

In addition to reports of the results, considerable effort was made to obtain full datasets from the assessments during our review. However, data from only four regional and international assessments that were implemented in seven ESA countries were available for this study within our time constraints: Uwezo (Kenya, Tanzania, Uganda), PASEC (Burundi, Comoros), TIMSS (Botswana) and prePIRLS (Botswana, South Africa).41 Data management and data cleaning procedures must be undertaken before assessment data can be analysed, reported and eventually released to the public. The more countries that are involved in an assessment – for example, in international and regional assessments such as PIRLS, TIMSS, PASEC and SACMEQ – the longer it takes for the data to be cleaned at the national and regional/international level and scaled and analysed. Both PIRLS 2011 and TIMSS 2011 studies took approximately two years and three months between main data collection and the release of international datasets (IEA, 2013a, 2013c). Hence, the most recent available datasets for our analysis are for Uwezo 2012, PASEC 2008–2009 and PIRLS/TIMSS 2011 (see Chapter 2).

41 The authors thank UNICEF Headquarters, UNICEF ESARO and COs for their support in requesting access to data from SACMEQ and national assessments. It is acknowledged that in instances where national data were requested from national education ministries or national examination bodies, country-level processes for approving the use of national assessment data for secondary analysis may have required more time than we had available for data analysis. This was the case for national data for South Africa where access was granted outside the time available, and these data were not included.

Page 43: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

25

Page 44: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

2626

Page 45: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

27

2. Literacy and numeracy in primary education in the ESA region: Students experiencing LLOs and trends over time

Average test scores for literacy and numeracy in international and regional assessments undertaken in the ESA region were generally low, with a considerable proportion of students not achieving basic skills in reading and mathematics. Results from SACMEQ III (2007) show wide disparities in basic reading and mathematics skills by the end of primary education (Grade 6). In 3 of the 12 participating countries in the ESA region (Kenya, Tanzania and Swaziland) between 80 per cent and 93 per cent of students achieved the minimum reading level in SACMEQ. In six countries (Botswana, Zimbabwe, Namibia, Mozambique, Uganda and South Africa), between 50 per cent and 80 per cent of students achieved the minimum level. In Lesotho, 48 per cent of students in Grade 6 achieved basic reading skills; in Zambia and Malawi, only 27 per cent of students reached this level. In mathematics, the proportion of primary students reaching basic skills was considerably lower, with less than 50 per cent of students in Grade 6 reaching the minimum level in three-quarters of the countries. In the remaining quarter of participating countries (again Kenya, Tanzania and Swaziland) between 56 per cent and 62 per cent of students learned basic mathematics skills (UNESCO, 2014, p. 35).42

Characteristics of low-performing students and trends in student performance over time in literacy and numeracy in primary education are the focus of this chapter. Specific analyses draw upon data from four different assessments in the region (as discussed in Chapter 1): Uwezo (Kenya, Tanzania and Uganda); PASEC (Burundi and Comoros); prePIRLS (South Africa and Botswana); and TIMSS (Botswana).43 The four assessments cover 7 of the 21 ESA countries. Trends in literacy and numeracy performance were analysed for three countries – Kenya, Tanzania and Uganda – where the same assessment, Uwezo, with the same key features (assessment framework, design, target population and conditions of administration) was implemented more than once.44 Data from the regional assessment SACMEQ, in which 12 ESA countries participated at least twice, and data from national assessments, were not available for this study.45 The limited available data makes it difficult to draw conclusions about the characteristics of low-

42 Source: IIEP/Pôle de Dakar Indicator Database (UNESCO, 2014, p. 35); (Hungi et al., 2010)

43 For details about the datasets available, see Appendix I.

44 Any comparison of results between different assessments—or between different cycles of the same assessment that don’t have the same key features across the different cycles—requires sophisticated linking procedures and analyses that are beyond the scope of this report.

45 The authors thank UNICEF Headquarters, UNICEF ESARO, and UNICEF COs for their support in requesting access to data from SACMEQ and national assessments during the time of this study. Findings from SACMEQ and national assessments are referred to in the discussion where reports including analysis of contextual data were obtained.

Page 46: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

28

performing students and trends in performance over time for the region. Given these constraints, we drew upon findings from a broad variety of reports from the ESA region, including SACMEQ and national assessments, to supplement our analyses. The findings that were reported provide valuable insights into the main characteristics of students experiencing LLOs in literacy and numeracy, the nature of associations between context factors and student performance, and changes in student performance over time.

2.1 Characteristics of students experiencing LLOs in literacy and numeracy in primary education in the ESA region

Equipping all students with basic literacy and numeracy skills, and minimizing the number of low-performing students in these domains, are fundamental goals of education systems. In order to quantify the number of students at different levels of performance and to monitor progress over time, competency levels must be defined and benchmarks set. Understanding the factors associated with low-performing students is critical for the development of targeted education policies.

As outlined in the previous chapter, 32 out of the 58 assessments that we reviewed define competency levels or benchmarks. However, there are no common metrics in literacy and numeracy across the different assessments, and different benchmarks are used to define ‘limited learning outcomes’. Hence, for our study limited learning outcomes are based on the benchmarks used for literacy and numeracy in the different assessments analysed in this report: PASEC, Uwezo, TIMSS and prePIRLS. The criteria used to set these benchmarks are conceptually quite different.46

For PASEC (Burundi and Comoros), student achievement scores in reading (French, Kirundi) and mathematics are categorised in three levels.47 Students at Level 3, the highest level, have acquired a basic level of knowledge (CONFEMEN, 2010b, p. 93). At the other end, at Level 1, students are considered to be close to failing. This group is categorised as experiencing LLOs.48

Following Uwezo’s approach, in which tests are aligned with the national Grade 2 curriculum in the three participating countries (Kenya, Tanzania and Uganda), all children attending Grade 3 are expected to have achieved the highest level in each domain (‘story’ for literacy and ‘multiplication/division’ for numeracy). Hence, children enrolled in Grade 3 and above who have not achieved the highest level of performance in English, Swahili and numeracy are considered to be experiencing LLOs.

46 Details about the definition and identification of students with limited learning outcomes in the different assessments are presented in Appendix I.

47 Grade 2 students from Burundi were also assessed in Kirundi.

48 Swahili was only assessed in Kenya and Tanzania.

Page 47: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

29

For TIMSS (Botswana) and PIRLS (or prePIRLS) (Botswana and South Africa) four international benchmarks are defined, ranging from ‘advanced’ to ‘low’ (I. Mullis, Martin, Foy, and Arora, 2012; I. Mullis, Martin, Foy, and Drucker, 2012).49,50 Students reaching the ‘Low International Benchmark’ for Grade 4 show some basic mathematical knowledge and reading skills. For the purposes of our analysis, students who have not achieved the Low International Benchmark are considered to have experienced LLOs.

We listed in a table the percentage of students experiencing LLOs, organised by assessment, country, grade and domain, as well as the criteria used for defining the LLOs for the different assessments (see Table 11).

Due to the conceptually different criteria or benchmarks used, the percentage of students experiencing LLOs varies considerably across the assessments. In Burundi and Comoros (PASEC), approximately one in five students experienced LLOs, with a similar percentage for literacy and numeracy domains across Grade 2 and Grade 5. In Kenya, Tanzania and Uganda (Uwezo), the percentage of primary school age students experiencing LLOs differed between the domains. For mathematics, approximately one-third of the students experienced LLOs. For literacy, the percentages varied across the Uwezo countries. In Kenya, around one in five students performed low in English or Swahili. In Tanzania, every second student of primary school age experienced LLOs in English, and approximately every third student in Swahili. In Uganda, 39 per cent of primary school age students experienced LLOs in English. In Botswana (TIMSS), 40 per cent of Grade 6 students experienced LLOs in mathematics. Around one in four students in Botswana and South Africa showed LLOs in reading (prePIRLS).51

49 Botswana participated in TIMSS 2011 with Grade 6 students, using Grade 4 TIMSS assessment. If it was expected that a country’s Grade 4 students would find TIMSS assessment too difficult, IEA encouraged the country to test children in a higher grade.

50 PrePIRLS was chosen over the traditional PIRLS dataset as it is better targeted towards the achievement of students from participating countries for the region. Botswana and South Africa participated in prePIRLS with Grade 4 students in 2011.

51 It is worth noting that Grade 6 students in Botswana were assessed using the TIMSS test material targeted to Grade 4; the proportion of low performing students in mathematics would presumably be higher if Grade 4 students were tested with the Grade 4 assessment material. For reading, Grade 4 students in Botswana and South Africa were assessed with the ‘easier’ or better targeted prePIRLS test materials; presumably the proportion of low-performing students would be higher if measured with the standard PIRLS tests.

Page 48: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

30

Table 11. Proportions of students defined as experiencing limited learning outcomes by assessment, country, grade and domain

Assessment (Year in brackets)

Criteria for limited learning outcomes

Country Grade Mathematics literacy

Reading literacy

PASEC (2008/2009)

Students with a test score of less than 25 out of 100

Burundi Grade 2 18% 19% (French), 18% (Kirundi)

Burundi Grade 5 23% 22%

Comoros Grade 2 21% 21%

Comoros Grade 5 24% 20%

UWEZO (2012)

Students enrolled in Grade 3 and above who could not achieve the highest level of performance

Kenya Primary school age

39% 23% (English), 21% (Swahili)

Tanzania Primary school age

31% 50% (English), 32% (Swahili)

Uganda Primary school age

32% 39%

TIMSS (2011)

Students scoring below ‘low achievement threshold’ proficiency standard (score of less than 400)

Botswana Grade 6 40%

Pre-PIRLS (2011)

Students scoring below ‘low achievement threshold’ proficiency standard (score of less than 400)

Botswana Grade 4 23%

South Africa

Grade 4 29%

Page 49: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

31

2.1.1 Individual and family characteristics of students with LLOs in literacy and numeracy

Individual and family characteristics of students found to be important in association with LLOs are: gender; age; language spoken at home; a range of socio-economic factors; learning activities prior to attending school; engagement in reading lessons; and whether the student attends lessons out of school. Overview tables with detailed results for each of the factors observed in the analysis per country, grade and domain are presented in Appendix IV.

GenderIn general, the assessments showed that males were more likely to experience LLOs in literacy than females. This pattern was found in Burundi (PASEC, both French and Kirundi), Kenya (Uwezo, both English and Swahili), South Africa and Botswana (prePIRLS, English). The most extreme example was in Botswana, where males were almost three times more likely than females to be experiencing LLOs. The exception was Comoros, where females from Grade 5 were slightly more likely to be experiencing LLOs in French than males (1.1 times more likely). It is not clear why the direction of the gender difference was reversed in Comoros, but we assume other factors contributed to it. For example, females in Comoros with illiterate parents were noticeably more likely to experience LLOs than males.

The findings for gender and its relationship with literacy mirror previous findings from the region. Girls were reported to outperform boys in reading literacy in Botswana (Monyaku, 2012); South Africa (Moloi and Chetty, 2010); Zimbabwe (ACER and ZIMSEC, 2015); and Eritrea (UNICEF Eritrea, n.d.). In contrast, an EGRA study in Ethiopia showed that Ethiopian boys had higher levels of early reading scores than girls. However, there was an interactional regional effect. Rural boys outperformed rural girls on almost all tasks, but in urban schools, the opposite effect was the case, with girls outperforming boys (RTI, 2010, p. 37).

For mathematics, girls were found to have greater representation in the LLO group in Burundi and Comoros, but less in Kenya and Botswana. Again, the differences across countries are likely attributable to other factors. For example, in Botswana, a greater percentage of students considered relatively old for their grade (aged above 14 years) were male compared to other age groups.

The different directions of the gender differences found in the assessments are consistent with the findings in prior studies from the region. Boys were reported to have outperformed girls in mathematics in Kenya, Uganda and in provinces from Somalia (Report on Monitoring Learning Achievements (MLA) in Grade 4 in Puntland and Somaliland, 2012; Uganda National Examinations Board, 2010; Paul M. Wasanga et al., 2012). However, the opposite pattern was found in Zimbabwe and South Africa (ACER and ZIMSEC, 2015; Moloi and Chetty, 2010).

AgeAge – relative to school entry and grade – is an important factor associated with student performance in the ESA region. However, the relationship between age and performance is complex, and may be influenced by other factors, such whether school entry occurs at the official starting age, the student’s cognitive development at the time of school entry, prior learning opportunities that affect progression through the grades, as well as instructional practices in

Page 50: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

32

response to student diversity.52 A literature review conducted for a study by Hungi et al. (2014) found different effects of age – relative to the grade where the students are at – for developed countries, where ‘older’ students in class generally outperformed their ‘younger’ colleagues, and developing countries, where most studies, especially from Africa, show that ‘younger’ students perform better than ‘older’ students (Hungi et al., 2014, p. 249).53

The assessments we reviewed showed that across the region younger students consistently performed better.54 For example, Grade 6 students in Botswana who were 12 years or under (approximately one-fifth of the population) were almost three times less likely to be experiencing LLOs in mathematics than students aged over 12 years of age. This is well supported by the literature in the region. A study by Kunje (cited in Hungi et al., 2014) showed that younger students in Grade 7 in Malawi outperformed their older colleagues in English literacy, Chichewa (a local language) and mathematics. SACMEQ III in 2007 showed that in 12 out of 15 countries, younger students performed better than older students; in nine of those countries, younger students performed better in reading as well as in mathematics (Hungi et al., 2014). Additionally, data from 740,000 students from the 2010 Kenya Certificate of Primary Education examination showed that younger students performed better than older students, according to Keith et al. (cited in Hungi et al., 2014).

Students relatively older than the average class age were over-represented in the LLOs groups for Botswana (prePIRLS, TIMSS), South Africa (prePIRLS) and Grade 5 students in Comoros and Burundi (PASEC). In contrast, Grade 2 students in Burundi who were relatively older, and older children from Kenya, were less likely to be experiencing limited learning. These mixed findings may be attributable to the different year-levels examined. Students relatively older in the latter years of primary school would have been more likely to have repeated a grade than students in the earlier years. At earlier years, age differences are more likely due to different ages of school commencement. For example, of the students from Grade 5 in Comoros who were considered relatively older, two-thirds had repeated a grade at some stage. These students were considerably more likely to be experiencing LLOs than those who did not repeat (and therefore started school at a later age).

The 2012 Monitoring Learning Achievement project (MLA) for Malawi found that those learners who had repeated at least one grade scored significantly lower than those who had never repeated a grade. The achievement levels of those who repeated were generally low and were more evident in higher than lower grades (Ministry of Education, Science and Technology of Malawi, 2014). These studies suggest that while grade repetition is not necessarily the cause of poor performance it does indicate that repeating a grade may not help low-performing students, regardless of age (Njora Hungi et al., 2014).

52 For example, as reported in Hungi, Ngware, and Abuya (2014), reasons given for early school entry by some parents in Kenya is ‘... a hope, that, if the children do not do well, they can always repeat because they have a year or two to spare compared to their classmates’ (Hungi et al., 2014, p. 256).

53 Hungi et al. (2014) investigated the optimal age with the greatest positive impact on literacy achievement for Grade 6 students from low-income families across six major slums in Kenya. For this study, a sample of 7041 Grade 6 students from 226 schools across six major urban slums in Kenya was drawn (Hungi et al., 2014, p. 247).

54 The criteria and relative proportions in each age group varied across countries and datasets. In PASEC for Burundi and Comoros this was defined as 5 years old or below for Grade 2 students and 8 years or below for Grade 5 students. For prePIRLS for South Africa and Botswana (also for TIMSS) this was defined as 12 years or less. For Uwezo for Kenya, Tanzania and Uganda this was defined as children who were aged between 6 and 9 years. From the data we are unable to determine whether younger students began school at an earlier age or whether a high proportion of other students were late entrants to schooling or had repeated a grade.

Page 51: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

33

Language spoken at homeThe language spoken at home, or rather the degree of alignment between the language spoken at home and the language of instruction, has a strong impact on learning outcomes. In multilingual environments, such as the ESA region, options for multiple language instruction are evident, which can pose complex challenges to education policy and management (Heugh, Bogale, Benson, and Yohannes, 2006). Research on first language instruction shows that children benefit from mother-tongue instruction for their cognitive development in general, and early literacy acquisition in particular (Ball, 2010; Bialystok, 2001; Cummins, 2000; Heugh et al., 2006; RTI, 2010; UNESCO, 2010).55 However, the alignment between language spoken at home and language of instruction is just one factor affecting student achievement. Other interacting factors are socio-economic status; the overall quality of language/reading instruction and instruction in general; provision and use of language instruction materials/books; linguistic complexity; the teacher’s proficiency in the language of instruction; teacher training; and the school environment. Given these complex interactions and particular country contexts, further research is needed on language of instruction policies and practices (RTI, 2010).

The language spoken at home was captured for Burundi, Comoros (both PASEC), South Africa (prePIRLS) and Botswana (TIMSS and prePIRLS). In most instances across the assessments, speaking the test language at home was an advantage for children. In Comoros, where students were assessed in French, 97 per cent of all Grade 2 students spoke Shikomori at home (96 per cent of Grade 5 students), while only 3 per cent spoke French at home (4 per cent of Grade 5 students). Those students who spoke French at home were less likely to experience LLOs in both French and also in mathematics. In Burundi, 95 per cent of students at Grade 2 level speak Kirundi at home (which is the teaching language until Grade 4), but were no more likely to experience LLOs for this language; however, they were over-represented in the LLOs group for French language assessment. For South Africa in prePIRLS, the vast majority (91 per cent) of students spoke the language of the assessment at home at least sometimes.56 For Botswana, in comparison, approximately three in four students spoke the language of assessment (English) at home (74 per cent in prePIRLS, 78 per cent in TIMSS). Those who spoke the test language at home were less likely to be experiencing LLOs in South Africa, with mixed findings found for Botswana (a difference was found in TIMSS but not for prePIRLS). It is not apparent why the effect of test language was not consistent across studies. However, the data does suggest that any relationship between speaking the language of test at home and performance is likely moderated by the influence of socio-economic factors. Students from both South Africa and Botswana who always (or almost always) spoke the language of test at home were more likely to have greater home resources.

55 A child’s first language is also often referred to as mother tongue and as the language spoken at home in assessments.

56 In South Africa, the prePIRLS assessment was administered in the 11 official languages, and more than 90 per cent of the population were administered the test in a language they speak at least sometimes at home. Information from the South African PIRLS 2011 national report indicates that less than 10 per cent of the population mainly speak English at home, despite English being the language of instruction for almost 80 per cent of the population at Grade 4 (Howie et al., 2012).

Page 52: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

34

Other studies within the ESA region show similar results. Evidence from PASEC and SACMEQ show a strong link between home language and the language of instruction in influencing test scores (Fehrler and Michaelowa, 2009, cited in UNESCO, 2010, p. 154; Garrouste, 2011). While this is more commonly seen as a factor in literacy assessments, evidence from Namibia using SACMEQ results suggests that low language skills also are a large contributor to low performance levels in mathematics (Garrouste, 2011, p. 231). Students from Zimbabwe’s ZELA project who spoke English at home had greater performance on the English and mathematics tests than students who spoke other languages at home (the mathematics test was also administered in English).57 Very few students (3 per cent) spoke English at home, as opposed to the main language spoken at home, which was Shona (70 per cent) (ACER and ZIMSEC, 2015, p. 29).

Socio-economic factorsThe socio-economic status of students is a strong predictor of achievement. The relationship between socio-economic factors and achievement exists consistently across the different assessments. Our analyses support this finding: students from lower socio-economic backgrounds were more likely to experience LLOs across all countries examined in both literacy and numeracy. This relationship was found across all assessments examined, even when the assessments (and the countries within the assessments) used different measures of socio-economic status.58

Parental education and literacy was also consistently found to be associated with LLOs. The mothers of students from Kenya, Tanzania, Uganda, South Africa and Botswana experiencing LLOs were more likely to have no formal education or lower levels of education (the same pattern was found for fathers for the latter two countries).59 Similarly in Burundi and Comoros, the most common pattern was that students identified with LLOs were more likely to have illiterate mothers and fathers.

Other socio-economic measures we examined showed that the socio-economic background of students had an effect on their school performance. A number of indicators for home possessions were identified, such as housing materials. In Kenya and Tanzania, students were more likely to experience LLOs (literacy and mathematics; Uwezo) if they indicated that the walls of their houses were made of the least expensive material of the possible options (mud) and less likely to experience LLOs if the walls were made of the most expensive materials of the options (stone/brick). This is not to suggest that there is a direct link between housing material and student learning but rather using housing material as a proxy for socio-economic status supports the finding that student performance is affected by home background.

57 The ZELA English test was administered in the context of the national policy that children in Zimbabwe learn English as a subject from Grade 1 (ACER and ZIMSEC, 2015, p. 36).

58 The assessments analysed as part of this study did not have data related to nutrition, child health or other related child development factors, which may constitute an area of further investigation with regard to student performance.

59 Uwezo only collected data on maternal education for Kenya, Tanzania and Uganda, whereas TIMSS and prePIRLS collected data for both parents.

Page 53: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

35

The assessments we analysed all included measures of household possessions. Home possession indices for Comoros and Burundi (PASEC) and for Botswana (TIMSS and prePIRLS) and South Africa (prePIRLS) in general showed that students experiencing LLOs, regardless of the domain, were more likely to have fewer household resources.60,61 In general, across all assessments, students experiencing LLOs were less likely, compared to their peers, to have a range of household possessions (e.g., fridge, TV, phone, books) basic facilities (access to electricity or clean water) and individual possessions such as children’s books.

Students experiencing LLOs in Comoros were also found to have more outside work activities than other students, including farm, household and retail work. These students were also more likely to indicate that their work hindered their ability to study at home, attend school and concentrate at school. The same association between non-school-related work and students was not found for students from Burundi. A similar association was found in ZELA, where the amount of time that students spent working was negatively associated with achievement data (ACER and ZIMSEC, 2015).

Learning activities prior to attending schoolAnalysing data from Botswana and South Africa (prePIRLS and TIMSS) we found that students who attended preschool (46 per cent of Grade 4 students in Botswana and 83 per cent in South Africa) were less likely to experience LLOs than students who did not.62 This is a well-supported finding. Other regional studies from Zimbabwe, Zambia and Malawi found a relationship between preschool attendance and achievement levels (ACER and ZIMSEC, 2015; Collins et al., 2012; Ministry of Education, Science and Technology of Malawi, 2014). However, it is important to note that the relationship is not necessarily causal. For example, students who attended preschool in Botswana and South Africa were also more likely to have greater home resources for learning and higher levels of parental education.

Parents of students in Botswana and South Africa were also asked about their children’s exposure to activities prior to attending school related to reading (prePIRLS) and mathematics (TIMSS), as well as their level of competency in these domains once they started school. Early activities included reading books, telling stories, singing songs, playing word games, writing letters or words, reading aloud signals and labels. Examples of early numeracy activities are counting different objects, playing games involving shapes, playing with building blocks or construction toys, or playing board games or card games. Students experiencing LLOs were less likely to have had exposure to such activities before attending school and were rated as having lower levels of competency at commencement. Again, the socio-economic status of the family was found to be associated with these two measures. This suggests that families with greater home resources are more likely to engage in learning activities with their children, which, in turn, is likely to increase the capabilities of students when commencing school.

60 This index was based on the following home possessions and basic facilities: electricity, a television, a telephone, a fridge, gas heating, a video recorder, a computer and a car.

61 This index includes number of books in the home, number of children’s books in the home, number of home study supports, highest parental education level, highest parental occupation level.

62 45 per cent of Grade 6 students in Botswana assessed in TIMSS attended preschool.

Page 54: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

36

Engagement in reading lessons and out-of-school lessonsStudents from South Africa and Botswana (prePIRLS) were presented with a series of statements that probed their engagement with reading lessons.63 Their responses were categorised as ‘Engaged’, ‘Somewhat engaged’ or ‘Not engaged’. Students who were ‘Engaged’ were far less likely be experiencing LLOs. Those who were ‘Somewhat engaged’ or ‘Not engaged’ were much more likely to be experiencing LLOs. In South Africa, students who fell into the latter category were almost three times more likely to be experiencing LLOs. It is important to note that the relationship between engagement and performance is likely to be reciprocal: more engaged students are more likely to perform better and higher performing students are more likely to be more engaged.

The indicator of home resources for learning was lower for those students who experienced LLOs and were categorised as ‘Not Engaged’, suggesting that socio-economic status may influence engagement. Howie and colleagues emphasize the need for schools to provide students with a ‘variety of stimulating, developmentally appropriate reading materials aligned to teaching practices that encourage active learning on the part of learners’, in order to help them engage with their reading (Howie et al., 2012, p. 109).

Kenyan students who were found to be experiencing LLOs (English, Swahili or mathematics) were less likely to have received extra lessons or tuition. Much of this may be explained by other demographic factors. For instance, students who received extra lessons or tuition were more likely to have attended private schools, have a higher number of household possessions and have greater access to basic facilities.64 Parents in Kenya, as well as Mauritius, have been reported to be among the highest users of extra tuition outside school (Paviot et al., 2008). However, it may be that the relative costs of such services mean they aren’t often used by families with fewer resources, regardless of academic need (Buchman, 2000).

2.1.2 School-level characteristics of students with LLOs in literacy and numeracy in ESAR

School-level characteristics that were observed to be factors related to LLOs are school resources, school type and school location. Overview tables with the detailed results for each of the factors observed in the analysis per country, grade and domain are presented in Appendix IV.

School resourcesSchool resources captured in PASEC, prePIRLS and TIMSS included in this analysis cover electricity, drinking water facilities, toilets, a school library, and a computer room or computer for instruction. The proportion of students attending schools where such facilities are available varies broadly. For example, a maximum of 27 per cent of Grade 5 students in Comoros attended a school with electricity, as opposed to 7 per cent in Burundi. Students in Comoros also had more access to drinking water in school (71 per cent versus 40 per cent of Grade 5 students in

63 Items that form the scale include ‘I like what I read about in school’; ‘My teacher gives me interesting things to read’; ‘I know what my teacher expects me to do’; ‘I think of things not related to the lesson’; ‘My teacher is easy to understand’; ‘I am interested in what my teacher says’; ‘My teacher gives me interesting things to do’.

64 The available data for this study do not provide information about the quality of the extra lessons and their direct impact on learning outcomes.

Page 55: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

37

Burundi). Toilets were available for at least three-quarters of students in Comoros, and for nearly all students (92 per cent) in Burundi. A school library was available for a maximum of 12 per cent of students in Grade 5 in Comoros and 3 per cent of Grade 2 students in Burundi. Approximately half of the students participating in prePIRLS and TIMSS in South Africa and Botswana were in schools with a library. The biggest difference was found for computer resources: while 1 per cent of students in Comoros and Burundi were in schools with a computer room, between 48 per cent of Grade 4 students in South Africa and 70 per cent of Grade 6 students in Botswana attended a school with computers for instruction.

Students who attended schools with access to electricity and drinking water facilities in Comoros and Burundi (PASEC) were less likely to experience LLOs than students attending schools without such facilities. The ZELA study in Zimbabwe also found adequate water and electricity resources to be strongly associated with student achievement even once other factors such as home resources of the student are taken into account (ACER and ZIMSEC, 2015).65

In Botswana (TIMSS, prePIRLS) and South Africa (prePIRLS), smaller percentages of students in schools with library facilities than those without experienced LLOs. This pattern was not observed in Comoros and Burundi (PASEC), where far fewer schools had access to this resource. Similar results were observed with access to school computers. Schools in Botswana and South Africa with computers for instruction tended to have fewer students experiencing LLOs, whereas the proportion of schools in Comoros and Burundi with computer rooms was negligible.

Majgaard and Mingat (2012) provide a comprehensive overview of school inputs that contribute to learning achievement in primary schools in low-income sub-Saharan African countries. In their report, the authors combined test scores from three international learning assessment programmes – SACMEQ, PASEC and MLA surveys – to create a comparable Africa Student Learning Index (ASLI) for the region. Their major findings at the school level on how learning outcomes can be improved include observable characteristics such as the quality of school buildings and the availability of libraries, although the authors note that it is likely that resources by themselves do not necessarily improve learning. Pedagogy plays a large role. Studies that we analysed show that school resources such as clean water, adequate sanitation and access to suitable learning and reading materials, such as the provision of libraries, are associated with positive student learning outcomes.

School type and school locationData from Kenya, Tanzania and Uganda (Uwezo) indicate an association between school type (public versus private) and student outcomes. Students attending public schools were more likely to be experiencing LLOs than those attending private schools.

In Botswana (TIMSS, prePIRLS) and South Africa (prePIRLS), students attending rural schools were more likely to be experiencing LLOs than their peers attending urban schools.

School type has consistently been shown as a good predictor of achievement (ACER and ZIMSEC, 2015; Ministry of Education, Science, and Technology of Malawi, 2011; Wasanga et al., 2012). Students attending non-government schools traditionally outperform those attending

65 The study reported the presence of electricity and water facilities at the school to be significantly associated with mathematics achievement for both rural and urban areas, but only rural (and not urban) areas for English achievement.

Page 56: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

38

government schools (Uganda National Examinations Board, 2010). Non-government schools also did better than government schools in the three learning areas (UNICEF Eritrea, n.d.). The same pattern has been reported in numerous studies regarding school location. Students attending schools in urban environments tend to have greater achievement levels than those attending schools in rural environments, a finding almost universally consistent across studies (ACER and ZIMSEC, 2015; Makuwa, 2005; Ministry of Education et al., 2011; Moloi and Chetty, 2010; Wasanga et al., 2012).

This is not to say that schools perform better because they are located in urban areas. The amount of resourcing that a school has available likely explains the relationship. As an example, data from Botswana (TIMSS) show that urban schools were more likely to be private and better resourced than schools in rural areas. Indeed, it is likely that school resourcing accounts for much of the variation found between different school types, and between different school locations.

2.2 Trends in learning outcomes of children in primary education in literacy and numeracy in the ESA region

Tracking progress in student performance over time is an important element of system-level monitoring in education. As discussed in Chapter 1, approximately two-thirds of the assessments we reviewed have a system-level monitoring purpose, and conduct recurring assessments to monitor changes in students’ literacy and numeracy performance levels. Apart from SACMEQ, PASEC, Uwezo and PIRLS/TIMSS, these assessments are conducted at a national level. As a regional assessment in Southern and Eastern Africa, SACMEQ is best placed to provide comparable data for a large number of countries (12 ESA countries participated at least twice). Since SACMEQ data were unavailable for this study, trends in performance were analysed for the three countries participating in Uwezo: Kenya, Tanzania and Uganda.66

2.2.1 Uwezo literacy trendsPerformance data for Uwezo relate to the percentage of primary-school-aged students that are able to successfully complete each of the tasks within a learning domain.67 Hence performance is presented at task level for each participating country for each year to explore changes over time.

Students undertaking the literacy component of Uwezo are rated on their highest level of task completion on this ascending scale of difficulty: ‘nothing’, ‘syllables’, ‘words’, ‘paragraphs’ and ‘story’. We show in the bar graph below an example of performance levels for Kenyan students completing the English literacy component over the three years for which data are available (see Figure 4). The data are presented in a hierarchical format that displays the highest task that each student could complete. For 2009–2010, it shows that 49 per cent of students were able to complete the story task. As this is the most difficult, this group of students would therefore also have successfully completed the paragraph, word and letters tasks. Only 6 per cent of students were not able to complete the most basic task (letters).

66 This study in particular looks at trends in performance since 2007. At the time of data analysis, PASEC was last implemented in Burundi and Comoros in 2008/2009 (only once). The first administration of prePIRLS was in 2011. Botswana had participated in TIMSS before, but with varying target populations (TIMSS 2007: Grade 8; TIMSS 2011: Grade 6 and Grade 9).

67 See Appendix I for further discussion of Uwezo data-related issues for trend analyses.

Page 57: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

39

Figure 4 shows a similar proportion of students being able to complete each task across the three years. The differences in proportions across time for each task – or changes in student performance – are all relatively minor. Similar figures for trends in English are provided in Appendix IV for Tanzania and Uganda, and for Swahili in Kenya and Tanzania.

In Tanzania, there was little improvement over time for English in terms of the proportion of students able to complete the most complicated task (story). However, there was a significant reduction in those not able to complete any task, dropping from 34 per cent in 2009–10 to 29 per cent in 2012. This difference was still significant – although reduced in magnitude – after a model that incorporated age and gender was introduced. In Uganda, the opposite pattern was found. There was little change in the proportions of students who could not complete any task, but a significant increase in the proportion of students able to complete the most complicated task, rising from 17 per cent in 2009–10 to 27 per cent in 2012. Because the populations of students that completed the assessments were different, a separate analysis was conducted to determine whether age and gender might account for the difference. Even after these two demographic variables were controlled for, the increase remained significant.

For Swahili, there were few observable trends found across Kenya, but in Tanzania there was a noticeable drop in the proportion of children able to complete the most difficult task, from 45 per cent in 2009–10 to 36 per cent in 2012. The drop remained significant even after the age and gender of the populations were taken into account.

Figure 4. Trends in English performance across time for students in Kenya (Uwezo)

0%

20%

40%

10%

30%

50%

60%

70%

80%

90%

100%

50

13

15

Nothing

Syllables

Words

Paragraph

Story

6

15

50

15

15

5

15

49

15

15

6

17

English performance in Kenya (Uwezo)

Page 58: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

40

2.2.2 Uwezo numeracy trendsSimilar to reading, seven common groups of tasks of increasing difficulty were also defined for mathematics, including ‘counting’, ‘numbers’, ‘values’, ‘addition’, ‘subtraction’, ‘multiplication’ and ‘division’. An example for mathematics performance in Tanzania over time is shown below (see Figure 5). Little variation is evident across the three years in the percentage of students not able to complete anything (11 per cent). At the other end of the performance spectrum, there is an increase from approximately one-third of students able to complete multiplication in 2009–10 to approximately one-half of all students in 2012, a significant improvement that remained after age and gender of the populations were taken into account.

Figure 5. Trends in Mathematics performance across time for students in Tanzania (Uwezo)

For Kenya, we compared mathematics trend results between Uwezo 2011 and 2012 and found little difference in in performance across the two years.68 In Uganda, there was a drop from 32 per cent to 26 per cent between 2011 and 2013 in the percentage of students who were able to successfully complete the most difficult task, which appears to be related to the increased percentage of students achieving their highest task performance at lower levels.

Tables with detailed results for the mathematics performance trends over time in Kenya and Uganda are presented in Appendix IV.

68 The Uwezo division task was not administered in Kenya in 2009–10. The nature of the mathematics performance data from 2009–2010 is therefore not consistent with data from the other two assessments, and was not included in the trends analysis.

0%

20%

2009-10 2011 2012

40%

60%

80%

100%

33

19

11

26

11 11

66

11

7

13

45

11

557

8

15

50

Multiplication

Subtraction

Addition

Values

Numbers

Counting

Nothing

Mathematics performance in Tanzania (Uwezo)

Page 59: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

41

2.2.3 Uwezo: trends for children experiencing LLOs in literacy and numeracy

In addition to monitoring trends in performance at each task level, the trends in the percentage of children experiencing LLOs over time were analysed, using the same criteria reported in Chapter 2.1 and defined in Appendix II. This analysis allows us to assess whether any potential interventions or policy changes at national levels had the desired effects on children performing at the lowest levels. The trend data for the proportions of students who are experiencing LLOs for each domain, for each country participating in UWEZO appears below (see Figure 6).69

Figure 6. Trends in proportions of students experiencing LLOs across Uwezo countries

69 Mathematics data for Kenya in 2010/2011 was not comparable with data for 2012 and 2013, and is thus not reported in this figure.

We found little change over time in the relative percentages of students experiencing LLOs in Kenya for each of the three domains. A slight but significant increase can be seen from 2009 to 2010 in Swahili.

In Tanzania, however, a significant reduction in the percentage of students experiencing LLOs in English and mathematics is observed from 2009/2010 to 2011 and also from 2011 to 2012. Conversely, a greater percentage of children experienced LLOs in Swahili from 2009/2010 to 2011.

0%

10%

Prop

ortio

n of

stu

dent

s ex

perie

ncin

g LL

O(%

)

Kenya

Engl

ish

Engl

ish

Engl

ish

Sw

ahili

Sw

ahili

Mat

hem

atic

s

Mat

hem

atic

s

Mat

hem

atic

sTanzania

Uwezo

Uganda

20%

30%

40%

50%

60%

2013

2012

2010/2011

Page 60: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

42

In Uganda, we observed a reduction in the proportions of children experiencing LLOs in English across the three time periods. For mathematics, however, we noted a significant increase from 2012 to 2013 in the proportions of children experiencing these problems.

Overall, when looking at the trend patterns from Uwezo across domains, there are instances where a higher percentage of students were able to complete the even more difficult tasks. On the other hand, there are instances where the proportion of students unable to complete the most basic of task changed. The reasons for these changes are not clear, nor whether they are positive or negative. To comprehend them, a better understanding is needed of the policy, financing and socio-political environments for education in each country – and how these developed over the years during which Uwezo was conducted.

2.2.4 Other literacy and numeracy trends for the ESA regionAdditional information on trends in performance for the region is available from SACMEQ and PIRLS. Table 12 summarises the direction of major trend increases or decreases for the 15 entities that participated in SACMEQ reading and mathematics assessments for 2000 and 2007. A major increase or decrease is defined as greater than 10 scale points (Hungi et al., 2011). Lesotho, Mauritius, Namibia, Swaziland and Tanzania (Mainland) all had increases in both reading and mathematics of more than 10 points. In Botswana and Tanzania (Zanzibar), only reading scores improved by this margin over the time period, while in Malawi only mathematics scores improved to this extent. A decrease of more than 10 points was found in Mozambique for both reading and mathematics, and only for mathematics in Uganda (Hungi et al., 2011).

Grade 5 students from South Africa participated in the PIRLS study for two cycles (2006 and 2011). Although reading performance for students in 2011 was higher than for students in 2006, this difference was not significant for either English or Afrikaans (Howie et al., 2012). Girls outperformed boys in both cycles, but the gender difference fell from 37 scale points in 2006 to 26 scale points in 2011.

The limited data available for this aspect of learning outcomes make any generalisations difficult for the ESA region. Conclusions based on improvement or decline in student abilities should only be considered at the national level, with careful consideration given to national contextual factors. The contextual background differences of students across years should also be taken into account when interpreting the results.

Page 61: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

43

Table 12. Direction of trends in SACMEQ reading and mathematics scale scores from 2000 to 2007

Country Reading Mathematics

Botswana

Kenya

Lesotho

Malawi

Mauritius

Mozambique

Namibia

Seychelles

South Africa

Swaziland

Tanzania (Mariland)

Tanzania (Zanzibar)

Uganda

Zambia

Zimbabwe •

Increase of more than 10 points from 2000 to 2007

Decrease of more than 10 points from 200 to 2007

Less than 10 point increases or decreases from 200 to 2007

Assessment not administered in 2007•

Source: Hungi et al., 2011

Page 62: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

4444

Page 63: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

45

3. Improving learning outcomes in the ESA region: Effective country level practices

Low learning outcomes in literacy and numeracy – along with a considerable proportion of disadvantaged students not reaching basic skills in these domains – are among the many challenges for primary education in the ESA region. The focus of this chapter is on country-level practices that have proved to be effective in improving the literacy and numeracy learning outcomes of disadvantaged primary students in the ESA region.

When speaking about the ‘effectiveness’ of a programme to improve student performance, two main aspects need to be considered:• Thecharacteristicsandstrategiesoftheprogrammethatweredevelopedandimplemented

to improve student performance, such as teacher training in core reading skills, producing reading materials in the local language, introducing a reading buddy;

• Thecharacteristicsoftheprogrammeevaluationdesignthatallowsforthemeasurementof the impact of the programme on student performance, such as using randomised control trials, and establishing a baseline, mid-line and end-line of student performance using adequate performance measures.

Both aspects are important. While the former make a programme effective, the latter provides evidence of a programme’s effectiveness. The more closely that the evaluation design is aligned with the programme’s intended goals – and the higher the quality of the tools to measure performance – the more valid and meaningful the results, which can then feed into strengthening educational interventions to improve student learning outcomes.

There is a substantial body of literature for the ESA region on the impact of practices to increase the quantitative aspects of education quality – such as access, enrolment and retention rates. In contrast, there are few reports on programmes to improve student learning outcomes and its impact. This observation is not particular to our study. A review of 115 impact evaluation studies in 33 low- and middle-income countries (Murnane and Ganimian, 2014) found a variety of policies that were effective in increasing enrolment of students from low-income families (Murnane and Ganimian, 2014, p. 43). Strategies to improve the quality of education – and hence student performance – were found to be more complex and thus more challenging to undertake. The review revealed that findings about the impact of strategies on student performance are often inconsistent, and that ‘blanket statements about the effectiveness of particular reform strategies are neither accurate nor helpful’ (Murnane and Ganimian, 2014, p. 44). Also, interventions may have different effects on different groups targeted. A study in Kenya, for example, found that low- and high-achieving students derived very different benefits from free English textbooks (Glewwe, Kremer and Moulin, 2009). It is therefore important to consider the effects of an intervention for specific groups or sub-groups, in order to understand whether the same intervention would have a similar outcome with a different population (Murnane and Ganimian, 2014, p. 44).

We take these considerations into account in the following analysis of effective country-level practices that focus on the improvement of learning outcomes in the literacy and numeracy of

Page 64: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

46

disadvantaged children in primary education in ESAR. We selected the programmes based on three main principles: • Theprogrammeaimsatimprovinglearningoutcomesinliteracyandnumeracy.• Theprogrammetargetsdisadvantagedchildreninprimaryeducation.• Theimpactoftheprogrammeonchildren’sliteracyandnumeracylearningoutcomeshas

been evaluated.

3.1 Country-level programmes analysedAltogether 10 programmes were identified, in seven out of the 21 ESA countries that effectively improved learning outcomes in the literacy and numeracy of disadvantaged children in primary education. An overview of these programmes is presented in Appendix V (see Table 23, Appendix V). It is important to note that the programmes presented are examples and do not claim to be exhaustive.70

We categorised the programmes into four groups according to their main objectives. We did this to assist the analysis and help readers identify programmes that match their interests:• Earlygradeliteracyornumeracyprogrammes(7): - Literacy Boost in Ethiopia, Malawi and Mozambique - Reading to Learn in Kenya and Uganda - Primary Maths and Reading Initiative (PMRI) in Kenya - Malawi Teacher Professional Development Support (MTPDS)• Schoolimprovementprogramme(1): - JET’s School Improvement Programme in South Africa• Earlychildhooddevelopmentprogrammes(2): - ECD component of the Early Literacy Project in Mozambique

- Early Literacy and Maths Initiative (ELMI) as part of the Innovation for Education Programme in - Rwanda.

The majority of the identified programmes aim at improving early grade literacy or numeracy (or both). Disadvantaged children targeted in these programmes are low performers from a low socio-economic background or from economically disadvantaged or remote areas. Literacy Boost in Mozambique addresses children affected by HIV/Aids. Young children in communities affected by HIV/Aids are also the target group of the ECD programme in Mozambique. ELMI targeted preschool children, including children from remote areas without access to ECD programmes. JET’s school improvement programme was designed for schools in economically disadvantaged rural areas.

All programmes were evaluated using either Randomised Control Trial (RCT) or quasi-experimental designs. In these settings, students being studied are allocated (or ‘randomly allocated’ in the case of RCT) to the intervention under study and compared with control groups receiving no intervention. In all programme evaluations a baseline and end-line was conducted to measure the progress of performance. Employing such an experimental evaluation design, with pre- and post-measurement of performance, ensures that these features are built in from the very beginning of the programme design. This is commonly observed in the example programmes.

70 A variety of programmes are implemented in the ESA region, as well as in countries where no examples have been identified for this analysis. In general, we observed a lack of evaluation reports or other documentation about the impact of a particular programme on learning outcomes; another 10 programmes were identified during this study where evaluation, including measurement of learning outcomes, was still in progress. They were not included.

Page 65: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

47

3.2 Key strategies for successOne common feature among the programmes we investigated is the holistic approach they take, in which a number of interventions to improve learning outcomes of disadvantaged children are implemented at different levels. Viewing a programme holistically means seamlessly weaving together activities for teachers’ professional development, provision of teaching/learning materials, community mobilisation, and capacity-building at the system level. The interventions flowing from such an approach would work in harmony at the system-, community- and school-levels. The one shortcoming is that holistic approaches make it difficult to assess which strategies are successful and which, in turn, pose challenges for scaling up an intervention. Another shortcoming is the difficulty of ensuring all stakeholders involved are equally committed to the intervention.

Common intervention strategies identified for the four programme groups are discussed in more detail in the following sections.

3.2.1 Early grade literacy and numeracy programmesThe main interventions we observed in the early grade literacy and numeracy programmes are teacher training on reading/mathematics instruction; provision of teaching/learning materials; production of reading materials in the local language; and community- and home-based reading activities that increase access to reading materials for children in and out of school.

All seven programmes reported significant improvement in the reading and mathematics skills of students who were targeted for these interventions.

A number of interventions contributed to the success of Save the Children’s Literacy Boost in Ethiopia, Malawi and Mozambique. In Ethiopia, having a reading buddy turned Literacy Boost non-reading students into readers. In Malawi, Literacy Boost employed a three-pronged approach that included: ‘(1) use of assessments to identify gaps and measure improvements in the five core reading skills; (2) training of teachers in the national curriculum with an emphasis on core reading skills; and (3) community action through community mobilisation to support children’s reading’ (Save the Children, n.d., p. 2). The success of the programme is credited to the use of these three measures.

Reading to Learn was implemented in the official languages of reading instruction in the early primary grades, i.e., Lango in Uganda and Swahili in Kenya. At the heart of the Reading to Learn programme is a ‘five-step scaffolding approach to literacy instruction, building from a conceptual understanding of stories, to decoding letter-sound relationships and eventually writing new sentences and stories’ (Lucas, McEwan, Ngware and Oketch, 2014, p. 951). The coherence of this instructional model, which was aligned with materials, teacher training and well-targeted instructional interventions, helped ensure its success.

The Primary Maths and Reading Initiative (PRIMR) in Kenya contributed to the development and use of pedagogical materials and practices to improve children’s foundational reading and mathematics skills. Materials developed were congruent with Kenyan curriculum documents and included detailed lesson plans based on best teaching practices identified through research,

Page 66: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

48

student books, training manuals for Teacher Advisory Centre (TAC) tutors and teacher training videos, teacher-support mechanisms, student assessment tools and teacher observation tools.71 The end-line impact evaluation showed remarkable improvements in pupils’ literacy and numeracy abilities, especially for pupils starting at the lowest levels. Learning outcomes in English and Kiswahili improved significantly; smaller improvements were seen in mathematics. Overall, girls performed at the same level as boys – if not better – especially in literacy.

Strategies found to be highly beneficial for the implementation of PRIMR were: • consistentTACtutors’visitstoschoolstosupportteachers(includingitsfacilitationthrough

reimbursement);• regular professional development through brief trainings, with follow-up and refresher

meetings;• changingmindsetsfromtraditionalteachingtomoreactive,student-focusedapproaches;• accuratedistributionofclassroommaterialstoschoolsbasedonschoolenrolmentdata;• planningandasophisticateddistributionnetwork;• limitingthenumberofschoolsthataTACtutorisresponsiblefor;• provisionofbooksata1:1ratio;• accommodatingmoreliteracyandnumeracyinstructionaltimeduringtheweek;• keepingthetransferofteacherstrainedinPRIMRtoaminimum(RTI,2014).

The main purpose of the Malawi Teacher Professional Development Support (MTPDS) project was to enhance the instructional practices of teachers, especially for early-grade reading. The project focused mainly on supporting the lower primary sub-sector, with an emphasis on teacher skill development, classroom support, and materials development. The evaluation report showed that students who participated in the intervention achieved noticeable gains in performance, surpassing students in the control schools, with significant differences seen on all sub-tests. Two key strategies were identified as having the most significant impact: • intensivecoachingforteachersprovidedbytheproject’sprimaryeducationadvisors;• continuityinsupportovertime(Randolph,NkhomaandBackman,2013).

3.2.2 School improvement programmes JET’s School Improvement Programme aims at improving the efficiency of the educational system through a systemic approach to enhancing the functioning and educational performance of schools. ‘The key assumption underlying the model is that educational outcomes will improve if teachers are effective and the teaching and learning environments are supported by effective school organisation, community involvement, and district support and monitoring’ (JET Education Services, n.d., p. 9).

The school improvement model entails seven elements:1) Stakeholder mobilisation in the community to support the improvement programme;2) Planning and organisation to improve school management and the functioning of schools

as organisations, including curriculum management, strategic planning and financial management;

3) Teacher performance, including awareness of teaching goals, focus on learning outcomes, access to efficient curriculum delivery systems and resources and provision of curriculum planning and delivery materials, school support visits and cluster-level activities;

71 Ben Piper and the PRIMR team, personal communication, 31 March 2015.

Page 67: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

49

4) Parent involvement through a parent mobilisation programme, which includes setting up home study groups monitored by parents, and developing a practical guide on how parents should support their children’s learning;

5) District support provided at two levels, the district office and the circuit involved in the project, to provide additional capacity for planning and programming of school support and monitoring activities, and for coordination with district-level activities;

6) Teacher competence (subject knowledge and teaching skills): monitoring, planning and facilitating of teachers’ professional development;

7) Research, monitoring and evaluation, including ongoing monitoring conducted by the project schools and district officials.

The key assumption underlying the model is that educational outcomes will improve if teachers are effective, and if the teaching and learning environments are supported by effective school organisation, community involvement, and district support and monitoring (JET Education Services, n.d.). The impact evaluation shows significant improvement in learning outcomes. The mathematics and literacy performance of learners at project schools improved by five percentage points compared with those at non-project schools (JET Education Services, n.d.).

3.2.3 Early Childhood Development programmesThe main interventions observed in early childhood development programmes are teacher (and parent) training on the Early Childhood Development (ECD) approach and provision of teaching and learning materials. A distinctive feature of the Early Literacy and Maths Initiative (ELMI) is that it equips parents and caregivers with the tools to support their children in developing ELM-promoted skills that encourage playful activities. This ‘ELM at Home’ approach aims at extending opportunities to develop ELM skills at home, especially for those children with no access to ECD centres (Save the Children, 2014).

Children who benefited from both of these ECD programmes show significant improvement in their cognitive development. The ECD programme in Mozambique reports that preschool interventions in rural communities improved a number of important dimensions of child development. These included cognitive, fine motor and socio-emotional development, which contribute to higher levels of school readiness and significantly increased primary school enrolment at the appropriate age. The report concludes that low-cost, community-based preschool interventions, such as those studied in the Early Literacy Project, show potential for positively affecting early childhood development in rural African contexts (Martinez, Naudeau and Pereira, 2012).

For ELMI, the preliminary findings from the mid-line evaluation demonstrate the effectiveness of the ELMI programme on children’s learning gains, regardless of whether the programme is implemented at home by parents or by teachers at an ECD centre. In investigations of the drivers of children’s learning gains, typical background characteristics, such as maternal education level and socio-economic status, were found to play a role. More importantly, the time spent in play activities at home at mid-line showed the most consistent relationship with skill growth. The mid-line report concludes that the strong relationship between playful activities between parents and children and learning gains highlights the benefits of children engaging in developmentally appropriate play at early ages (Save the Children, 2014).

Page 68: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

5050

Page 69: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

51

4. A macro theory of changeBased on the evidence collected for this study, we developed a macro theory of change, which is aimed at monitoring and improving the literacy and numeracy performance of primary students in the region. The theory combines the main findings of the stock-taking and comparative analysis of the assessments we reviewed, the main messages derived from the data analysis and related literature on characteristics of children with learning outcomes and trends in performance over time, and the experiences of effective country-level practices in the ESA region.

The theory identifies an evidence-based monitoring and intervention cycle, in which the interdependent sequence of assessment, analysis and action – known as the ‘three A’s – forms the basis of long-term and sustainable change in student performance (see Figure 7):• Assessments, i.e., the systematic, strategic and regular collection of data on educational

outcomes and factors related to these outcomes. Assessments form the basis of the evidence-based monitoring and intervention cycle.

• Analysis and interpretation of the findings on student performance, contexts, their relations and trends over time at the education policy level. This is to identify the factors to be addressed at the different levels of the education system in order to improve student performance.

• Action, the development of targeted educational interventions to improve progress for all learners, based on the factors and levels identified in the education policy analysis.

The three As are embedded in the conceptual framework of input, process and outcome factors at the different levels of an education system, which forms an integral component of the evidence-based monitoring and intervention cycle (see Figure 7).72

For assessments, the conceptual framework includes the classification and definition of the main factors to be considered, and offers guidance on the levels at which data needs to be collected.

For analysis, the framework allows for the identification of factors to be addressed at the different levels of the education system in order to propel change, as well as their stability and suitability for change.

For action, the framework helps identify the programmes required and the level at which they should be implemented. For example, it helps policymakers decide which input, process and outcome factors need to be addressed in policy decisions at the system and school level, and which ones can best be shaped by school development activities at the school and classroom level.

Through the systematic, strategic and regular collection of data through assessments, progress in student performance is monitored, providing findings to be analysed and interpreted on policy level, which can then be transformed into action to further improve student performance.

72 The two-dimensional taxonomy is outlined in the conceptual framework in the introduction to this report.

Page 70: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

52

Figure 7. A macro theory of change. An evidence-based monitoring and intervention cycle as premise for change: assessment, analysis, action

Evidence-based monitoring and intervention cycle

Key outcomes of the study informing the three A’sIn our study, we found the following elements of the three A’s important for the effectiveness of the monitoring and intervention cycle.

AssessmentPurposeWith the majority of the assessments conducted in the ESA region aimed at system-level monitoring (66 per cent), it is important that regular assessments be undertaken to monitor student performance levels and related context factors over time.

Output

Assessment

• Purpose: System level monitoring• Target population: Early, multiple grades,

inclusion of out-of-school children• Domains: Literacy and numeracy; Contexts• Current state and progress: Performance and

contexts• Dissemination strategy: Findings and

products, including datasets

Analysis

Policy analysis and interpretation for strategic decision-making and policy development towards improving student performance• Student performance levels• Association with context

factors at the different levels• Trends over time

Action

• Target interventions and strategies

• Integrated into a holistic programme design, involving a wide range of stakeholders

• Impact evaluation including measuremt of performance

Out

com

ePr

oces

sIn

put

School level Classroom level Student level

Page 71: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

53

Target populationOne of the initial policy decisions must be to define the target population(s) to be assessed. Grades 2 and 3 are the most frequently targeted in the ESA region. Nearly half of the assessments test multiple grades. Apart from Uwezo, which is a household-based assessment, all other assessments are school-based. The school setting excludes a significant proportion of children of primary school age who are out of school. Considerations on how to include out-of-school children in the assessment are necessary in order to gain a full picture of children’s performance levels in a country or region.

Domains and contextsAmong basic competencies, literacy and numeracy are the most widely assessed. Two-thirds of the assessments in the ESA region assess both. Frameworks that clearly define and operationalize what educational themes are to be assessed are invaluable. They can help guide test development, data analysis and interpretation, and support a common understanding of the content and scope of the assessment among all stakeholders. In addition to performance data, it is essential to collect context data in order to explore the relationships between learning outcomes and background factors, and to identify factors relevant to change. The conceptual framework, which sets out relevant input, output and process factors on the different levels of the education system, can help ensure that the intervention instruments are well targeted in terms of scope and information source, e.g., students, teachers, principals, parents, EMIS.

Current state and progress: Performance and contextsIn order to draw conclusions about performance in literacy and numeracy and to identify which factors to address at which levels to initiate change, it is essential that competency levels and benchmarks be defined. In addition, to monitor progress over time, the relationships between context factors and performance must be understood.

In more than half of the assessments (55 per cent), results are presented with reference to competency levels or benchmarks. The description of the skills and knowledge required for the different competency levels provide a concrete understanding of what students can do. Based on this information, measures can be established to address specific learner needs at each level, for example, developing instructional strategies or identifying areas for professional training.

In order to accurately monitor progress over time, learning outcomes can only be compared between different implementations of the same assessment that have the same key features, such as assessment framework, design, target population and sampling design, and conditions of administration across multiple cycles. The contextual background differences of students across years in a study should also be taken into account when interpreting the results.

The same applies for comparisons across countries or regions. At present, the majority of the assessments in the ESA region are conducted at the national level, 29 per cent at the regional and 4 per cent at the international level.73 An innovative approach to compare student performance across different assessments and contexts requires the development of common learning metrics for literacy and numeracy based on international and regional benchmarks for national achievement results and curricular expectations.

73 We view the purpose of the EGRA/EGMA implementations from the stock-taking as either system diagnostics or programme evaluation.

Page 72: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

54

Item response theory (IRT) is an essential technique for scaling cognitive data in order to establish competency levels and effectively monitor performance on system level across contexts and over time. At present, IRT is not often used in the region. However two country case studies included in our study where IRT was used (ZELA in Zimbabwe and LARS in Rwanda) are good examples of innovative capacity-building programmes with a focus on data analysis techniques.

DisseminationTo engage stakeholders and instigate change, a clearly articulated dissemination strategy for the assessment findings is essential. Dissemination strategies were applied in the assessments that we reviewed. The majority of the assessments (71 per cent) made their results reports publicly available. Approximately half also released to the public results summaries, press communiqués and policy briefs. Some also made public fully documented datasets, which enable further independent analyses.

AnalysisThe analysis and interpretation of assessment findings on student performance, context factors and trends over time can inform strategic decision-making and policy development. For example, the analysis we undertook of learning outcomes in the ESA region showed the proportion of students who experienced LLOs ranged from 18 per cent to 40 per cent for numeracy and 18 per cent to 50 per cent for reading literacy across datasets, countries and year levels. The message we take from these results is that individual student characteristics, home background and resourcing (at both the student and school level) are important factors that affect the ability of students to achieve basic levels of competency in literacy and numeracy. The results suggest that students would benefit from increased resourcing at the school level. At home, activities aimed at increasing student engagement with their studies would also have a profound impact on learning outcomes. The findings also suggest that encouraging parents to involve their children in literacy and numeracy activities before they start school, giving them more school-related responsibilities, and providing them with external tuition (if this was somehow made more readily available for those in need) would all positively impact their children’s learning development.

Data that show improved learning outcomes in literacy and numeracy over time are indicative of systems that have helped improve overall student capabilities. Examining the education policies, financing and socio-political environments that may have contributed to these improvements would be helpful, as would studying the learning environments in countries where negative trends in outcomes have been observed.

ActionFor our study, we examined 10 country-level programmes that have proven effective at improving literacy and numeracy learning outcomes of disadvantaged primary students.

Overall, common features included:• Interventions and strategies targeted to a particular group of disadvantaged children,

including children with low literacy achievement, children from disadvantaged socio-economic backgrounds, marginalised children in slums and non-formal settlements, and children in economically disadvantaged rural areas.

Page 73: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

55

• A holistic programme design that addresses as many levels of the education systemand stakeholders as possible. Such an approach typically involves teachers’ professional development, provision of teaching/learning materials, community mobilisation and capacity-building at system level. This is a multi-level approach to programme implementation, which provides interventions at system-, school- and community-levels.

• Impact evaluation, including the measurement of progress in learning outcomes toprovide evidence of success. Programmes with randomised control trials (RCT) or quasi-experimental designs require baseline, mid-line and end-line studies to ensure a built-in monitoring process from the beginning of the study.

Within programmes that share a particular purpose, we identified a number of key strategies that contributed to the success of country-level practices in the ESA region.

Effective early-grade literacy/numeracy programmes included teacher training on reading/mathematics instruction, aligned with the provision of teaching and learning materials and the production of reading materials in the local language, more active, student-focused approaches, the use of assessments, well targeted instructional interventions (e.g., students having a reading buddy to support their learning to read), increased instructional time, and community support for children’s reading.

Additionally, programmes that aimed at a whole-school improvement were shown to have a significant impact on learning outcomes. Effective strategies included strong school organisation, community involvement, district support and monitoring and supportive learning environments.

Effective Early Childhood Development programmes comprised of teacher (and parent) training based on a particular model (e.g. encouraging playful activities involving early literacy and numeracy skills) and provision of teaching materials and tools also helped support early literacy and maths skill development.

These findings underline the importance of exposing children to a rich learning environment in their early years. Combined with an effective organisation that supports teaching and learning at all levels and equips teachers (and parents) with instructional approaches and educational material, these factors establish a sound basis for literacy and numeracy skills development.

ConclusionsSynthesising the main findings from this study, we developed a macro theory of change, anchored in the ‘three A’s approach (assessment, analysis and action) and aimed at initiating a long-term and sustainable change in student performance. While substantial research has been undertaken to identify the factors that contribute to student school attendance, more research is required to understand how student learning can be improved, particularly in developing countries. We need to deepen our knowledge of where students are at in their learning and how performance progresses over time so that effective targeted interventions can be developed. In order to do this, assessment programmes must be undertaken to provide quality comparable data across populations, between grades and over time. Finally, the results of the assessments must be integrated into education reform agendas, so that human and financial resources to address children’s needs are efficiently deployed.

Page 74: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

5656

Page 75: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

57

ReferencesACER. (2014a). The Early Grade Reading Assessment: Assessing children’s acquisition of

basic literacy skills in developing countries. Assessment GEMS Series No. 2. ACER Press, Melbourne. <http://www.acer.edu.au/files/AssessGEMs_EGRA.pdf>

ACER. (2014b). Inception report on specific methodology: Consultancy service for improving quality education and children’s learning outcomes and effective practices in the Eastern and Southern Africa region. ACER Press, Melbourne.

ACER. (2015). The Southern and Eastern Africa Consortium for Monitoring Educational Quality. Assessment GEMS Series No. 8. ACER Press, Melbourne.

ACER. (n.d.). Learning assessments at a glance. ACER Press, Melbourne. Viewed 30 March 2015, <http://www.acer.edu.au/gem-la>.

ACER, and ZIMSEC. (2013). Zimbabwe Early Learning Assessment (ETF programme) Base-line Study.

ACER, and ZIMSEC. (2015). Evaluation of the Education Development Fund Programme: Zimbabwe Early Learning Assessment (ZELA). 2014 Monitoring Report.

American Institutes for Research (AIR). (2012). Ethiopia Early Grade Reading Assessment - Data Analytic Report: Language and Early Learning: American Institutes for Research (AIR).

American Institutes for Research (AIR). (n.d.). Namibian National Standardized Achievement Test (NSAT). Retrieved 17 March, 2015, from http://www.air.org/project/namibian-national-standardized-achievement-test-nsat

Beattie, K. (2014). Early Grade Reading Assessment results Somalia Programme: Concern Worldwide.

Beattie, K., & Grogan, R. (2013). Results of the Early Grade Reading Assessment conducted in five schools in Mogadishuschools: Concern Worldwide.

Biggs, J. (1999). Teaching for Quality Learning at University. Philadelphia, PA: Society for Research into Higher Education and Open University Press.

Biggs, J., and Moore, P. (1993). The Process of Learning (3rd edn). Australia: Prentice Hall.

Brombacher, A., Nordstrum, L., Davidson, M., Batchelder, K., Cummiskey, C., & King, S. (2014). National Baseline Assessment for the 3Rs (Reading, Writing, and Arithmetic) Using EGRA, EGMA, and SSME in Tanzania: Study Report.

Cao, Y., Dowd, A. J., Mohammed, O., Hassen, S., Hordofa, T., Diyana, F., & Ochoa, C. (2011). Literacy Boost, Dendi Ethiopia: Three month report: Save the Children.

Collins, P., Galbert, P. D., Hartwell, A., Kochetkova, E., Mulcahy-Dunn, A., Nimbalkar, A., and Ralaingita, W. (2012). Pupil Performance, Pedagogic Practice, and School Management: An SSME Pilot in Zambia: RTI, North Carolina.

CONFEMEN. (2010a). Rapport PASEC–Burundi 2010: Enseignement primaire: Quels défis pour une éducation de qualité en 2015? CONFEMEN, Dakar, Senegal.

Page 76: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

58

CONFEMEN. (2010b). Rapport PASEC–Union des Comores: Diagnostique et préconisations pour une scolarisation universelle de qualité. CONFEMEN, Dakar, Senegal.

CONFEMEN. (2010c). RAPPORT PASEC Union des Comores 2010. Diagnostic et préconisations pour une scolarisation universelle de qualité. PASEC/CONFEMEN, Comoros.

CONFEMEN. (n.d.). Programme d’Analyse des Systèmes d’Éducatif de la CONFEMEN (PASEC). Viewed 16 March 2015, <http://www.confemen.org/le-pasec/>

Constitution of Zimbabwe Amendment (No. 20), 2013. (2013).

Creemers, B., and Kyriakides, L. (2008). The Dynamics of Educational Effectiveness: A Contribution to Policy, Practice, and Theory in Contemporary Schools. Routledge, London.

Crouch, L., Korda, M., & Mumo, D. (2009). Improvements in Reading Skills in Kenya: An Experiment in the Malindi District: RTI.

Department of Basic Education Republic of South Africa. (2011). Report on the Annual National Assessments of 2011.

Department of Basic Education Republic of South Africa. (2012). Report on the Annual National Assessment of 2012: Grades 1-6 & 9.

Department of Basic Education Republic of South Africa. (2013). Report on the Annual National Assessment of 2013: Grades 1-6 & 9.

Department of Basic Education Republic of South Africa. (2014). Report on the Annual National Assessment of 2014: Grades 1–6 and 9. Department of Basic Education, Pretoria.

DeStefano, J., Ralaingita, W., Costello, M., Sax, A., & Frank, A. (2012). Task Order 7: Early Grade Reading and Mathematics in Rwanda: RTI.

Dowd, A. J., & Fonseca, J. (2012). Literacy Boost Results Mozambique: Save the Children.

Dowd, A. J., and Mabeti, F. (2011). Literacy Boost Results Malawi: Year 2 report. Save the Children, Washington DC.

Dowd, A. J., Wiener, K., & Mabeti, F. (2010). Literacy Boost Results Malawi: Year 1 report: Save the Children.

Examinations Council of Lesotho (ECoL). (n.d.). Examinations Counil of Lesotho: Research and Publications. Retrieved 17 March, 2015, from http://www.examscouncil.org.ls/research.aspx

Examinations Council of Zambia. (2015). Zambia - National Assessment Survey.

fhi 360 Education Policy and Data Centre. (2015). EPDC Policy Brief: Mapping national assessments. Education Policy and Data Centre, Washington DC.

fhi 360 Education Policy and Data Centre. (n.d.-a). Data. Retrieved 2 April, 2015, from http://www.epdc.org/data

fhi 360 Education Policy and Data Centre. (n.d.-b). National Education Profiles. Viewed 2 April 2015, <http://www.epdc.org/tags/national-education-profiles>.

Friedlander, E., Candiru, S., & Dowd, A. J. (2010). Literacy Boost Uganda: Baseline Report 2010: Save the Children.

Page 77: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

59

Friedlander, E., Hordofa, T., Diyana, F., Hassen, S., Mohammed, O., and Dowd, A. J. (2012). Literacy Boost, Dendi Ethiopia: Endline II. Save the Children, Washington DC.

Fullan, M. (2006). Change theory: A force for school improvement. Seminar Series Paper no. 157, November. CSE, Melbourne.

Garrouste, C. (2011). ‘Explaining learning gaps in Namibia: The role of language proficiency.’ International Journal of Educational Development, 31(3), 223–233. doi: 10.1016/j.ijedudev.2010.06.016

Glewwe, P., Kremer, M., and Moulin, S. (2009). ‘Many Children Left Behind? Textbooks and Test Scores in Kenya.’ American Economic Journal: Applied Economics, 1(1), 112–135.

Global Partnership for Education. (2012). Results for Learning Report 2012: Fostering Evidence-Based Dialogue to Monitor Access and Quality in Education. GPE, Washington DC.

Gove, A., and Wetterberg, A. (2011). The Early Grade Reading Assessment: Applications and Interventions to Improve Basic Literacy. RTI International, North Carolina.

Government of Zimbabwe. (2009). Short Term Emergency Recovery Programme (STERP): Getting Zimbabwe moving again. Government of Zimbabwe, Harare.

Guajardo, J., Night, C. S., Heijnen, E., Komakech, C., P’kech, E. A., Odwong, O. D., . . . Dowd, A. J. (2010). Literacy Boost Uganda: Midline Report 2012: Save the Children.

Hassen, S., & Friedlander, E. (2012). Literacy Boost Results Ethiopia: Save the Children.

Hivos/Twaweza. (2014). Are Our Children Learning? Literacy and Numeracy across East Africa 2013. Uwezo East Africa at Twaweza, Kenya.

Howie, S., Staden, S. v., Tshele, M., Dowse, C., and Zimmerman, L. (2012). PIRLS 2011. South African Children’s Reading Literacy Achievement Report. Centre for Evaluation and Assessment, University of Pretoria, Pretoria.

Human Sciences Research Council South Africa. (n.d.). Testing, testing First national assessment of Grade 9 pupils shows much work lies ahead. Retrieved 20 March, 2015, from http://www.hsrc.ac.za/en/review/september-2011/testing-testing#sthash.q42e9GXL.dpuf

Hungi, N. (2011). Characteristics of school heads and their schools. UNESCO, Paris.

Hungi, N. (2011a). Accounting for variations in the quality of primary school education. Retrieved from: http://www.sacmeq.org/sites/default/files/sacmeq/reports/sacmeq-iii/working-papers/07_multivariate_final.pdf

Hungi, N. (2011m). Characteristics of grade 6 pupils, their homes and learning environments SACMEQ Working Paper. Paris: SACMEQ.

Hungi, N., Makuwa, D., Ross, K., Saito, M., Dolata, S., van Cappelle, F., . . . Vellien, J. (2010). SACMEQ III project results: Pupil achievement levels in reading and mathematics. SACMEQ, Paris.

Hungi, N., Makuwa, D., Ross, K., Saito, M., Dolata, S., van Cappelle, F., . . . Vellien, J. (2011). SACMEQ III project results: Levels and trends in school resources among SACMEQ school systems. SACMEQ Working Document. Paris: SACMEQ.

Page 78: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

60

Hungi, N., Ngware, M., and Abuya, B. (2014). ‘Examining the impact of age on literacy achievement among grade 6 primary school pupils in Kenya.’ International Journal of Educational Development, 39, 247–259.

IEA. (2013a). PIRLS 2011. IEA, Amsterdam

IEA. (2013b). PIRLS 2011 Contextual Questionnaires. Viewed 17 April 2015, <http://timssandpirls.bc.edu/pirls2011/international-contextual-q.html>.

IEA. (2013c). TIMSS 2011. Viewed 7 August 2015, <http://timssandpirls.bc.edu/timss2011/schedule.html>

IEA. (2013d). TIMSS 2011 Contextual Questionnaires. Viewed 17 April 2015, <http://timssandpirls.bc.edu/timss2011/international-contextual-q.html>

IEA. (n.d.). IEA Study Data Repository. Viewed 17 March 2015, <http://rms.iea-dpc.org/>.

JET Education Services. (n.d.). Sustainable school improvement. JET, Zambia.

Joncas, M. (2011). Methods and procedures: PIRLS 2011 Target populations: IMSS & PIRLS International Study Centre, Lynch School of Education, Boston College.

Kaplan, R. M., and Saccuzo, D. P. (1997). Psychological testing: Principles, applications and issues. Brooks Cole Publishing Company, Pacific Grove, California.

Kyriakides, L., and Creemers, B. (2006). ‘Using the dynamic model of educational effectiveness to introduce a policy promoting the provision of equal opportunities to students of different social groups.’ In D. M. McInerney, M. Dowson, and S. V. Etten (eds), Effective Schools, vol. 6, pp. 17–41. Information Age Publishing, North Carolina.

Lesotho National Assessment of Educational Progress (LNAEP) Survey Report, 2010. (n.d.).

Lucas, A. M., McEwan, P. J., Ngware, M., and Oketch, M. (2014). Improving early-grade literacy in East Africa: Experimental evidence from Kenya and Uganda. University of Delaware, Delaware.

Majgaard, K., and Mingat, A. (2012). Education in sub-Saharan Africa: A comparative analysis. World Bank, Washington DC.

Makuwa, D. (2005). The SACMEQ II Project in Namibia: A Study of the Conditions of Schooling and the Quality of Education. Ministry of Basic Education, Sport and Culture, Namibia.

Makuwa, D. (2011). Characteristics of grade 6 teachers SACMEQ Working Paper. Paris: SACMEQ.

Martin, M. O., and Mullis, I. V. S., (eds). (2012). Methods and procedures in TIMSS and PIRLS 2011. TIMSS and PIRLS International Study Center, Boston College, MA.

Martin, M. O., Mullis, I. V. S., Foy, P., and Arora, A. (2012). TIMSS 2011 International Results in Mathematics. TIMSS and PIRLS International Study Center, Boston College, MA.

Martin, M. O., Mullis, I. V. S., Foy, P., & Stanco, G. M. (2012). TIMSS 2011 International Results in Science. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

Martinez, S., Naudeau, S., and Pereira, V. (2012). The Promise of Preschool in Africa: A Randomized Impact Evaluation of Early Childhood Development in Rural Mozambique. 31e, Washington DC.

Page 79: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

61

Mason, M. ( 2014). Complexity Theory in Education Governance: Initiating and sustaining systemic change. UNESCO Bureau of Education, Paris.

Miksic, E., & Harvey, S. (2012). Malawi National Early Grade Reading Assessment Survey: Midterm Assessment: RTI.

Ministère de l’Éducation Nationale et de la Recherche, & PASEC-CONFEMEN. (2010). Rapport PASEC - Union des Comores: Diagnostique et préconisations pour une scolarisation universelle de qualité. Dakar, Senegal: CONFEMEN.

Ministère de l’Enseignement de Base et Secondaire, de l’Enseignment de Métiers, de la Formation Professionnelle et de l’Alphabétisation, & PASEC-CONFEMEN. (2010). Rapport PASEC - Burundi 2010: Enseignement primaire: Quels défis pour une éducation de qualité en 2015? Dakar, Senegal: CONFEMEN.

Ministry of Education, Science, and Technology of Malawi. (2011). The SACMEQ III project in Malawi. Malawi Ministry of Education, Science and Technology, Malawi.

Ministry of Education, Science and Technology of Malawi. (2014). Monitoring Learning Achievement in Primary Education Malawi Report. Malawi Ministry of Education, Science and Technology, Malawi.

Ministry of Education of Angola, World Bank, A. R., Human Development Department, and Russia Education Aid for Development Programme. (2011). Early Grade Literacy in Angola.

Ministry of Education of Ethiopia (FDRE). (2008). Ethiopian Third National Learning Assessment of Eighth Grade Students. Ministry of Education of Ethiopia, Addis Ababa.

Ministry of Education of Ethiopia (FDRE). (2013). Ethiopian 4th National Learning Assessment of Grades 4 and 8 Pupils: Data Analytic Report. Addis Ababa, Ethiopia.

MOESAC. (2006). Education Amendment Act, 2005 (Chapter 25:04), H.B.6A, 2005. Government of Zimbabwe, Harare.

MOESAC. (2009). Education at a Glance 2009. MOESAC, Harare.

MOESAC. (2011). Education Medium Term Plan 2011–2015. MOESAC, Harare.

Moloi, M. (n.d.). Mathematics achievement in South Africa: A comparison of the official curriculum with pupil performance in the SACMEQ II Project.

Moloi, M. Q., and Chetty, M. (2010). The SACMEQ III project in South Africa: A study of the conditions of schooling and the quality of education: South Africa Country Report. Department of Basic Education, Pretoria.

Monyaku, B., & Mmereki, O. A. (2011). The SACMEQ III project in Botswana: A study of the conditions of schooling and the quality of education: Botswana Ministry of Education and Skills Development, Division of Planning Statistics and Research.

Monyaku, B. (2012). The SACMEQ III project in Botswana: A study of the conditions of schooling and the quality of education. Botswana Ministry of Education and Skills Development, Botswana.

Page 80: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

62

Mullis, I., Martin, M., Foy, P., and Arora, A. (Eds.). (2012). TIMSS 2011 International Results in Mathematics. TIMSS and PIRLS International Study Center, Boston College, and International Association for the Evaluation of Educational Achievement (IEA). Chestnut Hill, MA.

Mullis, I., Martin, M., Foy, P., and Drucker, K. (2012). PIRLS 2011 International Results in Reading. TIMSS and PIRLS International Study Center, Boston College, and International Association for the Evaluation of Educational Achievement (IEA). Chestnut Hill, MA.

Mullis, I. V. S., and Martin, M. O. (Eds.). (2013). PIRLS 2016 Assessment Framework. TIMSS and PIRLS International Study Center, Boston College, and International Association for the Evaluation of Educational Achievement (IEA). Chestnut Hill, MA.

Mullis, I. V. S., Martin, M. O., Foy, P., & (with Olson, J. F., Preuschoff, C., Erberber, E., & Galia, J.). (2008a). TIMSS 2007 International Science Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

Mullis, I. V. S., Martin, M. O., Foy, P., & (with Olson, J. F., Preuschoff, C., Erberber, E., Arora, A., & Galia, J.). (2008b). TIMSS 2007 International Mathematics Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012). PIRLS 2011 International Results in Reading. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

Mullis, I. V. S., Martin, M. O., Kennedy, A. M., & Foy P. (2007). IEA’s Progress in International Reading Literacy Study in Primary School in 40 Countries. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

Mullis, I. V. S., Martin, M. O., Kennedy, A., Trong, K., and Sainsbury, M. (2009). PIRLS 2011 Assessment Framework. TIMSS and PIRLS International Study Center and International Association for the Evaluation of Educational Achievement. IEA, Amsterdam.

Mullis, I. V. S., Martin, M. O., Ruddock, G. J., O’Sullivan, C. Y., and Preuschoff, C. (2009). TIMSS 2011 Assessment Frameworks. TIMSS and PIRLS International Study Center and International Association for the Evaluation of Educational Achievement. IEA, Amsterdam.

Mungoi, D., Mandlante, N., Nhatuve, I., Mahangue, D., Fonseca, J., and Dowd, A. J. (2010). Endline Report of Early Literacy among Pre-school and Primary School Children in Gaza Province. Save the Children, Mozambique

Murnane, R. J., and Ganimian, A. (2014). Improving educational outcomes in developing countries: Lessons from rigorous evaluations: NBER Working Paper 20284. National Bureau of Economic Research, Cambridge, MA.

National Educational Assessments and Examinations Association of Ethiopia. (n.d.). National Educational Assessments and Examinations Association. Retrieved 17 March, 2015, from http://www.nae.gov.et/HOMEE.aspx

Nhongo, K. (2014, 19 March). Grade 5 NSAT results depressing. Windhoek Observer. Retrieved from http://observer24.com.na/national/3131-grade-5-nsat-results-depressing

Nyanguru, A., and Peil, M. (1991). ‘Zimbabwe since Independence: A people’s assessment.’ African Affairs, 90(361), 607-620.

Page 81: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

63

OECD. (2013). PISA 2012 Assessment and Analytical Framework: Mathematics, Reading, Science, Problem Solving and Financial Literacy. OECD Publishing.

Piper, B. (2010). Uganda Early Grade Reading Assessment Findings Report: Literacy Acquisition and Mother Tongue: RTI.

Piper, B., & Mugenda, A. (2013). The Primary Math and Reading (PRIMR) Initiative: Midterm Impact Evaluation. North Carolina, USA: RTI.

Pouezevara, S., Costello, M., & Banda, O. (2013). Malawi National Early Grade Reading Assessment Survey: Final Assessment - November 2012: RTI.

Purves, A. C. (1987). ‘The Evolution of the IEA: A Memoir’. Comparative Education Review, 31(1), 10–28.

Randolph, E., Nkhoma, M., and Backman, S. (2013). ABE/LINK Malawi Teacher Professional Development Support: Project M&E report. USAID.

Raupp, M., Newman, B., & Revés, L. (2013). Impact evaluation for the USAID/Aprender la ler project in Mozambique: Baseline report: International Business & Technical Consultants, Inc. (IBTCI).

Report on Monitoring Learning Achievements (MLA) in Grade 4 in Puntland and Somaliland. (2012).

Results for Development. (n.d.). Center for Education Innovations. Viewed 2 April 2015, <http://www.educationinnovations.org/>.

Ross, K. (2009). The Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ). Paper presented at the Global Partnership for Education’s ‘Biennial Partnership Meeting’ Copenhagen, April 20–21.

RTI. (2004). Eddata II: Education data for decision making: School management. Viewed 17 April 2015, <https://www.eddataglobal.org/management/index.cfm>.

RTI. (2010). Ethiopia Early Grade Reading Assessment–Data Analytic Report: Language and Early Learning. RTI International, North Carolina.

RTI. (2011). Malawi National Early Grade Reading Assessment Survey: Baseline Assessment: RTI.

RTI. (2012). The Primary Math and Reading (PRIMR) Initiative: Baseline Report. North Carolina, USA.

RTI. (2014a). EGRA Tracker.

RTI. (2014). USAID/Kenya Primary Math and Reading (PRIMR) Initiative: Final Report. RTI, North Carolina.

RTI. (2015a). EGRA Tracker. In EGRA_tracker_for_web_16June2015.xlsx (Ed.).

RTI (2015c, January 9). [Personal Comunication, Request for data access].

RTI. (n.d.). EdData: Education data for decisionmaking. Retrieved 17 March, 2015, from https://www.eddataglobal.org/index.cfm

Rwanda Education Board. (2012). Learning Achievement in Rwandan Schools (LARS). Rwanda Education Board (REB), Kigali, Rwanda.

Page 82: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

64

Rwanda Education Ministry. (2014). 2013 Education Statistical Yearbook. Ministry of Education, Kigali.

SACMEQ. (2013). SACMEQ. Viewed 31 March 2015, <http://www.sacmeq.org/>.

Sakala, C. T., & Chilala, M. M. (2007). The Role of the Zambia National Assessment Programme in Evaluating the Attainment of Educational Goals. Paper presented at the 33rd International Association for Educational Assessment Annual Conference, Baku, Azerbaijan. http://www.iaea.info/documents/paper_1162d22700.pdf

Sasman, C. (2011, 19 July). Grade 7 tests show disappointing results. The Namibian. Retrieved from http://www.namibian.com.na/indexx.php?archive_id=82769&page_type=archive_story_detail&page=1580

Satio, M. (2011). Trends in the magnitude and direction of gender differences in learning outcomes, SACMEQ Working Paper No .4. SACMEQ.

Save the Children. (2014). Early Literacy and Maths Initiative (ELMI): Rwanda Midline Report. Save the Children, Washington DC.

Save the Children. (n.d.). Literacy Boost. Viewed 17 March 2015, <http://resourcecentre.savethechildren.se/>.

The Australian Council for Educational Research, & Zimbabwe School Examination Council. (2013a). Evaluation of the Education Development Fund Program - Zimbabwe Early Learning Assessment (ZELA): 2013 Monitoring Report.

The Australian Council for Educational Research, & Zimbabwe School Examination Council. (2013c). Zimbabwe Early Learning Assessment (ETF program) Base-line Study.

The Australian Council for Educational Research, & Zimbabwe School Examination Council. (2015). Evaluation of the Education Development Fund Program - Zimbabwe Early Learning Assessment (ZELA): 2014 Monitoring Report.

The Kenya National Examinations Council. (n.d.). The Kenya National Examinations Council: NASMLA. Retrieved 17 March, 2015, from http://www.knec.ac.ke/main/index.php?option=com_phocadownload&view=section&id=43:nasmla

The National Assessment Centre. (2010). Monitoring of learner achievement for class 3 in literacy and numeracy in Kenya: Summary of results and recommendations. The Kenya National Examinations Council, Nairobi, Kenya.

The National Assessment Centre. (2010a). Monitoring of learner achievement for class 3 in literacy and numeracy in Kenya. Nairobi, Kenya: The Kenya National Examinations Council.

The National Assessment Centre. (2010c). Monitoring of learner achievement for class 3 in literacy and numeracy in Kenya: Summary of results and recommendations. Nairobi, Kenya: The Kenya National Examinations Council.

TIMSS and PIRLS International Study Center. (n.d.). TIMSS and PIRLS. Viewed 16 March 2015, <http://timssandpirls.bc.edu/>.

Twaweza. (n.d.). Uwezo. Viewed 17 March 2015, <http://www.uwezo.net/>.

Uganda National Examinations Board. (2010). The Achievement of Primary School Pupils in Uganda in Numeracy, Literacy in English and Local Languages. Uganda National Examinations Board, Kampala.

Page 83: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

65

Uganda National Examinations Board. (n.d.). Uganda National Examinations Board - Home. Retrieved 18 March, 2015, from http://www.uneb.ac.ug/index.php?link=Home

UNESCO. (2008). EFA global monitoring report 2008—Education for all by 2015: Will we make it? UNESCO, Paris.

UNESCO. (2010). EFA Global Monitoring Report: Reaching the marginalized: Regional overview: sub-Saharan Africa. UNESCO, Paris.

UNESCO. (2014). Education for All in sub-Saharan Africa Assessment Report 2014. UNESCO Dakar Regional Office, Dakar.

UNESCO. (2015a). EFA Global Monitoring Report 2015. EFA 2000–2015: Achievments and Challenges. UNESCO, Paris.

UNESCO. (2015b). National Assessments [Excel file for Annex of 2015 EFA GMR report—in publication].

UNESCO. (n.d.-a). Education For All Global Monitoring Report. Viewed 2 April 2015, <http://www.unesco.org/new/en/education/themes/leading-the-international-agenda/efareport/>.

UNESCO. (n.d.-b). International Bureau of Education. Viewed 2 April 2015, <http://www.ibe.unesco.org/en.html>.

UNICEF Eritrea. (n.d.). Monitoring Learning Achievement in Eritrea.

UNICEF Mozambique Country Office (2015). [Comments on the Improving Quality Education and Children’s Learning Outcomes and Effective Practices in the Eastern and Southern Africa region: Final Report].

UNICEF. (2008a). News note: Zimbabwe education system in a state of emergency. Viewed 1 April 2015, <http://www.unicef.org/media/media_45950.html>.

UNICEF. (2008b). Progress report for UNICEF’s education in emergencies and post-crisis transition programme. UNICEF.

UNICEF Rwanda Country Office (2015). [Comments on the Improving Quality Education and Children’s Learning Outcomes and Effective Practices in the Eastern and Southern Africa region: Final Report].

UNICEF Somalia. (n.d.). Report on Monitoring Learning Achievements (MLA) in Grade 4 in Puntland and Somaliland.

UNICEF. (2011). The Education Transition Fund II, 2012–2015: Programme Document. UNICEF, Harare.

UNICEF Zambia Country Office (2015). [Comments on the Improving Quality Education and Children’s Learning Outcomes and Effective Practices in the Eastern and Southern Africa region: Final Report].

UNICEF. (2012). Zimbabwe 2012: Millennium Development Goals Progress Report. UNICEF Zimbabwe, Harare.

UNICEF. (2013a). Country Office Portal: Annual report 2013 for Rwanda. ESARO.

UNICEF. (2013b). Country Office Portal: Annual report 2013 for Zimbabwe.

UNICEF. (2014). The Education Development Fund: Stronger systems, better outcomes: Sixth Progress Report. Harare: UNICEF Zimbabwe.

Page 84: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

66

UNICEF Eritrea. (n.d.). Monitoring Learning Achievement in Eritrea.

UNICEF Zambia Country Office (2015). Comments on the Improving Quality Education and Children’s Learning Outcomes and Effective Practices in the Eastern and Southern Africa region: Final Report. UNICEF.

USAID. (2011). USAID Funded Malawi Teacher Professional Development Support (MTPDS) Program - 2010 Early Grade Assessment (EGMA): National Baseline Report 2010.

Uwezo-Kenya. (2013). Are Our Children Learning? Annual Learning Assessment Report 2012. Nairobi, Kenya: Uwezo, WERK.

Uwezo-Kenya. (2013). Survey Booklet 2013. Uwezo.

Uwezo-Tanzania. (2013). Are Our Children Learning? Annual Learning Assessment Report 2012. Dar es Salaam, Tanzania: Uwezo, TEN/MET.

Uwezo-Uganda. (2013). Are Our Children Learning? Annual Learning Assessment Report 2012. Kampala, Uganda: UWEZO Uganda.

Uwezo. (2014). Are Our Children Learning? Literacy and Numeracy across East Africa 2013. Nairobi, Kenya: Uwezo, HIVOS/Twaweza.

Uwezo. (2014). Datasets viewed November 19 2014, <http://www.uwezo.net/>.

Vogel, I. (2012). Review of the use of ‘Theory of Change’ in international development: Review report. DFID, London.

Wasanga, P. M., A. Ogle, M., and Wambua, R. M. (2012). The SACMEQ III project in Kenya: A study of the conditions of schooling and the quality of education. The Kenya National Examinations Council, Nairobi.

Wasanga, P. M., Ogle, M. A., and Wambua, R. M. (2010). The Report on Monitoring Learner Achievement Study for Class 3 in Literacy and Numeracy. The Kenya National Examination Council, Nairobi.

Wasanga, P. M., Ogle, M. A., & Wambua, R. M. (2012). The SACMEQ III project in Kenya: A study of the conditions of schooling and the quality of education: The Kenya National Examinations Council.

World Bank. (2015). Zimbabwe. <http://data.worldbank.org/country/zimbabwe>.

Page 85: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

67

Appendix I: Methodology

Methodology for Chapter 1: Stock-taking and comparative analysis of existing assessments in the ESA regionFor our stock-taking of existing assessments on primary students’ literacy and numeracy learning outcomes in the ESA region, we focused on assessments that provide data for one of the following three purposes:• system-leveldiagnostic;• system-levelmonitoring;• programmeevaluation.

National examinations were not included. They have a wholly different purpose, and use their own methods of sampling, data analysis and reporting. Furthermore, examinations tend not be accompanied by the publicly available documentation that learning assessments generate.

Stock-taking framework The framework used for presenting and analysing the results of our stock-taking was developed in relation to previous work ACER undertook to characterise assessments, as well as to the 2008 EFA Global Monitoring Report and other recent attempts to map the national assessment landscape (see Table 13).74

The detailed results of the stock-taking, with information for all framework categories for each assessment, are presented in the main stock-taking table in Appendix VI.

74 For an example of previous work ACER has done on characterising assessments, see Learning assessments at a glance (ACER, n.d.). For the assessment mapping framework used in the 2008 EFA Global Monitoring Report, see the annex ‘National learning assessments by region and country’ in UNESCO (2008, pp. 208–220). For another example of a recent attempt to map the national assessment landscape, see fhi 360 Education Policy and Data Centre (2015).

Page 86: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

68

Table 13. Stock-taking framework

Framework element

How assessments were classified within the framework element

Country Name of country of implementation

Assessment name

Name of assessment

Organisations/institutions responsible

Name of implementing body

Purpose Assessment purpose, distinguishing:• System-level diagnostic: Administered once to get a snapshot of

student performance levels at the system-level (usually national)• System-level monitoring: Administered repeatedly to monitor student

performance levels at the system-level (usually national)• Programme evaluation: Administered on smaller scale to evaluate the

impact of a programme that aims to improve student performance, with treatment and control groups, and usually involving baseline, (mid-line), and end-line

Inception Assessment start date

Frequency Frequency of assessment administration:• cyclelengthiftheassessmentisconductedregularly• yearsofimplementationiftheassessmentisrepeated,butnot

regularly• ‘N/A’ifanassessmentisadministeredonlyonce(one-off)

Target population

Grade-based (e.g., Grade 4) or age-based (e.g., 10 year old students)

Sample Brief details of achieved sample, including whether it is nationally representative

Cognitive domains

Cognitive domains covered in the assessment to measure student performance (e.g., literacy and numeracy)

Contextual instruments

Contextual data collection instruments (e.g., student questionnaire, teacher questionnaire, school head questionnaire)

Test administration

Test administration methods, distinguishing between the following:• School-basedorhousehold-basedforadministrationlocation• Groupadministration,smallgroupadministrationorone-on-one

administration for administration method• Paper-based,tablet-basedororalforadministrationmode

Page 87: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

69

Data analysis Data analysis approaches, particularly:• ifIRTanalysisisusedtoscaledataonstudentperformance;• ifcompetencylevels/benchmarksareestablished;• howstudentperformanceisanalysed(e.g.frequencyanalyses,mean

scores);• ifrelationshipsbetweenstudentperformanceandcontextualfactors

are explored via analytical methods such as correlation, regression, multilevel modelling;

• iftrendanalysisisconducted;• ifinternationalcomparisonsareused(inmulti-countryassessments).

Reporting and dissemination

• ifresultsreportsarepubliclyavailable• otherreportinganddisseminationmethods

Stock-taking approachThe approach for our stock-taking of the assessments included:• collatingcurrentknowledgewithinACERaboutassessmentsinESAR;• identifyinggapsininformationaboutparticularassessments,andESARcountrieswhere

little or nothing is known about assessments;• attemptingtofillinformationgapsbyconsultingthefollowingdata/documentation:

- data/documentation from sources including UNESCO’s International Bureau of Education (UNESCO (n.d.-b); UNESCO’s EFA Global Monitoring Reports (UNESCO (n.d.-a); the Education and Policy Data Centre maintained by fhi360°, particularly their education profiles, databases and the findings of their national assessment mapping activity (fhi 360 Education Policy and Data Centre (2015, n.d.-a, n.d.-b); the Centre for Education Innovations maintained by Results for Development (Results for Development (n.d.); and the EdData website75 (RTI International, 2004);

- the annual country office reports and education statistics reports that have been provided by the UNICEF ESARO and UNICEF Cos;

- data/documentation from activities undertaken as part of UNESCO’s Observatory of Learning Outcomes (OLO);

- data/documentation from ministry websites of governments in ESAR;• attemptingtofillinformationgapsbyconsultingthefollowingcontactsasrequired:

- existing contacts established by ACER in the course of earlier work (e.g. contacts at Results for Development, RTI, Uwezo, and national assessment bodies in specific countries in ESAR);

- new contacts established through this consultancy (e.g. contacts with people within ministries or donor agencies through UNICEF ESARO and COs).

Consultation with existing and new contacts was conducted via email, using standardised questionnaires that, first, sought an overview of the assessment activities in the country of interest and, second, sought a more detailed information about particular assessments relevant to the consultancy.

75 https://www.eddataglobal.org/

Page 88: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

70

Methodology for Chapter 2: Literacy and numeracy in primary education in the ESA region—Students experiencing LLOs and trends over timeAvailable datasetsSeveral criteria were used to select the datasets for our analysis. First, we used datasets relating to literacy and numeracy that were as current as possible. Second, we selected datasets that were representative of the population they were assessing, and were based on an appropriate census or household survey, or sample-design with sampling weights used to represent the target population. Third, for the data to be useful for profiling students, the dataset had to include links between achievement data and contextual information about the student.

For the analysis, data from four different assessments implemented in seven ESA countries were available: Uwezo, PASEC, TIMSS and prePIRLS. PrePIRLS was chosen over the traditional PIRLS dataset because it is better targeted to the achievement of students from participating countries for the region. SACMEQ data was not available within our research timeline.76 The datasets used, the countries involved, the year(s) of implementation, the target population and the domains assessed appear below (see Table 14).

Specifics about the different datasets used for this analysis are described in the following sections.

UwezoUwezo is a household and age-based survey that assesses literacy and numeracy outcomes for children in Kenya, Tanzania and Uganda. Achievement data is collected from the children, and contextual information is obtained via an interview with the head of the household, as well as from observations made in the school or home environment (Hivos/Twaweza, 2014). Uwezo data from 2009/2010, 2011 and 2012 were available for the purposes of our research.

The main focus of our data analysis was to examine literacy and numeracy learning outcomes of children in primary education. A procedure for selecting the relevant children was established using criteria for students who attend school, students who have dropped out of school, and children who never attended school.

PASECPASEC is a large-scale survey of students’ abilities in mathematics and reading in French that is administered across 13 countries across multiple grades (generally Grades 2 and 5). Students are typically assessed at the beginning and end of each grade in order to measure growth over the course of the year (CONFEMEN, 2010a). PASEC databases for Burundi and Comoros for Grades 2 and 5 were used for our analyses.

76 Efforts were made to gain access to SACMEQ data as well as data from national assessments. In total, 12 ESA countries participated in SACMEQ. The SACMEQ data archive remained offline when we conducted this research and the process required to obtain national data to be included in our analysis exceeded the timeline for this study. We thank UNICEF ESARO and COs for their support in requesting access to data during our research. Findings from SACMEQ and national assessments are referred to in the discussion where reports were available and contained analysis of contextual data.

Page 89: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

71

TIMSS and prePIRLSTIMSS and PIRLS are IEA studies run on a regular cycle (TIMSS every four years; PIRLS ever five years) to monitor mathematics, science and reading literacy skills among children in participating countries from different regions of the world. PrePIRLS was introduced for the PIRLS 2011 cycle as an assessment of reading literacy that is easier for students than the traditional PIRLS format (M. O. Martin, Mullis, Foy, and Arora, 2012; I. Mullis, Martin, Foy, and Drucker, 2012). In this report, data was analysed from TIMSS 2011 Grade 6 (mathematics only) for Botswana. Data was also analysed from prePIRLS 2011 Grade 4 for both Botswana and South Africa.

Table 14. Assessment programmes for which data were available for analysis

Methodology for the characterisation of students experiencing LLOs in literacy and numeracy in the datasetsIn each of the Uwezo, PASEC, TIMSS and prePIRLS databases, the first step in characterising students experiencing LLOs was to define who each of these children were. For each of the defined assessment variables within each database, a dichotomous variable was created that signified whether the student was defined as experiencing LLOs (value of 1) or whether they are not considered to be experiencing LLOs (value of 0) for the assessment variable in question. In each case, the decision of whether or not someone was experiencing LLOs was made based on definitions in each study. Students without a score for the assessment in question were treated as missing data.

The second step was to define a list of contextual variables in each dataset that would be considered of potential interest in characterising the students experiencing LLOs. This list was devised after considering the types of variables previously identified as being associated with student achievement. The variables cover a range of information from the student, including home background, household possessions, parental education and literacy, and student academic background experiences at school.

The third step was to explore descriptive statistics of two groups with overlapping students for each of the relevant contextual variables. The first group was students in the target population, for example, the percentage of the Botswana population for prePIRLS that are female. The second group was students identified in the first methodological step who were in a LLOs group. The example for this group would be the female proportion of the Botswana population of prePIRLS students who were identified in the first stage as experiencing LLOs in reading.

Assessment Name

Countries Year of implementation Target population

Domains assessed

Uwezo Kenya, Tanzania, Uganda

2009/2010 2011 2012

Age-based household (6-16)

Literacy (French, Swahili), Numeracy

PASEC Burundi, Comoros

2008/2009 (Burundi) 2008/2009 (Comoros)

Grade 2 students Grade 5 students

Literacy (French, Kirundi), Numeracy

TMSS Botswana 2011 Grade 6 students Mathematics

Malawi Botswana, South Africa

2011 Grade 4 students Literacy (English)

Page 90: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

72

The next step was to use significance testing to determine whether students who are categorised as experiencing LLOs for the assessment in question (value of 1) differ from the remaining students who were considered not to be experiencing LLOs for the assessment in question (value of 0). For this we used a logistic regression, a statistical technique that allows for a dichotomous dependent variable. A significant logistic regression test would imply that students identified as experiencing LLOs have significantly different characteristics to students who were not identified. All our analyses use weighted data that enable us to relate our findings back to the relevant target populations of each of the studies.

Specific methodologies relating to each assessment are listed below.

Profiling methodology: UwezoThe main focus of our data analysis was to examine literacy and numeracy learning outcomes of children in primary education. Therefore, the total sample of children participating in Uwezo was modified. This section provides details on the procedure we used to select the final sample of children included in our analysis. It also lists the performance and contextual data available for use in the analyses.

Uwezo is a household- and age-based assessment. Because of these characteristics, the assessment reaches children with different school enrolment status. From children who are within the formal education system, Uwezo assesses those attending preschool, primary or secondary education.

We selected children for our study based on three criteria. First, of all children enrolled in school, we selected only those in primary grades. This included children attending up to Grade 8 in Kenya, and children attending up to Grade 7 in Tanzania and Uganda.

Second, among children who had dropped out of school, only those who left during primary education were considered. Children whose last year at school was Grade 9 and above in Kenya, and Grade 8 and above in Tanzania and Uganda, were excluded.

Third, among children who had never been enrolled in school, only those within the typical ages for primary education were considered. One more year was added to the age in which children are expected to finish primary education in each country. This cut-off meant we only included out-of-school children 14 years and below in Kenya and Tanzania, and 13 years and below in Uganda (see Table 15).

The first step in examining the characteristics of children experiencing LLOs is to define the concept of LLOs. The nature of the Uwezo assessment and the variety of enrolment status among children taking the tests complicated the definition.

Uwezo tests are administered to a wide population of students aged 6–16, regardless of enrolment status and, if enrolled in school, the grade level they are attending. Different forms of the same test are given to all children. The test is aligned with the national Grade 2 curriculum in the three countries. Because of this alignment, it is assumed that all children attending Grade 3 and above should be able to reach the highest level of achievement: ‘story’ for the literacy tests and ‘multiplication/division’ for the numeracy test.

Page 91: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

73

Typically, the national reports present the results of the assessment in terms of the percentage of in-school children reaching each of the performance levels by grade. In particular, countries focus on the outcomes of students above Grade 3. The focus of the regional reports is a comparison of the percentage of children above Grade 3 who can achieve the highest level of performance in each test for the three countries. For the purpose of this study, and consistent with UWEZO’s analytic approach, children who experience LLOs are characterised as those enrolled in Grade 3 and above unable to achieve the highest level of performance.

Uwezo does not report the learning outcomes of out-of-school children. Two types of children are within this category: those who dropped out of school and those who never received formal education. A similar approach to that used for identifying in-school children with LLOs was used for the former group. Thus, children who dropped out of school in Grade 3 and above and who could not perform at the top level in each domain were identified as experiencing LLOs. For the group of children who never received formal education, the criterion to identify those with LLOs was age-based. Those 10 years old and above who could not perform at the top level in each domain were identified as experiencing LLOs.

Table 15 shows the final sample used for our analysis. The numbers are disaggregated by enrolment status. The application of the above mentioned exclusion criteria result in the exclusion of 7 to 14 per cent of the assessed children from our analysis.

Only data for 2012 were used for the purpose of profiling students experiencing LLOs.

Performance data we used in the analyses were for language literacy in English (all countries) and Swahili (Kenya and Tanzania), as well as numeracy (all countries). Children were scored on their ability to perform different tasks of increasing difficulty in each domain assessed. For language literacy, students were graded according to their ability to successfully complete tasks that relate to letters, words, paragraphs and a story (in increasing order of difficulty). It is assumed that the ability to complete a task at a higher level means the child can complete the task at a lower level. For numeracy, students were graded on whether they were able to successfully complete tasks that relate to counting, numbers, values, addition, subtraction and multiplication. Children in Kenya were also given a division task.

Contextual information available from UWEZO used in this report includes:• gender;• ageofstudent(6–9years;10–13years;14–16years);• typeofwallathome(Kenya:mud,polythene,ironsheet,timber,stones/bricks;Tanzania:

mud, burnt bricks, cement bricks, other);• homeresources(accesstoelectricity,TV,radio,phone,cleanwater,car,fridge,motorbike);• mother’slevelofeducation(none,someprimary,somesecondary,post-secondary)(options

vary across countries);• whetherthechildreceivesextralessonsortuition(Kenyaonly);• schooltype(public,private,other;completed);• schoollocation(Tanzaniaonly).

Page 92: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

74

Table 15. Uwezo final sample for analysis

Profiling methodology: PASECAll students included in the databases for PASEC, which is a school-based assessment, are in the target population. Some difficulties were encountered in reconciling contextual information across the databases of the two countries that we focus on in this chapter, as the variables and the names used for them were not comparable. Information about the variables used in PASEC in these two countries was mostly taken from the French language national reports (CONFEMEN, 2010a, 2010c).

Uwezo country

Total Children not retained

Children retained

Children retained

In school Dropout Never enrolled

Kenya 2009-10 74,781 6,089 68,692 94,489 622 3,582

-8.10% -91.90% -93.90% -0.90% -5.20%

2011 125,661 13,377 112,284 105,286 1,147 5,851

-10.60% 89.40% -93.80% -1.00% -5.20%

2012 145,564 15,180 130,384 121,617 1,355 7,412

-10.40% -89.60% 93.30% -1.00% -5.70%

Tanzania 2009-10 35,540 3,747 31,793 29,584 1,307 902

-10.50% -89.50% -93.10% -4.10% -2.80%

2011 110,435 10,743 99,692 89,931 5,404 4,357

-9.70% -90.30% -90.20% -5.40% -4.40%

2012 105,352 14,563 90,789 85,129 2,112 3,548

-13.80% -86.20% -93.80% -2.30% -5.40%

Uganda 2009-10 32,768 2,322 30,446 27,878 912 1,656

-7.10% -92.90% -91.60% -3.00% -5.40%

2011 100,550 7,224 93,326 87,370 2,227 3.758

-7.20% -92.80% -93.60% -2.40% -4.00%

2012 92,188 6,137 86,051 81,503 1,845 2,703

-6.70% -93.30% -94.70% -2.10% -3.10%

Page 93: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

75

For the Burundi Grade 2 database, a weighted sample of 2,694 students represents 405,429 students. For the Burundi Grade 5 database, a sample of 2,625 students represents 253,524 students.

For the Comoros Grade 2 database, a sample of 2,120 students represents 22,490 students. For the Comoros Grade 5 database, a sample of 1,945 students represents 9,765 students.

Achievement data for PASEC students are collected near the beginning and towards the end of each assessed grade. This enables measurement of the student’s improvement over the course of the year. We used performance data obtained towards the end of the year, consistent with PASEC reporting.

PASEC assesses students in French and mathematics for Burundi and Comoros. In addition, Grade 2 students from Burundi were also assessed in Kirundi.

In PASEC, there are 3 levels for international comparison (CONFEMEN, 2010b, p. 93):• Level1:Studentswhohaveascoreoflessthan25(outof100).Thesestudentsareeither

responding randomly and would be considered to be failing at school, or close to it.• Level2:Studentswhohaveascorebetween25and40(outof100).• Level 3: Students who have a score between 40 and 100. Students at this level are

considered to have acquired a basic level of knowledge.

For the PASEC datasets, students who have a score within the Level 1 range are considered to be experiencing LLOs.

Contextual information used in the analyses included the following:• Gender(proportionoffemales);• Whetherthestudentisbelowthenormalageforthegrade(below5yearsoldforGrade

2 or below 9 for Grade 5) and whether the student is above the normal age for the grade (above 8 years old for Grade 2, or above 11 for Grade 5);

• An indicator of household possessions – a tally of eight home possessions is used:electricity, a television, a telephone, a fridge, gas heating, a video recorder, a computer and a car. The indicator is grouped into three categories:

- less than three possessions;- between three and five possessions;- more than five possessions.

• Ifthestudentparticipatesinfarmwork;• Ifthestudentparticipatesinhousework;• Ifthestudentparticipatesinretailwork;• Whetherworkhindersthestudent’sabilitytostudyathome;• Whetherworkhindersthestudent’sabilitytoattendschool;• Whetherworkhindersthestudent’sabilitytoconcentrateatschool;• Literacyofparents(bothfatherandmother);• Languagespokenathome(Shikomori,Arabic,French,English,Kirundi,Swahili,other);• Presenceofbasicfacilitiesintheschool(library,computerroom,toilets,electricity,drinking

water).

Page 94: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

76

Profiling methodology: TIMSS and prePIRLSFor the purpose of our study, data for TIMSS 2011 and prePIRLS 2011 were sourced and analysed. The TIMSS data relate to Grade 6 mathematics performance for Botswana. Grade 6 is outside the typical primary level grade assessed by TIMSS (Grade 4).

The prePIRLS 2011 data are for Botswana and South Africa (Grade 4). PIRLS and TIMSS are run at different cycles: TIMSS is run every four years, whereas PIRLS is run every five years; in 2011 they were conducted at the same time.

Performance in prePIRLS can be linked to the PIRLS reading achievement scale. The PIRLS 2011 item parameters were used to anchor the prePIRLS scale. The results are reported on its own scale, using the same 0–1000 scale used in TIMSS and PIRLS. Given the widespread familiarity with the scale used by PIRLS and TIMSS, this metric was also used for prePIRLS. The prePIRLS scale was centred at 500 as the mean achievement of the three countries combined, and 100 points on the scale was set to the standard deviation of the combined achievement distribution.

As with PASEC, the relevant databases for TIMSS and prePIRLS only include sampled students from the target grade. Thus all cases in the databases were included in our analysis. The survey included contextual information collected from students (student questionnaire), from parents (parent questionnaire), teachers (teacher questionnaire), school principals (school questionnaire) and from national centres (national context survey). For our research, we used only data that can be linked back to the students, so the National Context Survey and the teacher questionnaire (which is sampled at the school level and is not identifiable to a particular classroom) were not considered.

Students’ achievement on the items in TIMSS and prePIRLS are used to identify the knowledge and skills associated with achievement at particular points on the achievement scale. Each of the scales used in the studies has several benchmarks for reading, mathematics and science literacy. The TIMSS and PIRLS International Study Center worked with various expert groups and subject advisory committees to set benchmarks for reading, mathematics and science in terms of what children should be achieving. On a scale of 0 to 100 with 500 mean, four benchmarks were set: • AdvancedInternationalBenchmark(625);• HighInternationalBenchmark(550);• IntermediateInternationalBenchmark(475);• LowInternationalBenchmark(400).

For our research, we consider students who do not achieve the Low International Benchmark (i.e., students who score less than 400) to be experiencing LLOs in that domain.

For mathematics, the Low International Benchmark of 400 was defined for Grade 4 as follows:Students have some basic mathematical knowledge. They can add and subtract whole numbers. They have some recognition of parallel and perpendicular lines, familiar geometric shapes, and map coordinates. They can read and complete simple bar graphs and tables.

For reading literacy, the Low International Benchmark of 400 was defined for Grade 4 as follows: When reading literary texts, students can locate and retrieve an explicitly stated detail. When reading informational texts, students can locate and reproduce explicitly stated information that is at the beginning of the text.

Page 95: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

77

The scales in each of the databases include five plausible values that are used to estimate standard error associated with the achievement level. For our study, only performance on the first plausible value was used to determine whether a student achieved the low international benchmark. There is a small margin of statistical error in using this methodology, but it was necessary in order to conduct the analyses. The relative size of the error is considered to be low.

TIMSS and prePIRLS collect a wide variety of contextual data. The variables chosen for our analysis are those that are comparable with the information collected in other studies (such as home background, household possessions, parental education), and those believed to be related to students experiencing LLOs, such as relevant school-related factors. Contextual information from the student questionnaire and parent questionnaires used in the analyses included the following:• Gender;• Age(12yearsorless;between12and14years;14yearsorolder);• Testlanguagespokenathome;• Homeresourcesfor learning index (numberofbooks in thehome,numberofchildren’s

books in the home, number of home study supports, highest parental education level, highest parental occupation level) was categorised into:

- many resources- some resources- few resources;

• Highestparentaleducationlevels;• Studentengagementwithreadingatschool(prePIRLSscale);• Preschoolattendance;• Engagement in numeracy activities before beginning primary school (TIMSS and PIRLS

International Study Center);• Engagementinliteracyactivitiesbeforebeginningprimaryschool(prePIRLS);• Competencyofearlynumeracytaskswhenbeginningprimaryschool(TIMSSandPIRLS

International Study Center);• Competencyofearlyliteracytasksbeforebeginningprimaryschool(prePIRLS);

Contextual information collected from the school questionnaire used in the analyses included: • presenceofaschoollibrary;• presenceofcomputersusedforinstruction;• schoollocation.

Methodology for Uwezo trend analysis There is a distinct lack of assessment data available for trends analysis in ESAR. To analyse within-country trends over time, assessments must be administered at least twice. It is also important to consider that learning outcomes can only be compared between different implementations of the same assessment that have the same key features, such as assessment framework, design, target population, and conditions of administration across multiple cycles. Any comparison of results between different assessments – or between different cycles of the same assessment that lack the same key features across the different cycles – requires sophisticated linking procedures and analyses that are beyond the scope of our report.

Page 96: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

78

UWEZO data we obtained across three years (2009/2010, 2011, 2012) allowed us to examine trends in student learning across time. We tracked performance data for each country for each period of time over three years. English and mathematics performance was tracked in Kenya, Tanzania and Uganda, whereas Swahili performance was only tracked in Kenya and Tanzania.

To conduct the analyses, several steps were taken to prepare the databases. First, the steps detailed in Table 15 were carried out to ensure that the appropriate target population was being tested. The datasets were then aggregated so that a single dataset could be used to examine trends. Dummy variables were set up to indicate whether the student was able to complete each stage of the assessment.

These dummy variables were then used to determine the percentage of students who could complete each task correctly in each year for each domain. Simple logistic regressions were then computed to determine whether differences in percentages across years were significant and, if so, odds ratios were used to express the magnitude of these differences. More emphasis was placed on the percentage of students at the extremes – students able to complete the most difficult task or not able to complete any tasks. Changes in percentages within these extremes are largely dependent on each other, and an increase in the percentage of children completing a task at one level corresponds to an equivalent decrease in percentage across other tasks.

Where sizable differences existed, contextual variables were included in the logistic regression model to determine whether factors such as gender and age differences of the groups in each year might account for these differences.

Page 97: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

79

Methodology for Chapter 3: Improving learning outcomes in the ESA region: Effective country-level practicesLiterature reviewThe literature review we undertook is the main source of information for effective country-level practices in improving literacy and numeracy learning outcomes of primary-school students in the ESA region. The literature reviewed was complemented with information gathered during our stock-taking of assessments (Methodology for Chapter 2), most of which was provided by UNICEF ESARO and UNICEF country offices. In this section, we describe the literature review process, which was based on the details outlined in the Inception Report on specific methodology (ACER, 2014b).

Our search strategy sought to identify published as well as ‘grey’ literature about programmes that were shown to have had an effect on student learning, as evidenced by learning outcomes data. It also looked at monographic and demographic studies that used qualitative rather than quantitative approaches. Literature and relevant databases held by different organisations working in the region, particularly UNICEF, UNICEF ESARO, DfID and World Bank, were included in the review. Additionally, peer-reviewed journals and Internet references were used to search for relevant literature. Strategies that were used in literature searches included the following: • Electronicsearchesofbibliographicdatabases.Theseinitialsearcheswereconductedby

ACER’s experienced information librarians, using our Cunningham Library catalogue and online search engines such as Google Scholar to search for journal articles and reports relevant to the scope of the review.

• Targetedsearchesofonlineholdingsofinternational/regionalagencies,researchfirmsandnational ministries in the region. This included targeting known international, regional and national agencies that have implemented programmes in improving learning for learners in the region, particularly the disadvantaged. These included DfID, UNESCO, UNICEF, UNICEF ESARO, UNICEF Country Offices in the region, and World Bank. Additionally, the publications of relevant research bodies, such as the Research Triangle Institute (RTI) and International Initiative for Impact Evaluation (3ie) were include. Websites of national ministries in the region were also searched for any relevant publications.

• RegionaldatabasessuchasAfricanJournalsOnline(AJOL),whichofferspeer-reviewedarticles from southern scholars, and the Association for the Development of Education in Africa’s (ADEA) online database.

• Citationchasing.Thisinvolvedcheckingthereferencesofrelevantpublicationstoidentifypossibly relevant literature as well as forward-citation tracking using Scopus, or searching through the list of papers/studies that cited relevant literature.

• Contactingrelevantgroups.ThisentailedcollaborationbetweenACER,UNICEFESAROandUNICEF Country Offices to access additional literature and information.

Identification of effective country-level programmes with a focus on improving learning outcomes of disadvantaged children in the region

Page 98: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

80

The main sources for identifying programmes with a focus on improving learning outcomes in literacy and numeracy of disadvantaged children were the literature review, as well as information obtained during stock-taking – mainly from UNICEF ESARO and country offices. Any literature that seemed relevant to the scope of this study was initially included in a large pool. Programmes were then identified, closely examined and selected for further analysis of effective practices.

To select the programmes, we applied three main principles: The programme (1) aims to improve learning outcomes in literacy and numeracy of (2) disadvantaged children, and (3) evaluation mechanisms are place to measure children’s learning outcomes.

In Chapter 3, the definition of ‘disadvantaged children’ is based on the definition used in the relevant programmes. A comparison of the type of disadvantage addressed in each programme is provided in the main text of Chapter 3.

Page 99: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

81

Appendix II

Detailed tables for Chapter 1: Stock-taking and comparative analysis of assessmentsTable 16. Assessment implementation by type of assessment

Country Type of assessmentsInternational Regional National EGRA/EGMA

Angola √Botswana √ √ Burundi √ √Comoros √ Eritrea √ Ethiopia √ √Kenya √ √ √Lesotho √ √ Madagascar √Malawi √ √ √Mozambique √ √ √Namibia √ √ Rwanda √ √Somalia √ √South Africa √ √ √ South Sudan Swaziland √ Tanzania √ √Uganda √ √ √Zambia √ √ √Zimbabwe √ √ Total 2 14 13 12

Page 100: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

82

Table 17. ESAR countries with limited assessment activity in recent years

Country Notes

Angola Only known assessment activity is one EGRA implementation

Burundi Only known assessment activity is one EGRA implementation and one PASEC administration

Comoros Only known assessment activity is one PASEC administration

Eritrea Only known assessment activity is one MLA implementation

South Sudan No known assessment activity

Page 101: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

83

Tab

le 1

8.

Ana

lytic

al t

echn

ique

s us

ed in

the

ass

essm

ents

fro

m t

he s

tock

-tak

ing

of a

sses

smen

ts in

ESA

R

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Nat

iona

lM

LAEr

itre

a

√√

Nat

iona

l Le

arni

ng

Ass

essm

ent

(NLA

)

Ethi

opia

√√

√√

NA

SM

LAKe

nya

√√

LNA

EPLe

soth

o

Ass

essm

ent

of G

rade

s 1,

2 a

nd 3

in

Leso

tho

Leso

tho

Ass

essi

ng

Lear

ner

Ach

ieve

men

t

Mal

awi

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

n

MLA

Mal

awi

Nat

iona

l A

sses

smen

tM

ozam

biqu

eU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Page 102: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

84

Nat

iona

lN

SAT

Nam

ibia

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

n

LARS

Rw

anda

√√

√√

MLA

Som

alia

Ann

ual

Nat

iona

l A

sses

smen

t

Sou

th A

fric

a

√√

NA

LASou

th A

fric

aU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

NA

PEU

gand

a

√√

NA

LAZa

mbi

a

√√

ZELA

Zim

babw

e√

√√

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Page 103: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

85

EGRA

/EG

MA

EGRA

Ang

ola

√√

EGRA

Bur

undi

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

n

EGRA

(S

yste

m-le

vel

diag

nost

ic)

in

201

0

Ethi

opia

√√

EGRA

(P

rogr

amm

e ev

alua

tion

)

Ethi

opia

√√

EGRA

(S

yste

m-le

vel

diag

nost

ic)

in

201

1

Ethi

opia

√√

EGRA

Keny

a

√√

EGRA

, EG

MA

Keny

a

√√

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Page 104: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

86

EGRA

/EG

MA

EGRA

Mad

agas

car

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

n

EGM

AM

alaw

i

EGRA

(P

rogr

amm

e ev

alua

tion

)

Mal

awi

√√

EGRA

(S

yste

m-le

vel

mon

itorin

g)

Mal

awi

√√

EGRA

(P

rogr

amm

e ev

alua

tion

– L

iter

acy

Boo

st)

Moz

ambi

que

√√

EGRA

(P

rogr

amm

e ev

alua

tion

APA

L)

Moz

ambi

que

√√

EGRA

, EG

MA

Rw

anda

√√

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Page 105: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

87

EGRA

/EG

MA

EGRA

Som

alia

EGRA

, EG

MA

Tanz

ania

EGRA

(S

yste

m-le

vel

diag

nost

ic)

Uga

nda

√√

EGRA

(P

rogr

amm

e ev

alua

tion

)

Uga

nda

EGRA

, EG

MA

Zam

bia

√√

EGRA

(P

rogr

amm

e ev

alua

tion

)

Zam

bia

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

n

EGRA

(S

yste

m-le

vel

diag

nost

ic)

Zam

bia

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

nU

nkno

wn

Unk

now

n

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Page 106: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

88

Reg

iona

lU

wez

oKe

nya

√√

Uw

ezo

Tanz

ania

√√

Uw

ezo

Uga

nda

√√

PASEC

Bur

undi

√√

PASEC

Com

oros

√√

SAC

MEQ

Bot

swan

a√

√√

√√

SAC

MEQ

Keny

a√

√√

SAC

MEQ

Leso

tho

√√

√√

SAC

MEQ

Mal

awi

√√

√√

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Page 107: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

89

Typ

e of

as

sess

men

tA

sses

smen

tC

ount

ryIR

T u

sed

Com

pete

ncy

leve

ls/

benc

hmar

ks

esta

blis

hed

Freq

uenc

y an

alys

es

cond

ucte

d/

mea

n sc

ores

ca

lcul

ated

fo

r co

gniti

ve

resu

lts,

disa

ggre

gate

d by

con

text

ual

varia

bles

of

inte

rest

Freq

uenc

y an

alys

es

cond

ucte

d on

co

ntex

tual

da

ta

Rel

atio

nshi

p be

twee

n co

gniti

ve

perf

orm

ance

an

d co

ntex

tual

fa

ctor

s ex

plor

ed v

ia

anal

ytic

al

tech

niqu

es

Tre

nds

in

cogn

itive

pe

rfor

man

ce

com

pute

d

Inte

rnat

iona

l co

mpa

rison

s of

cog

nitiv

e da

ta

repo

rted

Reg

iona

lSA

CM

EQM

ozam

biqu

e√

√√

SAC

MEQ

Nam

ibia

√√

√√

SAC

MEQ

Sou

th A

fric

a√

√√

SAC

MEQ

Sw

azila

nd√

√√

SAC

MEQ

Tanz

ania

√√

√√

SAC

MEQ

Uga

nda

√√

√√

SAC

MEQ

Zam

bia

√√

√√

SAC

MEQ

Zim

babw

e√

√√

Inte

rnat

iona

lPI

RLS

, pr

ePIR

LSBot

swan

a√

√√

PIRLS

, pr

ePIR

LSSou

th A

fric

a√

√√

√√

TIM

SS

Bot

swan

a√

√√

√√

TIM

SS

Sou

th A

fric

a √

√√

√√

Tot

al58

21

32

50

19

33

27

6

Page 108: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

90

Appendix III: Country case studiesThe aim of the two case studies we undertook for Zimbabwe and Rwanda was to obtain a deeper understanding of specific practices implemented to measure and improve the literacy and numeracy learning outcomes of primary school children in the long term. Both countries have developed a national assessment system that provides data on student learning outcomes in literacy and numeracy, as well as capturing important contextual background information that allows the exploration of relationships between achievement and context factors. Furthermore, the implementation of assessment systems in both countries helped local staff acquire the knowledge and skills for future innovations and developments.

Page 109: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

91

Country context Zimbabwe is classified by the World Bank as a low-income country. It has a population of just over 14 million people in 2013 (World Bank, 2015). Thirty per cent of the population live in urban areas, the majority in Harare and Bulaywo (UNICEF, 2013b).

The Constitution of Zimbabwe, Amendment No. 20, recognises 16 official languages (Government of Zimbabwe, 2013). These languages include Chewa, Chibarwe, English, Kalanga, Koisan, Nambya, Ndau, Ndebele, Shangani, Shona, sign language, Sotho, Tonga, Tswana, Venda and Xhosa. Amendment No. 20 states that each language must be treated equitably and that government must create conditions for the development of the official languages.

EducationThe Education Act 1987 makes provisions for three languages to be taught in all primary schools from Grade 1: English, Shona and Ndebele. Primary education is designed to equip learners with language skills in Shona and English or Shona and Ndebele (UNESCO, 2010). The Education Amendment Bill 2005 was passed in February 2006 and proposes the teaching of ‘three main languages of Zimbabwe mainly English, Shona and Ndebele and such other local language in all schools up to form two on an equal time basis’ (MOESAC, 2006). The bill also states that prior to Form 1 any language that is best understood by the pupils may be used in instruction (MOESAC, 2006). Zimbabwe’s formal education structure includes seven years of primary education (beginning at the age of six and ending at Grade 7), four years of lower secondary education (Forms 1–4), and two years of upper secondary education (Forms 5–6). As of early 2015, external assessments are conducted in the form of the Grade Seven Certificate at the end of the primary cycle, the O-Level examination at the end of the lower secondary cycle, and the A-Level examination at the end of the upper secondary cycle.

Zimbabwe

91

Page 110: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

92

After gaining independence in 1980, the Government of Zimbabwe expanded access to primary school education, which resulted in the number of primary school enrolments more than doubling in seven years. By 1982, primary enrolment rates were reported at almost 100 per cent (Nyanguru and Peil, 1991). However, between 1982 and 2004 enrolment rates decreased and in 2008 the provision of education services deteriorated dramatically because of the election period and hyperinflation. During this time, student attendance fell to around 20 per cent, and teacher attendance to about 40 per cent (UNICEF, 2008b).

Zimbabwe’s education system was ‘once arguably the best on the continent,’ but since 2000 the education sector has experienced significant deterioration due to declining financial assistance (UNICEF, p. 1, 2011). To replace the drop in government funding, a system of fees, levies and incentives was imposed that has affected access to and quality of education, particularly for the most disadvantaged children. In addition, the lack of funding has had an effect on school and learning supervision, availability of planning and policy development related to school and system governance, teacher in-service training and school environments in general (UNICEF, 2011).

In 2009, the sector slowly began to recover, with education made a priority in the new government’s Short Term Emergency Recovery Programme (Government of Zimbabwe, 2009). After a dramatic decrease in primary school completion rates between 1996 (82.6 per cent) and 2006 (68.2 per cent), completion rates rose to 82.4 per cent in 2009 (UNICEF, 2012).

However, there are still significant concerns about the provision of quality education for primary school children in Zimbabwe. Demographic and Health Survey (DHS) statistics indicate that the nation’s rural and poor citizens are substantially overrepresented in drop-out and repetition rates (UNICEF, 2008a). O-level pass rates are still extremely low, and there remains limited access to important material and non-material resources that support teaching and learning (MOESAC, 2009).

To address these shortcomings, the Ministry of Education, Sport, Arts and Culture (MOESAC) launched the Education Transition Fund (ETF) in 2009, managed by UNICEF (UNICEF, 2011).77 The purpose of the ETF was to improve the quality of education through the provision and delivery of essential teaching and learning materials for primary schools, and through high-level technical assistance to MOESAC. ETF entered its second phase in 2011 with the overall goal of continued support and revitalisation of the education sector. ETF was renamed the Education Development Fund (EDF) in 2014.

EDF support in Phase II focuses on activities in the following areas linked with the Ministry of Education’s Strategic Investment Plan (MOESAC, 2011): School and System Governance; Teaching and Learning; and Second Chance Education. Key activities within these themes include strengthening education delivery mechanisms; improving the quality of education services; improving access, retention, completion and achievement of learners; and a continued focus on the most vulnerable and out-of-school children (UNICEF, 2014). Access to education and improvement of student learning outcomes have been confirmed by the 2013 Education Management System (EMIS), the 2014 Zimbabwe Early Learning Assessment (ZELA) and the 2014 Multiple Indicator Cluster Surveys (MICS) (UNICEF, 2014).

77 In 2014, the Ministry of Education, Sport, Arts and Culture (MoESAC) was renamed the Ministry of Primary and Secondary Education (MOPSE).

92

Page 111: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

93

This case study explores emerging trends from ZELA in student learning outcomes and the provision of textbooks and teaching materials procured through EDF. It also reviews the multi-year programme of an intensive capacity-building partnership with the Zimbabwe School Examinations Council (ZIMSEC) and ACER. The capacity-building programme supports the long-term sustainability of ZELA through system strengthening in assessment, data management and analysis. Kenneth Russell, EDF Manager at UNICEF Zimbabwe, shared his experience with the ZELA Capacity-building Programme for this case study.

The latest available UNICEF annual country report for Zimbabwe is 2013. In it, multiple sources of data suggest that children in Zimbabwe are better off in 2013 than they were in the previous five years (UNICEF, 2013b). The report notes that there was a 95.6 per cent primary net enrolment rate and a 52 per cent secondary net enrolment rate. The gender parity index was quoted at 1:01 and the primary completion rate at the time of the report was 86.7 per cent. Access to and quality of education were reportedly enhanced through the provision of textbooks; training and supervision of teachers in 35 per cent of primary and secondary schools; and improved water, sanitation and hygiene (UNICEF, 2013b, p. 1).

In addition to ZELA (2012–2015), Zimbabwe participated in SACMEQ I (1995–1999) and SACMEQ III (2005–2010). Zimbabwe played a significant role in the eventual development of SACMEQ. Research generated from a collaboration in 1989 between Zimbabwe’s Minister for Education and Culture and the Director of IIEP UNESCO led to dialogue that eventually resulted in the development of the SACMEQ consortium (SACMEQ, 2013).

Zimbabwe Early Learning AssessmentThe Zimbabwe Early Learning Assessment (ZELA) is a four-year programme commissioned by UNICEF to support and enhance the national capacity to review, reform and re-orient the current system of student assessment in Zimbabwe. It establishes a baseline to help determine whether the EDF programme (2010-2015) has had the desired effects on children, their caregivers, schools, and the education sector in general, and it examined the extent to which the changes identified are attributable to the EDF programme interventions. ZELA’s defined target population is students beginning Grade 3 of primary school (ACER and ZIMSEC, 2013).

Main purpose and componentsThe goal of the ZELA project is to monitor and evaluate the effects of the EDF programme through the introduction of an early–grade learning assessment in language and mathematics.

ZELA measures student performance in language and mathematics. Information is also collected at the school and student level. School head and pupil questionnaires collect information about student background, teaching resources, funding and infrastructure.

The test domains are mathematics and language, including English as well as Ndebele and Shona. Tests were developed in Zimbabwe in February 2012, January 2013 and January 2014, by panels of ZIMSEC subject specialists and curriculum managers (ACER and ZIMSEC, 2015).

93

Page 112: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

94

Main findings regarding effective strategies and factorsThe ACER data, collected from three cycles of the ZELA, indicate socio-economic status is still a strong predictor of performance, and is associated with large differences in assessment results across Zimbabwe. Socio-economically advantaged pupils and schools tend to outscore their disadvantaged peers by larger margins than between any other groups of pupils in English and mathematics. There are large differences in pupils’ performance between provinces and between urban and rural areas (ACER and ZIMSEC, 2015).

Key findings include the following:• Thepercentageofstudentsperformingatorabovethegrade-appropriatelevelinEnglish

after completing Grade 2 in Zimbabwe was 49 per cent in 2012, 54 per cent in 2013 and 51 per cent in 2014. The 2014 results were not statistically significantly different from the previous years. The 2012 base-line study reported that the percentage of students performing at or above the grade-appropriate level in mathematics after completing Grade 2 in Zimbabwe was 46 per cent. This increased substantially to 63 per cent in 2013 and again increased significantly to 67 per cent in 2014.

• GirlshavecontinuedtooutperformboysinEnglishandmathematicsfrom2012to2014.In 2014, more girls than boys reached the benchmark for English (by 9 percentage points) and mathematics (by 6 percentage points). From 2012 to 2014, the performance of girls in English was significantly higher in 2014 than in 2012 (by 3.8 percentage points). There was a moderate positive trend in mathematics performance for both boys (by 11.9 percentage points) and girls (by 11.6 percentage points) since 2012. These trends are similar to those of other southern African nations. Findings indicate that gender differences do not change much within southern African countries. Where girls perform better they tend to continue performing better and where boys perform better they tend to continue performing better (Satio, 2011).

• Students in urban schools significantly outperformed students in rural schools in bothEnglish (by 42 percentage points) and mathematics (by 25 percentage points). More than eight of 10 urban students reached the benchmark in both English and mathematics, while only four of 10 rural students reached the English benchmark and six of 10 students reached the mathematics benchmark.

• Students in registered schools outperform students in satellite schools in both Englishand mathematics. Students in registered schools performed better on ZELA 2014 by 18 percentage points in English and 11 percentage points in mathematics.

• Socio-economicallyadvantagedpupilsandschoolstendtooutscoretheirdisadvantagedpeers by larger margins than between any other groups of pupils. The percentage of students performing at or above grade level in English was 34 per cent for the lowest socio-economic status (SES) quartile and 77 per cent for the highest SES quartile (a difference of 43 percentage points). In mathematics the difference was also clear, but smaller in magnitude: 53 per cent of the low SES pupils performed at or above the grade level and 84 per cent of the high SES pupils—a difference of 31 percentage points (ACER and ZIMSEC, 2015).

Several relationships have been observed between student performance and student background, teaching or infrastructure variables. These relationships are all correlational, and not necessarily causal.

94

Page 113: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

9595

Page 114: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

96

The school-level variance in performance was found to be relatively high, indicating that schools vary substantially in average student performance. In line with the aims of the EDF programme, one would expect to see a reduction in the proportion of school level variance over the EDF programme cycle (ACER and ZIMSEC, 2015).

ZELA is in its evaluation phase in 2015, and it is too early to draw conclusions from the study beyond some of the indicative trends noted earlier. The EDF programme distributed textbooks and teaching materials to all schools in Zimbabwe. Based on the relatively low base some pupils may be starting from, combined with increasing exposure to reading materials, one would expect to see long-term advancements in pupil performance over the EDF programme cycle.

ZELA Capacity-building ProgrammeZELA also targets system-level capacity. One of the key components of ZELA has been to support and enhance national capacity in student assessment. In 2012, ACER worked with ZIMSEC to construct four tests and two surveys. This activity was followed by the administration of these tools in 500 schools in Zimbabwe and the analysis, standardisation and reporting of pupil achievement levels in Zimbabwe through the ACER and ZIMSEC partnership. Training in assessment and data analysis were conducted in 2012, and ZIMSEC took increasing responsibility for these activities in each subsequent cycle of ZELA.

In 2013, ZIMSEC indicated that the training needs of its staff include the following topics:• Analysisof the relationshipsbetweenstudentbackgroundcharacteristics, teachingand

learning, and funding and facilities on pupil performance (using SPSS statistical analysis software);

• IntensiveandpracticaltrainingonIRT(includinguseofACERConQuest);• Knowledgeandskillsofschool-basedassessment(intheoryandpractice).

In 2014, an SPSS Roundtable was organized to reinforce the 2013 and 2014 capacity-building activities and to ensure ZIMSEC colleagues were fundamentally involved in the analysis and drafting of ZELA data. Intensive and practical training on IRT (including the use of ACER ConQuest software) was provided during a three-week training programme, as well as technical assistance on IRT. ACER also received a ZIMSEC delegation in Australia and introduced key ZIMSEC staff to school-based assessment (SBA) prior to 2015 capacity-building activities in this focus area.

In 2015, the focus of capacity-building activities with ZIMSEC is SBA. SBA activities include facilitated workshops with ZIMSEC and key government stakeholders, and a pilot research project with schoolteachers in Zimbabwe. As with 2013 and 2014, an SPSS Roundtable will be conducted with ZIMSEC during the Impact Evaluation report-writing stage. Similarly, an IRT Roundtable will be conducted with ZIMSEC in order to build on the technical assistance and workshops provided in previous ZELA cycles.

The expanded activities also include the placement of a technical assistance officer within ZIMSEC for up to two months per year (Kenneth Russell, UNICEF Zimbabwe, personal communication, 16 April 2015).

96

Page 115: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

97

ZELA Capacity-building Programme: Experience of the EDF managerKenneth Russell, EDF Manager at UNICEF Zimbabwe, shared his experience with the ZELA Capacity-Building Programme. The following is a summary of his responses to questions about the implementation of the capacity-building support, along with some success stories and an outline of the challenges encountered.

How is the ZELA Capacity-Building Programme implemented?Most of the capacity-building activities were (or plan to be) delivered through facilitated workshops. However, the placement of the technical assistance officer is different, and is one of the distinctive strategies used by ZELA to help with capacity-building. It not only allows for ready access and sustained support but would have helped to deepen relationships between the technical officer and ZIMSEC as well as strengthen partnership among the entities (which might survive beyond the project).

What are some success stories from ZELA’s Capacity-Building Programme?It is difficult to provide success stories from the Capacity-Building Programme without an assessment of the effect of the support for capacity-building that has been provided. What we know from our discussions with and the work of ZIMSEC is the following:• ZIMSECplayedagreaterroleintheanalysisof2014datathantheyhaddonepreviously,as

well as the preparation of the report. This is due in part to the support they have received in data analysis.

• ZIMSEChasspokenpubliclyabouttheirincreasedcapacityinIRT.Thisisanewareaofwork and a new approach to analysis for ZIMSEC, but one which they are interested in continuing to use for ZELA and their other assessments.

• ZIMSEC isattheforefrontofnationaldiscussionsoncontinuousassessment,andSBAspecifically, because of the support provided to them through ZELA. They were exposed to good practices in Australia and had opportunities to reflect on how to apply some of these lessons to Zimbabwe.

• ZELA has provided opportunities for ZIMSEC, as well as provincial and district staffand teachers, to engage in developing items for the assessment. This helped to deepen understanding of the participants, and helped the organisation to grow in how it designs items for other assessment. A critical aspect of this area of capacity-building is the diversity of those participating and hence the potential for domino effect in the system.

• InstitutionalcapacityhasalsobeenenhancedthroughtheprovisionofsoftwaresuchasSPSS and ACER ConQuest, computers and motor vehicles. These enhance the organisation’s access to technology to support its work, as well as its ability to monitor field activities and supervise staff.

97

Page 116: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

98

What are the main barriers you have encountered that limit sustainable capacity-building, and what ways were considered to overcome these barriers?The major barrier to sustainable capacity-building is the ‘projectised’ approach taken with ZELA. While necessary to test and experiment before making it institutional, such a critical project creates expectations and practices that might not be sustainable when mainstreamed. This project approach also resulted in the capacity-building activities being viewed as parallel to or outside the normal functioning of the organisation. In so doing, capacity is built primarily in those who are involved in the project despite their applicability and relevance to other aspects of the organisation. ZELA invested heavily in a small number of core staff who have done great work, but the effect of them leaving ZIMSEC would be potentially catastrophic for ZELA.

Another challenge to sustainability of built capacity is the concentration of investment in capacity-building within ZIMSEC to the exclusion of other organisations that will be critical to sustainability in the years ahead. While ZIMSEC has done a great job, during institutionalisation, the implementation arrangements could be different. In such a case, there could be new players playing critical roles for which they have not had the required capacity-building.

These barriers are the focus of the final year of ZELA as a project. Much of it will depend on the institutional arrangements agreed for ZELA beyond the current phase (up to end of 2015).

We thank Kenneth Russell for sharing his experiences for this research.

98

Page 117: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

99

78 http://www.unicef.org/rwanda/overview.html

Rwanda

Country contextRwanda is the most densely populated country in Africa, with a population of over 11 million. Half of its citizens are under the age of 18. Despite the country’s impressive economic growth over the past two decades, since the 1994 genocide Rwanda remains one of the poorest countries in the world, with 44 per cent of the population living below the poverty line. Approximately 80 per cent of the population live in rural areas. With increasing urbanisation since 1994, the urban population is expected to grow to 30 per cent of the population by 2020. One of the main development goals set out in Rwanda’s Vision 2020 and Economic Development and Poverty Reduction Strategy is to move from an agriculture-based economy to ‘a knowledge-based hub for business and information technology’ by 2020.78

EducationPrimary education in Rwanda starts at the age of seven and comprises six years. Together with three years of lower secondary education, Rwanda has nine years of compulsory basic education (Rwanda Education Ministry, 2014, p. 1). In 2011, a strategy was launched to expand access to education from nine to 12 years of basic education (UNICEF, 2013a, p. 2). The transition from primary to lower secondary education is based on a national examination at the end of primary education (Rwanda Education Ministry, 2014, p. 1).

99

Page 118: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

100

Equitable access to education and high-quality education are priorities for the government of Rwanda, which aims to provide its citizen with the skills and knowledge required for the socio-economic development of the country (Rwanda Education Ministry, 2014, p. 1). Since the school year 2003/04, fees from primary to secondary education have been gradually abolished in an effort to increase enrolment, retention and completion rates for basic education, especially for vulnerable children (Rwanda Education Board, 2012, p. 11). Rwanda is one of the few African countries on track to achieve seven of the eight Millennium Development Goals, one of which is universal access to primary education by 2015.79 In 2013, primary school enrolment in Rwanda reached 97 per cent (98 per cent for girls). However, the primary education completion rate was still low in 2013 at 69 per cent (64 per cent for boys and 74 per cent for girls) (Rwanda Education Ministry, 2014, pp. 12, 14).

The large increase in enrolment numbers poses enormous challenges for the education system, especially for the provision of adequate learning spaces in primary education (Rwanda Education Board, 2012, p. 11).

Another key challenge for Rwanda’s education system is improving the quality of education. The government addresses the remaining disparities in access to education and improvement of education quality in the Education Sector Strategic Plan (ESSP) for 2013/14–2017/18. The plan was developed in consultation with UNICEF and other development partners. The plan focuses on reducing the dropout rate, and improving access and retention for the most vulnerable children, including children with special needs (UNICEF, 2013a, p. 21). To improve the quality and relevance of education, the strategic priorities are curriculum development, quality standards, assurance and assessments, textbook distribution, improving teaching and learning, and implementation of a system for monitoring learning achievement at school level and national level (UNICEF, 2013a, p. 22). Key elements of UNICEF’s programme to support the government of Rwanda in its strategy to increase quality education are curriculum review, teacher development and use of learning achievement assessments (UNICEF, 2013a, p. 2).

Learning Achievement in Rwandan Schools (LARS)An important development regarding the quality standards and assurance programme of education in Rwanda is the 2011 introduction of Learning Achievement in Rwandan Schools (LARS).

Main purposes and componentsThe main purposes of LARS are to measure the level of achievement in literacy and numeracy at the national level in order to determine factors associated with student achievement – especially low achievement – and to monitor achievement over time. As a monitoring tool, LARS provides the Ministry of Education with a reliable database on learning outcomes as a basis for recommendations to policymakers and other stakeholders for future improvement (Rwanda Education Board, 2012, p. 12).

79 <http://www.unicef.org/rwanda/overview.html>.

100

Page 119: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

101

To achieve these goals, LARS measures student achievement in literacy and numeracy at Grade 3 level in public schools, government-aided schools and private schools. Capturing completion of the lower primary level, the target population consisted of students who had completed Grade 3 and were in the second term of Grade 4 (Rwanda Education Board, 2012, p. 13).

The literacy component of LARS focuses on writing and reading skills in the Kinyarwanda language. The numeracy component captures skills in numeration and operations, the metric system, and geometric figures (shapes), in conformity with guidelines from the national mathematics curriculum (UNICEF, 2013a, p. 19). In order to identify the relevant indicators and factors related to low-learning achievement in Rwandan schools, background data were collected through questionnaires for students, parents, teachers and school administrators.

Main findings regarding effective strategies and factorsThe LARS baseline report is based on a national representative sample of approximately 2,500 students in 60 schools across Rwanda (Rwanda Education Board, 2012, p. 15).

Key findings include the following:• Asignificantpercentageofstudentsfailtomeetcurricularexpectations:37percent in

literacy and 46 cent in numeracy, compared to 55 cent of students meeting the expectations in literacy and 27 cent in numeracy (Rwanda Education Board, 2012, p. 41). The percentage of students failing to meet curricular expectations for numeracy is thus higher and more variable than observed for literacy.

• Numeracy results vary significantly between provinces and between districts (RwandaEducation Board, 2012, p. 55). Significant differences between some of the districts are also reported for literacy (Rwanda Education Board, 2012, p. 43).

• Studentsinruralareasaredisadvantagedinmeetingcurricularstandardscomparedwiththeir peers in urban areas (Rwanda Education Board, 2012, p. 46). Achievement distribution in both literacy and numeracy is relatively equal for girls and boys (Rwanda Education Board, 2012, pp. 47, 55).

• Amajorimpactonschoollevelisalsomadethroughhigherperformingchildrenofhigh-income parents.

• Anotherfactorinfluencingachievementistheaverageteachers‘orheadteachers’yearsof experience (Rwanda Education Board, 2012, pp. 51, 61). Interestingly, students with teachers with the least experience appear to perform better. One likely explanation mentioned in the report is that new teachers are significantly more skilled than previous cohorts, in the light of the rapid expansion of the Rwandan education system in response to rapid increases in enrolment (Rwanda Education Board, 2012, p. 60).

Two main shortcomings affect the analysis of parent and classroom characteristics. First, the response rate among parents is very low. Second, student data were not linked to teacher and parent data at an individual level. Thus, students can only be linked through the average of teacher and parent characteristics for their school. Results concerning the relationships of parent and classroom background characteristics and student achievement therefore need to be interpreted with caution (Rwanda Education Board, 2012, p. 49).

101

Page 120: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

102

The LARS report mentions several reasons for the poor performance of students measured in LARS 2011 (Rwanda Education Board, 2012, p. 41). One important factor is that ‘the children being tested were born either during or immediately after the civil war, a period when parents’ attention was focused largely on matters of survival’ (Rwanda Education Board, 2012, p. 41). Another challenge for the education system and classroom management in particular is the rapidly growing enrolment rate, with high growth in overage children and children from low socio-economic backgrounds as well as rural areas.80 Language of instruction (transition to English) versus language spoken at home by the student and the teacher, and the language of test (Kinyarwanda) are also seen as important, but have not been captured or analysed in LARS (Rwanda Education Board, 2012, p. 41).

To improve student-learning outcomes in primary education in Rwanda, one major area that needs to be addressed is the number of students failing curricular expectations in literacy and numeracy. Another area is performance differences between districts. Both can be improved by providing resources to low performing students, and schools and districts with the highest proportion of low-performing students (Rwanda Education Board, 2012, p. 64).

The LARS report underlines the importance of further research to investigate and explain the determinants of achievement, especially of low achievement. One example is identifying cognitive strategies that need to be strengthened through in-depth analysis of items that students consistently fail to resolve. This includes letter and number recognition, receptive vocabulary, phonetic accuracy and fluency in reading components. This would help describe more precisely students’ missing prerequisite literacy skills (Rwanda Education Board, 2012, p. 65).

In order to allow for the measurement of achievement and the relationships with important background characteristics over time, LARS is implemented periodically in a three-year-cycle. An innovative way of linking LARS with international benchmarks would be to include test items from other regional or international assessments, as suggested in the LARS report (Rwanda Education Board, 2012, p. 66)

LARS capacity-building componentOne important accomplishment during the development of LARS was capacity-building. A team at the Rwanda Education Board (drawn from various departments of REB, districts and school teachers) was trained to design and conduct learning assessments. Important skills they acquired include item/tools development, test administration, coding of questionnaires, data entry and data analysis.

Both Rwanda and Zimbabwe focus on assessment design, monitoring and evaluation, gathering background data at individual, school and system levels, and conducting innovative capacity-building programmes at the system level. Both ZELA and LARS include a development programme to improve the capacity of education staff to analyse student learning outcomes. In addition, both programmes appear to support innovation in curriculum reform.

80 In Rwanda, the Primary Net Enrolment Rate (NER) rose from 92.9 per cent in 2009 to 96.6 per cent in 2013, with the largest increase between 2009 (92.9 per cent) and 2010 (95.4 per cent) (see Rwanda Education Ministry, 2014, p. 12).

102

Page 121: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

103

Appendix IV: Detailed tables and figures for Chapter 2: Literacy and numeracy in primary education in the ESA region—students experiencing LLOs and trends over time

Characteristics of students with LLOs in literacy and numeracy in primary education in the ESA regionThe results of the comparison between the contextual profile of all Uwezo participants and the contextual profile of students experiencing LLOs are presented below for each country, as well as for each domain assessed in each country (see Table 19). For example, in Kenya 50 per cent of all students in the population are female. However, of the sub-population of students who identified as experiencing LLOs in English, 48 per cent were female. This means the proportion of females experiencing LLOs is smaller than the proportion of females in the population, suggesting that males are more likely to be experiencing LLOs. A bolded percentage indicates that the difference is significant (in this instance, it means that the logistic regression with gender as the independent variable and LLOs for English as the dependent variable was significant).

The results of the comparison between the contextual profile of all PASEC participants and the contextual profile of students experiencing LLOs are presented below for each country, as well as for each year level and each domain (see Table 20).

The results of the comparison between the contextual profile of all TIMSS students (Botswana) and prePIRLS participants (Botswana and South Africa) and the contextual profile of students experiencing LLOs are presented below (see Table 21 and Table 22).

Page 122: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

104

Table 19a. Proportions of students experiencing LLOs for each contextual variable of interest for Uwezo countries (Kenya)

Uwezo (2012)Contextual variable Kenya

All English LLO

Swahili LLO

Maths LLO

GenderFemales 50% 48% 47% 49%Age6 to 9 39% 34% 35% 32%10 to 13 43% 54% 54% 54%14 to 16 18% 12% 11% 14%Socio-economic factorsType of wall at home – Polythene <1% <1% <1% <1%Type of wall at home - Iron sheet 9% 8% 8% 8%Type of wall at home – Timber 11% 11% 11% 12%Type of wall at home - Stone/Bricks 28% 23% 23% 24%Type of wall at home – Mud 50% 58% 58% 55%Type of wall at home - Burnt Bricks Type of wall at home - Cement Bricks Type of wall at home – Other Household has access to electricity 23% 16% 16% 19%Household owns a TV 28% 21% 21% 24%Household owns a radio 73% 70% 70% 71%Household owns a phone 70% 65% 65% 66%Household has direct access to clean water 24% 18% 19% 20%Household owns a car Household owns a fridge Household owns a motorbike Mother’s level of education – None 18% 21% 21% 21%Mother’s level of education - Some primary 58% 62% 62% 59%Mother’s level of education - Some secondary

22% 16% 16% 19%

Mother’s level of education - Post secondary 2% 1% 1% 1%Out of school lessonsChild receives extra lessons/tuition 52% 43% 43% 46%School type and school locationSchool type – Public 83% 89% 89% 89%School type – Private 16% 10% 10% 10%School type – Other 1% 1% 1% 1%School Location – Urban

Page 123: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

105

Table 19b. Proportions of students experiencing LLOs for each contextual variable of interest for Uwezo countries (Tanzania)

Uwezo (2012)Contextual variable Tanzania

All English LLO

Swahili LLO

Maths LLO

GenderFemales 50% 50% 49% 50%Age6 to 9 35% 14% 16% 18%10 to 13 50% 67% 68% 68%14 to 16 16% 19% 16% 14%Socio-economic factorsType of wall at home – Polythene Type of wall at home - Iron sheet Type of wall at home – Timber Type of wall at home - Stone/Bricks Type of wall at home – Mud 47% 49% 51% 53%Type of wall at home - Burnt Bricks 38% 39% 39% 38%Type of wall at home - Cement Bricks 12% 10% 8% 7%Type of wall at home – Other 3% 3% 2% 2%Household has access to electricity 17% 15% 13% 12%Household owns a TV 19% 16% 15% 13%Household owns a radio 66% 65% 64% 62%Household owns a phone 49% 47% 44% 42%Household has direct access to clean water 29% 27% 26% 24%Household owns a car 3% 2% 2% 2%Household owns a fridge 7% 6% 5% 4%Household owns a motorbike 9% 8% 8% 7%Mother’s level of education – None 20% 21% 23% 26%Mother’s level of education - Some primary 74% 74% 73% 71%Mother’s level of education - Some secondary

5% 4% 4% 3%

Mother’s level of education - Post secondary <1% <1% <1% <1%Out of school lessonsChild receives extra lessons/tuition School type and school locationSchool type – Public 97% 99% 98% 99%School type – Private 3% 1% 2% 1%School type – Other School Location – Urban 17% 15% 13% 12%

Page 124: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

106

Table 19c. Proportions of students experiencing LLOs for each contextual variable of interest for Uwezo countries (Uganda)

Uwezo (2012)Contextual variable Uganda

All English LLO Maths LLOGenderFemales 49% 50% 50%Age6 to 9 40% 20% 19%10 to 13 41% 61% 60%14 to 16 18% 20% 21%Socio-economic factorsType of wall at home – Polythene Type of wall at home - Iron sheet Type of wall at home – Timber Type of wall at home - Stone/Bricks Type of wall at home – Mud Type of wall at home - Burnt Bricks Type of wall at home - Cement Bricks Type of wall at home – Other Household has access to electricity 12% 10% 11%Household owns a TV 11% 9% 10%Household owns a radio 72% 70% 71%Household owns a phone 65% 62% 63%Household has direct access to clean water 12% 11% 11%Household owns a car Household owns a fridge Household owns a motorbike Mother’s level of education – None 16% 17% 17%Mother’s level of education - Some primary 69% 70% 70%Mother’s level of education - Some secondary

12% 10% 11%

Mother’s level of education - Post secondary 3% 2% 2%Out of school lessonsChild receives extra lessons/tuition School type and school locationSchool type – Public 74% 80% 78%School type – Private 26% 20% 22%School type – Other School Location – Urban

Page 125: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

107

Table 20a. Proportions of students experiencing LLOs for each contextual variable of interest for PASEC countries (Comoros)

PASEC (2008/2009) Comoros

Contextual variableGrade 2 Grade 5

AllFrench LLO

Maths LLO

AllFrench LLO

Maths LLO

GenderFemales 52% 52% 55% 59% 70% 63%AgeBelow normal age (5 for Grade 2, 9 for Grade 5)

0% 0% 0% 2% 1% 1%

Above normal age (8 for Grade 2, 11 for Grade 5)

36% 37% 36% 52% 55% 55%

Language spoken at homeStudent speaks Shikomori at home 97% 97% 91% 96% 97% 97%Student speaks Arabic at home 4% 5% 5% 6% 6% 6%Student speaks French at home 4% 3% 3% 6% 4% 3%Student speaks English at home 1% 2% 3% 1% 2% 2%Student speaks Kirundi at home Student speaks Swahili at home Student speaks another language at home 2% 1% 7% 1% 2% 2%Socio-economic factorsHome possession scale - Less than 3 52% 60% 52% 50% 58% 57%Home possession scale - Between 3 and 5 34% 32% 35% 34% 31% 34%Home possession scale - 6 or more 14% 8% 14% 16% 11% 10%Father’s literacy 68% 65% 66% 64% 60% 64%Mother’s literacy 61% 52% 58% 57% 56% 60%Participates in farm work 59% 61% 60% 59% 65% 65%Participates in house work 54% 60% 56% 70% 72% 76%Participates in retail work 16% 19% 26% 14% 20% 19%Work hinders students study at home 24% 23% 17% 20% 21% 23%Work hinders student’s ability to go to school

22% 28% 22% 13% 13% 12%

Work hinders student’s concentration at school

15% 21% 20% 14% 20% 16%

School resourcesPresence of school library 9% 10% 9% 12% 14% 13%Presence of computer room 1% <1% <1% 2% <1% 2%Presence of school toilets 77% 75% 74% 80% 73% 72%School has electricity 21% 16% 16% 27% 14% 16%School has drinking water facilities 66% 63% 62% 71% 68% 62%

Page 126: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

108

Table 20b. Proportions of students experiencing LLOs for each contextual variable of interest for PASEC countries (Burundi)

PASEC (2008/2009) Burundi

Contextual variableGrade 2 Grade 5

AllFrench LLO

Kirundi LLO

Maths LLO

AllFrench LLO

Maths LLO

GenderFemales 49% 47% 46% 50% 48% 47% 56%AgeBelow normal age (5 for Grade 2, 9 for Grade 5)

0% 0% 0% 0% 0% 0% 0%

Above normal age (8 for Grade 2, 11 for Grade 5)

76% 71% 71% 64% 90% 92% 93%

Language spoken at homeStudent speaks Shikomori at home Student speaks Arabic at home Student speaks French at home 2% 1% 1% 2% 2% 2% 1%Student speaks English at home Student speaks Kirundi at home 95% 97% 95% 96% 95% 88% 91%Student speaks Swahili at home 4% 3% 5% 3% 4% 3% 3%Student speaks another language at home 1% 1% <1% 1% <1% <1% 1%Socio-economic factorsHome possession scale - Less than 3 89% 88% 87% 88% 91% 91% 91%Home possession scale - Between 3 and 5 11% 12% 13% 12% 9% 8% 8%Home possession scale - 6 or more <1% 0% 0% 0% <1% 1% 1%Father’s literacy 60% 60% 61% 62% 54% 52% 52%Mother’s literacy 51% 49% 49% 51% 43% 40% 42%Participates in farm work 54% 53% 53% 55% 62% 58% 59%Participates in house work 78% 77% 75% 81% 80% 73% 75%Participates in retail work 10% 12% 13% 12% 9% 7% 6%Work hinders students study at home 23% 19% 20% 21% 24% 22% 24%Work hinders student’s ability to go to school

16% 16% 16% 15% 19% 14% 14%

Work hinders student’s concentration at school

18% 16% 19% 22% 15% 13% 15%

School resourcesPresence of school library 3% 2% 2% 2% 2% 2% 3%Presence of computer room <1% <1% <1% <1% <1% <1% 1%Presence of school toilets 92% 93% 93% 95% 92% 89% 87%School has electricity 5% 2% 2% 3% 7% 7% 6%School has drinking water facilities 34% 34% 35% 36% 40% 35% 33%

Page 127: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

109

Table 21. Proportions of students experiencing LLOs for each contextual variable of interest for TIMSS

TIMSS (2011)

Contextual variable

Botswana

Grade 6

All Maths LLO

Gender

Females 51% 46%

Age

12 years or less 21% 11%

12 years but < 14 years 66% 68%

14 years or older 12% 22%

Language spoken at home

Speaks test language at home 78% 69%

Socio-economic factors

Home resources for learning - Many resources 1% 0%

Home resources for learning - Some resources 57% 44%

Home resources for learning - Few resources 42% 56%

Parent highest education level - University or higher 10% 3%

Parent highest education level - Post-secondary non-university 16% 8%

Parent highest education level - Upper secondary 13% 11%

Parent highest education level - Lower secondary 17% 19%

Parent highest education level - Some primary, lower secondary or no school 41% 58%

Learning activities prior to attending school

Attended a preschool 45% 32%

Numeracy activities prior to primary school - Often 18% 11%

Numeracy activities prior to primary school - Sometimes 53% 54%

Numeracy activities prior to primary school - Never or almost never 28% 35%

Numeracy competency when beginning primary school - Very well 14% 6%

Numeracy competency when beginning primary school - Moderately well 75% 68%

Numeracy competency when beginning primary school - Not well 11% 16%

School resources

Presence of school library 50% 45%

Presence of computers for instruction 70% 68%

School type and school location

Urban setting (Urban, suburban, medium-size city, small town) 74% 65%

Page 128: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

110

Table 22. Proportions of students experiencing LLOs for each contextual variable of interest for prePIRLS (2011)

Contextual variable Botswana South AfricaGrade 6 Grade 6

AllReading

LLOAll

Reading LLO

GenderFemales 50% 31% 48% 36%Age12 years or less 30% 19% 36% 31%12 years but < 14 years 63% 68% 54% 53%14 years or older 7% 12% 10% 16%Language spoken at home Speaks test language at home 74% 73% 91% 88%Socio-economic factors Home resources for learning - Many resources 1% 0% 2% <1%Home resources for learning - Some resources 62% 53% 65% 61%Home resources for learning - Few resources 38% 47% 33% 39%Parent highest education level - University or higher 9% 3% 10% 3%Parent highest education level - Post-secondary non-uni

16% 7% 17% 11%

Parent highest education level - Upper secondary 14% 10% 38% 41%Parent highest education level - Lower secondary 20% 24% 14% 16%Parent highest education level - Some primary, lower secondary or no school

38% 53% 19% 28%

Learning activities prior to attending schoolAttended a preschool 46% 33% 83% 79%Literacy activities prior to primary school - Often 14% 8% 34% 30%Literacy activities prior to primary school - Sometimes 76% 78% 62% 65%Literacy activities prior to primary school - Never or almost never

10% 14% 4% 5%

Literacy competency when beginning primary school - Very well

25% 14% 31% 27%

Literacy competency when beginning primary school - Moderately well

43% 40% 44% 43%

Literacy competency when beginning primary school - Not well

32% 46% 25% 31%

Student engagement in reading lessons - Engaged 25% 9% 47% 29%Student engagement in reading lessons - Somewhat engaged

58% 63% 45% 55%

Student engagement in reading lessons - Not engaged 18% 28% 8% 15%

Page 129: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

111

School resources Presence of school library 49% 40% 41% 28%Presence of computers for instruction 61% 56% 48% 40%School type and school location Urban setting (Urban, suburban, medium-size city, small town)

75% 67% 58% 49%

Trends in literacy and numeracy learning outcomes of children in primary education in the ESA regionThe following figures show trends in performance for Swahili in Kenya (see Figure 8), mathematics in Kenya (see Figure 9), English in Tanzania (see Figure 10), Swahili in Tanzania (see Figure 11), English in Uganda (see Figure 12) and mathematics in Uganda (see Figure 13).

Figure 8. Trends in Swahili performance across time for students in Kenya (Uwezo)

Contextual variable Botswana South AfricaGrade 6 Grade 6

AllReading

LLOAll

Reading LLO

0%

20%

40%

10%

30%

50%

60%

70%

80%

90%

100%

53

2012

13

11

Nothing

Syllables

Words

Paragraph

Story

8

14

53

2011

13

11

7

14

52

2009-10

15

10

8

15

Swahili performance in Kenya (Uwezo)

Page 130: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

112

Figure 9. Trends in Mathematics performance across time for students in Kenya (Uwezo)

Figure 10. Trends in English performance across time for students in Tanzania (Uwezo)

0%

20%

40%

10%

30%

50%

60%

70%

80%

90%

100%

21

2012

10

27

Nothing

Letters

Words

Paragraph

Story

29

13

20

2011

8

24

33

14

20

2009-10

13

21

34

12

English performance in Tanzania (Uwezo)

0%

20%

40%

10%

30%

50%

60%

70%

80%

90%

100%

Nothing

Counting

Addition

Subtraction

Multiplication

Division48

2012

11

10

17

4

10

48

2011

11

18

4

10

10

Mathematics performance in Kenya (Uwezo)

Page 131: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

113

Figure 11. Trends in Swahili performance across time for students in Tanzania (Uwezo)

Figure 12. Trends in English performance across time for students in Uganda (Uwezo)

0%

0%

20%

20%

40%

40%

10%

10%

30%

30%

50%

50%

60%

60%

70%

70%

80%

80%

90%

90%

100%

100%

36

27

2012

2012

11

9

22

25

Nothing

Nothing

Letters

Letters

Words

Words

Paragraph

Paragraph

Story

Story

18

23

12

15

38

23

2011

2011

10

10

18

27

22

24

11

16

45

17

2009-10

2009-10

12

14

18

24

15

22

10

23

Swahili performance in Tanzania (Uwezo)

English performance in Uganda (Uwezo)

Page 132: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

114

Figure 13. Trends in mathematics performance across time for students in Uganda (Uwezo)

0%

20%

40%

10%

30%

50%

60%

70%

80%

90%

100%

26

2012

10

11

7

24

Numbers

Nothing

Values

Counting

Addition

Subtraction

Multiplication

10

12

32

2011

11

7

10

22

10

12

31

9

2009-10

14

16

10

8

12

Mathematics performance in Uganda (Uwezo)

Page 133: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

115

App

endi

x V

: D

etai

led

tabl

e fo

r C

hapt

er 3

coun

try

leve

l pra

ctic

esTab

le 2

3. E

xam

ple

prog

ram

mes

in E

SA

R w

ith fo

cus

on im

prov

ing

lear

ning

out

com

es in

lite

racy

and

num

erac

y of

dis

adva

ntag

ed

child

ren

in p

rimar

y ed

ucat

ion

Cou

ntry

Prog

ram

me

Impl

emen

ted

byFu

nded

by

Dis

adva

ntag

ed

child

ren

targ

eted

Eval

uatio

n

Early

gra

de li

tera

cy/n

umer

acy

prog

ram

mes

1Et

hiop

iaLi

tera

cy B

oost

Sav

e th

e C

hild

ren

Sav

e th

e C

hild

ren

Chi

ldre

n of

low

SES

En

d-lin

e II

repo

rt

(Frie

dlan

der

et a

l.,

201

2)

2M

alaw

iLi

tera

cy B

oost

(a

s pa

rt o

f th

e Spo

nsor

ship

Bas

ic E

duca

tion

Pr

ogra

mm

e)

Sav

e th

e C

hild

ren

Info

rmat

ion

not

avai

labl

eN

ot s

peci

fied

Year

2 rep

ort

(Dow

d an

d M

abet

i, 201

1)

3M

ozam

biqu

eLi

tera

cy B

oost

(as

pa

rt o

f th

e Ea

rly

Lite

racy

pro

ject

in

Moz

ambi

que)

Sav

e th

e C

hild

ren

Info

rmat

ion

not

avai

labl

e (p

rivat

e do

nor)

Incl

udin

g ch

ildre

n af

fect

ed b

y H

IV/

AID

S a

nd o

ther

vu

lner

able

chi

ldre

n in

eco

nom

ical

ly

disa

dvan

tage

d ar

eas

End-

line

repo

rt

(Mun

goi e

t al

., 2

010)

4Ke

nya

Rea

ding

to

Lear

nA

ga K

han

Foun

dation

Aga

Kha

n Fo

unda

tion

Chi

ldre

n w

ith

low

lear

ning

ac

hiev

emen

ts

in e

cono

mic

ally

di

sadv

anta

ged

dist

ricts

Ran

dom

ized

fie

ld

expe

rimen

ts r

epor

t (L

ucas

et

al.,

201

4)

Page 134: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

116

5U

gand

aRea

ding

to

Lear

nA

ga K

han

Foun

dation

Aga

Kha

n Fo

unda

tion

Econ

omic

ally

di

sadv

anta

ged

dist

ricts

Ran

dom

ized

fie

ld

expe

rimen

ts r

epor

t (L

ucas

et

al.,

201

4)

6Ke

nya

PRIM

R (

Prim

ary

Mat

h an

d Rea

ding

In

itia

tive

)

RTI

USA

IDM

argi

nalis

ed

child

ren

in s

lum

s an

d no

n-fo

rmal

se

ttle

men

ts

Fina

l rep

ort

(RT

I,

201

4)

7M

alaw

iM

alaw

i Tea

cher

Pr

ofes

sion

al

Dev

elop

men

t Sup

port

(M

TPD

S)

Cre

ativ

e A

ssoc

iate

s,

RTI, S

ewar

d In

c.U

SA

IDN

ot s

peci

fied

Proj

ect

Mon

itorin

g an

d Ev

alua

tion

rep

ort

(Ran

dolp

h et

al.,

201

3)

Sch

ool i

mpr

ovem

ent

prog

ram

mes

8Sou

th A

fric

aJE

T’s

Sch

ool

Impr

ovem

ent

Prog

ram

me

(Kha

nyis

a Ed

ucat

ion

Sup

port

Pr

ojec

t)

JET E

duca

tion

Ser

vice

sD

fID

Econ

omic

ally

di

sadv

anta

ged

rura

l ar

eas

Sus

tain

able

Sch

ool

Impr

ovem

ent,

rep

ort

(JET

Edu

cation

Ser

vice

s, n

.d.)

Early

Chi

ldho

od D

evel

opm

ent

prog

ram

mes

9M

ozam

biqu

eEa

rly C

hild

hood

D

evel

opm

ent

(EC

D)

prog

ram

me

Sav

e th

e C

hild

ren

ECD

pro

gram

me

impl

emen

tation

w

as s

uppo

rted

by

Am

eric

a G

ives

Bac

k an

d The

ELM

A

foun

dation

Youn

g ch

ildre

n in

com

mun

itie

s af

fect

ed b

y H

IV/

AID

S

Ran

dom

ized

Im

pact

Ev

alua

tion

(M

artine

z et

al.,

201

2)

10Rw

anda

Early

Liter

acy

and

Mat

hs In

itia

tive

(E

LMI)

Sav

e th

e C

hild

ren

DfID

, In

nova

tion

for

Ed

ucat

ion

Fund

Pres

choo

l chi

ldre

n,

incl

udin

g ch

ildre

n in

rem

ote

area

s w

itho

ut a

cces

s to

EC

D p

rogr

amm

e

Rw

anda

mid

-line

re

port

(Sav

e th

e C

hild

ren,

201

4)

Cou

ntry

Prog

ram

me

Impl

emen

ted

byFu

nded

by

Dis

adva

ntag

ed

child

ren

targ

eted

Eval

uatio

n

Page 135: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

117

Appendix VI: Main stock-taking tableThe main stock-taking table lists all assessments identified during this study that assess student learning outcomes in literacy and numeracy in primary education in the ESA region. The assessments are presented in the table according to the framework categories described in Chapter 1.

Table 24: Main stock-taking table NOTES:Initiatives that have an asterisk (*) in the Name column are ones whose data are used in the analysis discussed in Chapter 2.

The SACMEQ website (SACMEQ, 2013) was offline for the duration of this consultancy. The information about SACMEQ is derived from materials downloaded from the site before it went offline.

In a number of countries in ESAR, multiple implementations of EGRA/EGMA have been conducted. In such cases, the table only includes several indicative implementations for which complete or near complete documentation is available on the EdData website (see RTI (n.d.)), the main repository for EGRA/EGMA information.

N/A indicates ‘Not applicable.

1. Angola

2. Bostwana

3. Burundi

4. Comoros

5. Eritrea

6. Ethiopia

7. Kenya

8. Lesotho

9. Madagascar

10. Malawi

11. Mozambique

12. Namibia

13. Rwanda

14. Somalia

15. South Africa

16. Swaziland

17. Tanzania

18. Uganda

19. Zambia

20. Zimbabwe

1

2

3

4

5

6

7

8

9

10 11

12

13

14

1516

17

18

19

20

Page 136: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

118

Name EGRA

Organisation / institution responsible

World Bank

Purpose System-level diagnostic

Inception 2010

Frequency N/A (one-off)

Target Population Grade 3

Sample 139 schools, aiming for 36 students per school

Nationally representative

Cognitive domains Reading (Portuguese)

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Parent questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports not publicly available

Angola

Documents: (Ministry of Education of Angola, World Bank, & Russia Education Aid for Development Program, 2011)

Page 137: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

119

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Botswana participated in SACMEQ II & III

Target Population Grade 681

Sample SACMEQ III: 160 schools, giving approx. 3,975 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Botswana

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011; Monyaku & Mmereki, 2011)Websites: (SACMEQ, 2013)

81 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 138: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

120

Name TIMSS*

Organisation / institution responsible

IEA

Purpose System-level monitoring

Inception 1995

Frequency 4-year cycle; Botswana participated in 2007 & 2011

Target Population Grade 8 (2007); Grade 6 and Grade 9 (2011)82

Sample TIMSS 2011: 149 schools giving approx. 4,200 students (Grade 6); 150 schools giving approx. 5,400 students (Grade 9)

Nationally representative

Cognitive domains Mathematics and science

Contextual instruments

Student questionnaire

Teacher questionnaire (mathematics teacher questionnaire, science teacher questionnaire)

School head questionnaire

Curriculum questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive and contextual data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

International comparisons of cognitive data

Reporting and dissemination

International results reports, encyclopaedia and databases available for download from the TIMSS and PIRLS/IEA websites

Botswana

Documents: (M. O. Martin, Mullis, Foy, & Arora, 2012; M. O. Martin, Mullis, Foy, & Stanco, 2012; I. V. S. Mullis, M. O. Martin, P. Foy, & J. F. (with Olson, Preuschoff, C., Erberber, E., & Galia, J.), 2008a; I. V. S. Mullis, M. O. Martin, P. Foy, & J. F. (with Olson, Preuschoff, C., Erberber, E., Arora, A., & Galia, J.), 2008b)Websites: (IEA, n.d.; TIMSS & PIRLS International Study Center, n.d.)

82 If it was expected that a country’s Grade 4/Grade 8 students would find TIMSS assessments too difficult, IEA encouraged the country to test higher-grade children. Thus Botswana tested Grade 6 children with the TIMSS Grade 4 assessment and Grade 9 children with the TIMSS Grade 8 assessment.

Page 139: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

121

Name PIRLS, prePIRLS*

Organisation / institution responsible

IEA

Purpose System-level monitoring

Inception 2001

Frequency 5-year cycle; Botswana participated in PIRLS and prePIRLS in 2011

Target Population Grade 6 in PIRLS, and Grade 4 in prePIRLS

Sample prePIRLS 2011: 149 schools, giving approx. 4,400 students

PIRLS 2011: 149 schools, giving approx. 4,200 students

Cognitive domains Reading (English) (prePIRLS 2011)

Reading (English) (PIRLS 2011)

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Parent questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive and contextual data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

International comparisons of cognitive data reported

Reporting and dissemination

International results reports, encyclopaedia and databases available from the TIMSS and PIRLS/IEA websites

Botswana

Documents: (Joncas, 2011), (M.O. Martin & Mullis, 2012)Websites: http://timssandpirls.bc.edu/methods/index.html

Page 140: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

122

Name EGRA

Organisation / institution responsible

World Bank

Purpose System level diagnostic

Inception 2011

Frequency N/A (one-off)

Target Population Unknown

Sample 120 schools, giving approx. 1,800 pupils

Representativeness unknown

Cognitive domains Kirundi

Contextual instruments

Unknown

Test Administration School-based

One-on-one administration

Oral administration

Analysis Unknown

Reporting and dissemination

Unknown

Burundi

Documents: (RTI, 2014a)

Page 141: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

123

Name PASEC

Organisation / institution responsible

CONFEMEN

Purpose System-level monitoring

Inception 1993

Frequency Irregular (5 cycles, since inception) – Burundi participated in 2008-2009

Target Population Grade 2, Grade 5

Sample 2008–2009 post-test: 180 schools, giving approx. 2,400 students (Grade 2); 175 schools giving approx. 2,350 students (Grade 5)

Nationally representative

Cognitive domains Reading (French and Kirundi) and mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

International comparisons of cognitive data reported

Report prepared by PASEC-CONFEMEN

Reporting and dissemination

Reports available from PASEC website

Workshop on results held for government representatives

Burundi

Documents: (Ministère de l’Enseignement de Base et Secondaire et al., 2010)Websites: (CONFEMEN, n.d.)

Page 142: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

124

Name PASEC

Organisation / institution responsible

CONFEMEN

Purpose System-level monitoring

Inception 1993

Frequency Irregular (5 cycles, since inception) – Comoros participated in 2009

Target Population Grade 2, Grade 5

Sample 2009 post-test: 144 schools, giving approx. 1,900 students (Grade 2); 144 schools, giving approx. 195 students (Grade 8)

Nationally representative

Cognitive domains Reading (French) and mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

International comparisons of cognitive data reported

Report prepared by PASEC-CONFEMEN

Reporting and dissemination

Reports available from PASEC website

Workshop on results held for government representatives

Comoros

Documents: (Ministère de l’Éducation Nationale et de la Recherche & PASEC-CONFEMEN, 2010)Websites: (CONFEMEN, n.d.)

Page 143: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

125

Name Monitoring Learning Achievement (MLA)

Organisation / institution responsible

Ministry of Education, UNICEF

Purpose System-level monitoring

Inception 2001

Frequency Irregular – second MLA in 2008

Target Population Grade 3, Grade 5

Sample In 2008: 60 schools, giving approx. 2,300 students (Grade 3) and 2,000 students (Grade 5)

Nationally representative

Cognitive domains English, mother tongue, mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Parent questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Reporting and dissemination

Reports not publicly available

Workshops at national and sub-national levels

Eritrea

Documents: (UNICEF Eritrea, n.d.)

Page 144: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

126

Name EGRA

Organisation / institution responsible

USAID, RTI, Ethiopia MoE

Purpose System-level diagnostic

Inception 2010

Frequency N/A (one-off)

Target Population Grade 2, Grade 3

Sample 338 schools, giving approx. 13,000 students

Cognitive domains Reading (Tigrinya, Afan Oromo, Amharic, Somali, Sidaamu Afoo, and Hararigna)

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Report available from EdData and Ethiopia MoE websites

Policy workshop for MoE representatives and other stakeholders

Ethiopia

Documents: (RTI, 2010)

Page 145: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

127

Name EGRA

Organisation / institution responsible

Save the Children

Purpose Program evaluation (Literacy Boost initiative)

Inception 2010–2012

Frequency N/A (one-off)

Target Population Grade 3 in treatment schools and control schools in Dendi district of the Oromia region

Sample Approx. 400 students

Cognitive domains Reading (Afan Oromo)

Contextual instruments

Student questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Trends in cognitive performance computed

Reporting and dissemination

Reports available from the EdData and Save the Children websites

Ethiopia

Documents: (Cao et al., 2011; Friedlander et al., 2012; Hassen & Friedlander, 2012)

Page 146: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

128

Name EGRA

Organisation / institution responsible

USAID-Ethiopia, USAID-Washington, American Institutes for Research (AIR)

Purpose System-level diagnostic

Inception 2011

Frequency N/A (one-off)

Target Population Grade 2–Grade 4

Sample 330 schools, giving approx. 19,600 students

Cognitive domains English

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports available from AIR website

Ethiopia

Documents: (American Institutes for Research (AIR), 2012)

Page 147: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

129

Name National Learning Assessment

Organisation / institution responsible

USAID, National Educational Assessment And Examination Agency (NEAEA)83

Purpose System-level monitoring

Inception 2000

Frequency 3–4-year cycle, NLA III in 2007 & NLA IV in 2010/11

Target Population Grade 4, Grade 8

Sample NLA IV: 299 schools giving approx. 10,800 students (Grade 4); 291 schools, giving approx. 11,200 students (Grade 8)

Nationally representative

Cognitive domains Mathematics, English, mother tongue, environmental science (Grade 4)

Mathematics, English, biology, chemistry, physics (Grade 8)

Contextual instruments

Student questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Reports available from NEAEA website

Ethiopia

Documents: (Ministry of Education of Ethiopia (FDRE), 2008, 2013)Websites: (National Educational Assessments and Examinations Association of Ethiopia, n.d.)

83 The next round of the NLA in Ethiopia will be supported by UNICEF, not USAID.

Page 148: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

130

Name EGRA

Organisation / institution responsible

USAID-Kenya, USAID-Washington, RTI, Aga Khan Foundation

Purpose Program evaluation (EMACK initiative)

Inception 2007-2008

Frequency N/A (one-off)

Target Population Grade 2 in treatment and control schools in Malindi district

Sample Approx. 400 Students

Cognitive domains Reading (English, Kiswahili)

Contextual instruments

Student questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed (between baseline and end-line)

Reporting and dissemination

Report available from EdData website

Kenya

Documents: (Crouch, Korda, & Mumo, 2009)

Page 149: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

131

Name EGRA, EGMA

Organisation / institution responsible

USAID-Kenya, USAID-Washington, DFID, RTI

Purpose Program evaluation (PRIMR initiative)

Inception 2012–2013

Frequency N/A (one-off)

Target Population Grade 1, Grade 2 in treatment and control schools

Sample Approx. 220 schools, giving approx. 4,400 students

Cognitive domains Reading (English, Kiswahili), mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School inventory

Classroom inventory

Classroom observation

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed (between baseline and end-line)

Reporting and dissemination

Reports available from EdData website

Kenya

Documents: (Piper & Mugenda, 2013; RTI, 2012, 2014d)

84 A number of other implementations of EGRA/EGMA have been conducted in Kenya.

84

Page 150: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

132

Name National Assessment System for Monitoring Learning Outcomes (NASMLA)

Organisation / institution responsible

The National Assessment Centre at the Kenya National Examinations Council

Purpose System-level diagnostic/monitoring

Inception 2010

Frequency Uncertain

Target Population Grade 3

Sample 328 schools, giving approx. 8,000 students

Nationally representative

Cognitive domains Literacy (English) and Numeracy

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School and classroom observation schedule

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports (including recommendations) available from the Kenya National Examinations Council website

Kenya

Documents: (The National Assessment Centre, 2010a, 2010c)Website: (The Kenya National Examinations Council, n.d.)

Page 151: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

133

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Kenya participated in SACMEQ I–III

Target Population Grade 685

Sample SACMEQ III: 193 schools, giving approx. 4,500 students

Nationally representative

Cognitive domains Student tests in reading (English) , mathematics, and health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Kenya

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011; Wasanga, Ogle, & Wambua, 2012)Website: (SACMEQ, 2013)

85 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 152: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

134

Name Uwezo*

Organisation / institution responsible

Twaweza

Purpose System-level monitoring

Inception 2009/2010

Frequency Annual

Target Population 6–16 years old

Sample Uwezo 2012: Approx. 145,000 children

Cognitive domains Reading (English and Kiswahili), numeracy

Contextual instruments

Household observation

Village observation

School observation

Test Administration Household-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Trends in cognitive performance computed

Results presented in national and regional reports

Reporting and dissemination

Reports, datasets and other documentation available from the Uwezo website

Results disseminated via radio and print media

Kenya

Documents: (Uwezo-Kenya, 2013; Uwezo, 2014)Website: (Twaweza, n.d.)

Page 153: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

135

Name Lesotho National Assessment of Educational Progress (LNAEP)

Organisation / institution responsible

Examinations Council of Lesotho (ECoL), Ministry of Education and Training

Purpose System-level monitoring

Inception 2003

Frequency 1–2-year cycle

Target Population Grade 3, Grade 6

Sample Cycle 4: 184 schools, giving approx. 3,680 students at each grade

Nationally representative

Cognitive domains Sesotho, English, mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Reporting and dissemination

Report (including policy implications) available on the ECoL website

Lesotho

Documents: (“Lesotho National Assessment of Educational Progress (LNAEP) Survey Report, 2010,” n.d.)Websites: (Examinations Council of Lesotho (ECoL), n.d.)

Page 154: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

136

Name Assessment of Grades 1, 2 and 3 in Lesotho

Organisation / institution responsible

The Australian Council for Educational Research

Purpose Pilot for system-level monitoring

Inception 2014

Frequency N/A (one-off)

Target Population Grades 1–3

Sample 16 schools, giving approx. 950 students

Cognitive domains Sesotho, mathematics

Contextual instruments

N/A

Test Administration School-based

One-on-one or small group administration

Oral or tablet-based administration

Analysis IRT analysis used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results

Reporting and dissemination

Reports not publicly available

Lesotho

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Website: (SACMEQ, 2013)

Page 155: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

137

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Lesotho participated in SACMEQ II & III

Target Population Grade 686

Sample SACMEQ III: 182 schools, giving approx. 4,250 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Lesotho

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Website: (SACMEQ, 2013)

86 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 156: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

138

Name EGRA

Organisation / institution responsible

World Bank

Purpose System-level diagnostic

Inception 2009

Frequency N/A (one-off)

Target Population Unknown

Sample Unknown

Cognitive domains Reading (Malagasy)

Contextual instruments

Unknown

Test Administration Unknown

Analysis Unknown

Reporting and dissemination

Unknown

Madagascar

Documents: (RTI, 2014a)

Page 157: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

139

Name Assessing Learner Achievement

Organisation / institution responsible

Malawi Institute of Education

Purpose System-level monitoring

Inception 2005

Frequency 3–4-year cycle

Target Population Grade 2, Grade 3, Grade 5, Grade 7

Sample Unknown

Cognitive domains Chichewa, English, mathematics, life skills

Contextual instruments

Unknown

Test Administration Unknown

Analysis Unknown

Reporting and dissemination

Unknown

Malawi

Documents: (UNESCO, 2008, 2015)

Page 158: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

140

Name EGMA

Organisation / institution responsible

USAID-Malawi, RTI

Purpose Program evaluation (baseline for Malawi Teacher Professional Development Support initiative)

Inception 2010

Frequency N/A (one-off)

Target Population Grade 2, Grade 4

Sample 50 schools, giving approx. 1,000 students

Cognitive domains Mathematics

Contextual instruments

N/A

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Reporting and dissemination

Reports available from EdData website

Malawi

Documents: (USAID, 2011)

Page 159: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

141

Name EGRA

Organisation / institution responsible

Save the Children

Purpose Program evaluation (Literacy Boost initiative)

Inception 2009–2010

Frequency N/A (one-off)

Target Population Grade 2 and Grade 4 at 24 treatment and control schools in Zomba

Sample Approx. 850 students

Cognitive domains Reading (Chichewa)

Contextual instruments

Student questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Trends in cognitive performance computed

Reporting and dissemination

Reports available from EdData and Save the Children websites

Malawi

Documents: (Dowd & Mabeti, 2011; Dowd, Wiener, & Mabeti, 2010)

Page 160: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

142

Name EGRA

Organisation / institution responsible

USAID-Malawi, RTI

Purpose System-level monitoring

Inception 2010–2012 (baseline, midline, end-line)

Frequency N/A (one-off)

Target Population Grade 2, Grade 4

Sample 2012 end-line: 202 schools, giving approx. 5,200 students

Cognitive domains Reading (Chichewa)

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Reports available from EdData website

Malawi

Documents: (Miksic & Harvey, 2012; Pouezevara, Costello, & Banda, 2013; RTI, 2011)

87 Several other implementations of EGRA/EGMA have been conducted in Malawi. The table only includes a couple of indicative implementations for which complete or near complete documentation is available on the Eddata website (the main repository for EGRA/EGMA information).

87

Page 161: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

143

Name Monitoring Learning Achievement (MLA)

Organisation / institution responsible

Ministry of Education, Science and Technology, UNICEF

Purpose System-level monitoring

Inception 2012

Frequency 3-year cycle (intended)

Target Population Grade 2, Grade 4, Grade 7

Sample 2012: 225 schools, giving approx. 3,400 students (Grade 2), 2,750 students (Grade 4), and 3,200 students (Grade 7)

Nationally representative

Cognitive domains Chichewa, English, mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports not publicly available

Malawi

Documents: (Ministry of Education, Science, & Technology of Malawi, 2014)

Page 162: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

144

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring (national)

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Malawi participated in SACMEQ I–III

Target Population Grade 688

Sample SACMEQ III: 139 schools, giving approx. 2,800 students

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website89

Malawi

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011; Ministry of Education, Science, & Technology of Malawi, 2011)Website: (SACMEQ, 2013)

88 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

89 Note that the SACMEQ website was offline for the duration of this consultancy. The information in this cell is based on material downloaded from the site before it went offline.

Page 163: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

145

Name EGRA

Organisation / institution responsible

Save the Children

Purpose Program evaluation (Literacy Boost initiative)

Inception 2010-2011

Frequency N/A (one-off)

Target Population Grades 1–3 in treatment and control schools in Gaza

Sample 2011 end-line: approx. 550 children (preschool); approx. 430 students (Grades 1–3)

Cognitive domains Reading (Portuguese and Shangana)

Contextual instruments

Student questionnaire

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Trends in cognitive performance computed

Reporting and dissemination

Reports available from EdData website and Save the Children website

Mozambique

Documents: (Dowd & Fonseca, 2012)

Page 164: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

146

Name EGRA

Organisation / institution responsible

USAID, International Business & Technical Consultants, Inc. (IBTCI), Global Surveys Corporation (GSC Research)

Purpose Program evaluation (USAID/Aprender a Ler (APAL) initiative)

Inception 2013

Frequency N/A (one-off)

Target Population Grade 2, Grade 3 in treatment and control schools in Nampula and Zambézia provinces

Sample 2013 baseline: Approx. 3,500 students (baseline in 2013)

Cognitive domains Reading (Portuguese)

Contextual instruments

Student questionnaire

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School inventory

Classroom inventory

Classroom observation

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Reporting and dissemination

Reports available from EdData website

Mozambique

Documents: (Raupp, Newman, & Revés, 2013)

Page 165: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

147

Name National Assessment

Organisation / institution responsible

Instituto Nacional de Desenvolvimento de Educação

Purpose System-level monitoring

Inception 2000

Frequency Irregular – 2nd and 3rd implementations in 2006 and 2009

Target Population Grades 3

Sample Unknown

Cognitive domains Mother tongue, Portuguese, mathematics

Contextual instruments

Unknown

Test Administration Unknown

Analysis Unknown

Reporting and dissemination

Unknown

Mozambique

Documents: (UNESCO, 2008, 2015; UNICEF Mozambique Country Office, 2015)

Page 166: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

148

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Mozambique participated in SACMEQ II & III

Target Population Grade 690

Sample SACMEQ III: 183 schools, giving approx. 3,400 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Mozambique

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Website: (SACMEQ, 2013)

90 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 167: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

149

Name National Standardized Achievement Test (NSAT)

Organisation / institution responsible

Directorate of National Examinations and Assessment (DNEA)

Purpose System-level monitoring

Inception 2009

Frequency Biannual

Target Population Grade 5, Grade 7

Sample Nationally representative

Cognitive domains English, mathematics (Grade 5)

English, mathematics, natural science (Grade 7)

Contextual instruments

Unknown

Test Administration Unknown

Analysis Unknown

Reporting and dissemination

Unknown

Namibia

Documents: (UNESCO, 2015)Websites: (American Institutes for Research (AIR), n.d.; Nhongo, 2014; Sasman, 2011)

Page 168: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

150

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Namibia participated in SACMEQ I–III

Target Population Grade 691

Sample SACMEQ III: 275 schools, giving approx. 5,000 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from the SACMEQ website

Namibia

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2005, 2011)Website: (SACMEQ, 2013)

91 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 169: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

151

Name EGRA, EGMA

Organisation / institution responsible

USAID-Washington, USAID-Rwanda, Rwandan Ministry of Education

Purpose System-level diagnostic

Inception 2011

Frequency N/A (one-off)

Target Population Grade 4, Grade 6

Sample 42 schools, giving approx. 840 students

Cognitive domains Reading (English and Kinyarwanda), mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School inventory

Classroom inventory

Classroom observation

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports available from EdData website

Rwanda

Documents: (DeStefano, Ralaingita, Costello, Sax, & Frank, 2012)

Page 170: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

152

Name Learning Achievement in Rwandan Schools (LARS)

Organisation / institution responsible

Rwanda Education Board (REB),

Education Quality and Standards Department, UNICEF, UNESCO

Purpose System-level monitoring

Inception 2011

Frequency 3-year cycle

Target Population Grade 3

Sample 60 schools, giving approx. 2,500 students

Cognitive domains Literacy (Kin-Rwanda) and numeracy

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Parent questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports not publicly available

Rwanda

Documents: (Rwanda Education Board, 2012), (UNICEF Rwanda Country Office, 2015)

Page 171: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

153

Name EGRA

Organisation / institution responsible

Concern Worldwide

Purpose Program evaluation

Inception 2013–2014

Frequency N/A (one-off)

Target Population Grade 2, Grade 3, Grade 4 in Concern-supported schools in Mogadishu

Sample 2014 end-line: Five schools, giving approx. 321 students

Cognitive domains Reading (Somali language)

Contextual instruments

N/A

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Reporting and dissemination

Reports not publicly available

Somalia

Documents: (Beattie, 2014; Beattie & Grogan, 2013)

Page 172: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

154

Name Monitoring Learning Achievement (MLA)

Organisation / institution responsible

Ministry of Education, UNICEF

Purpose System-level monitoring

Inception N/A

Frequency 3-year cycle (intended)

Target Population Grade 4, Grade 7 in Puntland and Somaliland

Sample 15 schools, giving approx. 1,000 students (Grade 4)

Cognitive domains Somali and Mathematics (Grade 4)

Somali, Mathematics and Science (Grade 7)

Contextual instruments

N/A

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Reporting and dissemination

Reports not publicly available

Somalia

Documents: (UNICEF Somalia, n.d.)

Page 173: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

155

Name Annual National Assessment

Organisation / institution responsible

Department of Basic Education (DBE), Ministry of Education of South Africa

Purpose System-level monitoring

Inception 2011

Frequency Annual

Target Population Grade 1–6, Grade 992

Sample Census (i.e. all children in target population)

Cognitive domains Language (English, Afrikaans and nine local languages), mathematics (Grades 1–3)

Language (English and Afrikaans) and mathematics (Grades 4–6, Grade 9)

Contextual instruments

N/A

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Reporting and dissemination

Reports available from DBE website

South Africa

Documents: (Department of Basic Education Republic of South Africa, 2011, 2012, 2013, 2014)

92 In South Africa’s Annual National Assessment, testing of Grade 7 and Grade 8 was piloted in 2014.

Page 174: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

156

Name National Assessment of Learner Achievement (NALA)

Organisation / institution responsible

Human Sciences Research Council

Purpose System-level monitoring (national)

Inception 2008

Frequency N/A (one-off)

Target Population Grade 9

Sample Unknown

Cognitive domains Language, mathematics, natural sciences

Contextual instruments

Unknown

Test Administration School-based

Group administration

Paper-based administration

Analysis Unknown

Reporting and dissemination

Reports not publicly available

South Africa

Documents: (UNESCO, 2008, 2015)Website: (Human Sciences Research Council South Africa, n.d.)

Page 175: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

157

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; South Africa participated in SACMEQ II & III

Target Population Grade 693

Sample SACMEQ III: 392 schools, giving approx. 9,100 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

South Africa

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011; Moloi, n.d.)Website: (SACMEQ, 2013)

93 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 176: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

158

Name TIMSS*

Organisation / institution responsible

IEA

Purpose System-level monitoring

Inception 1995

Frequency 4-year cycle; South Africa participated in 2011

Target Population Grade 9 (2011)94

Sample TIMSS 2011: 285 schools, giving approx. 12,000 students

Nationally representative

Cognitive domains Mathematics and science

Contextual instruments

Student questionnaire

Teachers questionnaires (Mathematics, Science)

School head questionnaire

Curriculum questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive and contextual data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

International comparisons of cognitive data reported

Reporting and dissemination

International results reports, encyclopaedia and databases available from the TIMSS and PIRLS/IEA websites

South Africa

Documents: (M.O. Martin & Mullis, 2012; M. O. Martin, Mullis, Foy, & Arora, 2012; M. O. Martin, Mullis, Foy, & Stanco, 2012; I.V.S. Mullis et al., 2008a, 2008b)Websites: (IEA, n.d.; TIMSS & PIRLS International Study Center, n.d.)

94 If it was expected that a country’s Grade 4/Grade 8 students would find TIMSS assessments too difficult, IEA encouraged the country to test higher-grade children. Thus South Africa tested Grade 9 children with the TIMSS Grade 8 assessment.

Page 177: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

159

Name PIRLS, prePIRLS*

Organisation / institution responsible

IEA

Purpose System-level monitoring

Inception 2011

Frequency 5-year cycle; South Africa participated in 2006 and 2011

Target Population Grade 5 (2006), Grade 4, Grade 5 (2011)95

Sample prePIRLS 2011: 341 schools giving approx. 15,750 students

PIRLS 2011: 95 schools, giving approx. 3,500 students

Nationally representative

Cognitive domains Reading (English, Afrikaans, 11 local languages) (prePIRLS 2011)

Reading (English, Afrikaans) (PIRLS 2011)

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Parent questionnaire

Curriculum questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive and contextual data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

International comparisons of cognitive data reported

Reporting and dissemination

International results reports, encyclopaedia and databases available from the TIMSS and PIRLS/IEA websites

South Africa

Documents: (Howie, Staden, Tshele, Dowse, & Zimmerman, 2012; M.O. Martin & Mullis, 2012; I.V.S. Mullis, Martin, Foy, & Drucker, 2012; Ina V.S. Mullis, Martin, Kennedy, & Foy P., 2007)Websites: (IEA, n.d.; TIMSS & PIRLS International Study Center, n.d.)

95 To overcome the challenges presented by multiple native languages and languages of instruction, South Africa tested Grade 5 students in PIRLS in 2006, instead of testing the standard of Grade 4 students. In 2011, Grade 4 students were tested with the easier prePIRLS assessment, and Grade 5 children were again tested with PIRLS.

Page 178: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

160

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Swaziland participated in SACMEQ II & III

Target Population Grade 696

Sample SACMEQ III: 172 schools, giving approx. 4,000 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Swaziland

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Websites: (SACMEQ, 2013)

96 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 179: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

161

Name EGRA, EGMA

Organisation / institution responsible

USAID-Tanzania, RTI

Purpose System-level monitoring

Inception 2013 (baseline)

Frequency N/A (one-off)

Target Population Grade 2

Sample 200 schools, giving approx. 2,300 students

Cognitive domains Reading (English, Kiswahili), mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School inventory

Classroom inventory

Classroom observation

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Reports available from EdData website

Tanzania

Documents: (Brombacher et al., 2014)

Page 180: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

162

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Tanzania participated in SACMEQ II & III97

Target Population Grade 698

Sample SACMEQ III: 196 schools, giving approx. 4,200 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Tanzania

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Websites: (SACMEQ, 2013)

97 Only Zanzibar of Tanzania participate in SACMEQ I.

98 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 181: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

163

Name Uwezo*

Organisation / institution responsible

Twaweza

Purpose System-level monitoring

Inception 2009/2010

Frequency Annual

Target Population 6–16 years old

Sample Approx. 10,5000 children

Nationally representative

Cognitive domains Reading (English and Kiswahili), numeracy

Contextual instruments

Household observation

Village observation

School observation

Test Administration Household-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Trends in cognitive performance computed

Results presented in national and regional reports

Reporting and dissemination

Reports, datasets and other documentation available from the Uwezo website

Results disseminated via radio and print media

Tanzania

Documents: (Uwezo-Tanzania, 2013; Uwezo, 2014)Website: (Twaweza, n.d.)

Page 182: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

164

Name EGRA

Organisation / institution responsible

William and Flora Hewlett Foundation, RTI

Purpose System-level diagnostic

Inception 2009

Frequency N/A (one-off)

Target Population Grade 2, Grade 3 in Central and Northern provinces

Sample 50 schools, giving approx. 1,950 students

Cognitive domains Reading (English, Luganda/Lango)

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School inventory

Classroom inventory

Classroom observation

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Report available from EdData website

Uganda

Documents: (Piper, 2010)

Page 183: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

165

Name EGRA

Organisation / institution responsible

Save the Children

Purpose Program evaluation (Literacy Boost initiative)

Inception 2010 (baseline), 2012 (midline)

Frequency N/A (one-off)

Target Population Grade 3 in treatment and comparison schools in Amuru and Nwoya districts

Sample 2012 midline: approx. 530 students

Cognitive domains Reading (Luo), one mathematics subtask

Contextual instruments

N/A

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Trends in cognitive performance computed

Reporting and dissemination

Reports available from Save the Children website

Uganda

Documents: (Friedlander, Candiru, & Dowd, 2010; Guajardo et al., 2010)

Page 184: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

166

Name National Assessment of Progress in Education (NAPE)

Organisation / institution responsible

Uganda National Examinations Board (UNEB)

Purpose System-level monitoring

Inception 1996

Frequency 1–3-year cycle, most recently in 2010

Target Population Grade 3, Grade 6

Sample 2010: 1,098 schools, giving approx. 21,900 students (in each of Grade 3 and Grade 6)

Nationally representative

Cognitive domains Literacy (English and local languages), numeracy

Contextual instruments

Student questionnaire

Head teacher questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Reporting and dissemination

Reports available from UNEB website

Uganda

Documents: (Uganda National Examinations Board, 2010)Website: (Uganda National Examinations Board, n.d.)

Page 185: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

167

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Uganda participated in SACMEQ II & III

Target Population Grade 699

Sample SACMEQ III: 264 schools, giving approx. 5,300 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Uganda

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Websites: (SACMEQ, 2013)

99 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 186: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

168

Name Uwezo*

Organisation / institution responsible

Twaweza

Purpose System-level monitoring

Inception 2009/2010

Frequency Annual

Target Population 7–16 years old

Sample Approx. 92,000 children

Nationally representative

Cognitive domains Reading (English and local languages), numeracy

Contextual instruments

Household observation

Village observation

School observation

Test Administration Household-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Trends in cognitive performance computed

Results presented in national and regional reports

Reporting and dissemination

Reports, datasets and other documentation available from the Uwezo website

Results disseminated via radio and print media

Uganda

Documents: (Uwezo-Uganda, 2013; Uwezo, 2014)Website: (Twaweza, n.d.)

Page 187: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

169

Name EGRA, EGMA

Organisation / institution responsible

USAID, RTI

Purpose Pilot for system-level diagnostic

Inception 2011

Frequency N/A (one-off)

Target Population Grade 2, Grade 3 in the Central, Copperbelt, Luapula, and Northern regions

Sample 33 schools, giving approx. 800 students

Cognitive domains Reading (Bemba), mathematics

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

School inventory

Classroom inventory

Classroom observation

Test Administration School-based

One-on-one administration

Oral administration

Analysis IRT analysis not used to scale cognitive data

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Reporting and dissemination

Report available from EdData website

Zambia

Documents: (Collins et al., 2012; RTI, 2015a)

Page 188: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

170

Name EGRA

Organisation / institution responsible

USAID, EDC

Purpose Program evaluation

Inception 2012

Frequency N/A (one-off)

Target Population Unknown

Sample 1,400 students in 6 provinces

Cognitive domains Reading (English, ChiNyanja, ChiTonga, IciBemba)

Contextual instruments

Unknown

Test Administration School-based

One-on-one administration

Oral administration

Analysis Unknown

Reporting and dissemination

Reports not publicly available

Zambia

Documents: (Collins et al., 2012; RTI, 2015a)

Page 189: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

171

Name EGRA

Organisation / institution responsible

USAID, RTI

Purpose System-level diagnostic (as part of the National Assessment Survey)

Inception 2014

Frequency 2-year cycle

Target Population Grade 2

Sample 850 schools, 8,500 students

Nationally representative

Cognitive domains Reading (Bemba, Nyanja, Luvale, Lunda, Silozi, Kikoande, Tonga)

Contextual instruments

Unknown

Test Administration School-based

One-on-one administration

Oral administration

Analysis Unknown

Reporting and dissemination

Reports not publicly available

Zambia

Documents: (Collins et al., 2012; RTI, 2015a)

Page 190: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

172

Name National Assessment of Learning Achievement

Organisation / institution responsible

Examinations Council of Zambia

Purpose System-level monitoring

Inception 1999

Frequency 2-year cycle

Target Population Grade 5, Grade 9

Sample 2008: Approx. 400 schools, approx. 8,000 students

Nationally representative

Cognitive domains Grade 5 : English, mathematics, life skills

Grade 9 : English, mathematics, environmental sciences

Contextual instruments

Student questionnaire

Teacher questionnaire

Head teacher questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis not used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Frequency analyses conducted on contextual data

Reporting and dissemination

Reports not publicly available

Results of Grade 5 in 2008 were disseminated at provincial level. Also, remedial materials were developed for the areas that were found to be challenging for teachers and learners based on the test item analysis

Zambia

Documents: (Examinations Council of Zambia, 2015; RTI, 2015a, 2015c; Sakala & Chilala, 2007; UNICEF Zambia Country Office, 2015)

Page 191: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

173

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Zambia participated in SACMEQ I–III

Target Population Grade 6100

Sample SACMEQ III: 157 schools, giving approx. 2,900 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Zambia

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Websites: (SACMEQ, 2013)

100 Teachers of Grade 6 reading, mathematics and health knowledge are also tested in SACMEQ.

Page 192: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

174

Name SACMEQ

Organisation / institution responsible

SACMEQ

Purpose System-level monitoring

Inception SACMEQ I: 1995–1999

Frequency 5–6-year cycle; Zimbabwe participated in SACMEQ I and III

Target Population Grade 6101

Sample SACMEQ III: 155 schools, giving approx. 3,000 students

Nationally representative

Cognitive domains Reading (English), mathematics, health knowledge

Contextual instruments

Student questionnaire

Teacher questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Working papers present international-level results

Reports present national-level results

Working papers, reports, databases available from SACMEQ website

Zimbabwe

Documents: (Hungi, 2011a, 2011m; Hungi et al., 2010; Makuwa, 2011)Websites: (SACMEQ, 2013)

101 Teachers of Grade 6 Reading, Mathematics and Health Knowledge are also tested in SACMEQ.

Page 193: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

175

Name Zimbabwe Early Learning Assessment (ZELA)

Organisation / institution responsible

ZIMSEC

Purpose System-level monitoring

Inception 2012–2015 (baseline, two cycles)

Frequency N/A (one-off)

Target Population Grade 3

Sample 2014 cycle: 500 schools, giving approx. 16,000 students

Cognitive domains English, mathematics, Ndebele and Shona

Contextual instruments

Student questionnaire

School head questionnaire

Test Administration School-based

Group administration

Paper-based administration

Analysis IRT analysis used to scale cognitive data

Competency levels/benchmarks established

Frequency analyses conducted/mean scores calculated for cognitive results, disaggregated by contextual variables of interest

Relationship between cognitive performance and contextual factors explored via analytical techniques (eg correlation, regression, multilevel modelling)

Trends in cognitive performance computed

Reporting and dissemination

Reports not publicly available

Zimbabwe

Documents: (The Australian Council for Educational Research & Zimbabwe School Examination Council, 2013a, 2013c, 2015)

Page 194: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

176

Page 195: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in
Page 196: Improving Quality Education and Children's Learning ... · PDF fileImproving Quality Education and Children’s ... Test administration methods of assessments ... PIRLS Progress in

178