44
EdData II Data for Education Research and Programming (DERP) in Africa Annual Report October 1, 2015-September 30, 2016 EdData II Technical and Managerial Assistance, Task Number 19 Contract Number BPA No. EHC-E-00-04-0004 Task Order Number AID-OAA-12-BC-00004 Date: October 2016 This publication was produced for review by the United States Agency for International Development. It was prepared by RTI International.

Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Embed Size (px)

Citation preview

Page 1: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

EdData II

Data for Education Research and Programming (DERP) in Africa Annual Report October 1, 2015-September 30, 2016

EdData II Technical and Managerial Assistance, Task Number 19 Contract Number BPA No. EHC-E-00-04-0004 Task Order Number AID-OAA-12-BC-00004 Date: October 2016

This publication was produced for review by the United States Agency for International Development. It was prepared by RTI International.

Page 2: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Data for Education Research and Programming (DERP) in Africa Annual Report October 1, 2015-September 30, 2016

Prepared for Bureau for Africa United States Agency for International Development 1300 Pennsylvania Avenue, N.W. Washington, DC 20523

Prepared by RTI International 3040 E. Cornwallis Road P.O. Box 12194 Research Triangle Park, NC 27709-2194

RTI International is a registered trademark and a trade name of Research Triangle Institute.

The views expressed by the authors at RTI International do not necessarily reflect the views of the United States Agency for International Development or the United States Government.

Page 3: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Table of Contents List fTables ............................................ .. ................. ...... ..... .............. ................. ..................... i

Acronyms and Abbreviations .................................................................................................... ii

Executive Sun1n1ary ................................................................................................................... 1

Programming and Project Support ..................................... ...... .. ....... .. ......... .. ....... ... ... ........... 1

Operational Activities ............................................................................................................ 2

Technical Activities .............................................. .. ............. .................. ................................ 4

Progress 'foward Objectives (by Result) ................................................................................... 5

Result 1: Africa Mission Strategy-Related Data Needs Met ................................................ 5

Activities and Achievements-...... .. .................. ......... .............................................. ............ 5 Result l Next Steps ............................................................................................................ 5

Result 2: Availability of Africa Education Data and ·rrends Expanded ................................ 5

DERP in Africa Dissemination Plans ............................ ........................ ...... ........ .. ............ 5 Impact Evaluation for Education Programs ....................................................................... 7 Using Data for Accountability and Transparency in Schools Activity in Kenya ............ 12 State of Literacy in Selected Sub-Saharan African Countries ......................................... 17 Teacher Effectiveness in Selected Sub-Saharan African Countries ................................ 18 Reading Materials Survey (Survey of Children's Reading Materials in African l.Alnguages in ll Countries) ...................................................................................... ." ...... 19 Africa Regional Education Workshop ............................................................................. 21 .Result 2 Next Steps .......................................................................................................... 27

Result 3: Measurement Tool with Applicability Across Countries Developed ................... 28

Early Grade Reading Scale-Up and Sustainability Planning Tool .................................. 28 Guide for Gender Equality and Inclusiveness ............ .............. ... ............. ........................ 29 School-Related Gender-Based Violence Activity ............................................................ 31 r~GR ·Barometer ................................................................................................................ 32 Result 3 Next Steps .......................................................................................................... 34

Annex A. Financial Summary .................................................................................................. 35

Annex B. 'l'ime Line of Events in FY 2016 ...... .......... ............................................................. 36

List of Tables ·rable 1.

Table 2.

Table 3.

Table 4.

Table 5.

Table 6.

Table 7.

Table 8.

Status for Each Result Under the Contract (Active Components in the Fourth Quarter ofFY 20 16) ............................................................................... 4

Pre- and Post-Test Scores ................................................................................ 12

Findings of Errors or Discrepancies from Randomly Sampled Schools ......... 16

Data Regarding the Titles Surveyed ................................................................ 20

Final Agenda for the 2016 AREW ................................................................... 22

List of Participants Attending the AREW ....................................................... 25

Items to Change and Keep for AREW ..................................... ....... ................. 27

Workshop Agenda on the Guide for Promoting Gender Equality and rnclusiveness .................................................................................................... 30

Page 4: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Acronyms and Abbreviations Allegro

AME AREW CIES DOD DEC DEP-AME DERP E3 EdData 11 EGR EGRA EiCC EMIS FY IBM LQAS MOE MOEST OASIS PRIMR SMS sow SRGBV 'rO

UNICEF US AID USB WFD YALI

Allegro Global Procurement Solutions

Asia and the Middle East Africa Regional Education Workshop Comparative and International Education Society

Digital Divide Data Development Experience Clearinghouse Data for Education Programming in Asia and the Middle East Datafbr Education Research and Programming in Afi'·ica Economic Growth, Education, and Environment

Education Data for Decision Making

early grade reading Early Grade Reading Assessment Education in Crisis and Conflict Education Management Information System

Fiscal Year IBM Corp. lot quality assurance sampling Ministry of Education Ministry of Education, Science, and 'I'echnology Opportunities for Achievement and Safety in Schools Kenya Primary Math and Reading Initiative

short message service scope of work school-related gender-based violence

·rask Order United Nations Children's Fund U.S. Agency for International Development Universal Serial Bus workforce development President's Young African Leaders Initiative

ii

Page 5: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Executive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa, is a

centrally funded Time and Materials TO under the Education Data for Decision Making (EdData If) Indefinite Quantity Contract EHC-E-00-04-0004. This contract, between the U.S.

Agency for International Development (USAlD) and R'l'l International, was awarded with a

TO ceiling price . The project began on October 1, 2012~ the contractual end

date is November 30, 2016. The financial statement through fiscal year ending September 30, 2016 of the TO 19 contract is presented in Annex A of this annual report, and the time line of events this year is presented in Annex B.

The core objectives of this TO are to assist the Africa missions in addressing their Education Strategy-related data needs. 'fhese needs are to use existing tools to gather data, develop tools to address data gaps, use data to develop key trends, and assess the data capacity of a

host country. TO 19 broadly covers any data-related activity that will assist the missions in

designing and implementing programs that contribute to the achievement of the following

three goals set forth in the Education Strategy:

• Goal 1: Improve reading skills for 100 million children in primary grades by 2015

• Goal 2: Improve the ability oftetiiary and workforce development programs to generate workforce skills relevant to a country 's development goals

• Goal 3: Increase equitable access to education in crisis and contl ict environments for

15 million learners by 2015.

Support activities under this TO are intended to provide technical assistance and trainin g

services to 27 missions implementing or planning basic education programs in South, East,

Central, and West Africa.

Programming and Project Support

The DERP in Africa project is intended to generate regional- and country-specific education data and an analysis ofthose data for use by USALD's Bureau for Africa, missions, and

partner countries to prioritize education needs and the corresponding investment. This project works to strengthen local skills in the design, evaluation, and management of education

programs and in quality data capture and analysis to support them across the region. Activities are designed to achieve the following three key results:

Result 1: Africa mission strategy-related data needs met

Result 2: A vai lability of Africa education data and trends expanded

Result 3: Measurement tool with applicability across countries developed.

This annual report covers the period from October 1, 2015, through September 30, 2016. Du ring that period, the activities supported by the DERP in Africa project focused on Results

2 and 3.

Page 6: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Operational Activities

Ms. Michelle Ward-Brent continued as Lead Manager of TO 19, working alongside Ms. Anna Dick as Activity Manager. Dr. Kakali "Koli" Banik became the Contracting Officer's

Technical Representative as of October 1, 2015, and RTI has worked well with Dr. Banik throughout the transition. As the DERP in Africa contract draws to a close, closeout activities

have begun. Some of these activities include confirming that all of the reports are posted to appropriate sites, including the USAlD Development Experience Clearinghouse (DEC);

ensuring that property disposition is prepared and completed in a timely manner; and ensuring that other appropriate project closeout activities are completed.

Brief summaries of the major activities conducted during Fiscal Year (FY) 2016 are

presented in the remainder of this executive summary.

Result 1

• No major activities occurred under Result l during FY 2016, but some management

costs were incurred. As such, costs were split across the result areas.

Result 2

• Dissemination Plan: RTI submitted a draft Dissemination Pian to USAID at the end

ofFY 2015. Discussions were held about the proposed plan in the first quarter ofFY 2016, after which RTl proceeded with a number of different dissemination activities.

In particular, blog posts and briefers were prioritized for most of the DERP in Africa activities, and many of these have been posted to various blog sites. In addition, many deliverables were translated to French. These deliverables include the Guide for Strengthening Gender Equality and Inclusiveness in Teaching and Learning Materials (henceforth referred to as the Guide for Gender Equctlity and Inclusiveness), the Teacher Effectiveness in Selected Sub-Saharan African Countries desk study, the Guidance on Planningj(;r Language Use in Education: Factors to Consider and Recommendations for Optimizing Learning Outcomes desk study, and

the State qfLiteracy in Selected Sub-Saharan Aji-ican Countries report. 1n addition, a workshop about the Guide for Gender Equality and Inclusiveness was held in

Washington, DC, in September 2016, with future plans for videos and a Webinar

before the DERP in Africa project ends. ln addition, presentations on the Big Data Activity and the Reading Materials Survey were made at the Comparative and International Education Society (CIES) Conference in March 2016.

• State o(Literacv in Selected Sub-S'aharan African Countries report: R Tl staff submitted this rep01t to USAID in late FY 2015, and USAID provided feedback in FY 2016. The English version ofthe report was later finalized. RTI staff also prepared a briefer to support dissemination activities for the rep011. The rep01t is currently being

translated to French.

• Teacher Effectiveness in Selected Sub-Saharan Afi·ican Countries desk study: USAlD approved the main report before FY 20 16; however, during the first quarter of FY

2016, RTI staff prepared and submitted a briefer from the report. During the second and third quarters of the fiscal year, the report was tTanslated to French; USAID

approved the translated report on May 16, 2016. A blog post about the report was posted in August 2016 to the Global Partnership for Education's site.

2

Page 7: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

• U5ing Data {or Accountability and Transparency in Schools (also termed Kenva Big Data Activitv/Data Revolution fhr DevelopmentActivitv): This activity, to improve data accuracy and collection in primary schools in two pilot test counties in Kenya, was completed in FY 2016, along with activity subcontractors IBM Corp. (lBM) and Digital Divide Data (DOD).

• Reading Materials Survev: During FY 2015, RTI submitted the draft Reading Materials Survey Report, including the country reports, to USAlD for review and

comment. During FY 2016, RTI staff revised the main rep011 and country reports to incorporate USAID's feedback. In addition, RTI staffpresented the survey and report at the Global Book Fund meeting and at ClES. Later during FY 2016, RTl stafJ finalized the report. RTI staff are currently developing a briefer about the overall

report.

• Africa J~egional Education Workshop CAREW): rfhe 2016 AREW was successfully

held in Addis Ababa, Ethiopia, in May 2016. The workshop was an opportunity for USA TO education sector staff in Africa to meet and share experiences about Africa­specific education issues. RTI' s subcontractor, Allegro Global Procurement Solutions (Allegro), managed the logistics for the workshop.

• Impact Evaluation Course: Early during FY 2016, at USAfD's request, DERP ofJered a session of the 1m pact Evaluation Course (originally developed under the DEP-AME TO). This course was held at the Washington Learning Center in Crystal City, VA, tl·om October 26-30, 2015 and reached USAID Education Officers who were visiting the area for another meeting.

Res ult 3

• A Guide /(Jr Strengthening Gender Equality and Inclusiveness in Teaching and Learning lvfaterials: During the first quarter of FY 2016, RTJ staff finali zed the Guide for Gender Equality and Inclusiveness, and then translated the document to French. In addition, RTI statJ prepared a blog post; however, it has not yet been posted because the teani is working to identify an appropriate outlet for the post. During September 2016, a workshop was held in Washington, DC, about how to use the Guide for Gender Equality and Inclusiveness . RTI stan· are planning to host a Webinar and to post informational videos on YouTube during the remainder of the contract.

• Earlv Grade Reading (EGR) Program Scale-Up and Sustainahilitv Tool: ln September, 2015, the EGR Program Scale-up and Sustainability Tool was piloted in Malawi; since then, work has been ongoing on the tool. After USAID provided feedback to RTI staff regarding the tool, it was determined that the tool would be pared town to a lighter version (less data intensive) than originally planned so that it can be downloaded.

• Conceptual Framework {or Measuring School-Related Gender-Based Violence (henceforth referred to as the Conceptual Framework) and the Literature Review on School-Related Gender-Based Violence : How It Is Defined and Studied (henceforth referred to as the SRGBV Literature Review): Per USAlD personnel's request, the name ofthe Conceptual Framework activity report to remove the reference to a toolkit, as the document ref1ects work towards a conceptual framework. During FY 2016, RT1 staff submitted several rounds of revisions of the Conceptual Framework

3

Page 8: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

draft to USAJD personnel, who forwarded the revised version to other stakeholders for their review. RTI staff incorporated the significant revisions into the Conceptual Framework, and then submitted the report for additional review. R'ri staff arc currently making additional revisions to the rcp01i. Additionally, USAJD personnel were nearing completion of their review of the SRGBV Literature Review, with a

final version expected as the DERP contract closes out.

• Earlv Grade Reading (EGR) Barometer: The EGR Barometer was developed under the Data for I~ducation Programming in Asia and the Middle l~ast (DE]:>-AMI~) 'fO.

The EGR Barometer is available at earlygradcreadingbarometcr.org. Throughout FY 2016, data sets from sub-Saharan African ·countries have been added to the EGR Barometer, and DERP in Africa has suppoticd expanded functionality of the EGR Barometer.

Technical Activities

The technical activities undcti(ikcn during FY 2016 arc described i~1 dctai I under each result

presented in Table 1. A summary status is also presented in Table 1 for activities at the end of

the fourth quarter of FY 2016.

Table 1. Status for Each Result under the Contract (Active Components in the Fourth Quarter of FY 2016)

Result 1: Africa Mission Strateg~-Related Data Needs Met Status

No ongoing activities Not applicable

Result 2: Availability of Africa Education Data and Trends Ex(2anded Status

Guidance on Planning for Language Use in Education: Factors to Consider and · Completed Recommendations for Optimizing Learning Outcomes desk study

Senegal Language of Instruction Report Completed

State of Literacy in Selected Sub-Saharan African Countries Report and Matrix Completed

Using Data for Accountability and Transparency in Schools Report Pending final approval with

Ministry and USAID

DERP in Africa briefers In process

Teacher Effectiveness in Selected Sub-Saharan African Countries desk study Completed

Reading Materials Survey General Report · Completed

Reading Materials Survey Country Report revisions Completed

Planning and management of the AREW in 2016 Completed

Planning and management of Impact Evaluation Course Completed

DERP Dissemination Plan In process

Result 3: Measurement Tool with AQQiicabilit~ Across Countries DeveloQed Status

Guide for Gender Equality and Inclusiveness Completed

Guide for Gender Equality and Inclusiveness (French translation) Completed

Revisions to EGR Program Scale-Up and Sustainability Tool In process

Conceptual Framework revisions In process

SRGBV Literature Review Submitted , pending USAID approval

EGR Barometer for sub-Saharan Africa In process

4

Page 9: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Progress Toward Objectives (by Result)

Result 1: Africa Mission Strategy-Related Data Needs Met

Activities and Achievements

The objectives of Result I are to

• Establish the mechanisms through which the TO will be able to identify and respond

to missions ' requests for data supp01i. The support may take various forms such as the

following:

- Capacity, institutional , and/or systems assessments

- Reviews of existing country data and programs

- Evaluations of how countries' existing learning assessments can be used or

adapted to track improvements in reading

- Adaptation or development of tools for collecting data

• Support miss ions in developing specific requests

• Define the criteria and weights needed for a transparent rubric for "scoring" missions' requests

• In collaboration with Africa and the relevant mission, develop explicit plans for each

accepted request

• Support the implementation of the approved plan for data support to a mission

• Produce and disseminate reports that document the work accomplished.

Result 1 Next Steps

• There were no major activities under Result 1 during FY 2016, though ifUSAID

personnel requests activities, then RTl staff are prepared to provide support. No additional activities are expected through the completion of the DERP in Africa

contract.

Result 2: Availability of Africa Education Data and Trends Expanded

DERP in Africa Dissemination Plans

As the DERP in Africa project wraps up, one goal is to disseminate the work completed

under the project. In September 2015, RTl submitted a Dissemination Plan to USAID for

di scuss ion, which was held in December 2015.

The di ssemination efforts aim to share the research conducted under the DERP in Africa

project broadly and in a manner that is user friendly for those accessing the information. The

dissemination efforts focus on leveraging social media and blogs to share information.

As with previous years, USAID and RTT personnel shared some research from the DERP in

Africa project at the CIES Conference. During 2016, the conference was held in Vancouver,

Canada from March 6 through 10. Dr. Banik chaired a panel about research activities of

USAID's Bureau for Africa-E~ducation, and two R'l'I presenters each led a presentation to discuss DERP activities. Specifically, Mr. Mitchell Rakusin discussed the Using Data for

5

Page 10: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Accountability and Transparency in Schools/Kenya Big Data Activity, and Ms. Dick shared key findings fi·om the Reading Materials Survey. Ms. Karon l-Iarden, Activity Leader of the Reading Materials Survey was originally scheduled to discuss the survey, but she was unable to attend the ClES Conference. The panel was originally planned to include a presentation

about the EGR Program Scale-Up and Sustainability Tool, but plans to include that presentation were canceled before the CIES Conference. The panel was well received by the

attendees, who participated in a discussion after the presentations.

A I so at CIES Conference, a reception was also held to highlight the work across

DERP and other EdData II TOs, specifically TO 20, which focuses on Goal 1 of the USAID Education Strategy, and the Data for Education Programming in Asia and the Middle East (DEP-AME) TO.

The reception was an opportunity for conference attendees to engage with

Education Data for Decision MaJ<ing (Ed Data H) Reception

You Are invited roan evening tecep!lon at tlte 2016 CIES Con fe rence.

U5A!D'~ b:lu<>.~W...~ O.aro f~F D&!!><<.~\ t-l>V.it;g fEdDiu til i ~ a res~·ud~-b:u~d proj~ct work!ng to !mprov~ th~ J.ccuracy, timeliness. and ;ltcess.ibihty of dau kw b::tsi<:· ~ d ut:.i\tiQil policy INking ctnd program planning in Asii\. m~ t1iddle E.n; and ~ub-Sa.l\o\rtm Afl'ica EdD:u.:t I! h;).s produced reH.•arch studies. gu!dance documents.. 01nd p!·;;.cta:-al cools chat 1ne~s. ,md improve ch i!dre:1's basic reading .lnd mlth sk ill s

The receptton ts ..111 opponunlty w mingl~ and lcMn more Jbout i\Ctllfftle.s conduned under· EdDam II

Tue>day. M • .rch S, 20 14 17'30 p.m. to 930 p.m. Sh eraronYilnco~ IV~r Wall Centre I P,wJion Ballroom D

Figure 1. The invitation to the EdData II reception on March 8, 2016, during the C!ES Conference.

USAID and RTI staff about the work completed through the project. In supp01i of this event, RTI staff developed the invitation (see Figure 1) for the reception, which was sponsored by the DERP project, Goal 1 TO 20, and the DEP-AME TO 15. In addition, during the reception, printed copies of the approved delivera91es were available for attendees to review. Attendees were given Universal Serial Bus (USB) t1ash drives with the documents loaded

onto them so that the attendees could easily keep files of final deliverables without having to hand-carry large amounts of printed materials from the ClES Conference. T'he reception was well received, with approximately I 00 attendees throughout the evening.

To broaden the availability of the repotis under the DERP in Africa project, some of the documents were translated to French so that they could be used in francophone countries in sub-Saharan Africa. Specifically, the Teacher E.Yfectiveness jn Selected Sub-Saharan African Countries desk study and the Guide j(Jr Gender Equality and Inclusiveness, the Language of Instruction Guidance Document, and the State c~f Literacy report have been or are in the process of being translated.

Briefers either have been or are in the process of being prepared for almost every activity under the DERP in Africa project. Generally, RT1 staff prepare draft versions of the briefers as activities come to a close and share these with USAID personnel as soon as they are ready. Finalized briefs are currently being translated to French. RTl has completed the following briefers during this fiscal year:

• Teacher E.ffectiveness Briefer (submitted October 9, 20 15)

• SRGBV Literature Review Briefer (submitted October 22, 2015)

• State ofLiteracy Brie_fer (submitted January 4, 20 16)

6

Page 11: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

• Language oflnstruction Guidtmce Briefer (submitted August 14, 20 15)

• Teacher Effectivene,\'S Briefer (French translation, submitted September 9, 2016)

• State q(Literacy Briefb· (French translation, submitted September 9, 20 16)

• Gender and Inclusiveness Guide Briefer (French translation, submitted September 9, 2016)

• Language oflnstruction Guidance Briefer (French translation, submitted September 9, 2016).

Dissemination efforts by USAlD personnel this also included sharing work through

Education Sector Council meetings held at the Agency. The activities presented to the

Education Sector Council during the fourth quarter of FY 2016 were the Kenya Big Data Activity and the Guidef()r Gender Equality and Inclusiveness. One additional presentation to

the Council is planned about the EGR Barometer in October 2016. Another activity used to

di sseminate efforts to USAID Education Officers included the AREW held in Ethiopia in May 2016.

·r he dissemination efforts have also included outreach through media and social media

outlets, including through blogs and Twitter). In particular, tweets were drafted and shared

tweets with USAID to send out via Twitter to disseminate information about the EGR

Barometer's Web site (http: //www.earlygradereadingbarometer.org). One tweet in June 2016

that was prepared for the USAID Education Twitter handle resulted in 84 new visitors to the EGR Barometer's Web site.

In addition, several blog posts were written to promote and disseminate activities under the

Dli RP in Africa project. 'fhe Language oflnstruction Guidance document was posted on the

World Education blog in July 2016 at https://gemreportunesco.wordpress.com/ 20 16/07/27 /planning-for-language-use-in-education-best-practices-and-practical-steps-for­

improving-learning. The post was shared more than 80 times on Facebook and 76 times on

Linked In. A blog post about the Teacher Ef fectiveness in Selected Sub-Saharan African Countries desk study, as well as a link to the full report at the eddataglobal.org Web site,

was posted to the Global Partnership for Education's blog in August, and is available online

at http: //www.globalpatinership.org/blog/improve-quality-education-recons ider-true­

definition-good-teacher. RT1 staff have developed a draft version of the Guide for Gender Equality and Inclusiveness blog post, which will be finalized and posted before the end of the project.

Throughout this year and the life of the DE::RP in Africa project, dissemination efforts and

posting the reports to the DEC and to the ecldataglobal.org Web site have reached USA1D's

Education Oflicers, early grade reading stakeholders and researchers, to help ensure that the ·

project 's research will benefit stakeholders for years to come.

Impact Evaluation for Education Programs

As pati of the EdData II DEP-AME ·ro, RT'l staff developed the Impact Evaluation for Education Programs course. During FY 2016, USAID's Bureau for Africa personnel

expressed interest in replicating the course for the Agency ' s own staff in Washington, DC,

and to its Education Miss ion staff in Africa.

7

Page 12: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

The training consists of two components: an online portion through USAID University and an in-person supplement. By the end of the course, participants are expected to be able to

• Describe the impotiance of evaluations for education programs, projects, and activities

• Differentiate between qualitative and quantitative data collection techniques and

describe the value of each

• Differentiate between performance and impact evaluations and understand when each

is appropriate

• Differentiate between experimental, quasi-experimental , and non-experimental evaluation designs and know the strengths and I imitations of each

• Manage an evaluation from design and procurement to the dissemination of results

• Write an effective scope of work (SOW) for an evaluation and critical review of

deliverables.

RTI's subcontractor, Allegro, managed the logistics for the training sessions, particularly the

in-person component. Allegro personnel worked with staff from USAJD's Bureau for Africa to update the online module originally designed for the Asia and the Middle East (AME) Bureau to reflect the training needs for USAID's Bureau for Africa. 'rhis updated online module was supplemented by an in-person training session in Cape Town, South Afi·ica, fi·orn February 23- 27, 2015. Following the successful event, USAfD personnel expressed interest in holding another in-person training, just outside Washington, DC, in October 2015.

A total of24 individuals (i.e., 14 women and 10 men from across USAID field missions) attended the second in-person training session at the Washington Learning Center in Crystal City, VA, from October 26- 30, 2015. Out of the 24 attendees, 23 people completed the course. Dr. Melissa Chiapetta of Social Impact served as the instructor for both training

sessions (i.e., Cape Town, South Africa, and Crystal City, VA).

rfhe agenda for the Crystal City, VA, in-person training session is presented as follows:

8

Page 13: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

USAID Education Evaluations: Assessing and Learning from Education Interventions

Washington Learning Center, Crystal City, VA, from October 26- 30, 2015

Daily Course Agenda Monday, October 26

Objective: Participants will gain a clear understanding of the importance of evaluations of

education projects at USAlD, how evaluations are designed and how data are collected for

education evaluations, and the different types of sampling methods that exist.

8:30 a.m.

9:30a.m.

10:3 0 a.m.

1 0:45 a.m.

1 1 : 1 5 a.m.

12:3 0 p.m.

1:45 p.m.

3:30p.m.

3 :45 p.m.

5:30p.m.

Pre-tests, check in, and introductions

Review of on I ine eva] uation course concepts

Break

USAlD Evaluation Policy and the Education Policy

Evaluation purpose and questions

Lunch

Education data collection and analysis methods

Break

Sampling

Close

Tuesday, October 2 7

Objective: Patticipants will gain a clear understanding of threats to reliability and validity w ith data collection methods, especially as they relate to education evaluations. Parti cipants

w ill understand the spectrum of evaluation designs and will learn how to decide which to use fo r specific education projects or activities. Participants will also become familiar with

experimental designs and learn how to identify one, how to interpret an analysis, and how to

identify the limitations of such designs.

8:3 0 a.m.

9:3 0 a.m.

10:30 a.m.

10:45 a.m.

11: 15 a.m.

12:30 p.m.

1 :45 p.m .

3:30p.m.

3:45p.m.

4:15p.m.

5:30p.m.

Warm-up and review of Day l topics

Threats to reliability and validity

Break

The evaluation design spectrum

Jntroduction to impact evaluation and the counterfactual

Lunch

Experimental designs

Break

Experimental designs (continued)

Case study of an experimental design for an education activity

Close

9

Page 14: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Wednesday, October 28

Objective: Participants wiil become familiar with quasi-experimental designs and non­experimental designs and will learn how to identify one and select the appropriate one to answer specific education evaluation questions, how to interpret outcomes, and how to

identify the limitations of such designs.

8:30a.m.

9:30a.m.

10:30 a.m.

10:45 a.m.

12:30 p.m.

1:45 p.m.

2:30p.m.

3:30p.m.

3:45p.m.

4:45p.m.

5:30p.m.

Warm-up and review of Day 2 topics

Introduction to quasi-experimental designs for impact evaluation

Break

Quasi-experimental designs

Lunch

Case study of a quasi-experimental design for an education activity

Introduction to non-experimental designs

Break

Non-experimental designs

Case study using a non-experimental design for an education activity

Close

Thursday, October 29

Objective: Participants wiil gain an understanding of how to design effective statements of work for education evaluations, how to review proposals, how to budget for evaluations, and

how to review proposal budgets.

8:30a.m.

9:30a.m.

10:30 a.m.

10:45 a.m.

11:45 a.m.

12:30 p.m.

1:45 p.m.

2:45p.m.

3:30p.m.

3:45p.m.

4:45p.m.

5:30p.m.

Warm-up and review ofDay 3 topics

Preparing for an education evaluation

Break

SOW design

Reviewing a SOW for an education evaluation

Lunch

Group work on proposal expectations

Budgeting for an education evaluation

Break

Budgeting exercise

Managing education evaluations

Close

10

Page 15: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Friday, October 30

Objective: Participants will gain an understanding ofhow to manage typical problems that

arise in education evaluations, how to review an evaluation report, and how best to set up a

dissemination and use plan for the evaluation.

8:30a.m.

9:00a.m.

10: 15 a.m.

10:30 a.m.

ll :30 a.m.

12:00 p.m.

12:30 p.m.

1:45 p.m.

Review of Day 4 topics

Reviewing an evaluation report (findings, conclusions, and recommendations)

Break

Review of an evaluation rep01t

Disseminating results from evaluations

Wrap up, course evaluations, and post-tests

Close

Optional "office hours" (consultative meetings by appointment)

11

Page 16: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

··--·-·----- -------------

To gauge learning among participants in the 1m pact Evaluation for Education Programs course, they were asked to complete both pre- and post-tests. The questions were the same on both tests, yet the order of the questions was changed. 'rhe pre- and post-test questions and possible answers are presented in the remainder of this section ofthe annual report.

The average change for all patiicipants from pre- to post-test was 30o/o, and none of the 23 participants who completed the in-person course performed worse on the post-test than on the pre-test. The results from both tests arc presented in Table 2.

Table 2. Pre- and Post-Test Scores

Participant Pre-Test 1Post-Test Differencea

1 58% 83% 25%

2 50% 50% 0%

3 58% 58% 0%

4 42% 100% 58%

5 17% 75% 58%

6 67% 83% 17%

7 58% 75% 17%

8 50% 67% 17%

9 42% 75% 33%

10 25% 75% 50%

11 58% 75% 17%

12 42% 83% 42%

13 42% 100% 58%

14 50% 50% 0%

15 17% 67% 50%

16 25% 50% 25%

17 42% 58% 17%

18 33% 58% 50%

19 50% 58% 8%

20 17% 83% 67%

21 33% Not applicable Not applicable

22 Not applicable 100% Not applicable

23 Not applicable 67% Not applicable

a Note: Pre-test and post-test difference may be off slightly due to rounding.

Using Data for Accountability and Transparency in Schools Activity in Kenya

The Using Data for Accountability and r[ransparcncy in Schools activity in Kenya is aligned to the theme of Big Data and Data Revolution for Development, which are both growing areas of interest i.n strategic conversations between donor agencies, think tanks, developing

12

Page 17: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

country governments, and non-governmental organizations. The term "Data Revolution for

Development" was coined in The Report ofthe High-Level Panel ofEmJnent Persons on the Post-2015 Development Agenda. In that repoti, global leaders, including Mr. David Cameron

(Britain's Prime Minister), Ms. Ellen Sirleaf-Johnson (President of Liberia), and Mr. Susilo

Bam bang Yudhoyono (former President of Indonesia) stated the following:

''We also call for a data revolution.f(Jr sustainable development, with a ne-vv

international initiative to improve the quality qj'statistics and information available to citizens. We should actively take advantage ofnew technology, crowd-sourcing, and improved connectivity to empower people with information on the progress towards the targets. "

Leaders from key philanthropic organizations, including the Bill & Melinda Gates Foundation, have also called for an intensification of data use in the pursuit of development

goals such as in1proving learning outcomes. In addition , agencies such as the World Bank,

USAID, and the United Kingdom 's Department for International Development have already

begun to formally assess how to respond to the role of a data revolution in development.

This small-scale pilot research initiative is designed to help the Kenya Ministry of Education,

Sc ience, and Technology (MOEST) make greater use of learner performance and schoo l

quality data. Over the past year and a haH: RTI stan: in collaborati.on with personnel from the

United Nations Children's Fund (UNICEF), IBM, and ODD, have provided technical assistance to the MOEST to test information and communication technologies to inform

future efforts to strengthen Education Management Information Systems (EMlS). The following three broad components compose the program:

• Strengthening EMIS data validation capacity and tools, including switching from use

of a paper questionnaire with one on a mobile device (i.e., tablet or smartphone)

• Piloting mobile EMIS data col1ection applications (te lephone applications in Isiolo

County and tablet applications in Mombasa County)

• Testing feedback systems and reporting tools for schools, counties, and sub-counties.

At the beginning of FY 2016, the activity focused on intensive training and reflection

activities, which were held with School Heads and with county and sub-county officials to

prepare for the annual collection of schoo l census data at the end of school year (i.e. , end of

October). 'rhe main activities during the first quarter of FY 2016 involved training, data

col lection and field follow up, and iinitial policy dialogue with the MOEST. These three main

activities are described further in the following paragraphs.

Training

A ll of the tablets and mobile telephones were received before the training workshops began,

thu ensuring effic ient distribution during and after training. The Kenya Big Data Field Team (the F ield Team) of RTI's subcontractors (i.e. , DOD and IBM Corp.), as well as R'ri consultant Dr. Andrew Riechi followed up with the three schools that did not receive a device

(either a mobile te lephone or a tablet) while attending the training. During this period, the

Field Team captured critical feedback from the participants about the EMIS forms (issued by

the MOEST) and the mobile telephone and tablet applications.

13

Page 18: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

The Field Team led the training workshops, which occurred from October 12 through October 16,2015. The training workshops were held with stakeholders at the county and sub­county levels, as well as with school officials. ln lsiolo County, I 03 teachers, 107 Deputy Head Teachers, and eight county and sub-county officials were trained on how to properly usc a mobile telephone application. In Mombasa County, 96 Head Teachers, 95 Deputy Head Teachers, and six county or sub-county officials were trained on how to properly usc the tablet applications. Despite the short notice given to teachers because of the teachers' strike, nearly all (97%) Head 'feachers and Deputy Head 'feachers attended the training sessions in both counties. Out of the I 09 schools in Isiolo County, I 08 of them had a representative who attended one of the training sessions there. Out of the 98 schools in Mombasa County, 97 of them had a representative attend the training sessions there.

Data Collection and Field Follow Up

Data collection occurred from November 9 through 13, 2015, in all three sub-counties in Isiolo County; from November 2 through 6 in Likoni and Changamwe sub-counties; and from November I6 through 20 in K1sauni and Mombasa sub-counties in Mombasa County. In lsiolo County, the digital entry capture was designed to coincide with the exact same dates as the MOEST's timeline. ln Mombasa County, the four sub-county submission dates were staggered, relative to the submission date for the MOEST's currently used paper questionnaire, further testing the accuracy of the data if they were captured without being transferred from a completed paper questionnaire. Project updates were communicated to the teachers through a bulk short message service (SMS) platform, which was installed and n1onitored by ODD. Fmihermore, the DDD Team randomly selected 30<% of the schools in each county and made follow-up calls to assess the status of the practice sessions, better identify any technical issues, and identify schools that may need more assistance. After the calls, the IBM 'feam made subsequent follow-up caJls to the teachers to address the technical issues raised. When the Head 'feachers asked questions, the county and sub-county officials were very supportive with answering the questions (ifthese were in the officials' capacity) ; otherwise, the officials referred the questions to the Field Supervisors.

Within two weeks, the data submission fl'om 96 schools (88%) was completed or was in progress in Isiolo County, and the data submission from 98 schools, or 100% of the schools was either completely submitted or in progress in Mombasa County. Overall , it was observed that the submissions increased after November 12, which was the end date for the national examinations in primary schools. In lsiolo County, schools in the more urbanlsiolo sub­county had higher submission rates. In mid-November, the Field Team conducted a series of structured feedback sessions with Head Teachers and county and sub-county officials to catalogue their implementation issues and challenges.

Initial Policy Dialogue with the MOEST

'Ihe MOI::S'f held a mini-EM IS retreat on December 16 and 17. The Big Data 'l'eam was invited to share the status to date of the activities and participate in ongoing strategy discussions about the future of EMlS in Kenya. By the end of the retreat, the attendees reached an agreement regarding the four main objectives that should guide the process. The four objectives are to

14

Page 19: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

1. Strengthen EM IS capacity by establishing national and county centers by the end of

2017

2. Improve the processes of EMIS data management cycle by the end of2017

3. Integrate digitization in EMlS data management cycle by 2019

4. Review E.MIS policy and legislation framework ofmanaging EMlS by 2017.

The MOEST specifically requested that lsiolo and Mombasa Counties continue to use the equ ipment and applications developed to collect other types of data at the schools after the

pilot. ·rhe Ministry also requested that RTl and the Big Data Team remain engaged in issues spec ifically relating to Objectives three and four.

Thr ughout the beginning of the calendar year 2016, and into the second quarter of FY 2016, the activity continued to move quickly, providing training to the school officials about the school report cards and how to validate the data collection. The highlights of this work

involve training on school report cards, validation of EM IS data collection, and feedback sess ions and qualitative assessments. These highlights are presented as follows in the

foll owing paragraphs.

Training on School Report Cards

From late January to early February 2016, the Big Data Team conducted a series of training workshops with county, sub-county, and school officials about how to use the dashboards and feedback reports. The objectives of the training workshops were to help Head Teachers

interpret and use report cards and to ensure that they were properly trained about how to share and explain the reports with the school management groups. ln both counties, the

Kenya Big Data Activity Team was accompanied by Ministry of Education (MOE) Officers. Representatives from approximately 95% of schools from both counties attended the training workshops, with Mombasa County having a higher attendance compared to lsiolo County.

Validation ofEMIS Data Collection

The schools submitted their EMIS data during the data collection effort in November (i.e. , the

firs t quarter of FY 20 16), and data were ready for analysis in early January 2016. At the end of January, County and District Quality Assurance Officers were trained on how to properly

use the tablets to conduct data validation activities. The first objective of the validation exercise was for the County and District Quality Assurance Otlicers to visit a random selection of 20 schools within each county, modeling a form of lot quality assurance

sampling (LQAS).

·rhe primary objective of the validation exercise was to determine the frequency and magnitude of reporting discrepancies , particularly with regard to the number of students,

teachers, and textbooks reported by schools. The second objective was to assess the capacity of the county and sub-county ' s systems to implement a validation protocol. Regarding the

second objective, the Kenya Big Data Activity Team experienced some challenges when working with tl1e County and District Quality Assurance Officers that mainly focused on the

very limited bandwidth of the officers due to other competing duties. The County and District

Qua lity Assurance Officers struggled to fully patiicipate in the training; many of them subsequently did not conduct the field visits as promised, for a number of reasons.

15

Page 20: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Ultimately, to support reporting of data accuracy, the DDD Team conducted site validation vis its directly.

Table 3 shows the findings of errors or discrepancies for each county from the 20 schools randomly sampled. It is important to note that the intent was not to compare the performance of the counties, but rather to compare their accuracy with the broader population that used paper forms.

Table 3. Findings of Errors or Discrepancies from Randomly Sampled Schools

Mombasa Variables County lsiolo County

Enrollment discrepancy >1 0% ( n=schools) 4 (Passa) 7 (Not Passa)

Enrollment discrepancy range (minimum) -85.9% -'100.0%

Enrollment discrepancy range (maximum) 10.9% 902.0%

Average enrollment error (over reporting) 2.0% 11 .1%

Textbooks discrepancy >10% (n=schools) 0 (Passa) 6 (Not Pass3)

Textbook discrepancy range (minimum) 0.0% -1 00.0%

Textbook discrepancy range (maximum) 7.5% 158.0%

Average error (over reporting) 0.2% 10.4%

a "Pass" or "not pass" was determined by the 90% decision rule set before the data analysis was conducted .

Table 3 shows the number of schools out of the 20 sampled reporting discrepancies in their enrollment and textbooks, wh ich are the two greatest sources of error. Before conducting the analysis, a decision rule was set, in which 90% of the schools should report an error of less than 10% over-inflation of enrollment and textbooks. Applying the LQAS 1nethodology to the sample, we found that Mombasa County passed for both variables, but lsiolo County did not. ln Mombasa County, four schools (10%) over reported enrollment by greater than lO<Yo , and zero schools (0%) over reported textbooks by l 0%. fn contrast, in lsiolo County, seven schools (35%) over reported enrollment by greater than l 0%, and six schools (30%) over reported textbooks by 10%. The range of error and the average error for over reporting was

also examined. Again, Isiolo County had greater discrepancies on both counts for enrollment and textbooks, averaging greater than 11 o/o inflated enrollment tJ.gures and 10% inflated textbook figures. 'fhese two data points have a direct impact on the amount of capitation funds disbursed to schools.

Feedback Sessions and Qualitative Assessments

During March 2016, the Kenya Big Data Activity Team conducted a series of feedback sessions and qualitative assessments with the school officials and the District and County OftJ.cers. The feedback sessions were designed to elicit feedback about how well the school report cards were received and used by schools. 'rhe qualitative assessments were designed to obtain more nuanced information about the Head 'I'eachers ' understanding of the E~ MJS form requirements and their adoption and receptivity regarding the electronic forms and devices. 'J'he assessments were also designed to obtain information about the quality and use of school

record keeping. The qualitative assessments also included feedback from County and District

16

Page 21: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Officers. The key takeaways of the qualitative review ofthe assessment are presented as follows:

1. Many Head Teachers delegate the task of filling out the form to their Deputies, with some data quality assurance provided by the Head Teachers themselves.

2. Most District and County Officers assume that schools are misreporting, even when

the validation data do not support that assertion.

3. The roles and responsibilities of Head Teachers and their Deputies for reporting and

submitting data are not clearly understood or well defined.

4. Many fields in the EMIS form were completed and submitted with inconsistent decision criteria: school category, number of repeaters, teacher qualifications, and the number of textbooks by subject.

During the second half of FY 2016, the work in support of this activity was concluding.

During the third quarter of FY 20 I 6, .M.r. Rakusin, the day-to-day R'ri Activity l..~eader,

traveled to Nairobi, Kenya, to meet with the activity partners to prepare the draft version of the final activity report. Mr. Rakusin and the in-country Activity Consultant, Dr. Riechi, met with R'TTs subcontractors (i.e. , IBM Corp. and ODD) and MOEST counterpmis to share preliminary findings of the activity and the plan for a policy dialogue meeting.

Although an additional meeting with MOEST personnel was expected for this work, given the project's timeline for conclusion and the availability of the MOEST Team, this meeting does not seem to be feasible. R'T'l staff submitted the draft version of the final activity report

to USAID personnel for review and comment. USAlD provided minor feedback to RTf in June and July 2016. RTf staff are currently finalizing the report.

State of Literacy in Selected Sub-Saharan African Countries

In an effort to review USAID's progress toward Goal 1: improving reading skills for 100

million children by 2015, this desk study was requested to review student literacy assessments in sub-Saharan Africa. This desk study draws together information on which

select countries in sub-Saharan Africa have and have not conducted early grade reading assessments, while also providing information regarding where literacy issues remain critical and where improvement is underway. Twenty countries were selected because of USAID's

presence and continued interest in working to improve educational opportunities within these countries. 'fhe 20 countries are as follows: Benin, Cote d' lvoire (lvory Coast), Democratic

Republic of the Congo (DRC), Djibouti, Ethiopia, Ghana, Kenya, Liberia, Malawi, Mali,

Mozambique, Nigeria, Rwanda, Senegal, Somalia, South Africa, South Sudan, 'fanzania, Uganda, and Zambia.

The review was conducted through a systematic review of international literature, databases, and organizational rep01is. A main output of this study is a matrix created to provide an easy­to-navigate overview of the relevant early grade reading assessments - one matrix with an overview of every Early Grade Reading Assessment (l:GRA) that has been conducted across

the 20 countries, and a second matrix similar information for alternative assessments (such as Uwezo, national assessments, the Annual Status of Education Report (ASER), and the Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ).

Another example of an alternative assessment is the Analysis Programme of the CONFEMEN (La Conference des Ministres de !'Education des pays ayant le franc;ais en

17

Page 22: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

patiage, the Conference of Education Ministers of Countries Using French in Common) Education Systems (PASEC). A final matrix provides data from all EGRAs in the form of zero scores for oral reading fluency (ORF). While these scores are not intended to be directly comparable across counties, they can serve an illustrative purposes to gain an understanding regarding how countries are performing within each given study. 'fhe repoti drafted as part of this desk study supplements the matrices- comprehensively describing the contents of the matrices and explaining the implications of the matrix information.

The desk study includes information from 159 early grade reading assessments conducted across 20 countries over the past l 0 years. 'fhese assessments consistently provide evidence of relatively low levels of literacy and reading ability across countries. Additionally, the assessments highlight the importance of recognizing that there are large variations in scores across languages and across regions and districts within countries. A final finding ofthe desk study is that there is still a need for continued work on improving early grade reading within countries and a need for additional assessments (both nationally representative and across local languages, with the important reminder that all assessments should have planned strategic uses).

During FY2016, the desk study was being reviewed by USAID, however, during the third quarter ofFY20 16, USAID provided minimal feedback and revisions were quickly incorporated. The desk study was finalized on September 19, 2016. The final report is currently being translated into French. The translated version will be available by the end of the DERP in Africa cor1tract.

Teacher Effectiveness in Selected Sub-Saharan African Countries

U SAJD personnel requested that R Tl staff conduct desk research about teacher effectiveness in the sub-Saharan African context. The work represents an attempt to bridge the gap between teacher characteristics and student outcomes to yield a more holistic conceptualization of effective teaching, one that also emphasizes teachers' actual instructional practices and pedagogical moves. 'I'he repoti presents this research and draws on data and findings that are relevant to teacher effectiveness from international databases a;1d assessments, but also critically supplements these data with recent findings from donor­funded projects and evaluations that specifically attempt to observe teachers' classroom instruction. Additionally, evidence from the educational plans and policies of 13 sub-Saharan African countries and shows that these plans and policies often focus on teacher characteristics, classroom inputs, professional guidelines, and (to some extent) teaching practices. The report also describes 12 barriers that hinder the ability to focus more on effective teaching (as opposed to effective teachers and successful teaching) in sub-Saharan African countries. The report concludes with policy recommendations and considerations.

The repo1i was finalized in early FY 2016. Also during FY 2016, the report into French, then submitted for USAID review on March 18,2016. USATD personnel approved the report in May 2016. To help disseminate the findings , RTl staff prepared a briefer about the report. Also, as previously mentioned, a blog about the repot1 was posted on the Global Partnership for Education ' s Web site.

18

Page 23: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Reading Materials Survey (Survey of Children's Reading Materials in African Languages in 11 Countries)

At US ATD's request, RTf conducted a survey (inventory) of local language materials in selected sub-Saharan African countries. Originally titled the Learning Materials Survey, this

activity has since been renamed as the Reading Materials Survey. The main goals of the Read ing Materials Survey are to

• Describe the availability of titles in local languages tor the early primary grades,

especially in terms of language and book type (textbooks or other reading materials);

• Evaluate the suitability of available titles for early grade chi ldren, in terms of

pedagogical utility, reading level, cultural familiarity, and appropriateness of content.

Appropriateness was indicated by the presence or absence of objectionable content

and the propotiional representation of gender, ethnicity, religion, and persons with disabi lities; and

• Assess the feasibility of reusing, adapting, and reproducing avai lable titles based on the ir copyright status and availability in digital torm.

The Reading Materials Survey is a basis for future inventories and supports the development

of a G lobal Reading Repository and Book Fund. The general topics for the survey instrument

focused on the following:

• Basic identifying information. The Data Collection Team gathered a variety of information about the materials, including titles, authors ' names, publishers ' names,

publication dates, and prices.

• Physical characteristics and format. The team collected a variety of data, including

each title's availability in hard copy, soft copy (digital file) , or both; the number of

pages; and the presence of illustrations (if any) in black and white or in color.

• Copyright. The survey instrument included questions for determining whether the

free dissemination and reproduction of materials are permissible or whether special

authori zations or rights negotiations are required. Addit ionally, the Data Collection

Team recorded the presence of a Creative Commons license, if any, for each title.

• Language. The Data Collection Team noted for each title the language of publication as coded in the Ethnologue and the script and the conformity of the text to the

standard ized orthography, when known. 1

• Hook type. 'T'he team categor ized each title as textbook-related or non-textbook

(supp lementary) materials. The textbook-related sub-types included students' textbooks, students' workbooks, teacher ' s manuals, or decodable readers. Regarding

non-textbook (supplernentary) materials, the team members noted information about

the "genre" (narrative, informational , poetry, or reference) and " format" (e.g., leveled

reader, Big Book).

1 The Ethnologue is Web-based publication by SIL International that provides statist ics on more than 7,000 languages around the wo rld . 'T'he F.-'thno/ogue, in cooperation with the International Organization tor Standard ization, created an international standard fur language codes. A three- letter code is assigned to every language or dialect fur easy identification and classification (Lewis, M.P., Simons, G.F., & Fennig, C. D. (Eds.) . (2015a). Ethno/ogue: Languages ofthe "vorld ( 18111 ed.). Retri eved fi·om http ://www.ethnologue.com).

19

Page 24: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

• Basic indicators of content. For all titles except for teacher's manuals and reference materials, the survey instrument colJected information about the prevalent themes or topics in the content.

• Suitability for EGR instruction. The Data Collection Team assessed the potential suitability of a particular title for EGR instruction by examining the following main factors:

- Estimated reading level: ln the case of textbooks, the team registered only the recommended grade level for each title, as stated on the cover or title page. In the case of supplementary materials, an approximate estimate of the reading level was based on the maximum number of words per page.

- Pedagogical utility: The Data Collection Team examined textbook-related titles for the types of activities that were included (e.g., phonics, vocabulary, grammar comprehension).

- Content familiarity: '[he team evaluated the content of each title for its level of familiarity to the target audience based on a specific set of criteria.

- Content appropriateness: The team recorded whether potentially sensitive content (e.g., gore, sex, alcohol or drug use) appeared in each title.

- Equitable representation of gender, ethnicity, religion, and persons with a disability. The Data Collection Team used questions in the survey instrument to assess whether the contents of a particular title displayed bias or unequitable representation ofthe population, such as gender, ethnicity, religion, and persons with a disability.

RTT shared an initial draft of the general report with USAID personnel in July 2015. In September 2015, US AID personnel provided feedback about the draft version of the general report. On October 16,2015, a revised draft ofthe general report was submitted to USAJD personnel for review and comment. Throughout the first two quarters ofFY 2016, USAID personnel shared feedback about the individual country reports, which are annexes to the overall general report. Throughout the second quarter of FY 2016, RTI staff have worked to update the general report of the Reading Materials Survey and the country reports. RTl staff are also preparing data sets to be released with the reports for stakeholder use.

ln total, the survey collected data on nearly 6,000 titles across the 11 sub-Saharan African countries, as summarized in Table 4.

Table 4. Data Regarding the Titles Surveyed

598 10%

1,009 17%

354 6%

298 5%

324 5%

364 6%

20

Page 25: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

786

881

5,919

7%

13%

15%

100%

As previously mentioned, findings from the Reading Materials Survey were presented at the CIES Conference in March 2016 in Vancouver, Canada. The presentation was part of the panel on Research Activities of the USAID Bureau for Africa- Education.

During the third quatier of FY 2016, updates to the general report and country report annexes were finalized, and then submitted a revised draft to USALD personnel for review in early June. 'l'he repoti was finalized in August, and R'l'I staff provided hard copies ofthe tiles to USAID personnel. In addition, the report will be posted to the DEC and other sites as appropriate.

Africa Regional Education Workshop I

AREWs are opportunities for USAID Education Officers in sub-Saharan Africa to connect about a range of issues. The 2016 AREW focused on the following topics:

• Exchange lessons learned and best practices related to Goals 1, 2, and 3 (as set forth in the Education Strategy), including measurement parameters for Goals I and 3

• Develop a region-wide research agenda

• Address cross-cutting issues, including inc.lusive development

• Review progress in Africa towards accomplishing the goals of USAID 's Education Strategy

• Discuss support for the post-20 15 Development Agenda for the education sector in sub-Saharan Africa

• Introduce strategies for ending extreme poverty that can be integrated into planned education projects.

At the end ofNovember 2015, USAID personnel reached out to RTI to confirm availability of R'ri staff to support preparations and for Allegro to support the logistics of an AREW in 20 16. RTI and Ailegro staff confirmed their availability to support this activity. The

workshop was then set for May in Ethiopia. The event was held at the Hilton Addis Ababa Hotel in Addis Ababa, Ethiopia, from May 15 through 20, 2016. A group photograph ofthe participants who attended the 2016 AREW is presented at Figure 2.

21

Page 26: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Fiqure 2. A qroup photo of the participants attendinq the 2016 AREW

In support of sharing lessons learned and best practices to be shared at the event, R'l'I staff prepared presentations about DERP activities as requested by USA1D in the DERP in Africa Dissemination Plan feedback provided in January 20 16. R TI staff prepared and submitted a total of seven presentations to USAID on March 24, 2016. The presentation topics were the EGR Barometer, Teacher

Effectiveness, the Guide for Gender Equality and Inclusiveness, Guidance Document on Language oflnstruction, Conceptual Framework, State ofLiteracy in ...'>'elected Sub-Saharan African Countries , and the EGR Program Scale-up and Sustainability 'fool.

As the preparations for the AREW were underway, the agenda evolved with trade-offs made between topics and timing and schedule conflicts of some presenters. When finalized , each day focused on a theme. The fina1 2016 AREW agenda is presented as Table 5.

Table 5. Final Agenda for the 2016 AREW

Day 1.::....Sunday, May 15: "Where We Are" , "':

Estimated Time Session Presenter(s)

8:30- 8:45 a.m. Welcoming Remarks L. Garden

8:45-10:30 a.m . Introductions, Agenda Overview, Logistics, Handouts B. Brocker

10:30-11:00 a.m. Break

11 :00 a.m.-12:00 p.m. Session 1: Education Strategy and Economic Growth, Education, E. Rodriquez-Perez and Environment (E3) Update

12:00-1:15 p.m. Lunch

1 : 15-1 :4 5 p.m. Session 2: Progress Update by the Numbers B. Syl/a

1 :45- 2:45 p.m. Session 3: Professional Development E. Rodriquez-Perez

2:45-3:15 p.m. Break

J. Hanson Swanson

8:30-8:45 a.m. Review and Day 2 Opening B. Brocker, D. Weller

8:45-9:00 a.m. Session 5: Education in Crisis and Conflict Introduction N. Papadopoulos

9:00-9:45 a.m. Session 6: Education in Crisis and Conflict Strategic Framework N. Papadopoulos

9:45-10:15 a.m. Break

10:15 a.m.-12:15 p.m. Session 7: Design Relevant, Data-Driven, and Evidence-Based Results Frameworks and Theories of Change for Education in Crisis and Conflict

22

N. Weisenhorn

Page 27: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

12:15-1 :15 p.m. Lunch

1:15-2:15 p.m. Session 8: Accelerated Education

2:15-3:30 p.m . Session 9: Rapid Education Risk Analysis

3:30-4:00 p.m. Break

4:00-4:45 p.m. Session 10: "Speed Dating"-Systems Strengthening, Safer Learning Environments, Teacher Incentives, and Construction

Estimated Time Session

8:30-8:45 a.m. Review and Day 3 Opening

8:45-10:30 a.m. Session 11: Putting Education to Work

10:30-11:00 a.m. Break

11 :00 a.m.- 12:15 p.m. Session 12: Pipeline Management

12:15-1:15 p.m.

1:15-2:15 p.m.

2:15-3:15 p.m.

3:15-3:45 p.m.

Lunch

Session 13: Counting Methodology-Current and Future

Session 14: Association for the Development of Education in Africa (ADEA) and Stakeholder Engagement

Break and End of Day

N. Papadopoulos

N. Weisenhorn and M. Pucilowski

N. Papadopoulos, J. Hanson Swanson, R. Adams, and N. Weisenhorn

Presenter(s)

B. Brocker

N. Taggart

R. Adams and B. Brocker

B. Sylla

0 . Dibba-Wadda (guest presenter) and L. Garden

- -Day 4-Wednesday, May 18: "All Children Reading"

Estimated Time Session Presenter(s)

8:30-8:45 a.m. Review and Day 4 Opening B. Brocker

8:45-9:00 a.m . Session 15: Introduction and Overview P. Bender and M. Davidson

9:00-9:45 a.m . Session 16: Summing Up and Looking Forward- The 2011-2015 B. Sylla Strategy in Numbers

9:45-10:45 a.m. Session 17: Telling the Story-2011-2015 Lessons Learned M. Davidson Mission Panel: Ghana and Uganda (names to be announced)

10:45-11:00 a.m. Break

11 :00-11 :30 a.m. Session 18: Telling Your Story Exercise-Lessons Learned M. Davidson

11 :30 a.m.-12:30 p.m. Session 19: World Cafe-Social and Behavior Change P. Bender Communication, Global Book Fund, and Lot Quality Assurance Sampling

12:30-1 :30 p.m. Lunch

1:30- 2:15 p.m. Session 20: Increasing Impact-Key Features of Effective M. Davidson Interventions

2: 15-3:15 p. m. Session 21: Integrated Programming-Reading in Conflict and M. Davidson and Crisis-Affected Environments N. Weisenhorn

(Mission Panel-South Sudan and Somalia) Mission Panel: Christine Djondo and Abdulghani Sheikh Hassan

3:15-3:45 p.m. Break ..

23

Page 28: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

3:45-4:15 p.m.

4:15-4:45 p.m.

4:45-5:00 p.m.

9:00-9:15 a.m.

9:15 a.m.-1 :15 p.m .

1:15-2: 15 p.m.

2:15-3:30 p.m.

3:30-4:00 p.m.

Session 22: Working with Governments and Other Key Issues in Scaling and Sustainability

Session 23: Questions and Discussion

Wrap-Up Day 4

Session 24: Site Visit Stage Setting

Break and Board Buses

Session 25: Site Visits

Lunch

Session 26: Site Visit Debrief

Break and End of Day

P. Bender; N. Weisenhorn Mission Panel: Ethiopia, Kenya, Tanzania (names to be announced)

All Children Reading Team

Not applicable

B. Brocker

Day &-Friday, May 20: "What's Next?" .

Estimated Time Session Presenter(s)

8:30-9:30 a.m. Session 27: Evaluation M. Pucilowski

9:30-10:30 a.m . Session 28: Africa Research Agenda J. Hanson Swanson

10:30-11 :00 a.m. Break

11:00 a.m.-12:00 p.m. Session 29: Conference Evaluation and Closing Remarks L. Garden

12:00-1 :00 p.m. Lunch

'fhe AREW was well attended and well received by pa11icipants, who provided their feedback during each day of the event. During the Day I of the AREW, Ms. Evelyn Rodriguez-Perez requested input into the draft Education Strategy. On Day 2 of the AREW, the USAID/Ethiopia Mission Director, Mr. Dennis Weller, and a representative from the Ethiopian MOE, opened the day. The schedule for the remainder of the day shifted by approximately 45 minutes, with Session 7 shortened to accommodate the time provided to the guest speakers. Also on Day 2 of the AREW, the Education in Crisis and Conflict (EiCC) Team requested that Session 8, Accelerated Education, be videotaped because a representative from the Ethiopia MOE was added to the agenda. Allegro personnel contracted with a professional videographer who supplied 10 DVDs, as well as the raw footage of the videotape. The EiCC Team will use these materials for future training sessions. On Day 4 of

the AREW, the optional evening session about flexible instruments was held directly after the conference day ended. Approximately 22 people attended the optional session .. Day 5 ofthe AREW included a site visit. For the event, three buses were rented; between 12 and 20 attendees rode on each bus. Personnel from USAID/Ethiopia arranged for visits to two sites for each bus trip. Trip #1 was to two schools: one primary school with and one without an

24

Page 29: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

inclusive education program in place; Trip #2 was to two primary schools; and Trip #3 was to

a primary school and a teacher training institute or university.

Table 4 lists the participants who attended AREW. The participants represented 17 Africa missions in addition to Washington education staff in· the Bureau for Africa and the Bureau for Ekonomic Growth, Education, and Environment (E3).

Table 6. List of Participants Attending the AREW

Mission Last Name First Name

Garden Loretta

Lezhnev Meg note

Bureau for Africa Oleksy Ojikutu A lexandria

Pucilowski Mateusz

Hanson Swanson Ju lie

Davidson Marcia

Papadopoulos Nina

Pogue Barton Bureau for Economic Risley Heather Growth, Education

Rodriguez-Perez Evelyn and Environment Syl la Ben

Taggart Nancy

Weisenhorn Nina

Democratic Republic Breslar Zoey

of the Congo Tshimanga Pascal

Bonnenfant Marc

Kelemework Tesfaye

Manallew Berhanu

Mclaughlin Martin Ethiopia

Mel esse Yadesa

Wigzaw Addis

Zegeye Frehiwot

Zewdie Tade le

Adzei Richard

Crites Sarah Ghana

Jehanfo Adam a

Napari Paul

Gang Ia Lilian Kenya

O'Toole Denise

Brown '

Simone

Liberia Nyumah · Mardea

Phelps Malcom

Malawi Sosola Ramsey

Tall Aliou Mali

Traore Amadou Ousma

Mozambique Cos sa Celia

Harris-Hussein Croshelle Nigeria

Reja Ahmed

25

Page 30: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Mission Last Name First Name

Samuel Moses Olawale

Maloney Kate Rwanda

Rurangirwa David

Ba Sal a

Senegal Diop Malick

Ndiaye Amadou Lamine

Ad an Fatuma Somalia

Hassan-Sheik Abdulghani

Fox Meredith South Africa

Vorster Carien

Djondo Christine South Sudan

Wani Daniel

Bruns David Tanzania

Nsanzugwanko Abbas Thomas

Blanton Douglas Uganda

Mayanja Sarah Barbara

Mweene Beatrice

Zambia Naluvwi-Chomba Yvonne

Young Iris

Daily feedback opportunities were available so that attendees could provide comments about the event. Generally, the feedback was very positive. The following excerpts are some of the comments from AREW attendees:

• "Schedule was intense but not overloaded- thanks for long breaks for networking! " (Day 1)

• "Nothing to change: Very good day with a lot of interactive methods." (Day 2)

• "Loved the varied sessions designs, and that everyone is keeping it fun in addition to very relevant and useful." (Day 2)

• "A tremendously enjoyable, enriching, and educational workshop. Very well done! " (Day 6)

The following comments are also from attendees who provided some additional feedback about possible improvements to future AREWs:

• "Would be helpful to have more detailed examples or suggestions of how education teams can support design of WFD [workforce development] activities managed by other technical sectors." (Day 3)

• "f would reduce the number of sessions and give more time to the sessions presented. Today was a blur at the end of the day because there was so much to absorb. Also summaries or a recap at the end of the day to solidify what was learned." (Day 4)

Table 7 presents the key items to change and to keep regarding the AREW.

26

Page 31: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Table 7. Items to Change and Keep for AREW

Some presentations were rushed

Sunday start date (mixed reactions)

Add YALI (President's Young African Leaders Initiative) to agenda

Global Partnership for Education session (add to conference schedule within regular hours)

More time on presentations

Hold it in a country with less visa restrictions

The time weighting for goals should be re-evaluated ~ --- -·· .......

Expand sharing of lessons learned time

Differing opinions on whether implementing partners should participate in future workshops

Pre- and post-training opportunities

. Flow and networking time

. Site visits

Barbara Brocker

-----Time (not rushed)

Sharing experiences across countries

Multiple speakers

1 Build on prior training - - - -l 1 Ben (Sylla) and Mateusz (Pucilowski)- both subjects they covered were good!

, No folders (everything on Web) - -

Flexible procurement session

Room (for talking and networking)

Participation by MOE (in local area)

! Pipeline management

Consultation opportunities

Balance of practical and strategic

EiCC (pre- and post-tests)

1 Cultural dinner

Dealing with challenges together

1 Gender inclusion (integrate into all our ~ discussions) , E3/Bureau for Africa co~laborat1on

Learning opportunity for E3/0ffice of Education

Afte r the workshop, Allegro staff posted all session materials to the AREW Web potial and distributed the link to patiicipants. There are no further follow-up activities for the AREW

because the event has concluded.

Result 2 Next Steps

During the final stages of the DERP in Africa TO, RTI staffwill

• Continue with and finalize dissemination activities under the DERP in Africa

Dissemination Plan, including conducting the Webinar and preparing the "how-to"

videos for the Guide for Gender Equality and Inclusiveness, and complete the remaining presentations to the Education Sector Council for the EGR Barometer

• Finalize the Kenya Big Data Activity

• Finalize additional briefers (i.e. , Reading Materials Survey and the Kenya Big Data

Activity) for dissemination.

27

Page 32: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Result 3: Measurement Tool with Applicability Across Countries Developed

Early Grade Reading Scale-Up and Sustainability Planning Tool

Under the DERP in Africa project, a tool was developed and is being refined to assist key stakeholders, including USAID ' s Education Officers, MOE staff, and implementing partners, to gain a better understanding of EGR scale up and sustainability. The tool helps stakeholders learn what it takes to successfully scale up and sustain an evidence-based EGR program, what it might cost to take such a program to scale, and what it might cost the host country government to sustain such a program once it has been taken to scale. ·rhe five steps (or modules) ofthe measurement tool are described as follows:

• Step 1: Introduction and Overarching L .. sues. Step 1 is informational and instructionaL This step provides users with an overview of the entire tool, what each step does, how the steps relate to each other, how the tool works, and other relevant information.

• Step 2: The EGR Intervention. Step 2 focuses on the intervention itself (the evidence-based EGR program that is being taken to scale). This step requires users to provide detailed information about the evidence-based EGR program. Step 2 also asks for all ofthe costs associated with each element of the EGR program.

• Step 3: Institutional and Systemic Requirements. Step 3 helps users to identify the various institutional and systemic reforms that may have to be implemented as pati of the overall scale-up and sustainability effort. The purpose of Step 3 is to ensure that users are aware of all of the activities that must be conducted if sustainability is to happen. Step 3 also provides users with information that they use during discussions with the MOE stafTto help them develop a meaningful 5- or l 0-year EGR implementation plan that addresses these reforms.

• Step 4: Enrollment, Teachers, and Coaches and/or Trainers. To estimate the cost of scaling up and sustaining the EGR program, some very basic data are needed such as enrollment figures; the number of teachers, coaches, and/or trainers; and some indication of how enrollment might change over the course of the next 10 years.

• Step 5: Output. The measurement tool will generate a lot of output, and depending on the type of output that users want, Step 5 will generate it. Specifically, this step will present users with many output options such as the total cost of scale up, the number of teachers who must be trained, and the number of materials that must be produced in a given year. This step wi11 also present a graphic to illustrate how costs are incurred over time.

Work on the tool in FY 2016 proceeded after feedback from the pilot field test in Malawi in September 2015. The majority of the feedback was positive, though users recommended simplifying the tool where possible. ln addition, there were recommendations to make transitions, both forwards and backwards, easier, thereby allowing users to jump between steps. There were also recommendations to incorporate definitions and additional flexibility for rolling out an EGR program. The piloting effort identified the need for a strong manual to accompany the tool to facilitate a user ' s experience.

Early in FY 2016, an early version of a draft manual for the tool was in process. While the document was being prepared, additional programming for the EGR Program Scale-up and

28

Page 33: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Sustainability Tool was underway, and a beta version was made available for internal RTf review. The review was conducted by several senior technical staff with significant experience on EGR projects, as well as project operations team members who have a keen understanding of EG R project finances. RTI staff also prepared a draft guide to assist users with how to properly use the tool. R.TI staff refined the guide based on feedback.

In May 2016, staff from USAID and R'ri met about the status of the tool to share progress and determine the next steps. During the discussion, USAlD personnel expressed concern that the tool could be self-contained so that users could teach themselves how to use it. The tool as it was prepared would require training support, so RTI proposed a light or streamlined version of the tool, with fewer inputs required (Figure 3). RTI staff began working on

ft.. GUIDE FOR STRENGTHENING GENDER EQUALITY AND ,

, INCt.USIVENESS IN TEACHING AND LEARNING MATERIALS

'%. '<! +¢ ~ ~

the lighter v~rsion of the tool, and this work is ongoing. Throughout the final quarter of FY 20 16, RTI staff continued to refine the tool based on USAlD's feedback and additional internal reviews. An updated version of the tool is expected to be ready in the final months of the DERP in Africa contract. Once approved by USAID and finalized , the tool will be downloadable and posted to the DEC and other appropriate sites (e.g., eddataglobal.org, fo11ow-on sites) to ensure that the tool is widely available.

.! ·

/i>// Us A!Nut:

( <'<:{ .; • ~ :t:.~ -~'1. ..... ·('

\•1• ~.r ,·i; ~' -, h,)' At n

Figure 3. A screen capture of Scale-Up and Sustainability Tool beta version presented to USAID on May 2016.

Guide for Gender Equality and Inclusiveness

During FY 2013 , at USAID 's request, RTl statTbegan working on a guide and evaluation too l to promote gender equality and inclusiveness in teaching and learning materials . Work to prepare for the development of the guide began with a literature review. The development of

29

Page 34: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

the Guide for Gender Equality and Inclusiveness followed soon after the completion of the literature review. Target audiences for the Guide for Gender Equality and Inclusiveness include the following:

• USAlD mission personnel tasked with project design

• Local ministry staff in countries suppOiied by USA1D's Bureau for Africa that are responsible for developing policy and teaching and learning materials

• Contractors, local publishers, and organizations responsible for developing teaching and learning materials

• Monitoring and Evaluation Specialists tasked with assessing ongoing and final achievement and performance of project indi.cators related to equality.

During the first quarter of FY 2016, after several rounds of revisions recommended by USAID and by peers and colleagues from across institutions (including the Special Olympics and UNICEF), the Agency approved a final version of the Guidefor Gender Equality and Inclusiveness.

With the Guide for Gender Equality and Inclusiveness finalized, efforts shifted to dissemination ofthe document. rro suppoti broad dissemination ofthe Guidef()!~ Gender Equality and Inclusiveness, the document was translated to French. The French translation was conducted and finalized in .FY 2016.

ln addition, a total of23 attendees participated in a workshop at RTI's offices in Washington, DC, on September 22. The agenda for the workshop is presented as Table 8.

Table 8.

Time

1:00- 1:15 p.m.

1:15-2:00 p.m.

2:00-2:20 p.m.

2:20-2:50 p.m.

2:50-3:00 p.m.

3:00-4:00 p.m.

4:00-4:30 p.m.

4:30-5:30 p.m.

Workshop Agenda on the Guide for Promoting Gender Equality and Inclusiveness

Duration Activity

15 minutes Welcome, Introduction, and Background About the Guide

45 minutes Review How to Use the Guide, Including Details About Each Theme

20 minutes Explanation of Worksheets

30 minutes Group Review of Examples Using the Guide

10 minutes Break

1 hour Time for Individual and Small Groups to Practice Using the Guide

30 minutes Review of Questions and Discussion

1 hour Reception (with Refreshments) for Attendees

Open-source materials from the African Storybook Project and from other publically available U.S. publications were shared and used as examples so the participants could review the materials and practice using the Guide for Gender Equality and Inclusiveness during the workshop.

The feedback from participants of the workshop was very positive. Some attendees provided feedback on the Guide for Gender Equality and Inclusiveness and the approach that was used. Some attendees commented that they wanted to review the materials more holistically, others wanted to foster discussion about what the tick mark calculations in the worksheet mean and how to address issues, and others were interested in considering broader messaging of the story and contextual appropriateness. Some attendees also commented that it is

30

Page 35: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

impotiant to have a diverse group of people review the material s because this ensures a more

thorough evaluation from different viewpoints and potentially different stakeholders.

In addition, some ofthe participants commented that the mathematics section of the rubric was challenging. Questions also arose about how the Guide for Gender EquaUty and Inclusiveness fits in to the broader cycle of developing and revising teaching and learning

materials and how to weave the document into other reviews that include items such as font size and formatting issues.

Another topic that was discussed as a result of the workshop is that an RTl staff member saw

the poster of the Guidef()r Gender Equality and Inclusiveness and was concerned about the depiction of the girl's hair on the cover, interpreting the illustration as negatively

stereotypical. R'I'I staff raised the concern to USAID and mentioned the possibility of revising the illustration, but USAlD staff determined that no fmther changes were needed.

In t e final months of the project, additional dissemination effotis are planned, as a webinar will occur in late October 2016 and short how-to videos, which have been discussed during

FY 2016, will be completed and posted online.

School-Related Gender-Based Violence Activity

USAID's Bureau for Africa launched the Opportunities

for Achievement and Safety in Schools (OASIS) program

during the May 12, 2014, event titled Seminar on the Intersection of Safe Learning Environments and Educational Achievement: A Dialoguefor Action. Ms. Julie Hanson Swanson and Dr. Katharina Anton­

Erxleben, both of US AID, presented OASIS, which builds on recent research that suggests a linkage between

unsafe learning environments and low educational achievement. OASIS aims to conduct needed and relevant studies to investigate this linkage furthe r and disseminate findings to raise awareness and advocate for

improved school safety and reduced incidence of

SRGBV. In support of OASIS and to fill a gap in the research knowledge regarding the linkage between

SRGBV and educational outcomes (including enrollment and completion data), particularly

for deve loping country contexts, USAID personnel have requested a literature review and the

deve lopment of a Conceptual Frame"vorkfor Measuring School-Related Gender-Based Violence (the Conceptual Framework).

'l'he main focus of the work is on bullying, corporal punishment, sexual harassment and sexual violence, and physical and psychological intimidation. This information will be used to guide program monitoring, impact evaluations, and research about SRGBV, including, but not limited to, studies ofthe linkage between SRGBV and learning outcomes (and enrollment

and completion data). The information will also be used to gather data on occurrences of such

violence and underlying attitudes related to each form of violence or abuse.

The work on the SRGBV activity began with a critical review of existing methodologies that are used to measure SRGBV based on literature from different disciplines (e.g. , education,

31

Page 36: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

public health, sociology, psychology, gender, and developn1ent studies). The literature review

is almost flnal and is expected to be approved in the final months of the DERP in Africa

contract.

The literature review was followed by the development of the Conceptual Framework, along with implementation and analysis guidance with a set of questions and core indicators for

each of the main aspects of SRGBV. USAID personnel provided feedback about the draft Conceptual Framework, and RTI staff made fmther revisions as requested. During the first quarter of FY 2016, USAID personnel provided additional feedback about the document, and

then R'I'I made the requested revisions, which included adding scenarios and revising the text. Additional information was also added about the Institutional Review Board, which is

an ethics committee that is designated to review, approve, and monitor research involving human subjects. RTI staff submitted the revised version to USAli) in November 2015.

USAID personnel sent some additional changes to RTI staff during the second quarter of FY 2016. USAJD personnel asked that RTJ staff not make further changes to the document until

additional comments from the Global Working Group were available. RTI staff received the additional comments later in FY 2016. Additional feedback from the piloting efforts in

Uganda have helped to shape some ofthe questions and additional guidance to be included about confidentiality and consent of participants. Staff from RTI and USAID have been making revisions based on the feedback; some additional changes are ongoing, but these will be completed during the final months of the DERP in Africa contract.

EGR Barometer

The use of EGRA and other EGRAs is f,rrowing, and there is a developing need for tools to

help users interpret and better leverage data collected through the assessments. How best to draw value from the data generated by an EGRA depends on the way in which the assessment is used and on whether a country is conducting a baseline, evaluating a pilot, or preparing to

launch a national reading program. Some studies are limited in scope (selected regions);

others are national in scope, but are designed to provide a general baseline measure of outcomes. Other studies are conducted to measure i.mprovement by comparing pre- and post­intervention performance in treatment and contTol schools.

'fhrough the DEP-AM E project, R'I] staff supported the development of the EGR Barometer, which was intToduced and demonstrated at the 2013 Education Summit in Washington, DC, and at the 2014 M.iddle East and N01ihern Afl·ica A11 Children Reading Workshop in Morocco. The EGR Barometer (available at www.earlygradereadingbarometer.org) is an interactive tool that offers access to a wealth of EGRA data.

The data tool has six key domains that are used for viewing and analyzing EGRA data­snapshots, results, relationships, benchmarks, comparisons, and intervention. The tool is

useful to USAID field missions and their in-country counterparts to make better use of EGR data.

The DERP in Africa project was asked to support the expansion of capabilities of the EGR Barometer and to support the addition of sub-Saharan African data sets into the barometer. rfo suppoti additional capabilities for the EGR Barometer, .DERP in Africa supported the

development of offline capabilities because in many areas of the world-where users would want to access the barometer- there is a lack of reliable, consistent, and fast Internet access.

32

Page 37: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Ultimately, the EGR Barometer will be able to function as a stand-alone, offline application. Such an app lication will enable USAID and national staffto demonstrate the application in a wide range of settings, regardless of the availability of Internet access.

Add itionally, leveraging data sets from the sub-Saharan Africa context has expanded the trend analysis capability. 'rhe incorporated Africa data sets will not be launched externaiJy until the USA1D ' s Bureau for Asia piloting effort and official launch have concluded.

During FY 2016, data sets from the following countries were added to the EGR Barometer: Ghana, Liberia, Malawi (2010 and 2012); Nigeria (Bauchi , J igawa, Kaduna, Kano, Katsuna, and Sokoto States); and Tanzania and Zambia. Prior to the release of each dataset, staff

from USAID and RTf conducted an internal review. Figure 4 is a screen capture ofthe main page for Zambia data in the EGR Barometer.

Usage statistics are available for the

ZAMBIA: 201 s, Gradt:' 2

EGR Barometer, including both Figure 4. Homepage for Zambia data on the EGR Barometer.

countries from the AME region and sub-Saharan Africa. Figure 5 provides the number ofnew and total accounts by month for FY 2016. The large jump in September 2016 is tied to dissemination efforts about the EGR Barometer that occurred on or around International Literacy Day.

700 603

600 487 511 521 534 552 ?"

500 405 364 374 388

415 434 446

400

300

200

100

0

....,._New Accounts Total Accounts

Figure 5. Number of new accounts and total accounts by month.

Figure 6 is a map of the user sessions globally, by country. Table 8 highlights the data for September 2016. The map shows the global usage and interest in the EGR Barometer.

33

Page 38: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Country Sessions

United States

Nepal

Kenya

Mozambique

Philippines

United Kingdom

France

Argentina

Nigeria

Guatemala

Figure 6. Number of user sessions on the EGR Barometer's Web site, by location , with table of top locations, in September 2016.

Jn support of dissemination of the EGR Barometer, RTI staff have prepared tweets for USAID personnel to submit through their rfwitter communication handles. USAID personnel shared the tweets on or around International Literacy Day. These communications helped to boost traffic to the barometer' s Web site.

In the final m.onths of the DERP in Africa contract, RTI staff plan to finalize two new rep01ts, rrrends and Development Goals. RT'l staff are also working to release additional data sets before the end ofthe DERP in Africa contract, with data to be added from the Uganda School Ffealth and Reading Program, and EdData II activities in Mali and the Democratic Republic of the Congo during 2015. Unfortunately, though the dataset from the Kenya Primary Math and Reading (PRIMR) Initiative was planned to be included, this will not be feasible by the end of the DERP contract. While the Kenya PRIMR data has been processed and statistical output has been created; additional support is needed to reconcile apparent discrepancies in the sample of schools and students used for analysis by the PRl.M.R project and the sample included in the dataset which the Barometer team is using. Out of concern for assuring the quality and accuracy of what is presented in the Barometer, additional work is needed bef()re the Kenya PRIMR data can be posted, and there is insutiicient time left in the contract to finalize this work to post the Kenya PRJMR data during the DERP in Africa project.

Result 3 Next Steps

During the final months of the DERP in Africa project, R Tl staff will

• Continue to make revisions and finalize the lighter version ofthe Scale-Up and Sustainability Tool

• Finalize dissemination efforts for Guide for Gender Equality and Inclusiveness, including preparing how-to videos and supp01ting a Webinar in October.

• Support requested revisions of the SRGBV deliverables based on feedback from USAID personnel.

• Continue to expand the countries included in and capabilities of the EGR Barometer and support dissemination efforts through the end of the DERP in Africa contract.

34

508

25

16

14

13

8

7

6

6

4

,_

Page 39: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Annex A. Financial Summary

Quarterly Financial Statement, Task Order 19 (Sixteenth Quarter) Recipient: RTf International Data for Education Research and Programm;ng (DERP) in Africa

Task Order Number: Prime Contract AID-OAA-BC-12-00004

Performance Period: October 1, 2012- November 30, 2016

Quarterly Financial Report as of September 30, 2016

35

Page 40: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

Annex B. Time Line of Events in FY 2016

. . . 15- 15- 15- 16- 16- 16- 16- 16- 16- 16- 16- 16-ACtiVIty Tame Line Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep

Result 1: Africa Mission Strategy-Related Data Needs Met

Saharan African Countries

Data Revolution for Development Activity

Preparations for/Submission of Draft Final Report Preparations for Policy Dialogue in Ken

36

Page 41: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

DERP in Africa briefers and Blog posts

Reading Materia ls Survey

AREW 2016

Guide on Gender Equality and Inclusiveness

37

Page 42: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

EGR Program Scale­Up and Sustainability Tool

SRGBV Conceptual Framework

EGR Barometer

Development of functionality and equating of data sets for further data set releases

38

Page 43: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,
Page 44: Data for Education Research and Programming …pdf.usaid.gov/pdf_docs/PBAAF488.pdfExecutive Summary Task Order (TO) 19, Data for Education Research and Programming (DERP) ;n Africa,

United States Agency for International Development

Bureau for Africa Africa Education Division

1300 Pennsylvania Avenue, N.W. Washington, DC 20523

www.USAID.gov