ICT4E Evidence Starter Kit

There is a growing body of evidence on ICT4E interventions from around the world. The mEducation Alliance Resource page provides several documents that catalog the literature on ICT4E, but it can be daunting to know where to start for researchers, practitioners, and donors looking to gain a basic understanding of the sector. This ICT4E Evidence Starter page provides a few key studies and other helpful links for learning more about about the evidence of impact of ICT4E.

Do you have a study that you think should be on the list? Make sure to send us a link through the Contact page.

Key Studies

The following studies were chosen to illustrate the breadth and depth of ICT4E research. This list is by no means representative, but rather aims to provide a sense of the rigor applied to ICT4E evaluations. The studies are listed in order of accessibility and rigor for understanding the nature of ICT4E evaluations.

This evaluation employs a Randomized Control Trial design to assess the impact of an education program in Niger that provides mobile phones as a tool to promote adult education. The key finding of this study is that students in Project ABC villages had higher test scores on general education and math exams. Pages 18-24 explain mechanism for why Project scores may have been higher. This study is a great place to start for understanding how simple information technology can be harnessed to improve educational outcomes among rural populations.

This evaluation uses a Quasi-Experimental Design by matching students in Bridge Academy’s Kenya EGRA/EGMA programme and compare them to those not in the program. BridgeStaff helping students learn how to use computers Academy combines multiple mechanisms to deliver education to students, such as wireless delivery of lessons. A summary of findings starts on page 20. The key study finding is that Bridge students obtain almost 32% more schooling a year, and score higher on math and reading exams compared to non-Bridge public school students.n English and Maths” by Bridge Academies (2015)

This evaluation uses a Quasi-Experimental Design to evaluation a joint program between Nokia and the Pearson Foundation that aims to use mobile technology to reach previously inaccessible classrooms in the developing world and to provide teachers in these schools with contemporary and locally relevant teaching materials. The main finding is that BridgeIT was consistently correlated with student learning gains in science, math, and English as well as teaching quality for instructors when compared with control schools.

This Randomized Control Trial evaluation highlights the importance of rigorously assessing ICT projects to see if they actually improve outcomes. The authors study the One Laptop Per Child program in rural Peru. The key finding is that the program had no impact on test scores or enrollment. Graphs showing uptake and technology competence start on page 36.

The Non-Experimental study reviews programs that provide basic computer training for people with disabilities and at-risk youth. The key study finding is that the ICT program increased confidence and reduced anxiety about technology among participants, as well as improved employment opportunities. Page 53 of the report graphically maps out the relationship between the program and beneficiary perceptions.

Evidence Reviews

These reports synthesize ICT4E research and provide an overview of the types of interventions being studied. They provide a good starting place for anyone looking to gain a broad view of the current state of ICT4E programs and their impact.

The Education Endowment Foundation reviewed 48 studies to assess evidence on using technology in the classroom. Unlike other meta-analyses, the report and website for this study provide a summary of evidence as well as recommendations for practitioners thinking about incorporating technology into their education interventions. Recommendations include identifying and clarifying the rationale for using ICT4E, understanding how it will support existing structures and learning, adequately training teachers, and understanding what teachers will have to stop doing as a result of the intervention. The EEF website provides a full and easily searchable list of references, as well as one of the few resources to press readers to consider data security concerns.

This Excel-based database created by the DfID funded HEART consortium provides a broad overview of papers on educational technology.  The sources in this database are largely observational, but there are also links to a handful of experimental studies.  The downloadable database allows for sorting and searching based on geographical region and study methodology.  This is a great resource for narrowing your initial literature search.

Findings from this study, based on 31 of the 77 reports reviewed, show gains from computers and other technology to be lower than was found for U.S. domestic applications, but to be more effective than other types of interventions studied. The review of technology treatments starts on page 354. Readers should note that the use of technology in this review is relatively broad, but it does serve as a good starting point for finding rigorous studies.

This paper reviews six systematic reviews and meta-analyses, including the McEwan study. This study notes that McEwan’s overly broad use of “computers and technology” is less useful for understanding what works. Instead, researchers should narrow their search for evidence within an intervention approach, such as ICT, to see what best suits a particular context and need. Key findings start on page 12.

Resources for Planning the Design and Evaluation of ICT4E Projects

These links provide information for policy makers, implementers, and researchers planning ICT4E interventions. The documents below provide insight into what to consider for running and evaluating ICT4E programs.

A follow-up to a 2011 guide targeted toward a domestic US audience, the guidelines in this report for how to design rigorous evaluations of ICT4E interventions, implement the evaluations, and report your results is useful for international practitioners, as well. Section 3, on handling personally identifiable information, is a great addition to the guidance literature and provides actionable information for readers considering or currently implementing an education evaluation.

This guide was created by USAID to guide ICT implementation and design. Page 44 suggests potential impact indicators, and the document provides overall suggestions for understanding how best to implement and evaluate ICT programs.

This PowerPoint presentation examines the ICT4E evidence base and provides helpful questions for scoping evaluation questions. Slides 18 through 24 are helpful starting points for planning your ICT4E evaluation.

Developed by the Comprehensive Initiative on Technology Evaluation (CITE) at MIT, this framework was designed for governments, organizations, and schools to effectively evaluate educational technologies for specific uses in their classrooms

Helpful Evaluation References

The following links are a good starting point for learning more about ICT4E research and the organizations at the forefront of evaluating technology in the classroom.

Additional Resources:

Please contact us here.