Building a qualitative evidence base with education innovations

Corinne Hoogakker
 

One of the main reasons I went back to school was to better understand quantitative methods. What I did not expect though, is for so many of my professors to explain that a laser-like focus on numbers isn’t worth much without the interviews, case studies, and stories that qualitative data provides to back it up.

It’s not terribly useful to know that a new intervention produced a 25% improvement in literacy scores if no one has any idea why or how it worked.

Still, when I began my internship with CEI in 2016, I asked my colleagues about how programs are chosen to be included in the CEI Program Database. How do we know these programs are effective? What if a program is too new or too small to have any data on impact, or any long-term results?

As part of my work, I help add new innovators to the CEI Program Database. Based on information submitted by each organization, I carefully draft a program profile, then schedule a call between the CEI team and the innovator. During these interviews, we fill in any gaps or inconsistencies in the information presented in the profile. We ask programs to explain their data collection strategies, tell us about any internal or external evaluations of their work, and provide relevant documentation- the more information an organization can provide, the better. When funders use the CEI database as a resource to find new programs with potential, they are often looking for this information as well.

Lessons from the Groundeducation innovation qualitative evidence base learning teaching quantitative data results monitoring evaluation

Soon after I began work, I participated in an early morning call to Tanzania. Kachocho Timanywa is the executive director of Tumaini Letu, an organization that provides early childhood development services to vulnerable children in Kagera region, Tanzania. Kachocho explained to us the outcomes that Tumaini Letu monitors, and his hopes for future expansion. But he also explained to us why the work he does is unique- explaining how orphans in Kagera, especially those who lost their parents to HIV/AIDS, still face stigma and often lack safe spaces to play and learn. Tumaini Letu is the only organization in the area that offers vulnerable children these spaces.

I began to wonder- how can a program measure the benefits to a community where children feel safer and more supported, or the improvement in a child’s chances at finishing school when they are able to access quality ECD services? An expensive longitudinal study would tell us some of these things in 5-10 years. But the qualitative evidence provided by Tumaini Letu is also a valuable resource. Another organization trying to begin offering ECD services in rural Tanzania or East Africa may be able to learn just as much or more reading the descriptions and case studies generated by Tumaini Letu’s work than they could examining data from an impact study.

Do we care about quantitative evidence? Absolutely. CEI carefully documents results from each program, and ranks program M&E reporting into three “tiers”- data collection, process evaluation, and impact evaluation. Especially when scaling a program, this type of evidence is absolutely crucial.

Sharing Stories

As I interviewed more implementers, however, I began to value these conversations more and more - not just the numbers that we received- as another source of vital data and learning that could be shared. In an interview with Parikrma Humanity Foundation, I heard stories about the importance of schools being embedded in the community. When a teacher notices a child who is tired, hungry, or distracted, they can contact parents, visit a family at home, and develop a plan that includes parents and child to help the student succeed. There is not a uniform protocol for identifying these students, and the results of such interventions are hard to quantify. But understanding how these interactions work is vital to understanding why Parikrma has developed a successful model in Bangalore.

education learning teaching evidence data classroom teacher monitoring evaluation resultsCEI’s Innovator Interview blog series tries to distill some of these qualitative lessons, asking program implementers for advice and insight on how and why their program works. We also try to build qualitative data into program profiles, summarizing the “how” and “why” of program implementation with insights from innovators working towards access and quality in education around the world.

What do I remember most about my work at CEI? I can’t usually remember how many soft skill trainings an organization has conducted or the improvement in standard deviations of a literacy intervention. But what I learn from interviews will stick with me for a long time.

Innovations make up an increasingly significant percentage of development interventions, and robust evidence will be critical in scaling successful experiments into larger-scale impact. Nevertheless, the power of a story, especially when combined with rigorous data, is critical to dissemination and adoption of effective models.

CEI gives voice to innovators so that they can share their experiences, as well as their data, with stakeholders in a position to learn from it. It was an honor to add my voice to their chorus, and I don’t plan on going quiet anytime soon.

 

Know of any innovative education programs that should be featured on CEI? Email cei@r4d.org with your suggestion and a description of the program.

Corinne Hoogakker will receive her MA in International Affairs from the Johns Hopkins School of Advanced International Studies (SAIS) in May. This year, she worked as an intern at the Center for Education Innovations (CEI) at R4D. Before coming to R4D, she served as a primary school teacher in Palestine and worked in administration in DC public schools.

Photo Credits: GPE/Kelley Lynch ; Edupeg ; GPE/Chantal Rigaud

 

See more blogs