This article was originally published on Global Giving
Grace is a Grade 8 student at one of Rising Academies’ schools in Sierra Leone. She wears a bow in her hair and speaks in a quiet, gravelly voice. When she gets nervous she laughs and runs her hand down her face like she’s trying to wipe away the nerves.
Grace is amazing because even though she has struggled throughout her school career, and even though her family couldn’t afford to send her to school at all for most of the last 4 years, she never gave up on her education.
Like many students in Sierra Leone, Grace struggles with reading. She went to primary school and took the leaving exam, but unfortunately that doesn’t mean a lot. She’s 17 years old, but reads at about the level of a 7 year old.
Girls like Grace are why Rising Academies was set up in the first place: to provide access to a quality education where the existing school system cannot. To make sure it is doing that, Rising takes impact evaluation very seriously. It is being independently evaluated by Oxford University, with the results published on its website.
But the problem with evaluation is that it’s too slow and too broad to really drive improvements in programme design. It’s like walking up a treacherous mountain path backwards. It allows you to say whether you were on track, but that’s not particularly useful if it turns out you weren’t.
When you’re trying to figure out how to help a girl like Grace, you want something that tells you how to improve your programme in something closer to real-time.
- The first step was to spend some time with the R4D team really defining the problem to be solved, and identifying the parameters or design constraints of a potential solution. Given the challenges facing girls like Grace, Rising decided that the initial focus would be on supporting literacy among its weakest readers, and that any potential solution needed to be administratively simple, with a low marginal cost in terms of learning resources or staff time.
- The second step was to move very quickly – within days, not weeks – into prototyping and testing some of the possible solutions on a small scale. One idea that was tested was around high frequency, light touch assessments that were so simple students could administer them to each other. Another involved a peer-to-peer reading club each morning before school, with a series of activities where stronger readers would support struggling readers.
- The third phase, which has just started, is to put these innovations to the test. It’s not quite at the level of a full randomised controlled trial, but it does involve trying to generate some rigorous evidence about which interventions seem most promising, including by randomly assigning students to different treatment conditions within a number of our schools, with a view to being in a position to take some decisions on what to do next by early next year.
It’s too early to say exactly what the outcome will be, and ultimately the test of this is whether it improves the programme for girls like Grace. But the experience has already been very powerful in creating a different way of thinking about evidence and helping the organisation to approach the task of experimentation in a more systematic way than it had done previously.
Every Monday and Wednesday when Grace and her colleagues gather for their morning Community Meeting, they recite the Rising Academy creed. One of its lines is that “our first draft is never our final draft.” What goes for Grace, goes for Rising Academies as an organisation too. By learning more systematically what works, Grace’s story may just have a happy ending.
Photo Credit: Rising Academies