by Sasha Alyson
Save the Children calls itself “the world’s leading expert on childhood” and boasts of the “proven success” of its reading program, Literacy Boost.(1) What support can it offer for such claims? After some searching, I found a booklet that Save the Children produced to solicit donations from philanthropists. It makes three impressive, precise statements:
In Bangladesh, students with a low baseline of 10 percent comprehension showed an almost five-fold increase in comprehension to 49 percent. Indonesian students recorded a three-fold increase from 19 to 67 percent comprehension and children in Ethiopia recorded a doubling of comprehension from a higher baseline of 32 to 64 percent.
No documentation was given. But with help from a source inside SAVE (as it is often called within the industry), I was able to see the actual studies from which this data was drawn.(2) Let’s compare SAVE’s statements with the full picture:
- “In Bangladesh, students with a low baseline of 10 percent comprehension showed an almost five-fold increase in comprehension to 49 percent.” The numbers in the report actually show it going from 11% to 49%, a multiple of 4.45.(3) But the full report shows that in a comparison group that did not get Literacy Boost, the comprehension numbers went from 8% to 36%, a multiple of 4.5. It was a draw. SAVE’s evaluation of another Literacy Boost trial in Bangladesh stated that: “No statistically significant differences exist between Literacy Boost and comparison groups either at baseline or at end line.”
- “Indonesian students recorded a three-fold increase from 19 to 67 percent comprehension.” True. In fact, that’s more than threefold. But the booklet for donors fails to mention that students who did not get Literacy Boost increased more than fivefold, from 9 to 50 percent. One group gained more as a multiple; one gained more percentage points. The full assessment states that the difference in gains was “not statistically significant.” (The chart at the top of this page is from Save the Children’s internal evaluation in Indonesia.) It seems to me, however, that the students who didn’t get the program were the lucky ones. They were a much weaker group to start — only half as many could read with comprehension, compared to those who got Literacy Boost. By the time the study ended, three-quarters as many could do so.
- “Ethiopia recorded a doubling of comprehension from a higher baseline of 32 to 64 percent.” True again, but the Ethiopia study didn’t have a comparison group at all. Save the Children simply reported comprehension at the beginning and end of the year for 300 third-graders. That’s an age when we would expect reading skills to increase (as they did in Bangladesh and Indonesia, at an even faster rate.) There’s no reason to believe Literacy Boost had anything to do with it.
Save the Children writes that it has “invested more than $150 million in Literacy Boost programs across 34 countries, benefitting more than 3 million children” – and these are the best examples it can find of its “proven success.” I looked at SAVE’s evaluations for more than a dozen countries. Many concluded that Literacy Boost made no statistically significant difference. Some found a small short-term improvement, but even these acknowledged that it was far less than what was needed.
Meanwhile, several big questions remain unasked and unanswered.
What will happen after the funding stops? It should be possible, with enough money, to bring about some short-term improvements. The only surprise is that often, SAVE failed to achieve even that. Where small benefits appeared, is there any reason to believe they will continue after the money stops?
How much does it cost? In these evaluations, SAVE never mentions the price tag. Real people, real governments, and real businesses, all try to weigh total costs against expected benefits, when evaluating a new program like this. If it was really attempting to identify a viable, long-term way to improve literacy, Save the Children would look at the full picture, too.
What are the non-monetary costs? When an NGO arrives and assures everyone that it is the expert, then fails to actually get much done, that distraction comes with a price. I’ve explored these hidden impacts in The High Cost of Meddling.
Save the Children does not know how to improve literacy in developing countries, but it must claim to have the answers. That keeps donor money flowing into Save the Children, and it gives SAVE a license to meddle in countries where it hasn’t a clue what to do. The only losers are the countries – and their children — that receive this fraudulent “help.”
Notes and Sources
Top illustration: The chart at the top of the page is from Save the Children’s internal evaluation in Indonesia, cited below. To highlight the key data, we announced this story on Twitter with a graph that showed only the “Reading with Comprehension” numbers, with the green bar starting at the left axis.
1. To call yourself “the world’s leading expert on childhood” is quite a claim, but Save the Children repeats this phrase on its various websites, everywhere from the U.S.A. to New Zealand. The phrase is then parroted by others, including New York Times columnist Nicholas Kristof, as if it were his own words reporting an established fact rather than a advertising line. Encouraging people to donate to Save the Children, Kristof writes: “As the world’s leading expert on childhood, they know exactly what children in crisis need – and how to deliver it effectively and efficiently.”
2. The Philanthropy report and country reports were all published by Save the Children. They are:
►Power of Philanthropy: Investing in Literacy Boost for Vulnerable Youth. Undated, but it has citations from 2016, so was published in that year or later. Claims quoted are on page 11.
►Literacy & Numeracy Boost, Bangladesh Endline, by Christine Jonason et al., April 2014, page 28, final line of Table 7.
Literacy Boost Bangladesh, Endline Report, by Jarret Guajardo et al., May 2013. The statement that “No statistically significant differences exist between Literacy Boost and comparison groups either at baseline or at endline,” is on page 23, with a chart showing the same thing.
►Literacy Boost Indonesia, Endline Report, by Christina Brown, July 2013, page 20.
►Literacy Boost Tigray, Ethiopia, Endline Evaluation Report, by Zeray Gebreanenia et al., July 2014. (The Philanthropy report does not give sources for its claims, so possibly there was another Ethiopia report, but if so, I’ve been unable to find it. That Save the Children doesn’t bother to document such statements is further evidence that it’s not serious about looking for solutions. It’s doing a snow job for donors.) The lack of comparison groups is noted on page 5. This report deceptively refers to “statistically significant” improvements in reading skills, but this simply means that these skills did improve during the year; without a comparison group there’s no way to know if Literacy Boost had anything to do with it.
3. It appears the comprehension number was between 10 and 11 percent, and it was rounded in one direction one time, the other direction another time.