Edited by: Julie Wang’ombe

At Evidence Action, we often use mobile technology to collect program data that informs decision making. Across the spectrum of our work, we need real-time data to quickly troubleshoot and make course corrections and design improvements, but while programs are in an incubation phase, addressing issues in real time is particularly important to give evolving prototypes the best chance of success. When collecting data, we like using mobile phones for their ubiquity and prefer platforms that are SMS-based since they’re more likely to reach the remotest, least-connected communities.

Recently, we experimented with Echo Mobile as a tool for monitoring Winning Start, a program in our Beta incubator that deploys post-university youth to public schools where, using the highly evidence-based “teaching at the right level” model, they help children develop basic reading, writing, and numeracy skills.

For three years, we have been testing the Winning Start model in Kenya with the Government of Kenya’s G-United program. As part of the program, volunteers regularly share data with our team. We collect information like:

  1. How many children are struggling with basic literacy skills?
  2. How many after-school remedial sessions did youth volunteers conduct in a given week?
  3. How many students attended these sessions, and how are they progressing?

This kind of data is critical for helping our team assess program delivery, track students’ progress, and address emerging issues as they arise.

Over the years, we’ve experimented with different ways of having volunteers share this data with us. In 2014, when we launched the Kenya-based pilot, we issued special internet-enabled tablets to volunteers, developed specifically for the program, at the request of the Government of Kenya. Volunteers were expected to share data through a portal that relied on having an internet connection. It wasn’t long before we scrapped this approach, foreseeing several challenges. For example, issuing individual tablets to volunteers would not be cost-effective in the long run, and ensuring the timely manufacture and supply of these tablets would be equally difficult. Likewise, the need for an internet connection would limit the program’s capacity to operate in remote areas with limited connectivity.

The next year, in the spirit of one of Evidence Action’s guiding values, “iterate, again,” we experimented with a promising SMS-based platform. That platform solved the challenges with which we previously grappled: volunteers could use their own phones to share data with us, reducing our costs and eliminating the need for an internet connection. However, a new challenge emerged. Volunteers were required to send us data in a format that was, understandably, hard to remember:

VOLUNTEER_ID [space] CATEGORY_KEYWORD [space] field1 [space] field 2 [space] comments [space]

The result was messy: most volunteers either constantly forgot to send in data, or they made mistakes in formatting their SMS messages, making the data difficult to interpret and, in some cases, impossible to use. So we decided we needed to “iterate (yet) again” in search of a better solution.

Enter Echo Mobile

 Sample survey conducted on Echo Mobile Sample survey conducted on Echo Mobile

At the time, our Deworm the World team in Kenya was using Echo Mobile to send notices to teachers and other key stakeholders related to the national deworming day. As we investigated the platform, we learned that it offered a range of functionality beyond this notification service. For example, it could be used to design and send out surveys, and these surveys could be structured to only allow for specific kinds of response (e.g., numbers only, numbers within a certain range) and to automatically reject out-of-bound responses. Importantly, the surveys enabled communications between the program and volunteers to be more dialogic, since the surveys could be programmed to ensure questions followed each other intuitively, with answers to certain questions dictating what follow-up questions would appear. The platform also allowed us to program in prompts reminding volunteer to complete their surveys. As an added benefit, we could personalize mass messages sent via the platform with individual greetings.

It looked promising, so, being evidence-first, we decided to test the tool before making a decision to scale its use throughout the program.

Piloting and adopting Echo Mobile

We ran a two-week pilot using Echo Mobile to send out surveys to a sample of volunteers. The results were impressive: volunteers found it easy to use, giving the platform an 8.5 out of 10 rating, and 95 percent of the volunteers who participated in the pilot preferred using Echo Mobile to the previous platform. More importantly, the rate of volunteers submitting their data rose, the submissions were more accurate, and we found no outliers.

G-United volunteers’ feedback on the Echo Mobile platform.

Based on the positive pilot results, we decided to make the switch to Echo Mobile. In 2017, we used Echo Mobile to monitor our Winning Start program in Kenya, G-United—and the program is better for it. Last year, volunteers in the program averaged an 88% weekly response rate, and the quality of data markedly improved.

Of course, the platform isn’t a panacea. While it has significantly reduced data collection challenges, it hasn’t eliminated them. It’s still possible for volunteers to log errors (even within the provided bounds of response), and these are particularly hard to detect. Similarly, volunteers can still procrastinate and/or forget to send in their data. This year, we are exploring how we can use Echo Mobile to give volunteers pre-submission report summaries that allow them to verify their responses before finally submitting them. We hope that this nudge to verify responses will help them detect and correct errors in their submissions before we receive them. We are also incorporating other ways to motivate volunteers to report regularly, including exploring the use of behavioral nudges. (Stay tuned for a future blog on that!) Ultimately, we’re working on creating a holistic approach to ensuring that our monitoring data is timely, comprehensive, and accurate. Now, with the right monitoring tool in place, we’re well on our way.

Making data-driven decisions

So how do we use the regular inflow of higher quality data in G-United? In two related ways: to address issues as they emerge and to improve our program design. Last year, for instance, when we saw response rates begin to drop, we immediately looked for solutions. We decided to shorten the length of our surveys and send them out as frequently as necessary, to address what we thought might be the issue: respondent fatigue. The effect was almost immediate: response rates began to climb, and, by the end of the year, we had maintained an 88% average weekly response rate.

Overall, Echo Mobile has helped us enhance data quality and response rates and is enabling us to meet reporting timelines. It is also a cost-effective monitoring option. All these advantages make it an innovative tool for electronic data collection. Ultimately, however, embracing this platform is just a small part of our larger push towards more real-time, tech-enabled monitoring of program processes and performance.

Stay tuned for a future blog on how we’re partnering with AgImpact to use another mobile platform, CommCare, throughout our No Lean Season program.

Leave a Reply