Testing Assumptions: More Closely Approximating the Truth

A reminder from the field: Always test assumptions.

Recently, alongside a colleague from the Global Health Project at the MIT Sloan School of Management, I have been trying to determine how LifeSpring, a maternal health hospital in Hyderabad, India, might utilize mobile technology to gather data about its outreach efforts.

In the past few years, many mobile-based solutions have been developed to support health workers in low-resource contexts. (Some are better than others.) Many at LifeSpring, including outreach workers themselves, agree that the hospital could benefit from digitally collecting information about its clients. These records could help assess the effectiveness of outreach efforts by capturing indicators such as the number of women who visit for prenatal care, a key goal of communications campaigns.

It might have been a good idea but, for some reason, nobody was very excited about trying it. The reason: Most existing mobile health (mhealth) solutions are built on SMS (short message services, or simply text message) platforms, and LifeSpring was not confident about its outreach workers’ ability to use text messaging – or even about their ability to navigate and comprehend a text-based data collection system. Even with the possibility of visual and/or audio prompts within a Hindi- or Telugu-language platform, the hospital was skeptical.

But technological capacity and textual literacy are not new concerns in mhealth, and LifeSpring nevertheless wanted us to explore the feasibility of a mobile solution to support its outreach efforts. To do so, we needed to better understand the intended users of a mobile data collection program.

Outreach workers are mostly middle-aged women who visit low-income neighborhoods to share information about LifeSpring Hospital with potential customers and check in on pregnant women due to deliver soon. We needed better intelligence about these women’s needs, behaviors, and capacities related to their responsibilities and to their use of technology. We needed to better understand the skills and activities required of them in their roles as outreach workers, the data they currently collect, and their opinions about how technology may improve or hinder their work. We thus shadowed the outreach workers and conducted in-depth interviews with them. Synthesis of these interviews in aggregate, as well as interviews with organizational stakeholders, was also done to help us understand Lifespring more broadly, including its capacity and will to implement a mobile solution.


One user we spoke to was a woman we will call Pooja. Based out of the hospital’s Chilkalguda branch, Pooja has been with LifeSpring since its founding, some seven years ago. She rotates among 25 different colonies (neighborhoods) associated with five different branches of the hospital, spending most of her time out of the hospital, interacting with new and returning customers. She returns daily to the hospital to share information with nurses and assist with customers in labor.

Pooja, like other outreach workers, has not received formal training for her position; rather, she has learnt on-the-job from other workers. Pooja is fluent in spoken Telugu, Hindi, Urdu and reads and writes in English. Although she has difficulty responding, she understands much of what is said to her in English. She’s not unique: many workers we spoke with could also converse in three or four languages, and almost all of them write in English. In spite of this, Pooja and her fellow outreach workers were described by their Lifespring colleagues as low-literacy.

Pooja owns a basic Nokia mobile phone, which is mostly used for personal communications, but she also uses it contact her customers. She only uses the phone’s calling function and does not text message, although she says she would be willing to learn.

Based on what Pooja said about her existing mobile habits, and what her hospital colleagues assumed about outreach workers like her, it seemed that LifeSpring’s outreach program was ill-suited for a mhealth solution. The major barriers, according to both groups, would be outreach workers’ low-literacy and lack of use of phone features beyond voice calls.

I am convinced the most useful thing we did in this project was to test these assumptions.

We got in touch with Dimagi, a firm that develops user-friendly mobile data collection tools. With their help, we loaded a demo application onto basic mobile phones. The SMS-based application had a series of Hindi-language forms with visual prompts designed for low-literacy users, and we proceeded to test it with outreach workers to understand their capacity and desire to use such a tool.


With a bit of basic explanation on navigating the application, most outreach workers we tested with – even though they only used their mobiles for voice calls at present – were able to read the prompts (‘Full name,’ ‘Husband’s name,’ ‘Age,’ ‘Date of last menstrual period,’ etc.) and type in their responses.

An older woman we will call Neha was very approving of the tool, as she believed usage would boost her status in the eyes of potential customers—she thought she would seem modern and savvy in a way that would positively distinguish LifeSpring. When we asked how it compared to entering data manually on a paper log, Neha said that while it was a bit hard the first and second times, by the third time, she thought it easy and recognized that it would save her time. This tool, she reasoned, would remove many of the steps she currently performs as part of a cumbersome data collection process, such as gathering customer information on-site and returning to the hospital to transcribe that data in a formal register.

After trying the application, a newly confident Neha wagged her finger in the face of nearby colleagues and—with a big smile—informed them that she would certainly be using it. The group laughed out of amusement and surprise at their newly tech-savvy colleague. Simply through prototype testing we were able to identify ready champions among the outreach workers; imagine what a full communications and training campaign could do.

The quick comprehension and ensuing confidence of Neha and outreach workers like her reaffirmed the importance of testing assumptions. Though a full and tailored solution still needs to be developed for LifeSpring – and we still need to tackle larger questions around how a multi-site hospital integrates such a technology into its programs – at this point the process, we were happy to celebrate a small success. By overcoming initial assumptions about populations viewed as low-literacy, and recognizing that they – like the rest of us – are eager to embrace new tools, we revealed new opportunities and challenges for LifeSpring.


One such challenge: If a mobile tool can collect customers’ details, outreach workers may not need to go to Lifespring as often. This means critical informal exchanges between outreach workers and hospital staff around customers’ medical and personal situations may be lost. How do we ensure gains in customer intelligence don’t end up eroding customer service?

We have recommended to LifeSpring that they pilot the tool for a few months to understand outreach workers’ and LifeSpring’s overall experience with the tool, and to understand the larger program implications of integrating such a tool, including associated risks. Once they have more data from the pilot, they can iterate the tool and refine the program accordingly.

There still exist many unanswered questions about how mobiles can best be used to support LifeSpring, but we are getting closer. And this is because we and our LifeSpring partners chose to question existing biases and assumptions, in search of a closer approximation of the truth.

Futher reading.