Thursday, October 30, 2014

Formhub with Democracy International's Michael Baldassaro (with hints at ona.io)

Continuing with my work on Mobiles for development as a student of the TechChange Institute, this week I had the pleasure of learning from Michael Baldassaro, Innovation Director of Democracy International.

Who? 

Michael Baldassaro (who by chance has a Masters from the same school where I did one of my Masters) is @MBaldassaro on Twitter. He has a history of working for non-profit organisations that seek to promote democracy.

Michael now works for Democracy International (@DemocracyIntl), a non-partisan USA-based NGO with offices located in 70 countries.

Background on case study presented

Democracy  International (DI) was hired by the Egyptian High Election Committee to monitor the January 2014 Constitutional Referendum vote. For the Egyptian Constitutional Referendum, more than 20 million Egyptians visited polling stations across the country on the 14th and 15th of January to vote on (and overwhelmingly approve) a new Constitution. (Caveat - Estimated voter turnout was 38.6 percent.)

What?

DI uses its own open-source, cloud-based (so yes, you need a feature phone and access to wifi at some point to submit your data) survey-style platform called formhub.

A developer from DI is working on a feature-free mobile version of this called ona.io. Ona.io will turn the feature-phone survey into a text-based survey that can collect and collate data via SMS ("like Geopoll", Michael explained.)

When observing the January vote in Egypt, DI simply purchased Google Nexus tablets and pre-paid sim cards for all election observers. The survey form was downloaded onto each tablet and filled in offline, and then observers needed only to find a hot spot to submit the complete forms.

How? 

In place of old-fashioned paper surveys, DI's formhub uses technology to collect, collate, and display data online. This means data is collected and analysed faster with less room for human error. The data is also stored in the cloud - that means data is not hosted locally, where local agitators might be able to access and destroy or alter it. (Michael alluded to local election observer organisations who experienced police raids that destroyed their servers - and all their data.)

The observation of elections in Egypt took place in five steps:
  1. Create the observation form (the survey) using XLSForm. There are several online videos about how to do this - I found a pretty extensive 'how to' here. The purpose of using XLSForm to create a survey is that formhub can then build a user-friendly survey form that 'responds' to user answers - that is, the form ensures the user fills in 'required' sections and that certain follow-up questions appear only when the user gives a specific response, e.g. if a user notes that there are parties campaigning at a polling station, the form will present a drop-down box asking 'which parties?' 
  2. Upload the form to the cloud via formhub. 
  3. Download the form to specific tablets or smart phones.* Forms can be downloaded via clicking a link or scanning a QR code. Michael led a real-time demonstration, and there were some tech issues with iPhones. 
  4. Fill out the forms.
  5. Upload the completed forms to the cloud. 
*Michael did note that DI spent two days training election observers on the tech (the tablets and the online forms) and the environment in which the observers expected to use the tech. He emphasised the need for tech to be fit for purpose, reliable, familiar, and user-friendly to be effective. 

Data can be viewed immediately via a map of where what type of information was collected (with geo-located clickable bubbles allowing you to access specific forms or view particular responses). Data can also be downloaded in Excel or CSV, or shared with other sites via an API (application interface).

Some images are below:


The formhub admin interface (above.)



The map visualisation of the geo-located data. Click a dot to see the form from a specific polling station.



The form you see when you click a dot.



Filter the data for specific questions. 



View responses to specific questions (overview in the corner.)



A sample survey form (what the survey looks like to a user filling it out.)


How a person filling out the survey can geolocate themselves.



How you build the survey (in XLSForm): above are the questions (with constraints, hints, etc.)


How you build the survey (in XLSForm): above are the answers. 

Lessons learned:

Once again, it's great to have these classes with actual practitioners who bring their on-the-ground experience and lessons learned to the table. Michael had the following to say: 
  • On managing the organisational change required when it comes to tech-enabled projects:
    • Act like a doctor. Treat the patient, not the symptom. Focus on the outcome - a healthier patient, not a fancy gadget or tech-enabled test. Don't feel the need to use technology if it is not necessary. 
  • On open-source:
    • Don't confuse how the application is built with where the data is kept. If you use open-source to build the data-collection application but you store the data itself on a secure server, you shouldn't have security problems whether your application is built via open-source or not. 
  • On formhub:
    • Tell your users not to forget to submit the form. 

What is 'digitally literate' when it comes to big organisations?

I read and write a bit about digital literacy here - how investment in information and technology communication (ICT) infrastructure is useless without investment in the people who are supposed to use it.

That said, I've watched the whole Zunzuneo fiasco with a bit of indignant resignation. How did they not see that coming? Was it a dearth of digital literacy within the organisation? Let me try to explain what I mean....


What happened with ZunZuneo? 

All I know is what I've read, but it appears that the USA government (or some agency thereof) funded a fake social media network for Cuban citizens, called ZunZuneo, with the aim of encouraging free speech and potentially/particularly (depending on which media you are reading) free speech that was critical of the existing Cuban government.

There is a lot of debate whether this was a covert project (with the ultimate aim of changing/challenging the current Cuban regime) or whether ZunZuneo was less ambitiously radical and more ... well, if not transparent, then simply opaque.

Nevertheless, the USA has gotten a lot of internal and external criticism when it comes to social networking and social media recently - from privacy violations to propaganda allegations to general naive screw-ups.

As to whether my country's government is more prone to social media mess-ups or just more prone to relevant criticism (some might argue the USA is no worse and in some ways better than other governments when it comes to using, abusing, and / or regulating social media), the revelations about US involvement in ZunZuneo were, in a public relations sense, a bad thing for the old US of A government.

What does it have to do with digital literacy? 

It's not enough that citizens be digitally literate and capable of understanding and using the ICT infrastructure available to them. Governments and their ilk need to be digitally literate as well.

No, I don't mean in a "wag the dog" sort of way (though De Niro knows, there's enough of that flying around. ZunZuneo is probably the least - and potentially, at least now, one of the least effective - examples of it.)

I mean in a sort of user-friendly, give the people what they need and want in an easy-to-access, easy-to-use way. Listen to the people you want to love you and, hopefully, your ideas. Take a page out of Apple, Google, or yes, even - at times - Microsoft's playbook. Invest in usability testing, invest in surveys, invest in your users (or the people you want to become your users). Make sure you give them what they want, digitally-speaking, rather than what you think they'd like.

Digital literacy is being useful.

Don't develop tools that sound good in theory, pay attention to what people are using in practice. If what you build is something users love (and citizens do love a good government service - when it is in fact a real service), then they won't care who is funding it or why it's really being funded or if it is 'sexy' (horrible, over-used marketing word.) Your beloved users will, for the most part, appreciate that the digital product or service offered is useful and easy-to-use. Congratulations, you've achieved digital literacy.

Be useful. That's easy. Why don't we do it more often? 

There is in government, as in any organisation or group, a desire to control the message, to spend money effectively constructing an idea, a message, that you, the message-maker, want to convey, and then embedding this, your Important Message, at the forefront of any product you put out, from a speech to a YouTube video to a PDF flier (printed offline or, sadly, uploaded online as well - because isn't that what all web users want? Downloadable fliers?)

And that's not a bad idea (that is, embedding your message in all you do is not a bad idea - downloadable fliers are a horrible idea), it's just not user-centric. And digital literacy is all about being user-centric. Build the product the people want and then figure out how to convey your message as a part of, around - or even after - that. Your message is (almost) never what will sell your service.

People are much more willing to listen after they've been heard. And this counts double when you are acting in the public interest. Show the relevant public you are interested in them first. Then ask them to reciprocate.

So back to ZunZuneo...

I shouldn't be so critical. After all, while I try to be part of the solution, I've not always avoided contributing to the problem. (I have, in fact, uploaded downloadable fliers.)

ZunZuneo probably sounded like a fantastic idea in some back room policy chamber where a bright-eyed bipartisan is working hard to improve the world in the way s/he thinks it needs improvement.

But where-ever it came from, I'm pretty sure ZunZuneo producers didn't first look at the Cubans as users, analysing what they wanted and needed and what service or product would be most likely to fulfil those needs in such a way as to render the message at best, a bonus, and, at worse, a nuisance (like those sponsored ads on Facebook, yes? Updates from my friends along with annoying recommended pages or games nobody cares about - but someone must or Facebook would make less money.)

How do the Cubans want to see world improvement in their day-to-day lives? That's the service they'll buy or the product they'll hang on to, even if it turns out to be funded by an organisation not entirely in agreement with the Cuban political agenda. (Yes, we all love our privacy and want companies to respect it, but how many of us still have Facebook, YouTube, Google, Yahoo....?) Give them this service or product, be useful, be digitally-literate.

Then when you slot in your message, they'll be listening. Because you already heard them. 


How Government uses the power of high performance analytics - or how SAS would encourage them to do so (using SAS tools maybe?)

A 2012 white paper covering a survey led by SAS amongst USA government agencies on the use of analytics by these agencies offers some nice insight into government trends with regard to the importance, understanding, and use of web and social media analytics - even if it is 2 years old.

The white paper: How Governments are Using the Power of High-Performance Analytics

The authors: SAS (pronounced Sass as in the American Southern expression 'don't sass me' - after all, global headquarters are based in North Carolina's family-friendly answer to Silicon Valley) used to stand for 'statistical analysis system'. It grew out of a system rooted in agricultural analysis to become globally renown as the go-to company for analytics software for, among others, governments and pharmaceutical companies.

A bit about the authors

SAS is a leader in predictive analytics. Predictive analytics roughly means that where most of us react to our analytics in real time, noting a sudden trend and formulating a quick response in what is known as reactive analytics, we should be mining our deep pockets of historical and current data to forecast what actions (actions, not reactions) we need to take to avoid, exploit, or even cause online and offline events we want to see with regard to our brand, policy, communication activity, event, etc.

SAS uses predictive analytics and its widely dispersed teams of expert analysts for its business clients and for its government clients. An oft-cited report I've seen in my field is the UN Global Pulse - SAS study on how English-language forums in Ireland and the USA could be used to correlate online conversation about employment with trends in actual unemployment, i.e. relevant forum discussion between specific individuals could predict that unemployment was on the horizon for a large number of individuals.

A bit about predictive analytics

Not so impressive, huh? Who couldn't see that talking about unemployment probably stems from possible upcoming unemployment?

Well, wait, it gets more interesting. I worked in my early beginnings as an Analyst on predictive software for pharmaceuticals (not for SAS, though I did work with SAS a little bit later in my career). My work involved building a taxonomy - a list of terms - and citing each of these terms as positive or negative terms with regard to a specific subject. For example, 'headache' = bad when found in a sentence or paragraph with a specific medication.

Why? Because a headache is an annoying side effect of that medication, so probably the sentence or paragraph is bad news for the medication and the patient on that medication. Too specific? Well, imagine feeding these terms and others into an algorithm that learns over time which words are 'good' words when associated with a specific medication or illness and which words are 'bad' words when associated with the same medication or illness. Now release that algorithm on the web to review and catalogue and even learn (yes, the machines - sorry, the algorithms - are smarter and learn faster and, importantly, are more objective in their analysis than the human beings these days) as much about the public perception about a drug or illness as possible.

Suddenly, marketing, public relations, and even research divisions are better informed with regard to which part of the public will spend money on what and why. Very helpful information when formulating business goals and allocating resources, yes? What will be an easy sale to whom and how, where people will pay more attention (and may be willing to spend more money), etc. is all right there in the data.

This applies to government too - governments provide services to their citizens, after all. Wouldn't it be good to be able to analyse which citizens wanted what sorts of services and via which channels and where improvements could be made and where current successes could be promoted - and how? The Barack Obama Presidential campaign used simple (and free) Google Analytics to optimise a site that aimed to collect donations and volunteers and noted a 40% increase in subscriptions and financial donations as a result of keeping an eye on just a few metrics.

Back to the white paper...

SAS is not without an agenda.

No white paper or report is written without an agenda, and SAS is more upfront than many in pointing out that it sells what it feels the US government should use: more high-level analytics modelling to better plan, implement, and achieve their missions. 

That said, the benefits of large-scale analytics and the data-driven decision-making that, when used properly, such analytics are supposed to enforce have obvious benefits. Machines and algorithms are faster than people, process more data than people, and can detect random trends more quickly than people. Plus, people are subject to their personal and societal biases (even top-level universities are subject to this), which means that people can miss important patterns, correlations, trends and shifts among audiences, users, and stakeholders - thus missing warning sides and/or opportunities that are obvious to the machines. (Though, yes, software can have a cultural bias built into it, but this is a topic for another blog post.)

Given we're all drowning in data, what's not to like about machines designed to move us from information overload to the more easily rectified (we hope) filter failure? Analytics, properly used, are supposed to correct filter failure and help us turn amazing swathes of relevant data into usable information. 

So how do government agencies in the USA federal system use the power of high performance analytics? 

The survey focused on analysts and managers and their perception and use of analytics in decision-making. 

Where did analysts and mangers agree
Both felt that employees tended to lack the relevant skills and training necessary to take advantage of analytics - to turn simplistic metrics and data collection into useful, actionable, and insightful data. Both managers and analysts seemed to also agree that existing analytics needed to be more timely and accessible (read: easy-to-understand and take action on) as well as more automated with regard to sourcing and collating (read: less human-resource intensive). Lastly, both agreed that improved data visualisation of analytics would be a great plus (which kind of combines the two earlier points, I think. The ability to access and take action on the data is, to a large degree, dependent on how quickly you as an analyst or manager can see and understand both the data and the data's implications for your business or organisation.)

Where did analysts and managers disagree? 
Managers tended to still rely more on prior experience than on data to make decisions. In my experience, this could be due to many reasons, from the data arriving too late to be factored into an urgent decision to managers who are a bit too self-confident (the 'I know best'/'Mommy manager' syndrome). Not to mention, decisions made by managers, in government and elsewhere, tend to factor in a lot more than just the obvious data trends in a market or among a target audience - office politics, aspirations for promotions, and personal relationships and departmental priorities all impact any decision a manager must make. Factoring all this data into a so-called 'data-driven decision' could definitely result in data taking a back seat to prior experience. 

Analysts, in a move to make super-analytics Google 'master' Avinash Kaushik proud, preferred data-driven analytics when it came to decision-making. Analysts also thought that managers lacked commitment to making data-driven decisions (according to managers apparently, analysts aren't far off in that assumption.) Finally, it looked like analysts felt that sufficient resources were not being appropriately diverted to make true analytically-backed data-driven decision-making possible (okay, so we are agreed that we need more skills and training - budget for it, why don't you, managers?)

Overall SAS lauded the managers and analysts proactive will to use analytics but seemed critical of the overly simplistic metrics upon which most analysts and managers relied. Not sure if I agree completely - the KISS (Keep it Simple, Stupid) rule is very important when it comes to any sort of web metrics. If people don't feel like they understand the data, they are less likely to trust it (unless it backs up what they already believe, that is, which is not exactly a good thing if you want a data-driven decision-making culture in place of a self-involved navel-gazing, I know best because I know best because I know best...etc. culture.) However, SAS did cite the need for more easily accessible, easy-to-understand and take action on behalf of my organisation data for decision-makers, so maybe my caution is unnecessary. 

Maybe missing...

Something that jumped out at me, as a student of AK's Market Motive Web analytics class, is that some government departments may simply lack a clear digital marketing and measurement model. Analytics, like any form of data collection, need a framework that corresponds with the overall purpose of the group or organisation running the analytics and collecting the data. Data collected with no clear purpose is, while not useless, not easy-to-use. If managers are struggling with shifting political agendas and unclear or simply opaque organisational missions and goals, it would make any form of data-driven decision-making (or even just identifying simple metrics to measure) difficult. 

Monday, October 27, 2014

My World 2015 and mobiles - post 6 of 6 - concluding thoughts and questions

These 6 posts aimed to give a summary of the groundbreaking My World 2015 survey and explore how mobiles have been and are being used to promote and distribute the survey.

My World 2015 aims to survey individuals across the globe with regard to their priorities in public policy. The survey allows respondents to choose 6 out of 16 pre-selected priorities or to submit their own priority in a 17th ‘fill-in-the-blank’ option. Respondents have participated in the survey via pen-and-paper ballot, via a central website, and through mobile technology (SMS, IVR, and a mobile application).


To review the posts from the beginning, here are the links: 


  1. My World 2015 and mobiles - post 1 of 6 - an introduction to the posts
  2. What is My World 2015? My World and mobiles - post 2 of 6
  3. How are mobile phones used to distribute the My World survey? My World and mobiles - post 3 of 6
  4. Interactive voice response (IVR) and My World 2015 - My World and mobiles - post 4 of 6
  5. biNu and My World 2015 - My World and mobiles - post 5 of 6
  6. And this one...

Some questions I am still looking to answer when it comes to how My World is getting done...


  1. How were the 16 priorities were determined? My understanding as described in the paper is 1) the United Nations Millennium Development Goals (MDGs) Campaign made 15 thematic categories based on the MDGs and existing related research, 2) A Uganda mobile survey asked users to name one policy area of priority and these were cross-checked with the 15 thematic categories, 3) A network of MDG and development experts were consulted and the 16 topics were finalised. 
  2. Any specifics on dropout rates available (per SMS, IVR, biNU)? 
  3. Did the positioning of the priorities in the survey (e.g. healthcare offered as an early option to respondents) possibly impact decision, or were priority positions random in the different survey methods? 
  4. Can anyone find the link to Yemen video seen by 3 000+ viewers? Was this also actively distributed on Facebook, Twitter, etc or was there no point given it was on Yemeni TV?
  5. Where exactly was interactive voice response (IVR) offered as a survey option? I can find specific mentions of Yemen, Rwanda, and India, but a My World ‘How To’ document suggests that additional countries (Bolivia, DRC, the Republic of the Congo, Nigeria, and Bangladesh?) may also have included IVR options. 
  6. Is this link to the SMS version of the survey distributed in Yemen and only SMS in only Yemen or all the types of the survey done in Yemen? 
  7. Any study comparing SMS vs. IVR (dropout rates, response rate, etc.)? 
  8. How many SMS messages in the survey? Was it one text per priority or one text per four priorities (as with IVR)? Were these four priorities grouped together randomly?  
  9. A cost analysis or proposal would be great to see to have an idea of overall costs for GeoPoll's partnership or biNu's promotional work. 
  10. biNu has a number of unregistered users – could these individuals fill out the survey? 
  11. How does biNu offer advertising, e.g. is it similar to Facebook ads or is it just Facebook ads? I Googled this but could find no clear examples of biNu ads. 

biNu and My World 2015 - My World and mobiles - post 5 of 6

biNu and My World 2015

The final approach to mobile distribution of the My World 2015 survey is a partnership with biNu.

biNu is a cloud-based application that allows “almost all types of [feature – specifically Android or Java-based] mobile phones to access Internet applications and services running in the cloud with near instant response times, even on slower or congested 2G (GPRS / EDGE) networks”. biNu claims that users access 10 times less bandwidth for an experience that is 10 times faster than standard mobile browsers.

biNu has had an active user base of 3-5 million during the first six months of 2014. The user base fluctuates as more and more people get smart phones or change providers.

How is the survey done via biNu?

My World 2015 partners with biNu to both distribute and promote the My World 2015 Survey. Users with accounts on biNu can find the survey in the survey section of biNu (which also offers free credit to users that fill out surveys). biNu also supports survey distribution with advertisements and reminders for account holders.

Most biNu account holders use the application to chat, share content and to access largely text-based news, sports updates, and information.

Approach of the biNu My World 2015 survey

My World 2015 noted the popularity of biNu amongst mobile users in developing countries. biNu acts as a cheap means by which to directly target the owners of inexpensive feature-phones for surveys (as little as 20 cents a question, according to biNu’s website). Users receive credits for answering surveys, allowing them to access more features as well as more time on biNu.

The My World 2015 survey allows each biNu account holder (registered user) to fill out the survey once. However, there are some unregistered users, and it’s unclear as to whether or not these users could successfully access or fill out the survey.

Implementation of the biNu My World 2015 survey

In addition to being offered as one of many surveys made available to biNu users, My World 2015 received promotional placement from biNu including Facebook ads, automated reminders sent to registered users, and some video promotion. Moreover, clickthrough rate to the My World 2015 survey via biNu increased with the added incentive of 10-20 biNu credits for completing the survey.

Results from the biNu My World 2015 survey

By early June, biNu survey distribution resulted in over 100 thousand votes from over 180 countries. About 84% of all respondents are male, and 80% of all respondents are between the ages of 16 and thirty. Over 80% are in or have finished some sort of secondary education. The largest number of votes has come from India (about 18%), perhaps due to strong promotion about the survey via high-profile Bollywood stars.

The top three priorities for biNu respondents include better education, better job opportunities and better healthcare. “An honest and responsive government” came in fourth overall.




Future plans for and lessons learned from the SMS My World 2015 survey

biNu continues to distribute the survey. The largest concerns in the biNu distribution include the disproportionate amount of votes from men and a high dropout rate.

To increase female participation, My World 2015 has increased promotions targeting women, using female celebrities and offering additional biNu credit incentives to women. My World 2015 is considering targeting only women via biNu, but this has not yet been done given the survey’s main goal of remaining non-exclusive.

To decrease the dropout rate, My World 2015 has asked biNu to distribute automated reminders to those that began but did not complete the survey.


Interactive voice response (IVR) and My World 2015 - My World and mobiles - post 4 of 6

IVR and My World 2015

Interactive voice response (IVR) versions of the My World 2015 allowed respondents to call a toll-free number and respond to automated voiced questions, usually in a local language, by pressing buttons on their mobile phones in response to automatically voiced questions.

How is the survey done via IVR?

As with the SMS and other options by which to take the survey, local partners and campaigns promoted both the survey and the local options by which to take it to potential respondents. For countries where IVR was offered, this includes promoting the toll-free number by which to take the survey in all relevant My World 2015 promotional media, from print articles to fliers and billboards to promotional SMS messages.

To set up the IVR platform in each country where it was offered, My World 2015 worked with local NGOs to record the survey in local languages. To set up the technical platform behind the IVR survey, My World 2015 and partners worked with an international company specialized in mobile technology as well as national and local mobile service providers.

Approach of the IVR My World 2015 survey

To make sure the My World 2015 survey was non-exclusive, distributors wanted to ensure that there was a survey option for illiterate individuals or individuals who did not have their own mobile phones. For this reason, My World 2015 offers an IVR option, where respondents call a toll-free number, pick their preferred language, and then respond to automated verbal questions by pressing buttons on their phone.

This survey option, when available in a specific country, was promoted in all relevant media and local outreach.

Implementation of the IVR My World 2015 survey

Working with Kirusa, an “international mobile technology provider”, connected to mobile service providers in Bolivia, the Democratic Republic of the Congo, the Republic of the Congo, Nigeria, and Bangladesh, as well as with local NGOs and telecom companies in different target countries, My World 2015 encourages partners and interested parties to record the survey in widely-spoken languages and distribute a toll-free number allowing respondents to call in and take the survey.

In a ‘How To’ document, My World 2015 explains how the IVR option works once set up:

  •      The user will receive an SMS message on their mobile phone introducing them to MY World through a simple message and giving them a toll-free number to call to take the survey. They may also be offered a non-financial incentive to participate.
  •       When an individual calls the toll-free number, an automated telephone system will allow them to choose their preferred language. They will then be asked four multiple-choice questions about the issues that are most important to them (each time selecting one from a choice of four MY World options). At each stage they will dial into the telephone keypad the number corresponding to their choice. Finally, they will be asked their gender, age and education level.
  •       [My World 2015] estimate[s] that a caller will need a maximum of two minutes to complete the automated survey.
  •       [Callers] will not incur any charges for making the call.


To record the survey in local languages, My World 2015 relied heavily on NGOs and other in-country partners. My World 2015 provided the prompt and script (usually in English), and then the partners translated, recorded, and edited the local transcripts.

Results of the IVR My World 2015 survey

The data portal for My World 2015 does not offer a clear look at all IVR results. Instead, those interested can access the results from the on-going IVR survey in Yemen, where Y-Telecom, local UN Volunteers, and the NGO Y21Forum have heavily promoted the toll-free number. MenaVAS and a local UNV, Mr Ahmed al- Ashwal, have taken care of the technical side and reporting for the Yemeni IVR and SMS version of the survey.

As of early June 2014, over 100 thousand Yemenis responded to the survey via IVR. (Only 46 thousand have responded via SMS in Yemen.) About 31% of these respondents are female (versus 23% in the SMS version of the survey). Fifty-nine percent of all respondents are between the ages of 31 and 45 while 41% are between the ages of 16-30.

As with the Yemeni SMS version of the survey, respondents ranked a better education, better job opportunities, and an honest and responsive government as their top priorities. Better healthcare ranked 7th collectively (though it did rank 5th amongst the largest responding age group, ages 31-45).




Future plans for and lessons learned from the IVR My World 2015 survey in Yemen

IVR response rate, in Yemen at least, is more equitable with regard to gender and slightly more equitable when it comes to education level. More people have chosen to respond to the survey via IVR vs. SMS, perhaps because it is easier to call in and respond to the survey via phone, generally costing the respondent only about two minutes of his or her time, instead of responding to several text messages.

IVR seems to be, based on the Yemen case study, a more effective means of engaging respondents to the My World 2015. The dropout rate appears to be lower and the overall response rate higher and slightly more equitable with regard to gender. Offering the survey via IVR is more costly with regard to time and technical set-up on the side of the distributor; however, results suggest it is time well spent.

Further analysis was not possible because the data was not disaggregated to view IVR versus SMS survey data.