A week of Tech; understanding and overcoming the challenges to using Tech for monitoring and evaluation

sean.darby Mar 28, 2018
Blog

This is a blog by Beth Turner, Program Development Coordinator at Integrity Action, who reflects on two events she attended in March 2018 that highlighted the impact that technology could-and is- having on the development sector. 

The first event I attended was the MERL (Monitoring, Evaluation, Research and Learning) tech conference held in London. MERL tech conferences are held annually in Washington, and this was the 2nd year that it was also being held in London. Over two days, MERL tech sought to ‘gather 200 thought leaders, practitioners, evaluators and decision-makers who are using technology for monitoring, evaluation, research, and learning’.

I represented Integrity Action on one day of the conference; presenting the results of a pilot we conducted in Sindhupalchok, Nepal. We had created a mobile based app for monitoring the reconstruction of homes destroyed by the 2015 earthquake in Nepal. Not only are these type of events a good opportunity to learn about the work of others, it is also an opportunity to hear other people’s opinions about our own work. Discussions with the audience included how to ensure the data collected is secure, and also how to protect the people who are using it. As an organisation, we have considered these points thoroughly in the development of the app and made steps to assure them both; however there is always room for improvement- especially when it is such a sensitive and important program that is being monitored.

These issues were touched upon in a session that was hosted by the Digital Impact Alliance (DIAL). DIAL spoke about their work developing standards for technology tools that are intended to monitor development and humanitarian projects. These are captured in the ‘Principles for Digital Development.’ These principles can act as reminders about what needs to be considered when developing such tools. There are nine in total:

  • Use Open Standards, Open Data, Open Source, and Open Innovation

  • Reuse and Improve

  •  Address Privacy and Security

  • Be Collaborative

  • Design with the User

  • Understand the Existing Ecosystem

  •  Design for Scale

  •  Build for Sustainability

  • Be Data Driven

DIAL informed us that these principles are constantly evolving, based on learning and how their influence impacts the development of different tools. One criticism was that some of the principles could contradict one another; for example ‘Addressing Privacy and Security’ and ‘Use Open Standards, Open Data, Open Source and Open Innovation’ as well as understanding how to design for ‘Scale’ and ‘Sustainability.’ This is always an act of balance; in a sector where transparency and accountability is often ascertained to how visible the data is, how can we make sure that we preserve the privacy of our beneficiaries? It is often tempting to publish every bit of data collected to add evidence about how successful a project was in reaching out to the most vulnerable, as well as to present these in case studies and attractive diagrams. However, by doing that we are making a statement that we can use the personal information of the people we are trying to help in ways that they might not know about.

SindhupalCheck, and indeed our other technology tool DevelopmentCheck have both been designed with sustainability in mind. We want the act of citizens monitoring to be embedded in a community so that these tools lend themselves to a long-term behaviour change as opposed to just being quick-fix solutions. However, ensuring sustainability makes the push to scale up harder to reach. Although we want our tools to be wide-reaching, will this compromise their sustainability in communities where oversight of projects and services is needed over a longer period?

Despite these nuances, the ‘Principles’ are a critical guideline for organisations to understand how their tools have or have not met certain criteria. Addressing each principle to 100% is nearly impossible, but understanding which areas your technology is lacking can help you find other ways to address the principles through non-technological mechanisms.

Integrity Action is proud to have developed ‘SindhupalCheck’ as a tool to see how the reconstruction of houses is progressing on the ground. As a pilot, we hope to embed all the feedback we received throughout its implementation to create a better second version - and will definitely be applying the principles to the next iteration!

The second event I attended was hosted by the British Council and titled, ‘Mobile Technology: The future of evidence in development?’ I was grateful of having attended the DIAL session just the day before for enabling me to evaluate the mobile technology being presented in the systematic way offered by the Principles of the Digital Development. Similarly to DIAL, the British Council had developed a set of considerations for their technology;

  • The Ubiquity of Markets

  • Rapid Response
  • Cost Efficiency

  • Discipline of Shorts

  • The Promise of Now

  •  Access to New Data

This event approached technology from a different angle; presenting the findings from several smaller studies to ascertain what types of technology are best suited to monitor and evaluate and how to reach the people you need information from. This specifically was in relation to the British Council program, ‘Connecting Classrooms’ which aims to improve the education of core-skills to students across the world by providing extensive training to teachers. These core-skills are ones not necessarily addressed by national curriculums, for example ‘Critical thinking and problem-solving’ as well as ‘Citizenship.’ Neil Williams, the Senior Project Manager at British Council leads the Monitoring and Evaluation of Connecting Classrooms.

The British Council, alongside Kantar Public, an international research consultancy whom were represented by Melissa Baker and Alexandra Cronberg, addressed some crucial but often over-looked questions. The purpose of answering these questions was to understand how using technology might benefit the monitoring of a program, which in this case was a British Council programme based in Ethiopia. These questions included which technology would benefit the program the most and how cost-effective different types of technology might be. It was refreshing that technology was compared to traditional non-technological methods for data collection such as face-to-face surveys. Technology can often be used as a token in programs to show that they are innovative, even when there are better, non-technological and simpler alternatives.

The Connecting Classrooms program collects data about teachers that had participated in British Council training to teach the core-skills. This was to evaluate how well the teachers had absorbed the training provided through a questionnaire. The technologies compared in administering this questionnaire were through telephone, SMS and instant voice recognition (IVR) - alongside a conventional paper-based method. There were sixteen questions asked in total. There were many interesting findings of which I will write about a few.

Even though SMS might seem the most practical method of response, especially in a country which has a relatively high mobile penetration, it actually resulted in the lowest response rate at 23%. Paper based questionnaires yielded a more impressive 70% and telephone questionnaires the highest response at 90%. The researcher stated that with SMS although it offered respondents flexibility in when they might answer the questions, there was a ‘time-out’ function where if left for more than two days, the questionnaire would be considered void. The researchers spoke about teachers often not being able to access their phones or even reach signal until after their work days, giving a much shorter window of time to respond.

They also measured the response rate for each question. Sixteen seems like a lot of questions to respond to using a mobile phone keypad and the response rates showed that regardless of the total amount of questions, that it dropped significantly after just three being answered. The research group chose sixteen questions based on previous research that showed that completion rates were the same for eight question questionnaires as they were for sixteen, but significantly higher than for twenty-four.

The research group also found that reimbursing respondents the air-time used before they responded to SMS questionnaire as opposed to afterwards yielded in a 25% higher response rate. This reflects a phenomenon much spoken about in behavioural science called, ‘reciprocal behaviour.’ This describes the incentive that being given a gift has on someone to do something in return. In this case, it would be to complete the questionnaire.

This session was particularly useful because it provided a level of understanding of the small factors that could easily be considered to make sure that a program has an efficient mechanism, technological or not, for monitoring and evaluating its impact.

To end the event, Dr Baldev Singh of Imagine Education UK presented his innovative tech tool to measure the impact of the Connecting Classroom program. Instead of asking the teachers if they had absorbed the teachings of the training, the monitoring and evaluation could happen in the classroom and be based on how well the students were demonstrating the core-skills they were meant to be being taught. In each lesson plan, teachers set objectives to be met for each core skill and that could be demonstrated by the student.

To summarise the tool, each student and the teacher is given a card with a QR code. Every time a teacher thinks that a student has exhibited a core skill, they can scan the QR code using their mobile phones. This registers on a larger system. Peers and the student themselves can also scan the QR code. This limits the risk that the individuals will scan the QR code when it is not deserved, as data can be triangulated.

This scheme was a pilot applied to one specific program, and its ability to measure the learning outcomes were regarded successful. However, in order to address the Principles for Digital Development, it would have to consider more thoughtfully the resources needed to scale it, the openness of the data (I saw no evidence that the people using the technology were receiving any feedback of the data they were providing) and how it could be built on and improved to be used in other programs. That being said, for a solution that offers the ‘rapid response’ and the ‘access to new data’ amongst the other criteria of the British Council, it serves well.

These two events gave an insight into the existing knowledge on how to use technology in development. It was reassuring to see that although there is a struggle with concerns surrounding privacy and openness, scale and sustainability across the sector, there is also a mutual incentive to collaborate to find innovative ways to use and apply technology in a way that benefits the people that we work with.