1. Privacy

Responsible data management is not new to development. However, with the use of technology-enabled tools for M&E, it has raised a few challenges related to the privacy of individuals. These include the growing use of biometric data for tracking and sensors to monitor daily habits. The collection of personal financial information and affiliation has also made it vital to consider data security when setting up an M&E framework. This can be addressed through data encryption, ensuring that individual data is not easily identifiable, and developing a policy that ensures responsible data practices. Furthermore, organisations need to be aware of the ethical implication of collecting data on people and the necessity to secure all the permissions and consents required. It is also important to be transparent about the methods of collection, why data is collected and how will it be used with the respective individuals. Finally, ownership has to be explicit when information is shared and a plan should be in place on what happens to data collected once a project ends. In South Africa, the Protection of Personal Information Act, 4 of 2013 also lends a relevant and interesting dynamic.

2. The end-user in mind

To select the most suitable technology-enabled tool(s), taking a human-centered design approach to the selection process will ensure that the organisation does not end up with an irrelevant or unnecessary tool. The approach starts with identifying what is desirable (one should consider project managers as well as community members, i.e. the people who will be using the tool), then viewing the solution through a feasibility and viability lens. This ensures and increases the usability of the tool as well as ensuring that no segment of the community is “ignored” as result of the selected tool, i.e. thinking of the accessibility of the tool and the training that would be required. Once identified, the tool should be piloted on one project before rolling it out.

3. Getting the right balance

Technology facilitates, but does not replace, M&E methodologies such as a well-thought out theory of change and quality M&E plan. So it may be tempting to fall into the habit of selecting or collecting data based on the easiest tool rather than what really matters to your program. Furthermore, technology can lead to over-dependence on digital data and missing the opportunity to observe and interact with communities in order to get a comprehensive picture of an intervention. To get the right balance, one must be very clear on the value the tool will add.

Although there are other factors to contemplate, the above three points offer a good guide to anyone considering the use of technology-enabled tools in their programs. With the ever-growing need to understand and measure impact, the integration of technology from delivery of services and monitoring of interventions to the evaluation of programs will continue as it offers possibilities and innovation to increasing reach, moving to scale and improving the efficiency and effectiveness of interventions.

This article was originally posted on the Tshikululu Social Investments blog. Photo courtesy of Jan Truter Creative Commons

About Amira

photo (1)

Amira Elibiary is a Monitoring and Evaluation (M&E) specialist with 10 years of experience in research, grant-making and program management; over two years of experience in the corporate social investment sector for education, health and social development projects. With a keen interest and extensive experience in democracy, governance, advocacy and rule of law work. Amira holds a Master’s degree in International Affairs from American University and a BA degree in Economics.

By Kevin Flanagan and Yuting Liao

A few weeks ago, my colleague Yuting Liao and I had the opportunity to attend MERL Tech—an industry conference of sorts designed to bring together M&E practitioners, researchers, technologists, and development professionals—on behalf of the Monitoring, Evaluation, and Learning (MEL) team at the National Democratic Institute (NDI).

At NDI, the MEL team is always on the lookout for innovative M&E practice that can be effectively applied to the democracy and governance sector and this event seemed like an excellent opportunity to explore the “big ideas” and partake in a larger discussion: what can information and communication technologies (ICT) offer to monitoring, evaluation, and learning work as the pressure to integrate ICTs into many aspects of development programming continues to rise.

Offering nearly thirty presentations, the event provided us ample opportunity to sit back and revel in the opinions of the experts, as well as contribute meaningfully to roundtable discussions and collaborative brainstorming activities. As such, these are the five takeaways:

1. More data does not necessarily mean more learning

ICT can make data collection easier, however, it’s crucial to ask the question: is this the data we need? “Big data” is enticing and a common mistake of the novice researcher is: let’s collect as much data as we can. But will that data answer your evaluation questions or will it simply be distracting? While collecting larger volumes of data could certainly result in unexpected observations, if data collection is not strategically tied to your evaluation questions, it does not necessarily lead to better learning. Quality is more important than quantity.

2. ICT can increase the level of risk for the subjects of the evaluation

Data hacks happen, so start by being scared. Whether we want to admit it or not, ICT implementations introduce additional risks to M&E work, particularly when it comes to privacy and data security. And yet, too often M&E practitioners don’t address the risks until after a breach happens. Worry about this in advance and create a threat model to assess assets, risks, and vulnerabilities.

3. Be a data-led organization, not just data-driven

While ICT does help improve data accuracy, organizations that embrace a “data-led” mentality will empower their users to strive to better understand data and incorporate it into their decisionmaking processes. Successful learning initiatives rely on better interpretation and analysis of data, and ICT for evaluation is useless without capable analytical and sector experts.

4. ICT can expand your sample size, but be mindful of the unexpected challenges in sample bias

When collecting data, ICTs can expand the reach of your evaluation efforts, creating opportunities to capture data beyond the traditional “beneficiaries” of a program. However, the “digital divide” may perpetuate the issue of sample bias, and your results may be valid only for those segments of the population with digital access.

5. There’s no ICT “quick-fix” to improve monitoring & evaluation

While it’s possible to achieve a high level of methodological rigor through carefully designed ICT studies, it’s not always easy to do so—often being technically complex, expensive, and time-consuming. Most importantly, effective ICT is built on sound monitoring & evaluation strategies, and incorporating ICTs into M&E requires long-term institutional commitment and evaluation capacity development.

Despite the wide breadth of content, there was a common theme: “It’s ok to reinvent the wheel, not the flat tire.” These words spoken by Susan Davis during a five-minute “lightning” presentation, struck an unexpected chord with the audience, attendees and presenters alike. Whether these are words of comfort for the tech-timid or caution for the tech-tenacious, Davis pointed us all to the indisputable fact that it’s okay to look to new technologies to address old problems in development as long as we are all aware that any new process, tool, or approach has just as much potential to fall flat as did their predecessors. The successful integration of M&E and ICT is fully reliant on sound monitoring and evaluation strategies and realistic expectations.

 

Kevin Flanagan
Kevin Flanagan is a TechChange alum from Technology for Monitoring and Evaluation course. He is a learning technologist on the Monitoring, Evaluation and Learning team at the National Democratic Institute.

yuting.liao
Yuting Liao is senior assistant for data analysis and visualization on the Monitoring, Evaluation and Learning team at the National Democratic Institute.

The National Democratic Institute is a nonprofit, nonpartisan organization working to support and strengthen democratic institutions worldwide through citizen participation, openness and accountability in government.

Featured image: Wayan Vota Flickr

One of the things that sets the TechChange platform apart from other online courses is the network you are immersed in. Today, we are excited to highlight one of our alumna who took the network she built within one of our courses, into the incredible work she’s doing. Devjani joined us in our Technology for Monitoring and Evaluation course a few months ago, and her curiosity for the role of technology in M&E led her to take a deep dive in one of the tools she was introduced to in the course. We got to catch up with her on how the course has influenced her work.

How did you come across our Tech for M&E course?

I was on a year-long sabbatical in London from the National Hydroelectric Power Corporation (NHPC) in India. During my sabbatical, I wanted to learn more about the role of technology in monitoring and evaluation. I was looking for a course that would not only provide me a sound knowledge of the system but would also be cost-effective for me. So, after a few google searches, I came across TechChange’s course on Technology for Monitoring and Evaluation and immediately signed up!

Why were you curious about the role of tech in M&E?

NHPC’s power projects are located in far-flung villages, from Jammu and Kashmir in the North to Arunachal Pradesh and Manipur in the North-East. The project locations are very remote without any infrastructure, roads, or communication, making survey, investigation and project completion very challenging. Due to the remoteness of these projects, we have often noticed that the socio-economic surveys were not being conducted very diligently. We have often found that there are lot of discrepancies and inaccurate data collected on project-affected families during Social Impact Assessment Studies. Manual data collection in these remote villages has been cumbersome, and the staff have to carry GPS, cameras and papers into the field. Sometimes, the affected villages are so far out that it takes more than a day to reach it by foot, and in that case, the data collected in paper surveys are re-entered manually into Excel sheets, leading to a high margin of error.

After the manual collection of data, the monitoring is done through site visits (once or twice a year) and the data collected by external consultants doesn’t always reach the NHPC, making it difficult to have baseline information on the families affected by the project.

I knew there had to be a better way to collect data and do M&E!

How did TechChange provide you with the information you needed to make an impact?

In the TechChange course, I was introduced to so many possibilities of ICT in M&E, but I was hungry for more and wanted to take an even deeper dive. From the various M&E tools I got to know through the TechChange course, I was interested in learning more about a data collection tool, Akvo FLOW. I had attended a guest expert session with Marten Schoonman of Akvo, so I reached out to the course facilitator to help me get in touch with him. Marten put me in touch with the Akvo India group and I was able to attend a five-day training session in India with the Akvo team to learn all about the tool and how well I can use it in my organization.

IMG-20151016-WA0014

Picture from the Akvo FLOW training in India

I attended a 5-day training, organised by the India chapter of the Akvo group with 40 other attendees. Everyone came from diverse backgrounds; some were tech savvy and highly educated while others were handling smartphones for the first time. It was a great opportunity for me to really dive into one specific tool that I was curious about.

Tell us more about Akvo FLOW

Akvo FLOW is a data collection platform that is used on inexpensive mobile phones with an Android platform. It has helped me with all of the following:

  • Getting the GPS coordinates
  • Obtaining authentic data from inaccessible respondents
  • Surveying a greater geographical area at lower costs
  • Ease of obtaining data on the dashboard
  • Using it in areas where there is no mobile network
  • Managing a database of respondents who can provide loads of data that can be used for baseline studies and longitudinal impact assessments
  • Geo-mapping of sampled areas and use of media such as pictures, audio and video,
  • Creation of identity cards
  • Data analysis and evaluation

Since NHPC’s projects are located in remote areas all over India, we can use Akvo FLOW to collect data from all projects in a standardised format and have it all in the same platform.

Any advice for someone considering a TechChange course to break into the ICT4D field?

Today, ICT has become an indispensable tool to bridge distances and spread the seeds of development in the remotest corners of this world. For funders and development workers, monitoring in remote areas is a daunting task, but the expansion of ICT has now paved the path towards limitless possibilities, providing easy access to information to make informed decisions. ICT4D is not just about computers, mobile phones and the internet, but it is also about help, support and capacity building of people who are using them and linking them with the right communities.

Here are five reasons I would always recommend TechChange courses:

  1. Extremely helpful and cooperative course coordinators. I was totally new to ICT when I enrolled for this programme, so I was prepared to feel lost. But with the help of facilitators and guest experts, I could get hands-on experience with the Akvo FLOW tool. This has opened a new door of opportunity for me and I am excited to implement what I learned to my work.
  2. Course material is beyond just PowerPoints. It’s very interactive with plenty of networking opportunities, group discussions and presentations, live question and answer sessions, feedback from other participants, video chats, etc. making the whole learning process very engaging and enjoyable. There is plenty of learning opportunities from course participants as well, who are very experienced and coming from different parts of the world with different sets of skills and approaches.
  3. Courses are very well crafted and exhaustive yet comprehensive. Excellent study materials that are up to date.
  4. Connecting with guest experts who are leaders in the ICT4D industry. It’s very difficult to get access to such a range of good study materials and get to hear from the experts themselves.
  5. Courses are easily accessible. All the course material, even live events are recorded and archived for those who could not attend the live sessions. Then, the course materials are available even 4 months after a course ends, which gave me ample time to catch up and get a grasp of the whole syllabus.

What’s next?

It is going to be a daunting task to introduce ICTs in our public sector, but I am going to take up brainstorming sessions with senior officers from various organisations and try to understand how to effectively use these tools in our crucial surveys and monitoring of community development and CSR programmes and also use the same for conducting Social Impact Assessment studies.

“Nothing stops an organization faster than people who believe that the way they worked yesterday is the best way to work tomorrow. To succeed, not only do your people have to change the way they act, they have got to change the way they think about the past.” – Former Chairman of KPMG International, Jon Madonna.

So my journey begins here. I know it’s an uphill task to convince the system to accept this Akvo FLOW tool, which opens a whole new world of data transparency and authenticity, and successfully integrating the tool in our various development programmes.

I would really like to thank the whole TechChange Team for giving me this opportunity to share my views and experience with you all. I am going to come back to them over and over for many more courses that would not only ensure progress in my professional development but also give me inspiration and satisfaction to work successfully in the field of social development.

***
Interested in learning more about data collection, you can still join our Tech for Data Collection and Survey Design course that started on Monday! Want a more immersive experience? Check out our Tech for M&E Diploma program!

About Devjani
Devjani
Devjani has been working for over 18 years in India with a successful track record in the area of Environment, Social Development, CSR and Sustainability. Her experience extends across large hydro and mining projects within the government & NGO sectors and international funding agencies. She has travelled extensively in India to conduct a range of investigative studies. Devjani is a Registered PRINCE2 Practitioner and has a postgraduate degree in Public Systems Management with specialisation in Environment. She has also done a Post Graduate Certificate course in Environmental Impact & Assessment with specialisation in Strategic Environment Assessment.

Featured image: Isha Parihar from Akvo India trying to calibrate the GPS in the field

Last month, for the first time we launched a whole diploma program in technology for monitoring and evaluation. We started this program to give our community a learning experience that lasts more than a 4-week course, and to offer a quality learning experience without breaking the bank. We were excited to launch this new program and were excited to see our community’s response.


Today, we are happy to announce that we have over 90 participants enrolled in our pilot program! And they are joining us from 30 different countries, like the U.S., Bolivia, Ethiopia, India, Jordan, Kenya, Netherlands, Canada and more. This week, we are wrapping up the first course in the diploma program, Technology for Monitoring and Evaluation that ends with a workshop.

The next course in our diploma program is Technology for Data Collection and Survey Design that begins on October 19, where we will explore new tech tools, and learn how to design and deploy digital surveys, and how to combine active and passive data collection.

If you are interested in joining the diploma program for the second intake, you can sign up today!

In August, along with announcing our Tech for M&E Diploma program, we kicked off a M&E Professionals Series, where we will be talking one-on-one with M&E professionals to give you the inside scoop on the industry.

For this second post in the series, we are featuring an interview that one of our alumni, Stephen Giddings conducted, with Janet Kerley, Senior Director, Monitoring and Evaluation Practice at Development and Training Services, Inc (dTS), a Virginia-based consulting organization that does considerable work with USAID.

Janet Kerley
Janet Kerley is a master evaluator and an accomplished trainer in evaluation and performance measurement. As Senior Evaluator in the Monitoring and Evaluation Practice at dTS/Palladium, she provides technical leadership for evaluations in the ME unit, provides technical direction on design and field methods, and supervises the preparation of the evaluation reports. As Chief of Evaluation, Research and Measurement for the Peace Corps, she established an impact evaluation system at Peace Corps.

Ms. Kerley was the Team Leader for Monitoring and Evaluation in the Office of the Director of Foreign Assistance, US Department of State, leading a 200-member inter-agency team to develop standard indicators for the 2007 Foreign Assistance Reform reporting tool. She worked at USAID in the Bureau for Policy and Program Coordination, CDIE and as the Monitoring and Evaluation Office in the Bureau for Africa and the Bureau for Europe and Eurasia. Prior to joining USAID, Ms. Kerley was a Senior Research Associate at Aguirre International. She has lived and worked in many countries in Latin America, Africa

S: How has technology changed the way M & E is conducted over the past decade in international development?
J: The change has been remarkable! A decade ago, most of the data gathering and analysis work was all paper-based, making it difficult, time-consuming, and costly. Especially in overseas environments, it took considerable time and effort to gather, transcribe (and often translate) and analyze the data. But today, the tech tools have made data collection and analysis more efficient and save time and money.
However, there is still a considerable “digital divide” between the much more tech savvy young people and the older professionals originally trained using SPSS (or even earlier)
technologies.

S: Does paper-based data collection still have a place in M&E today?
J: Yes — in certain circumstances paper-based data collection may be preferred.

In very rural areas where electricity may not be available, where batteries for electronic devices cannot be charged or where internet connections or mobile phone services is inconsistent or not available, paper-based data collection is still the best option.

Not everyone is comfortable with data collection using electronic devices, but they may be more open to paper-based questions.

S: What are some of the pitfalls of some of the popular tech-based data collection tools?
J: With so much tech available, it is easy to get carried away.

Some less experienced or less than fully trained data gatherers may lose sight of the fundamental questions the monitoring or evaluation is trying to get at. If evaluators lack sufficient training in sound principles of research, they may be tempted to substitute technology for sound reasoning and good judgment.

Some data collection tech tools may also have a tendency to collect too much data, some of which may be irrelevant to the task at hand. USAID, in particular, is burdened by data overload where data management systems fail to filter out data that is of little use and complicates the monitoring and evaluation practices.

S: What challenges have USAID Missions faced when integrating new technologies into their M& E functions?
J: By and large, USAID Missions have been quite open to technological improvements to M&E functions. That said, there is still a “digital divide” where younger employees (including local staff) who have grown up in the digital age are more comfortable with and more adept at using new technologies to enhance M & E. But more senior and older USAID staff seem generally open to embracing and appreciating the advantages that new technologies can bring to M & E while leaving the technical analysis and the new data gathering tools to younger techie staff. USAID staff have generally been very receptive to training in using new M & E technologies to their advantage.

S: Have new evidenced based technologies made decision making by senior USAID staff easier and more informed?
J: Most USAID Mission Directors recognize the value that good evidence on performance can bring to the achievement of program results, and the added clarity that good data and visually well-presented documentation can bring to decision making.

UNDP in Kigali, Rwanda (Creative Commons image)Photo Source: UNDP in Kigali, Rwanda

S: What are the advantages of mixed methods evaluations?
J: The most important starting point for an evaluation is doing the research required to understand what questions you want answered. Only then should you begin to look at evaluation methodologies to acquire necessary information.

When done at a proper scale, well executed quantitative data collection and analytical methods can bring statistical rigor and clarity. For example, the scale of some of the evaluations done for USAID’s food security (Feed the Future) programs has generally provided reliable data. Unfortunately, USAID Missions sometimes do not make available sufficient budget to assure that sample size for quantitative methods is sufficient to draw reliable conclusions. This is where qualitative methods can help to fill gaps.

Storytelling, an evaluation tool, is one of the most useful qualitative data collection methods. Sometimes quantitative data collection methods do not allow beneficiaries to open up and provide adequate and reliable information, but they react much more positively if they are allowed to tell a story. If you get enough good stories they can provide insights and nuances that purely quantitative methods cannot. Thus mixed method evaluations can provide more reliable evidence of performance than quantitative or qualitative methods.

S: Do you think there is a bias towards quantitative methods in international development because of a lack of free and easy to use qualitative tools?
J: Not at all. Many USAID evaluations make good use of qualitative methodologies. A
decade ago, there was an overuse of “the windshield wiper” approach (an evaluation that is not given time to do adequate field work and they report what they observe “through the windshield.”) to evaluations but more recently qualitative methodologies have become more sophisticated and reliable and can provide a lot of extremely useful information for decision makers.

S: What questions should we be asking to select the best technology for M & E?
J: Evaluation planning should begin with framing the research questions — what is it that we need to learn? The preferred technological solution should be one that can best answer the research questions and must also take into account cultural sensibilities. It is very crucial that technology be viewed as a tool, and not as a substitute, for knowing the basic principles of research.

StephenIDphoto
Stephen Giddings, a TechChange alum, has served for 25 years as a Foreign Service Officer with the USAID, retiring in late 2005. For most of his USAID career, he specialized in managing housing and urban development programs, serving in USAID offices in Panama, Kenya, Cote d’Ivoire, Russia and Rwanda, as well as Washington, D.C. During his last four years with USAID he was the Chief of the Policy Division for USAID’s Africa Bureau.

For the past ten years Mr. Giddings has been an independent consultant providing assistance to the Development Assistance Committee (DAC) of the OECD, and consulted with USAID, the International Real Property Foundation (IRPF), among other international development organizations. He serves on the Development Issues Committee of the USAID Alumni Association and is Co-Chair of the Africa Work Group in the Society for International Development’s Washington, D.C. Chapter (SID-Washington). Prior to his USAID career, Mr. Giddings managed low-income housing development programs at the U.S. Department of Housing and Urban Development and was Director of Planning and Development at the Boston Housing Authority. Mr. Giddings received a BA in political science from Wesleyan University and an MPA degree from the Maxwell School of Citizenship and Public Affairs at Syracuse University.

——-

Hope you enjoyed our second installment of our M&E Professionals Series! Don’t forget to follow our blog for the next post in the series!

Interested in engaging in similar conversations with M&E professionals like Stephen and Janet? Join us in our upcoming course TC211: Technology for Data Collection and Survey Design that starts on October 19. If you want the whole package, you can join our second session of our Tech for M&E Diploma program

We are excited to start our fourth iteration of our most popular online course, Technology for Monitoring and Evaluation! We wanted to ask some of our alumni who have taken the course with us what they got from the course. Here is what they had to say:

Ladislas Headshot
Ladislas Hibusu
Consultant at Zhpiego (Zambia)

This is the course that landed me a Monitoring and Evaluation consultancy job with Jhpiego as I approached the interview room with much tech knowledge and courage beyond my previous experiences.

Sahibzada Arshadullah
Sahibzada Arshadullah
Senior Manager M&E at Cowater International Inc (Pakistan)

This is a must course for the M&E practitioners, where they can get hand on experience using various latest tools and softwares necessary for data management, real time monitoring, and evaluation. Due to the ever increasing role of information technology in the development sector as well the beginning of the big data era, it has become important for M&E related professionals to exploit the latest technological advancement and equip themselves with the right tools and software to compete in the global market.

ARumsey CABI cropped
Abigail Rumsey
Content Developer (Technical Solutions) at Plantwise Knowledge Bank (UK)

The community created around this course is the most valuable aspect. There are people from all around the world sharing their experiences and knowledge, and learning together.

Niamh Barry
Niamh Barry
Global Lead on Monitoring and Evaluation at Grameen Foundation (Uganda)

This course was fantastic. The platform of engagement was the best i have experienced, you feel part of a community and it is so engaging (this is coming from someone who has lost interest in a few online courses before!). The facilitators, demos and guest speakers were well chosen. Do this course if you are just starting in Tech and M&E and if you have already started it, it will show you how much more there is to learn and inspire you to try new innovations in your work.

Robert Kolbilla
Robert Kolbila
M&E Manager, Mennonite Economic Development Associates (Ghana)

Enrolling in this course has just opened a new career path for me as development practitioner. I have been exposed to modern tools and techniques that is fast changing the face of M&E in development practice globally. I was a Nutrition Coordinator at my organization when I joined the course, and now have transitioned to M&E Manger of a $20 million project. This course has been life changing for me.

Want to be our next success story? Our next Tech for M&E online course begins next week! Save your spot now!

We have equipped around 6000 alumni with similar skills around the world in many of our other courses. To help our community grow even further, we are taking a step beyond a 4-week online course, and offering a brand new diploma program in Tech for M&E. Check it out here!

 

At TechChange, we pride ourselves in teaching our participants the crucial skills needed for a career in social good. And how do we find out what those skills are? We go straight to the source! In our new M&E Professionals series, we’ll be talking one-on-one with the pros who are recruiting for these very positions.

2015 was marked as the International Year of Evaluation, so it’s no wonder that M&E is increasingly becoming a sought after skill in many organizations today. We spoke with Michael Klein to learn more about the ideal skills a M&E professional should have and how to get them:

Mike Klein Headshot

Michael Klein is the director of International Solutions Group (ISG), a company that works with governments, U.N. agencies, international organizations, NGOs and other companies to improve the implementation of humanitarian aid and development programming projects.

Mike’s work is at the self-described intersection of ‘old-school’ M&E and ‘new-school’ ICT, working with partners to build on established M&E strategies, streamline data flows and analysis systems, facilitate access to key information, knowledge, reporting and data in a fast, reliable and secure manner.

Q: It’s an exciting time to be an M&E professional. Do you see a big need for young professionals who are trained to take on this kind of work?

M: Yes, it is definitely an exciting time to be in my field. Just looking at the types of conferences and forums being held on the subject, it’s really clear that M&E is a rapidly developing focus in our field. Specifically, I see M&E growing in two separate, but overlapping, areas.

Standard M&E careers: If you were to search job opportunities listed on Devex or Idealist, they are the typical of M&E positions you would find, ones that take a traditional approach (i.e M&E personnel are used to provide managers the with analysis and data they request.) These opportunities are certainly growing, as organizations will always need highly trained staff help to address their M&E.

Beyond the standard label of M&E: Just as in other sectors, skillsets such as analytics, knowledge management, data collection, and information sharing are highly valued in M&E, and the field is increasingly embracing individuals who have these skill sets. This is especially true for people who understand analytics, and how data can be collected, used, and analyzed.

As more and more players in the field are using new technologies and tools for data collection and analysis, a college graduate or young professional entering the field of M&E has a great opportunity to make his or her mark on the industry by leveraging his or her digital knowledge to provide guidance to some of the most prominent development organizations.

Q: What do you look for when hiring?

M: First off, I think it’s important to have a passion, an area of expertise that you enjoy. As a profession, M&E can take you in myriad directions, and it is important to identify what type of work in which you most want to engage. A strength of my team at ISG is that everyone has differing professional interests, ranging from gender equality to how ICTs can catalyze development. Having these range of specialties strengthens what we can offer clients, and when looking at new hires, I look for individuals who are already established or are on their way to becoming an expert in a field or sector.

Aside from passion and subject-matter expertise, people need to appreciate the big picture. When we look to make hires, we want an individual who understands what organizational performance means, ranging from back-office activities such as business development and marketing, all the way through to front-line programming at the field level. If you’re interested in M&E, you have to understand that organizational effectiveness is made possible by a complex interplay of many elements. Having that general appreciation for how organizations function, the types of struggles they face, and how you can improve upon their performance, are keys to success in this field.

isggroupMike with his group at ISG

Q: What does career progression look like for someone in M&E?

M: The reason I was drawn to this field is that there is no set path. Before working in M&E, I worked in Mergers & Acquisitions (M&A), which attracted me for similar reasons. When I worked in M&A, my clients represented a wide range of sectors, and I worked with management to help these organizations restructure, fundraise, find new investors, and generally position themselves to be more successful at what they were already doing (sound familiar?). Monitoring and evaluation is no different. People come to the field from a variety of different backgrounds—such as corporate finance like me, IT, agriculture, and plenty of others, including academic programs focused on M&E—and they serve clients representing the same diversity.

Because everyone enters the field with different skills, it is hard to say exactly how one progresses in the field. Someone entering the field after studying M&E at a university will likely have a fairly technical background may take a M&E support role within a larger organization and begin to assume more responsibility over the years. Whereas, if you’re transitioning into the field with an already established skill set from a different sector, you’re likely to take a different direction and provide either consultancy services or specific project-level guidance related to your expertise.

Q: What role do you think technology plays in M&E?

M: My personal take is that technology in and of itself is not necessarily transformative. Whether M&E is done on pen and paper or done using state-of-the-art IT solutions, good M&E is good M&E. However, technology has allowed organizations to quickly come up to speed with regards to implementing more robust approaches to M&E. If an organization is just shaping their M&E approach, using a technology solution can offer ease and expediency. The way most of these tools are constructed is based on industry best practices, so their structured, hierarchical approach is what we think of as good M&E. Clients who use these tools are then trained to capture their data in a systematic way that has the end in mind.

A lot of our clients initially say, “I really like that heat map,” or cite another specific visualization they see proved by an M&E tool and ask how they can create it for their program. This then launches into a discussion about how to collect data in a way that can deliver these types of reporting and highlight what is most important. It is much harder to go the other way.

Q: Any other pieces of advice you have for people considering the TechChange diploma in Tech for M&E?

M: Know how your study applies to what you want to accomplish and set some goals for yourself before you get into the nuts and bolts of the diploma program. If you come to these courses with a general idea of what skill gaps you want to address, you’re going to be very well placed to make the best use of the TechChange diploma program.

When I was taking one of TechChange’s M&E courses, I knew that I wanted to leverage the new skills I was developing to enhance my company’s marketing efforts. Thus, when I studied data visualization, I created multiple infographics for ISG, using tools that I would not have come across on my own. This was one of the best parts of the class: discovering cutting-edge tools in the field that I could utilize immediately.

———————————

That’s all for this installation in our M&E Professionals Series! Be sure to check out our Technology for Monitoring and Evaluation diploma program – deadline to enroll is September 4!

If you have taken a TechChange course, you know that the participants are all doing amazing things wherever they are in the world. Some go on to start their own organization, some collaborate with other participants for future projects, and some take what they learned in the course and apply it in their current projects. Ameneé Siahpush took our Tech for M&E online course in January and has since been leading tech integration in Trickle Up’s M&E programs.

Tell us about yourself

A: I’m a Pacific Northwesterner who moved to New York City in 2010 after spending the prior few years in Latin America. My current role at Trickle Up is Senior Monitoring & Evaluations (M&E) Officer, where I support our economic and social empowerment programs in India and Central America. My work aims to increase our understanding of sustainable livelihood development for highly vulnerable populations, including outcomes around food security, health, coping mechanisms, and social empowerment. I’m particularly interested in expanding our use of participatory methods to improve and deepen our program learnings and developing simple mechanisms for sharing knowledge across participants, partners, staff, and offices. (If you have any ideas, please let me know!)

What does Trickle Up do?

A: Trickle Up is an international NGO that works to create a world in which it is unacceptable for anyone to live in extreme poverty. In collaboration with local partner organizations, we empower and support the poorest and most vulnerable people to develop the confidence and knowledge to build sustainable livelihoods by 1) providing training, coaching, and seed capital grants to jumpstart microenterprises; 2) forming savings and credit groups to build financial capital and literacy; and 3) improving access to information and financial, health, and social services. We also provide technical assistance to other development organizations and government agencies to help them deliver social empowerment and economic programming that reaches “last mile” populations, including women, people with disabilities, and marginalized ethnic populations living on under $1.25/day in rural areas. Trickle Up currently works in India, Central America, South America, West Africa, and the Middle East.

How did you hear about TechChange?

A: My colleague at Trickle Up learned about the Technology for M&E course through a Yahoo M&E group, and quickly forwarded me the information given my interest in the topic.

Why did you decide to enroll in the Tech for M&E course?

A: I feel very fortunate to work for an organization that has invested in a robust M&E system, including the use of mobile data collection for some of our projects. However, as we scale our programs, it’s essential that we adapt our M&E systems to become more efficient and effective across an increasingly large and diverse number of partners and program participants. Integrating new technologies and tools is key in this adaptation process – yet, I knew that I needed very practical guidance in understanding which combination of technologies and tools would be best suited for Trickle Up’s current and future programs. The Tech for M&E course felt like the perfect companion for exploring these issues. It offered practical tools and resources, connection to a wide network of experts, forums to collaborate with other NGOs, and flexible access to course materials to accommodate my travel schedule. I also really appreciated that the discussions were geared towards international organizations who often work in remote, rural places where connectivity and electricity challenges must be considered in their M&E tools.

How has the course impacted your work at Trickle Up?

A: I entered the course with a deep interest in exploring technologies to increase the efficiency and quality of our M&E data. I came out of the course with the language, framework, tools, and resources to actually take the lead in designing and implementing new technologies within Trickle Up’s M&E system. Since completing the course, I have successfully added “M&E tech upgrades” into our upcoming year’s strategic plans. This includes a detailed roadmap of how we will integrate and utilize mobile data collection and a data visualization/reporting platform across all of our projects to increase access to real-time data for project management, promote cross-regional learning, and, ultimately, improve our ability to direct resources towards combating extreme poverty. Yes, it’s a very lofty goal, but one that is greatly enabled by simple technologies that help to ensure our program data is more efficiently and effectively used.

What would be an advice to other participants taking a TechChange course? How can they get the most out of it?

A: If possible, approach the course with a specific, tangible challenge that you hope to confront in your daily work. Keep this challenge in mind as you choose which webinars to attend or resources to explore, and then organize your course notes in a way that will be easily accessible in the future.

Another obvious, but important, suggestion is to be an active participant! Connect with fellow students, ask questions, follow up with presenters, experiment with the recommended tools. Luckily, the course provides a wide variety of ways to engage with the materials and people, despite being in different time zones, and everyone felt very approachable and enthusiastic. We’re all current or future tech nerds, after all.

You can join participants like Ameneé in our next Tech for M&E course in September. If you are looking to dive deeper, check out our brand new Diploma Program in Tech for M&E

About Ameneé
Amenee
Ameneé is the Senior Monitoring & Evaluations Officer at Trickle Up, where she supports their economic and social empowerment programs in India and Central America. She holds a BA in sociology and psychology from the University of Oregon and an MPA, with a specialization in international policy and management, from the Wagner School of Public Service at New York University (NYU). As an NYU Gallatin Global Fellow in Human Rights, Ameneé partnered with Global Workers Justice Alliance to conduct research on gender and migration in Oaxaca, Mexico, and has spent multiple years in Latin America, more broadly, volunteering with small-scale farmers and studying Spanish. Prior to Trickle Up, Ameneé was a Program Evaluator at Morrison Child & Family Center in Portland, OR, and a Research Supervisor at the Oregon Social Learning Center. Outside of work, Ameneé loves to play soccer, dance, and spend time in the mountains.

By Norman Shamas and Samita Thapa

In a previous post, we wrote about why global development practitioners need to be data skeptics. One of the many reasons that we need to be skeptical about the data we are collecting is the biases that are incorporated in the data. The data bias is especially significant when it comes to gender data. Women and groups that don’t identify with binary genders are largely missing or misrepresented in global development data.

Data is a crucial component of any program or intervention. It justifies the need for a specific program, show its effectiveness, and allows us to improve it through evaluation. But this important information tells us little if more than half of the population is missing or misrepresented, so we need to start looking at data with a gender lens.
data in the program cycle

Data on women raises awareness of women related issues

With 62 million girls not attending school worldwide, the U.S. government was able to justify their “Let Girls Learn” initiative. This initiative was announced in February and is aimed at making education a reality for girls around the world. USAID is one of the agencies involved in the government-wide initiative and have presented their approach with data to support it.

But there is still a problem getting good data on women. GSMA’s 2015 Bridging the Gender Gap Report highlights two systemic barriers to mobile ownership and usage for women:

  1. lack of disaggregated data and
  2. lack of focus on women as a market.

However, we need better gender data for more than just the economy. Oxfam conducted a household survey on the gendered effects of the December 26, 2004 tsunami that hit several Asian countries. Women were found to be more severely affected than men. Despite the need for better gender data in the field, it is not always happening. Lack of data on women leads to lack of awareness of issues related to women and consequently, lack of programs designed to tackle those issues.

Survey design can promote non-binary gender inclusion

The problem of gender and data bias gets even more complex when we talk about non-binary genders. Twitter, for example, determines its users’ gender based on data it analyzes from tweets. There are only two gender options: male and female, and users cannot choose to opt out from automatic gender assignment or manually choose their gender. By the simple fact that Twitter is using a gender binary of male/female, individuals who do not identify with a binary (e.g., transgender individuals) or have anatomically mixed sexual characteristics (i.e., intersex individuals) are ignored in the data.

It is important to ask questions about gender on a survey to improve interventions. Instead of restricting gender to a binary, a third option to opt-out or define oneself as ‘other’ can be instituted. When appropriate, additional questions can be used to determine whether practice and self-identification fit into pre-defined categories.

Data must represent local gender categories

It is also important to localize the survey where gender categories and practices may vary. India acts as a good case study for the difficulties in language for demographic purposes. India initially provided three gender options: male, female, and eunuch on its passport forms. However, these three categories marginalized other transgender populations, so in 2014 Indian courts changed the category of ‘eunuch’ to ‘other’ on the election ballots. This simple change in language not only promotes the human rights of India’s non-binary gender individuals, but also provides better data on its non-binary gender communities.

The hijra are a transgender community that has existed in South Asia for over 4,000 years. Along with a few ‘Western’ countries, at least four South Asian countries — Nepal, India, Pakistan, and Bangladesh — recognize a third gender in some legal capacity.

Global development is moving forward with programming for non-binary gender communities. The Swedish International Development Cooperation Agency put out an action plan for working with lesbian, gay, bisexual and transgender (LGBT) issues from 2007-2009. Last year USAID announced its LGBT Vision for Action, a general roadmap of how the donor would support non-binary gender communities. As programming for non-binary gender communities continues and increases, we need to think closely about the language we use and how we collect data on gender to create effective, data-driven interventions.

With development becoming more data driven, the data we collect and the biases we include in the data are having a larger impact. Technology can make these biases more entrenched through features like data validation. Even though data validation is important for survey collection–it limits responses to particular choices or data types (e.g., phone numbers)–it also restricts options based on the choices of the survey creator and can marginalize groups whose identities are not included or allowed as a valid option. Going forward, we need to be careful we are not unintentionally marginalizing other groups or genders with the data we collect.

Interested in engaging in similar conversations around data and tech in M&E? Join us and more than 90 other international development practitioners in our upcoming course on Technology for Monitoring and Evaluation.

Image Source: Global Forest Watch

Open data has been a popular topic in international development recently, but what about the evaluation of open data programs? We are excited to welcome Dow Maneerattana from World Resources Institute (WRI) to our upcoming Tech for M&E online course to talk about evaluations of open data movements. Before her guest expert session next week in our course, we asked her for a little sneak peek:

What does World Resources Institute do?

World Resources Institute is a global research organization that turns big ideas into action! We focus on six critical issues at the nexus of environment and development: food, forest, water, climate, energy, and cities and transport.

What is your role at WRI?

I work on one of the critical issues of our time: deforestation. We have very big ideas about how to reduce deforestation, and I have a unique opportunity to build monitoring and evaluation (M&E) processes and systems from the ground up to ensure that we are capturing results and impact, and adapt our strategy and approach over time. I am involved with WRI’s M&E advocacy, trainings on basic functions of M&E at all stages of a project, and finding strategic opportunities to learn from our work through evaluations.

What will you be speaking about in the course?

I’m excited to share my passion for evaluations (and geek out) with the course participants. I will talk about emerging trends and impacts of open data movements in developing countries and discuss our initiative evaluation design, which includes Global Forest Watch, an interactive forest monitoring and alert system and open data portal. Global Forest Watch leverages remote sensing, big data and algorithms to produce the timeliest and most precise information about the status of forest, including near real time suspected locations of recent tree cover loss.

What excites you the most about Tech for M&E?

I am most excited about the opportunity to shift the dialogue on monitoring and evaluation, which often gets a bad reputation for how it is full of confusing terminology or synonymous with donor requirements. Once you move past the superficial fear, monitoring and evaluation tools and processes are common sense solutions; and when they are coupled with appropriate technology can create one powerful, time-saving machine. It’s been fascinating to see many ways smart techies create and apply tech for M&E solutions for international development. For me, I enjoy connecting the dots and putting powerful tools in the hands of people with the least power.

We are looking forward to Dow’s session in our Tech for M&E online course that begins on Monday, April 20. Want to engage in conversations about the use of tech in M&E with guest experts like Dow? Join us in our upcoming online course on Technology for Monitoring and Evaluation.

About Dow

Dow Profile

Dow is a monitoring and evaluation manager at the World Resources Institute, a research organization that works on issues at the intersection of environment and development. She loves to empower teams to systematically capture success stories as well as learn from evaluations and often draws from her international experience as a community organizer and human and labor rights activist. Dow has worked on impact and performance evaluations of governance projects at the National Democratic Institute and on rule of law, human and labor rights advocacy, and livelihoods projects with Pact and Human Rights and Development Foundation. In her free time, you can find her playing tennis, glamping/camping, or mentoring an 8th grader through Asian American Lead.