By Nick Martin, Founder and CEO of TechChange

From the impact of AI on the human rights of marginalized populations, to its potential to help (and exacerbate) the climate crises, even to using AI tools for public health writing, artificial intelligence was on EVERYONE’s agenda at the recent Global Digital Development Forum, a signature TechChange event. This included the launch of a brand new tool at the conference’s closing keynote, which featured speakers from USAID, the Government of Canada, the Global Center for Responsible AI, and more. 

That tool is The Global Index on Responsible AI, the first global report to establish benchmarks for responsible AI and assess those standards across 138 countries, providing an overview of how nations are–or aren’t–addressing the ethical, social and regulatory challenges of AI. 

As Sabeen Dhanani, Deputy Director for the Technology Division of USAID, said in her opening remarks on the launch: “The fact is that we are all working in real time to determine the right approach to AI, and in order to chart a path forward we need to see where we currently stand.” 

The Global Index on Responsible AI is taking on that challenge, and its findings were likely not entirely surprising to anyone who’s been paying attention. Here are a few of the top takeaways from the report that stood out to me:

Key issues of inclusion and equality in AI are not being addressed

Governments are largely not prioritizing issues of inclusion and equality, which means that existing gaps may be exacerbated by its spread. But civil society (including researchers, development organizations, and universities) are helping to draw their focus. I like to think that the conversations and workshops at GDDF add much-needed fuel to this fire. 

There are major gaps in ensuring the safety, security, and reliability of AI 

This one’s more about the technical security of AI– something I confess I hadn’t thought much about. And, according to the report, only 38 countries have taken any steps to address the safety and reliability of AI systems. This is a major oversight, and one that will take the work of committed professionals in diverse contexts to remedy. 

There’s still a long way to achieve adequate levels of responsible AI worldwide

Nearly SIX BILLION people are living in countries where there are not adequate policies or oversight to protect their human rights from AI. AI can go instantly global, so even if it’s developed in a place with regulation, it can quickly reach more vulnerable groups in settings where there is no such oversight. And don’t get it twisted- the majority of countries have yet to take any actions to protect the rights of marginalized people in the context of AI. 

Politico covered the launch of the Index at GDDF

There’s so much more to read and learn from this interdisciplinary report, which took a global team of researchers three years to develop. I encourage you to take some time and read it for yourself. As Dr. Adams said in the panel: “We know the risks are real, and that they threaten the very fabric of society, from misinformation weakening democracy to exacerbating inequalities of vulnerable groups.” 
Some of those risks were made painfully clear in another GDDF workshop, “AI through Feminist Lenses Workshop: Reimagining Tech with Popular Education” where a team of researchers from the Data-Pop Alliance shared excerpts from books and movies about algorithmic racism and misogyny, including some pretty upsetting things, like facial recognition issues for black skin and pornographic deepfakes. But at the end of the session the speakers gave attendees the opportunity to express their feelings and reactions to these distressing issues using an image generation algorithm.

I love the meta-aspect of this– using AI to help us express our very human reactions to the dangers presented by AI. Plus, it was a neat trick to make an engaging hybrid workshop– which I’m always a fan of. We might even have to add that one to our hybrid playbook. 

One of the AI images created in the Feminist Lenses on AI workshop, in response to the prompt “woman looking at herself in the mirror and seeing the reflection of a robot”

This is only the beginning of the rise of AI, and it’s pretty clear that we don’t know where things are going to go with it. But I for one am glad that there are new tools to track its use and oversight, and smart, socially conscious professionals working with them. 

As Gloria Guerrero, the Executive Director of the Latin American Alliance for Open Data, said in the closing GDDF workshop on Responsible AI: “the race to govern AI isn’t about getting there first, but making it work for everyone.”

If you want to watch these sessions from GDDF, it’s not too late. The power of a TechChange event isn’t just the people in the room, it’s spreading the incredible digital development community to an inclusive environment around the world. 

By Norman Shamas and Samita Thapa

In a previous post, we wrote about why global development practitioners need to be data skeptics. One of the many reasons that we need to be skeptical about the data we are collecting is the biases that are incorporated in the data. The data bias is especially significant when it comes to gender data. Women and groups that don’t identify with binary genders are largely missing or misrepresented in global development data.

Data is a crucial component of any program or intervention. It justifies the need for a specific program, show its effectiveness, and allows us to improve it through evaluation. But this important information tells us little if more than half of the population is missing or misrepresented, so we need to start looking at data with a gender lens.
data in the program cycle

Data on women raises awareness of women related issues

With 62 million girls not attending school worldwide, the U.S. government was able to justify their “Let Girls Learn” initiative. This initiative was announced in February and is aimed at making education a reality for girls around the world. USAID is one of the agencies involved in the government-wide initiative and have presented their approach with data to support it.

But there is still a problem getting good data on women. GSMA’s 2015 Bridging the Gender Gap Report highlights two systemic barriers to mobile ownership and usage for women:

  1. lack of disaggregated data and
  2. lack of focus on women as a market.

However, we need better gender data for more than just the economy. Oxfam conducted a household survey on the gendered effects of the December 26, 2004 tsunami that hit several Asian countries. Women were found to be more severely affected than men. Despite the need for better gender data in the field, it is not always happening. Lack of data on women leads to lack of awareness of issues related to women and consequently, lack of programs designed to tackle those issues.

Survey design can promote non-binary gender inclusion

The problem of gender and data bias gets even more complex when we talk about non-binary genders. Twitter, for example, determines its users’ gender based on data it analyzes from tweets. There are only two gender options: male and female, and users cannot choose to opt out from automatic gender assignment or manually choose their gender. By the simple fact that Twitter is using a gender binary of male/female, individuals who do not identify with a binary (e.g., transgender individuals) or have anatomically mixed sexual characteristics (i.e., intersex individuals) are ignored in the data.

It is important to ask questions about gender on a survey to improve interventions. Instead of restricting gender to a binary, a third option to opt-out or define oneself as ‘other’ can be instituted. When appropriate, additional questions can be used to determine whether practice and self-identification fit into pre-defined categories.

Data must represent local gender categories

It is also important to localize the survey where gender categories and practices may vary. India acts as a good case study for the difficulties in language for demographic purposes. India initially provided three gender options: male, female, and eunuch on its passport forms. However, these three categories marginalized other transgender populations, so in 2014 Indian courts changed the category of ‘eunuch’ to ‘other’ on the election ballots. This simple change in language not only promotes the human rights of India’s non-binary gender individuals, but also provides better data on its non-binary gender communities.

The hijra are a transgender community that has existed in South Asia for over 4,000 years. Along with a few ‘Western’ countries, at least four South Asian countries — Nepal, India, Pakistan, and Bangladesh — recognize a third gender in some legal capacity.

Global development is moving forward with programming for non-binary gender communities. The Swedish International Development Cooperation Agency put out an action plan for working with lesbian, gay, bisexual and transgender (LGBT) issues from 2007-2009. Last year USAID announced its LGBT Vision for Action, a general roadmap of how the donor would support non-binary gender communities. As programming for non-binary gender communities continues and increases, we need to think closely about the language we use and how we collect data on gender to create effective, data-driven interventions.

With development becoming more data driven, the data we collect and the biases we include in the data are having a larger impact. Technology can make these biases more entrenched through features like data validation. Even though data validation is important for survey collection–it limits responses to particular choices or data types (e.g., phone numbers)–it also restricts options based on the choices of the survey creator and can marginalize groups whose identities are not included or allowed as a valid option. Going forward, we need to be careful we are not unintentionally marginalizing other groups or genders with the data we collect.

Interested in engaging in similar conversations around data and tech in M&E? Join us and more than 90 other international development practitioners in our upcoming course on Technology for Monitoring and Evaluation.

On this year’s International Women’s Day, we recognize the important work our alumni and partners are doing to empower women and girls across the world.

At TechChange, there are few areas we see this empowerment happening than in the field of family planning and reproductive health. As we’ve seen in our mHealth online course and community, many organizations are doing fantastic work in this area including the UN Foundation and MAMA, D-Tree International, FHI 360, Jhpiego, John Snow Inc., and more.

We’re hoping to further explore the issue of gender in global development programs and technology in our upcoming online course on Gender.

Use the coupon code, IWD2015, by this Friday, March 13, to get $50 off any TechChange open online course such as mHealth, Mobiles for International Development, Gender, and more.