The Role of Ethics in Using ICTs for Peacebuilding

TechChange COO Chris Neu is fond of pointing out that in social change, technology is only 10% of the equation while the rest is about the humans using that technology. That 10% is a pretty powerful percentage though, and when technology is used effectively, it can amplify voices of peace and empower local communities that want to find alternatives to violence. It’s easy to forget though that technology isn’t the most important part of any information and communication technology (ICT) for peacebuilding enterprise; it’s the people, both the beneficiaries and the peacebuilders (who can be one and the same!). Because what we’re doing with ICTs in any peacebuilding context involves asking people to share data and participate in interventions, we must be aware of the risks participants face and how to manage those risks. The problem is that we face a variety of risks at multiple different levels when using ICTs in any political environment, so what are a few things we can focus on while planning a project?

An Institutional Review Process as a Starting Point
There are a variety of simple starting points. For example, if you are an academic or affiliated with an academic institution, they require you to go through an institutional review process before you can do any research involving human subjects. This would include doing a crowdsourcing project using SMS text messaging or social media. Many institutions have some kind of process like this, so check before you deploy your project. While tedious, the process of defending your risk management procedures can help you identify a lot of problems before you even start. If you don’t have an internal review board, grab a copy of the ICRC’s “Professional Standards for Protection Work” and check your project design and risk management against the recommendations in Chapter 6.

Along with doing this kind of standard review, what are some other factors that are unique to ICTs that you should be aware of?

1) National Infrastructure and Regulatory Policy
The first is that ICTs are part of national infrastructure, and are regulated at the national level. When you use any kind of transmission technology in a country, the rules for how that data is transmitted, stored and shared are set at the national level as part of regulatory policy. If the government in the country you’re working in is repressive, chances are they have very broad powers to access electronic information since they wrote the regulations stipulating data privacy. In general locals will be aware of the level of surveillance in their lives, so do your legal homework about the regulations that people have actively or passively adapted to. These are usually titled something like “Telecommunications Act” or “Electronic Transmission Act”, and are often available for public viewing via the web.

2) Legal Compliance
If you’re going to go forward with an ICT-supported peacebuilding program after doing your legal homework, ethical practice starts at home. Unless you are reasonably adept at reading and interpreting legislation, did you have someone with a legislative or legal background interpret the privacy laws in the country you’re about to work in? Does your team have someone with expertise on the technical and policy aspects of using ICTs in a conflict-affected or high risk environment? Has the entire project team had some basic training in how ICTs work? For example, does everyone understand the basics of how a mobile phone works, how to protect sensitive data, and the implications of having people share data on electronic platforms? Before showing up in a conflict-zone and asking people to participate in your project, you should make sure your team understands the risks they are asking people to take.

3) Informed consent
What do the local project participants know about ICTs? In terms of safety, people are generally aware of what will get them in trouble. Always assume that your perception of risk in a country is under-informed, even if you’ve read the laws and done some regulatory analysis. With this in mind, if you’re going to ask people to take risks sharing electronic information (always assume that sharing electronic information is risky), do you have a process for assessing your participants’ knowledge of ICTs and then addressing any gaps through training? Do you and your team understand the technology and regulations well enough to make the risks to partners clear, and if they still choose to participate provide risk management training? Informed consent means making sure you and your partners are both equally clear on the risks involved in what ever project you’re doing.

Peacebuilding carries some level of inherent risk – after all, we’re dealing with conflict and violence. ICTs carry a unique set of risks, compounded by both the nature of digital information and the capacity for governments and conflict entrepreneurs to exploit this information. An effective ICT for peacebuilding program addresses these risks from both the legal and technical sides, so that implementers and local partners are equally informed and able to use the tools in the safest, most effective way.

Want to learn more about the ethical issues facing peacebuilders using technology? Enroll now in our Technology for Conflict Management and Peacebuilding online course which runs October 6 – 31, 2014


Image source: Tech Republic


  • @schultjen

    Thank you for an article on this important issue! I am also working to spark much-needed discussion, raise awareness & development capacities on responsible data practices, privacy as a human right, & digital security in humanitarian work. New technologies & behaviors bring new risks. That said, I’d like to share two comments on the article:

    1) Humanitarians need structured risk assessment methods to identify & mitigate not only digital privacy risks, but also assessment of (related & mutually reinforcing) operational/physical, & psychosocial risks. See risk assessment tool examples in the Tactical Technology Collective/Frontline Defender “Digital Security in Context” guides (for LGBTI communities in Arabic region & sub-Saharan Africa, and for environmental defenders in Sub-Saharan Africa) here:

    2) It’s important to move beyond talking about “repressive national governments” to recognize that the NSA has “broad powers to access electronic data” in all but four countries on Earth. With post-9/11 mass dragnet surveillance; attacks on freedom of expression, activists, whistleblowers, journalists & sources; extrajudicial renditions & killings; & vast “information sharing” between the 5-eyes & national intelligence agencies–basically all but 4 countries globally are colluding & exploiting personal data captured through repressive surveillance. With the US government stance that the “world is a battlefield,” expansion of the “War on Terror,” & use of “military humanitarianism,” populations affected by humanitarian crises are being exploited in the name of over-reaching “counter-terrorism” intelligence, “increasing collaboration” between agencies, “efficiency,” commercial & economic exploitation, & political targeting for rights abuses.

    There is no 100% security. Still, humanitarians need to use structured methods to assess risks & plan mitigating steps to address high priority, probable digital/physical/psychosocial security risks in a given context to keep themselves & the people they serve, safer.