7 Ethical Considerations in Humanitarian Data

by | Mar 16, 2021 | ICT4D |

ethical considerations

As a provider of humanitarian aid and development activities, NGOs often have privileged access to populations. This special access comes from a number of factors, not least of which including: Perception that we are there to help, not harm. Common associations of humanitarians around neutrality and impartiality and so on.  This is applicable both to how we treat people in person as well as how we treat their data and information.

Fundamentally, there is a question of what rights does the beneficiary have in relation to us when it comes to digital registration?  And what risks (short term and long term risks) are we exposing the beneficiaries to?  One way to illustrate this is to ask how will we respond if the beneficiary does not want to be digitally registered?  Will they no longer qualify for our programmes?  Or do we have alternatives options enabling the person (or household) to be part of our programme but not digitally registered?

Back in 1986, Richard Mason published a paper titled Four Ethical Issues of the Information Age. It is still widely quoted and referred to today. In an attempt to articulate some of the key ethical considerations we need to consider as we handle data and information about vulnerable people, I’ve built on this paper’s foundations to create the 7 below.

1. Privacy and Surveillance

What information about one’s self or one’s associations must a person reveal to others, under what conditions and with what safeguards? What things can people keep to themselves and not be forced to reveal to others? Can we switch to requiring the org to be able to prove beyond doubt they have obtained consent and demonstrate that consent was expressed?[1]

  • Going Digital results in increased surveillance[2] and potentially leads to surveillance capitalism[3][4].  Are we using the data we collect to improve the aid we offer or are we, or third parties, using it for profit?
  • Do we understand the algorithms being used by the technology we use?

2. Accessibility and Accuracy

Who (person or an organisation) has the right or a privilege to access the information? Under what conditions and with what safeguards?  With whom can it be shared and what rights does the person, about whom the data is, have?  What risks are we exposing others and ourselves to?  Who is responsible for the authenticity, fidelity and accuracy of information? Similarly, who is to be held accountable for errors in information and how is the injured party to be made whole?

  • Oppression.  Data is a reflection of life and life is not equal.  It is full of bias, racism, and oppression and data reflects this.  So while data may be objective, it reflects the bias and oppression in life, which impacts how the data is used.  
    • Who does it leave out? Is the project designed for the majority or a minority?
    • Has the project considered the bias in project team, in project data (sources), and in how you collect the data?
    • Is the use of data and technology non-discriminatory?
    • Is there any prejudice or favouritism in the data, analysis, or model? Gaps or omissions in data?
    • Who has written the algorithms and Code – what are their assumptions about the populations with whom we work?
    • Are the algorithms understood? (e.g. can someone not directly involved in the process explain what is happening?)
    • Is the model (or underlying data) codifying the current state of the world and thereby making it harder to change? (e.g. are we building models that perpetuate or encore past mistakes)
    • What assumptions are being made and what is the basis for those assumptions?
  • Third Party access. By sharing data with third parties, we give them the same power in the lives of beneficiaries that we have.  Do we know how third parties will use the data we share with them?  Do the people about whom the data is know? Is there consent to share the data from the people about whom the data is? Is there a clear data sharing model? Does it travel one way, two way, or multidirectional? What is the ethos of the partners and does it ‘fit’ ours? Are you planning to publish any of the data? Under what conditions?

3. Ownership and Portability

Is data and information something to own? Something separate from our personhood?  If so, who owns information? What are the just and fair prices for its exchange? Who owns the channels, especially the airways, through which information is transmitted? How should access to this scarce resource be allocated? Can the person about whom the data is use the data we collect about them in other areas of their lives without our involvement?  Are the rights to the data and related insights derived clearly defined? (e.g. is it clear how decisions are made regarding how and by whom the data can be used, how problems in or related to the data are rectified, and other related issues?)  Is it clear who ‘owns’ the data and the insights?  And who has access to it, why, and for what purpose? 

4. Power

Knowledge is power, and data feeds knowledge. And we demand data in exchange for aid.  We must understand and take seriously the power dynamics and differentials between us and the beneficiaries – we are not two ‘equals’.  By collecting and storing beneficiary data, we place our organisation in a position of power over them; And that by using beneficiary data in any way—even for appropriate humanitarian purposes—we directly exercise that power in the lives of those we serve.  Therefore, the onus is on us.  Why do we need their data? With whom will we share it? Why? What viable alternative options do they have? to ensure awareness and understanding about why and how we will use their data. 

  • Can a project participant or stakeholder complain about the process?
  • Has someone in your team been appointed to respond to requests or feedback?
  • If the project engages people online, are there clear protocols in place about how to do that? 
  • Is the project ensuring communities know how to live wisely in the digital world?

5. Consent

When collecting personal data about people there are two prevailing legal frameworks used – consent or legitimate interest.  With consent the liability rests with the person ‘giving’ consent, while with legitimate interest the liability remains with the collecting organisation.  However, regardless of the legal basis used, there is a duty of care the data collector has to ensure the person about whom the data is collect has clear awareness of why and what rights they have.  Therefore, we need to improve our awareness raising and always check for understanding before we collect data.   

  • Do the people involved in the project understand:
    • What data about them is needed for the project, why, how it will be used, who will have access, with whom it will be shared (and why), and when it will be deleted
    • Be transparent not only about purpose for data collection, but also about the project design, analysis tools, and algorithms you are using
    • How are you ensuring more vulnerable individuals or groups understand?
  • Might any part of the data collection be perceived as coercive by the participants or their community, and how will this be addressed?

6. Context

Data has context – social, structural, and societal[5].  We forget this at our peril. Understanding the context of the data and of where/when it was collected impacts how we analyse it.  In addition to this, there are contextual questions about what legislation, policies, or other regulation shape how we use data (collection, processing and use)? What requirements do they introduce?

  • Consider: the rule of law; human rights; data protection; IP and database rights; anti-discrimination laws; and data sharing, policies, regulation and ethics codes/frameworks specific to sectors (eg health, employment, taxation).

7. The reality of ill-intent

We must understand that there are many entities in the world—in some cases very powerful ones—who want the data that we have or can gain access to. These entities might be governments, militaries, religious organisations, companies and corporations, and special interest groups of all kinds. Moreover, we must recognise that many of these would, if given the opportunity, use beneficiary data in ways which do not reflect humanitarian values. Ways in which third parties might use beneficiary data fall on a continuum which ranges from simple non-alignment with humanitarian principles, to things which actively seek to harm those whom we try to help.


[1] https://datagovernance.org/files/research/1606218143.pdf

[2] https://en.wikipedia.org/wiki/Surveillance

[3] https://en.wikipedia.org/wiki/Surveillance_capitalism

[4] https://www.thenewhumanitarian.org/opinion/2021/2/22/the-case-against-humanitarian-cash

[5] See ‘Structural context of technology access (SCOTA) framework’ in Figure 5.1 in https://opendocs.ids.ac.uk/opendocs/bitstream/handle/20.500.12413/15928/IDS_Working_Paper_545.pdf?sequence=1&isAllowed=y

Photo by Pietro Jeng

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *