As we go digital, we collect more and more data,  partly because we can, but hopefully we are collecting it with a purpose.  Most often the purpose has two parts:

  1. Reporting to donors about what we did –what was distributed to whom or what service was provided to whom & what impact did it have.  

  2. Secondly, the data is used to fulfil various internal processes for things like audit and legal

If we are lucky, we work for organisations who also use the data to:

  1. Figure out how to make our internal processes more efficient so more of funds can be used to provide more stuff or services to those in need of assistance.  

  2. Improve the design and implementation of our services or methodology so we can have greater impact in realtime.

And while there can be all kinds of positive implications of using data in these ways, it also leads us to some unexpected or unintended places when data becomes an end in itself.

Going digital has made data cheap and visible.  And so while for years we have collected data about whom is receiving aid on paper, now that it is digital we seem to require global visibility of what was previously only needed to be seen locally, just because we can.  And now if local data is missing, we become suspicious globally.  And so our local staff start equating data with job security and so inadvertently start to demand data as necessary for aid – if you want to receive aid, you must provide verifiable personal data including your biometrics, if you don’t provide it, you don’t receive aid.  This is exemplified in the Yemen situation.

And as we use data to help us design better projects, better services, we want to know more about how those we seek to serve are using the aid we provide.  The intent is to improve the design our services, so we track people, we track how they use the cash we give them, what they buy with  it and WHERE they are, how they move, etc.  And so while our intent was to improve our services and projects, we inadvertently have built a surveillance system.  In fact, now some of our donors demand to know the names and details of who has received aid through projects they fund.

The pivot point on this data seesaw is trust and our ethics – collecting data is fine, even necessary, if we have the proper checks and balances in place that maintain the trust with recipients of aid.  Is data the end itself or is the delivery of aid?  Our principles say the delivery of aid is, but our actions are beginning to indicate otherwise.  

As we try to encourage more and more principled and purposeful data use, three A’s to keep in mind:

– Create Awareness amongst the people who the data is about regarding why we need the data, how it is used, with whom it is shared, the choices they have, the opportunities digital presents for them and how to live wisely in a digital world 

– Enable Access for people to all the data we have about them and enable them to use their data however they wish without our involvement.  Allow them to access and control it, and where possible, give the data we collect about beneficiaries to them.  

– Ensure Alternatives are available for people to refuse to provide data but still receive aid and that our frontline staff do not feel they will be accused of fraud because they provided aid without collecting data first.  

Photo by Markus Winkler

0 Comments

Trackbacks/Pingbacks

  1. Unpacking 'informed consent' | The Engine Room - […] ‘The Data Seesaw’ (Amos Doornbos) […]

Submit a Comment

Your email address will not be published.