A Year’s Reflection on Responsible Data


/ April 18, 2016
A year ago, I took a sabbatical from Amnesty International to manage the Responsible Data Program at the engine room. The year is coming to an end, and April 22 will be my last day with the Responsible Data Program. This will have no effect on the program’s projects and partnerships, all of which will go ahead as usual. In future, please send any enquiries to Zara (zara@theengineroom.org).

When I started at The Engine Room, I thought I knew what ‘responsible data’ meant. I was wrong.  Despite some communally developed definitions, I don’t think any of us really know what it is, what it means to other people, where the boundaries are, how it should affect our work, or the best ways to talk about it.

My take-home message is this: responsible data issues are not new. We – the social change community – have been struggling with the ethics of research, verification, representation, consent, non-maleficence, gender and identity mainstreaming for a very long time. In reality, these issues have not been ‘solved’. Quite the reverse – there is still a very real struggle to ensure social change work does not revert to the pornography of poverty.  

As a concept, privacy can help to mitigate unlawful or unethical data collection and use. But in reality, the individuals behind the data are often marginalized, excluded and underrepresented. Respect for and protection of privacy is often the first thing to be sidelined in these cases. Privacy might not be acknowledged or valued at a community level; the individual might not know their rights; or national privacy protection frameworks might be poorly developed. Maybe privacy simply flies under the radar, or is minimized in the face of greater humanitarian priorities. The end result is the same: their right to privacy is not respected.

I am concerned that the rapid addition of new and emerging technologies and data types will only fan the flame. Technologies and data are a classic double-edged sword: they offer us the ability to document, learn and innovate, while also vastly expanding our ability to violate privacy rights, unearth new risks, and basically cause harm.

The dichotomy I face is the difference between innovation and experimentation. Everyone wants to ‘innovate’. Novel solutions are important because existing social change systems are often inefficient and poorly aligned. And a key part of experimenting is accepting failure. But in our work, we are experimenting with people’s lives. Failing in our experiments can and does have real-world consequences.

Regardless of sector – be it human rights, humanitarian or development – we must balance innovation and experimentation. Understanding that new technology and data types improve social change work, but can also be used to protect the sanctity of individuals’ data.

I’m impressed by the diversity of people who are ready to admit what they don’t know, and are willing to start a conversation about responsible data. We should engage anyone involved in social change work – not just technologists and data-wranglers. We need diverse backgrounds to bring knowledge and perspective to the discussion.

Let’s take advantage of the fact that we live in a world replete with ‘data’, where every organization uses technology to collect, transfer, store, manipulate, publish and communicate. Responsible data offers a platform for applying new tools and techniques while improving ethics of social change work. Now, it is time to hold our institutions to account – and time to hold ourselves to account.

About the contributor

Danna is a feminist human rights lawyer with over 10 years experience working with local and international organisations all over the world on issues ranging from migrant worker rights to data ethics, including with organisations such as the Red Cross and the Organisation for Security and Cooperation in Europe. She specialises in human rights defenders support and protection, and has developed participatory tech approaches like the Panic Button application for activists at risk. She was previously Responsible Data Program Manager at The Engine Room and currently works as Research and Policy Adviser on Technology & Human Rights at the Amnesty’s London office. She is particularly interested in rethinking what informed consent looks like in a digital age, and in supporting the responsible use of data in human rights.

See Danna's Articles

2 thoughts on "A Year’s Reflection on Responsible Data"

Leave a Reply


Related /

/ July 26, 2022

Takeaways from our Community Call on Responsible Data for Social Justice Organisations

/ February 15, 2022

Responsible Data Community Retrospective series: 12 articles about the state of the Responsible Data work

/ March 2, 2021

Curating connections in the Responsible Data community