RD 101: Responsible Data Principles


/ January 24, 2018

Adopting responsible data practices can feel daunting. The issues involved range widely, from privacy to ethics, to many other grey areas around making good judgements when it comes to working with data. Over the years of thinking about and practising responsible data ourselves, and listening to conversations happening in the community, we’ve identified a few considerations that arise repeatedly in this work. We would love to hear from you what other considerations you’ve seen in your responsible data work. Leave a comment below, or shoot us a note at hello@theengineroom.org!

Responsible Data considerations

Key elements of practising responsible data include:

  • Power dynamics: The least powerful actors in any situation are often the first to see unintended consequences of data collected about them. Processes like co-designing or ensuring that people from diverse backgrounds are involved in data collection or analysis processes can mitigate against this.

    For example, in humanitarian crises, the people from whom data is being collected hold far less power than those who are asking for their data. How could that power asymmetry affect their willingness to give their data over?

  • Diversity and bias: Considering questions like, “who makes the decisions? What perspectives are missing? How can we include a diversity of thought and approach?” can highlight blind spots, and areas where adding additional voices would be valuable.

    We believe diversity of all kinds strengthens our projects and our approach. We’ve seen projects, products and organisations suffer from homogenous staff or communities – and often, the negative effects of data are first seen and experienced by marginalised communities. We need to include those voices and provide ways to improve as a result.

  • Unknown unknowns: We can’t see into the future, but we can build in checks and balances to alert us if something unexpected is happening.

    Often, “But we didn’t know” is the first thing heard when there are unintended negative consequences of a data-related project. It’s our responsibility to think about how we can build in proxies for particularly important or impactful unintended consequences.

  • Precautionary principle: Just because we can use data in a certain way, doesn’t necessarily mean we should. If we can’t sufficiently evaluate the risk and understand the harms when handling data, then perhaps we should pause for a minute and re-evaluate what we’re doing and why.

    Technology offers us all sorts of possibilities. Not all of these are smart, and not all of these will have good effects on the world. If we’re working in social change, our priority is to respect and protect people’s rights – and that requires us to be thoughtful about our own actions.

  • Thoughtful innovation: For new ideas to have the best possible chance of succeeding – and for everyone to benefit from those new ideas and projects – innovation needs to be approached with care and thought, not just speed.

    Innovation is about finding better solutions and more effective products to better meet needs. To do that, we must first take the time to think about what those needs are – perhaps through research, perhaps in other ways. Then we must think about what possible solutions could meet those needs and, crucially, would have positive impacts (without unintended negative side-effects) on the people we’re trying to support in the long term.

  • Holding ourselves to higher standards: In many cases, legal and regulatory frameworks have not yet caught up to the real-world effects of data and technology. How can we push ourselves to have higher standards and to lead by example?

    Working in social change and advocacy means we hold ourselves to a certain set of ideals. Profit isn’t our goal – positive social change is. In many areas of the world, regulatory frameworks have loopholes that allow projects that, if we think about them again, might be considered exploitative. Different countries have very different standards of legal protections for privacy – like the strongly rights-protecting upcoming General Data Protection Regulation across the European Union.

  • Building better behaviours: There is no one-size-fits-all for RD. Existing culture, context and behaviours change the implications and ways in which data is used.

    Responsible data is not a prescriptive practice – sadly there aren’t any checklists to go through and then be considered ‘responsible’. Much of this is about building better-informed approaches to working with data – which might include regularly reviewing the decisions that were made, given new information. Practising responsible data isn’t just a task for those who directly handle data – it’s an operational issue that everyone, from leadership to staff, needs to be thinking about.

About the contributor

Zara is a researcher, writer and linguist who is interested in the intersection of power, culture and technology. She has worked in over twenty countries in the field of information accessibility and data use among civil society. She was the first employee at OpenOil, looking into open data in the extractive industries, then worked for Open Knowledge, working with School of Data on data literacy for journalists and civil society. She now works with communities and organisations to help understand how new uses of data can responsibly strengthen their work.

See Zara's Articles

Leave a Reply


Related /

/ May 17, 2019

From Consensus, to Calls to Action: Insights and Challenges From #5daysofdata

/ May 17, 2018

Why accessibility matters for responsible data: resources & readings

/ November 28, 2017

Looking back at the GDPR community call and sharing resources