This post is by Zara Rahman and Danna Ingleton from the engine room.
Through the various Responsible Data Forum events over the past year, we’ve heard many anecdotes of responsible data challenges faced by people or organizations. These include potentially harmful data management practices, situations where people have experienced gut feelings that there is potential for harm, or workarounds that people have created to avoid those situations.
But we feel that trading in these “war stories” isn’t the most useful way for us to learn from these experiences as a community. Instead, we want to build a structured, well-researched knowledge base on the unforeseen challenges and (sometimes) negative consequences of using technology and data for social change. We hope that this can offer opportunities for reflection and learning, as well as helping to develop innovative strategies for engaging with technology and data in new and responsible ways.
So we’re starting a new project: Responsible Data Reflection Stories. This collects real-life examples of the risks we all face when using data in our advocacy work, and strategies to overcome these challenges.
What are we trying to do?
This project has two main aims:
- To better understand and document the challenges, risks and (in some cases) harms related to technology and data-driven projects.
- To explore and record strategies and tools that have been used to meet those challenges and mitigate harm.
The series of Reflection Stories will document the conditions under which responsible data challenges arise; the real consequences for the people, organizations and projects involved; and the organizational or communal responses to these situations.
We hope that collecting the stories will help transcend traditional sectoral and organizational silos to create a culture of openness and shared learning – which will ultimately benefit the (data) subjects and beneficiaries of our work. They will provide an evidence base from which the community can develop research agendas, project plans, risk mitigation planning for projects, minimum standards and strategic planning for funders.
We will, of course, be recording these stories with a keen awareness of the potential negative consequences that publishing them could entail. We will de-identify published information (as far as it is possible to do so in an effective way) to ensure that the stories are not perceived as ‘naming & shaming’ exercises, and that they contribute to our collective intelligence.
Our experience suggests that raising awareness about responsible data issues and corresponding practice needs strong, clear and concrete examples about what can and has gone wrong when working with technology and data. We want to begin addressing this need with the Reflection Stories project.