After a week of reflective blog posts, 90 minutes of thoughtful discussion at a DC Tech Salon, and a Twitter chat, it’s clear that responsible data is no longer an optional element of effective and ethical digital practice – it’s critical.
USAID’s Considerations for Using Data Responsibly launched this week. The team behind the Considerations designed them with the intention to support USAID staff and partners to have better dialogue on the subject of responsible data. The Considerations are also a welcome endorsement of the importance of responsible data, which for many years has been working its way from the fringes of development and humanitarian tech conferences to the plenary stage.
With this new feeling of consensus comes the need for hard conversations. These hard conversations center around everything from limited capacity to operationalise the growing number of guides put out by actors such as USAID and UNOCHA, unclear legal requirements, and the lack of digital literacy that would better allow staff to work responsible data into existing systems and processes.
Your values should shape your responsible data approach
Siobhan Green’s blog on What Does Responsible Data Look Like and the Considerations Guide set out the tensions between:
- Protecting privacy and security
- Leveraging the potential of data for development aims
- Supporting transparency and openness
Part of resolving these tensions includes conducting a holistic assessment that balances the risks and benefits of collecting data and plans for transparency where it is possible and safe to do so.
As a global alliance seeking to support the use of digital technologies and data for development aims, the Digital Impact Alliance (DIAL) feels these tensions keenly. We talk about responsible data use – finding ways to realise, where possible, the potential of digital to deliver services and support communities, while doing so ethically and legally. For us, our duty to help use digital to deliver on the Sustainable Development Goals and tackle a growing humanitarian caseload, means that we will experiment and share what we learn, as we have done in our Malawi Data for Development project and in our work with Flowminder to develop FlowKit, an open-source software that analyses anonymised mobile data to improve the effectiveness of disaster and emergency response.
At the latest Tech Salon, several speakers and participants underlined the need to link your responsible data approach to your culture and values. We need to decide who we are, as development actors. We need to understand our own values as organisations and as individuals and how they must shape the decisions that we must make. While national regulation sets the stage, privacy and data law in most countries is patchy, confusing and behind the times. These laws don’t force us to adhere to the higher standard implied by the codes of ethics and good practice that most of us signed up for. As stewards of the Principles for Digital Development, DIAL wants to go beyond the requirements of national law and instead take a human-centred approach to understanding the context in which data is created and collected, while assessing the risks of harm arising from its use.
Understanding these risks accurately and holistically means understanding the harms that can arise from data. The ICRC’s excellent Humanitarian Metadata Problem report is a step in the right direction. However, in order to holistically assess these risks, we need to involve community stakeholders in program design, implement good feedback mechanisms, and continue to build taxonomies of potential harms from data use.
If we are to make responsible data real, we must address clear gaps and shortfalls
Good intentions and situational awareness alone will not operationalise responsible data. As speakers at the latest Tech Salon made clear, we need systems and processes that support good digital practice and enhance compliance and reporting. That requires resources, including people. In their blog, DIAL’s Syed Reza and MERLTech’s Linda Raftree set out a clear call to action for donors: funders should increase their own capacity and awareness in this area, and invest holistically in grantees’ operational and human resources for this work. As the authors note, ‘[s]afeguarding of all kinds takes resources!’
Ultimately, Responsible Data is just good data practice – and good development and aid practice
For me, this issue comes back to established practice in development and aid. Understanding the theory of change behind your use of data requires good situational awareness, acknowledging assumptions and risks, and resolving them sufficiently to demonstrate that your proposed approach will help, not harm. One of the Twitter polls put out during the Twitter Chat showed that – while transparency and accountability were challenges in data work – the most votes went to an unexpected concern: usefulness.
However responsibly we are collecting data, a significant proportion of respondents found that it wasn’t always useful. This poses larger questions about our motivations for collecting the data in the first place. Cost-benefit analysis of our action is critical. As the post on informed consent notes, ‘Don’t use “innovation” as an excuse for putting historically marginalised individuals and groups at risk.’
Now that there is a real consensus on responsible data, what comes next? Conversations like this one, and spaces like Tech Salons, the responsible data mailing list and other similar spaces are critical to allowing frank feedback and expression of needs and challenges by all stakeholders. We must both utilise the actionable advice given to us and continue to challenge and hold each other accountable so that we maximise the impact of our development efforts.