Exploring fundamental flaws behind digital consent  – an interview with Georgia Bullen

Almost a year ago, Simply Secure and The New Design Congress published The Limits to Digital Consent. The report focused on understanding whether data collection for underrepresented communities can be done ethically and if a digital consent mechanism is the best tool for doing so. We spoke to Georgia Bullen, Executive Director of Simply Secure about the report’s findings as well as her insights from her own journey working with tech policy, design and data.

Design practices, power imbalances and the risks associated with data 

Georgia’s work is rooted in understanding how people think about and respond to technology. Her experience in UX design, open data practices and transparency in addition to her engagement in the global open internet fight, have informed her approach to working with tech and data: “While I still am optimistic about what could be done [with data], I am also a lot more aware that we need to think about constraints, risks and misuse”. 

At Simply Secure, Georgia helps practitioners think about these issues and facilitates technology design that centres and protects vulnerable populations. The organisation offers design support to practitioners and conducts qualitative research shedding light on the relationship between people and technology.

Thinking about “informed consent” and people-centric data collection

Together with The New Design Congress, Simply Secure published ”The Limits to Digital Consent”. The report explores fundamental flaws behind digital consent asking platform designers and policy-makers to take them into account. The research outlines key shortcomings of digital consent and juxtaposes them with stories from activists working with underrepresented communities in the UK and the US.

Here’s a summary of The Limits to Digital Consent’s six main findings:

  • #1 The consent model for tech is outdated. The overarching conclusion of the report is the lack of broader protections allows digital consent to act as a mechanism that widens societal power imbalances. Consent models are usually designed as one size fits all solutions that don’t account for the self-assessment of individual safety, which is complex and contextually regionalized. 
  • #2 Local-first storage isn’t inherently safer for people or communities. The local-first approach to data storage recently gained popularity as an ethical way to solve some of the issues relating to the collection and storage of personal data. The report challenges this proposition as it places all the risks related to data extraction on the individual. Instead of instilling users with agency, this approach could result in harm, especially in the case of vulnerable groups of people subjected to harassment or prejudice.
  • #3 Data creation, including the potential for data creation, is silencing. The accelerating rate and scope of data collection combined with increased public awareness about the dangers posed by such datasets produce a chilling effect on those who wish to speak up. This silencing effect is more significant in the case of marginalised communities who are more likely to not use certain tools or platforms because of associated risks
  • #4 Everyone– not just members of underrepresented communities are at risk. With cybercrime on the rise, the unexplored, or downplayed risks of data creation and collection generate vulnerability for almost everyone: public institutions, companies and individuals. The unknown unknowns relating to, e.g. the development of future technologies, make the digital consent contract impossible to honour and trust.
  • #5 Ethical platform designers must consider themselves as the potential bad actor. Within the current socio-technological framework unintended harms are a real threat. For any designer, it is virtually impossible to ensure that the tools they build can remain ethical over time. Agile, humble, participatory design might be one answer to this issue.
  • #6 Participants are overwhelmed by both the potential for harm and the indifference of decision-makers. The research brought out the sentiments of caution, scepticism and fear underlying users’ approach to technology. This is an important feeling to engage with as we rethink our current digital consent mechanisms. 

For more details on the shortcomings of digital consent, you can read the full report (it is pretty compact!). You are welcome to learn more about Simply Secure’s work from their website and if you would like to follow what Georgia is currently up to, you can find her on Twitter.

Recommended links

Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?

How the use of biometrics in the humanitarian sector has the potential to put people at risk – an interview with Belkis Wille

Back in March, Human Rights Watch (HRW) published a report indicating that the Taliban control the systems holding sensitive biometric data that Western donor governments left behind in Afghanistan. We spoke to Belkis Wille, a senior researcher at HRW, about the report’s findings and how the use of biometrics in the humanitarian sector has the potential to put people at risk.

Using data can reinforce existing power asymmetries

Data is often celebrated for its potential to improve the efficiency and accountability of humanitarian agencies. But the use of data (especially biometric data) in the context of emergency response may end up reinforcing existing power asymmetries between those in need and those who distribute help. “I have seen this very transactional dynamic being created where one person is giving another person a tent to sleep in, and in return getting their iris scanned.” Belkis Wille told us about how her experiences working in humanitarian response in Yemen, Syria, and Iraq informed her perspectives on responsible data: from questioning the complexities of meaningful, informed consent in these contexts to noticing the frequent lack of accountability for the agencies and institutions working with data in these contexts. 

Intrusive data collection and inadequate protections mean real risks to the Afghan people

The latest HRW report examined six digital identity and payroll systems built in Afghanistan for or with assistance from foreign governments and international institutions. The systems contain Afghans’ personal and biometric data, including iris scans, fingerprints, photographs, occupation, home addresses, and names of relatives. Much of this data ended up in the hands of the Taliban, endangering the Afghan people. HRW’s report states that researchers have not been able to ensure that international donors such as the US government, the European Union, the UN, and World Bank have taken any steps to ensure that people whose data was collected are informed of what has happened to their data and what the consequences of that could be. Additionally, researchers note that the data could be used to target perceived opponents (and research suggests that this may already be happening).

Documenting data-related harm is an important step

Belkis believes that documenting cases of data misuse causing tangible harm is a fundamental step toward improving the sector’s data practises: “I don’t think anyone in the humanitarian space necessarily intends to do harm. If we can show how harm is being done, we can potentially very successfully advocate for better practices.” Together with her team at HRW, Belkis is working on cataloguing and publishing human rights violations resulting from the lack of data protection standards and thorough risk assessment in the case of Rohingya refugees, in Kenya, Jordan, Iraq, and most recently, Afghanistan. All of this research and reporting forms a part of HRW’s broader advocacy strategy around INGOs and state donors

Where do we go from here?

The report calls on donors to develop a set of procedures for the work in similar contexts, including improved procedures for data collection as well as the potential destruction of sensitive data. It also urges donors to mitigate risks in countries where similar systems have been built, disclose what exactly has (or has not) been done in order to protect the data, and follow up with the people affected. HRW also recommends that the United Nations High Commissioner for Refugees (UNHCR) and countries considering Afghan refugee claims take the increased vulnerabilities and risks caused by the Taliban’s capture of asylum-seekers’ personal data into account when processing asylum claims. 

If you’d like to learn more about responsible data collection in the context of humanitarian response, explore the links below. If you’d like to connect with Belkis or keep up with her work, you can follow her on Twitter.

Recommended Links:

This interview first appeared in the Mission Responsible newsletter #14. Subscribe to receive regular updates from us and keep an eye out for more interviews like this in the months to come!

The featured image is “Human Shapes” by Dr. Matthias Ripp CC BY 2.0 and was found via Flickr.

Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?