Exploring fundamental flaws behind digital consent  – an interview with Georgia Bullen


/ October 11, 2022
Photo by eberhard grossgasteiger on Unsplash

Almost a year ago, Simply Secure and The New Design Congress published The Limits to Digital Consent. The report focused on understanding whether data collection for underrepresented communities can be done ethically and if a digital consent mechanism is the best tool for doing so. We spoke to Georgia Bullen, Executive Director of Simply Secure about the report’s findings as well as her insights from her own journey working with tech policy, design and data.

Design practices, power imbalances and the risks associated with data 

Georgia’s work is rooted in understanding how people think about and respond to technology. Her experience in UX design, open data practices and transparency in addition to her engagement in the global open internet fight, have informed her approach to working with tech and data: “While I still am optimistic about what could be done [with data], I am also a lot more aware that we need to think about constraints, risks and misuse”. 

At Simply Secure, Georgia helps practitioners think about these issues and facilitates technology design that centres and protects vulnerable populations. The organisation offers design support to practitioners and conducts qualitative research shedding light on the relationship between people and technology.

Thinking about “informed consent” and people-centric data collection

Together with The New Design Congress, Simply Secure published ”The Limits to Digital Consent”. The report explores fundamental flaws behind digital consent asking platform designers and policy-makers to take them into account. The research outlines key shortcomings of digital consent and juxtaposes them with stories from activists working with underrepresented communities in the UK and the US.

Here’s a summary of The Limits to Digital Consent’s six main findings:

  • #1 The consent model for tech is outdated. The overarching conclusion of the report is the lack of broader protections allows digital consent to act as a mechanism that widens societal power imbalances. Consent models are usually designed as one size fits all solutions that don’t account for the self-assessment of individual safety, which is complex and contextually regionalized. 
  • #2 Local-first storage isn’t inherently safer for people or communities. The local-first approach to data storage recently gained popularity as an ethical way to solve some of the issues relating to the collection and storage of personal data. The report challenges this proposition as it places all the risks related to data extraction on the individual. Instead of instilling users with agency, this approach could result in harm, especially in the case of vulnerable groups of people subjected to harassment or prejudice.
  • #3 Data creation, including the potential for data creation, is silencing. The accelerating rate and scope of data collection combined with increased public awareness about the dangers posed by such datasets produce a chilling effect on those who wish to speak up. This silencing effect is more significant in the case of marginalised communities who are more likely to not use certain tools or platforms because of associated risks
  • #4 Everyone– not just members of underrepresented communities are at risk. With cybercrime on the rise, the unexplored, or downplayed risks of data creation and collection generate vulnerability for almost everyone: public institutions, companies and individuals. The unknown unknowns relating to, e.g. the development of future technologies, make the digital consent contract impossible to honour and trust.
  • #5 Ethical platform designers must consider themselves as the potential bad actor. Within the current socio-technological framework unintended harms are a real threat. For any designer, it is virtually impossible to ensure that the tools they build can remain ethical over time. Agile, humble, participatory design might be one answer to this issue.
  • #6 Participants are overwhelmed by both the potential for harm and the indifference of decision-makers. The research brought out the sentiments of caution, scepticism and fear underlying users’ approach to technology. This is an important feeling to engage with as we rethink our current digital consent mechanisms. 

For more details on the shortcomings of digital consent, you can read the full report (it is pretty compact!). You are welcome to learn more about Simply Secure’s work from their website and if you would like to follow what Georgia is currently up to, you can find her on Twitter.

Recommended links

About the contributor

Alicja is a consultant, researcher, and a participation strategist interested in technology, culture, and social change. Her experiences include setting up digital communications and participation strategies for companies, social organizations, and cultural institutions such as the LEGO Foundation, Liminal, Danish Statens Museum for Kunst, New Economy Organisers Network, TechSoup, and an artists run gallery Vi lever på polsk.

See Alicja's Articles

Leave a Reply


Related /

/ December 15, 2022

Building an embodied approach to data – an interview with Anja Kovacs

/ June 1, 2022

Seven Essential Questions for Ethical War Crimes Documentation

/ May 24, 2022

How the use of biometrics in the humanitarian sector has the potential to put people at risk – an interview with Belkis Wille