Using the GDPR in civil society, data rights and social justice work

How civil society has used the GDPR to strengthen their work—from supporting gig economy organizing, to improving internal data practices

Context

The EU’s General Data Protection Regulation (GDPR) has received its share of skepticism about the challenges it could bring to nonprofits and charities—but we were interested in the other side of the story. How could civil society leverage policies like the GDPR to strengthen their work?

This past spring, our team at The Engine Room shared a call-out for stories on the positive impacts of data protection legislation. We encouraged people to share stories about the ways they had used the GDPR and similar data protection legislation: (1) as a catalyst for better data governance and responsible data practices within nonprofits, (2) to advance their fight for privacy rights, or (3) to strengthen social justice work outside of privacy-focused missions.

Over the last six months, we connected with digital rights groups, researchers, lawyers and advocates to explore this question. They spoke to us about the creative ways they were making use of the GDPR — from filing complaints against major tech players, to helping individuals take control over their personal data. We learned how civil society was using the GDPR to hold big tech accountable, to encourage developers to improve data sharing practices, and to organize workers in the gig economy. Below, we share some of our learnings, reflections and open questions moving forward.

Using the GDPR to advance privacy rights

We learned that digital rights organisations were quick to use the GDPR as a legal avenue for keeping big tech corporations accountable for their invasive data practices. Many groups in the EU focused their efforts on challenging the ad tech industry—specifically, the real-time bidding system, which broadcasts detailed information about website visitors to potential advertisers in real time. These broadcasts are an enormous data breach in violation of the GDPR, and over the last year, complaints against major ad tech players such as Google and the Interactive Advertising Bureau have been filed in over a dozen jurisdictions. These parallel complaints–filed by representatives of Panoptykon Foundation, Brave, Open Rights Group, Eticas Foundation and more—make up part of a wider campaign called Fix AdTech, to which data protection authorities are slowly beginning to show responses

The traction gained from the past year’s ad tech complaints suggests that strategic litigation may continue to be a promising avenue for leveraging the GDPR moving forward. Indeed, privacy NGOs such as noyb list legal enforcement of the GDPR as one of their key approaches to strengthening privacy rights in the EU. However, the last year has also shown how data protection laws can be a powerful tool even in the absence of strategic litigation. For instance, UK-based charity Privacy International was able to convince Android app developers to improve their data-sharing practices by publishing a legal analysis outlining how their applications fell below an acceptable standard with respect to user consent and privacy under the GDPR and the UK’s ePrivacy directive. Their work suggests that using the GDPR to put public pressure on companies can also be a relatively effective first step for introducing meaningful changes in the tech industry.  

The GDPR hasn’t just presented a new legal avenue for digital rights groups; it’s also raised an important conversation around how individuals can take greater control over their privacy and personal data. The GDPR harmonised and strengthened the rights that people in the EU have over their data, leaving fewer barriers to exercising one’s right to access, correct, move or delete personal data owned by companies like Facebook or Google, as well as stronger penalties for companies that fail to comply with these requirements.  However, exercising these rights in practice is still a confusing and technically challenging process for many individuals to navigate. Digital rights groups, such as My Data Done Right and the Data Rights Finder are helping bridge this gap by providing easy-to-use web tools that help individuals generate and file data requests to companies, as well as explain corporate privacy policies to them. 

Connecting data protection to broader social justice work

While we expected that the GDPR would be used to advance the work of privacy-oriented nonprofits, we were also keen to hear about the ways that data protection could be connected with social justice work outside of traditional privacy advocacy. Shifts in our digital landscape are changing the ways people and communities organise, and we had the opportunity to learn about interesting ways that data protection legislation could provide new avenues for collective action.

For example, we spoke to Arne Semsrott, one of the project leads of OpenSCHUFA—a public data donation project with the aim of understanding how SCHUFA, Germany’s biggest private credit bureau, calculates individuals’ credit scores. Over 4000 members of the public submitted subject access requests to SCHUFA and then donated their data to the OpenSCHUFA project. While the project had its limitations, it acted as an opportunity to build capacity amongst the public about the data access request process, and in our opinion, provided an exciting window into the possibilities of leveraging individual data rights en masse as a form of collective action.

Illustration of three people sitting in a cafe

We also saw how subject access requests could be used in the context of gig economy workers rights organising. In the UK, a group of four Uber drivers filed a lawsuit against Uber for violating the GDPR’s Right to Access, after repeated attempts to access data on the total time spent on the Uber platform as well as their GPS records. This data would have enabled them to calculate their hourly wage and make a case for what they are owed in order to meet living wage and holiday pay requirements. One of the drivers leading the lawsuit, James Farrar, has now launched a new nonprofit organisation called Worker Info Exchange, which will help gig economy drivers, couriers, and other app-based workers file subject access requests for their personal data stored by these platforms. Worker Info Exchange aims to “tilt the balance away from big platforms” towards workers, by allowing individual workers to calculate what they are owed in back-pay and holiday pay, as well as by pooling aggregate data so workers can collectively demand fairer wages and working conditions.

Finally, we saw how data protection laws could provide a possible avenue for shutting down stalkerware technologies used in domestic and intimate partner abuse.  We spoke to Cynthia Khoo, a lawyer and research fellow at Citizen Lab, about their recent policy and legal analysis of stalkerware in Canada, and how the GDPR helped inform this work. The report’s authors outline the GDPR as a model for Canadian privacy law to work towards, particularly in reference to its enforcement powers. However, Khoo also stressed that reducing an abusive technology like stalkerware to an issue of data protection risks sidestepping a necessary conversation around gender and intimate partner abuse in tech.

Improving nonprofit data practices

In the lead up to the GDPR, many nonprofits expressed concerns around how they would be impacted by the regulation. Compliance seemed daunting, technical and resource-intensive—something that smaller civil society groups might not have the capacity to take on. Larger organizations faced a different set of challenges arising from operating across jurisdictions, working on distributed teams and holding a large volume of data. Curious to learn about strategies being used by civil society organizations to adapt to the GDPR, we spoke to Oxfam GB, a global humanitarian organization. Oxfam GB was using the GDPR as an opportunity to incorporate more intentional data practices and data governance into their work. James Eaton-Lee, Head of Information Security at Oxfam GB, shared the ways awareness-raising, technical capacity-building, and nurturing internal data protection advocates within departments could together lead to a meaningful organizational culture shift. Although Oxfam GB is a large, well-resourced organization, these strategies could provide a template for smaller groups to adapt on a more lightweight scale.

Limitations and future directions

Geographic and cultural context

Given that we framed our call-out for stories around the GDPR, our desk research and interviews primarily involved civil society actors in the EU. However, over the course of writing our reflection stories, we had the opportunity to connect with actors in other regions. For example, we were grateful to speak with Francis Monyango, a tech policy researcher who discussed the state of data protection legislation in Kenya, where he is based, with us. In 2010, a new constitution was approved in Kenya, which included an article protecting a person’s right to privacy. A Data Protection Bill, which some have compared to the GDPR, was put forward in 2015 and the bill was passed in November 2019. Though some consider the bill in a mostly positive light (though long overdue), others, like Privacy International and its partners, cite improvements that could be made. Critics have also made note of the timing of the bill’s signing–coming shortly after Kenyan’s expressed dissatisfaction with the implementation of the National Integrated Identity Management System (NIIMS), also known as the Huduma Namba.

Our conversations with Moyango–together with the rapidly unfolding events around both the Data Protection Bill and Huduma Namba–illustrate the importance of understanding the complexities at play. They drive home the need to ensure analyses of data protection regimes honor the context of each legislation–data protection will look significantly different across geographic, political, and cultural contexts. We recognise that we were not able to cover the rich diversity of these contexts within the limited scope of this project and want to emphasise that conversations around the ways in which civil society can leverage data protection legislation and the fight to advance privacy rights do not end within the EU or North American contexts. We are excited to see how this conversation unfolds globally.

The GDPR and social justice efforts beyond privacy rights

When beginning research for this series, we were unclear about the extent to which data protection legislation would have an impact on broader social justice missions, outside the purview of privacy rights. We were excited to learn about initiatives like OpenSCHUFA and Worker Info Exchange, which illuminated the ways in which data rights under the GDPR might enable new forms of collective action.

However, we were curious about whether data protection legislation could support other social justice efforts, particularly in places other than the EU. Across contexts and history, we see how social movements involve mobilising and convening communities on the ground to push back against harmful state or corporate behaviour. For instance, in the United States, we have seen a recent emergence of community organising around oppressive corporate and state technology use (think: San Francisco’s recent facial recognition ban). Like the joint efforts behind the ban, social justice organising almost always involves grassroots, coalition-led efforts, that are deeply rooted in community.

We posed the question of what role data protection legislation might play in social justice organising to some individuals working at the intersections of technology and racial/economic justice work.  Ken Montenegro, a technologist and lawyer supporting social movements, reminded us that ultimately, using legislation as a vehicle for social change doesn’t serve to build the community-based people-power that is so central to movement-building, but rather centres on building state power. He added that while the GDPR has its merits, it “takes what could be a people power building possibility and turns it into a compliance fight—which puts it squarely into the hands of lawyers and providers, rather than the creators of the data points.”

Along a similar line of thought, Jevan Hutson, a researcher and policy advocate at the Technology Law and Public Policy Clinic at the University of Washington School of Law, emphasised that we should be having the important discussion on how we build diverse coalitions that centre communities at the margins when it comes to the intersection of data and social justice. This means involving, and centering, migrants, Black, Brown and Indigenous communities, poor and working class people, those who are unhoused, and  LGBTQ communities in the fight to push for progressive data and privacy rights legislation. Hutson expressed that while data protection or consumer privacy legislation plays an important role in advancing data rights, these laws “do not do the heavy lifting to address structural and historical inequities experienced by particular communities”—particularly when they fail to involve communities in their drafting. 

Seeta Peña Gangadharan, a research justice organiser in the UK, echoed these sentiments. She spoke to us in the context of her involvement with Our Data Bodies, a community-based research project looking at how digital data collection affects people in marginalised neighbourhoods. She emphasised that ‘focusing just on legal strategies can only get you so far—often the struggle to dismantle systems of oppression requires you to be more nimble, and think on a longer horizon of time.” Importantly, however, she also reminded us that social justice organising will look very different in the EU than in the United States, which “has its own unique history of racism, classism and other forms of divisiveness.”  It is not so easy to extrapolate the potential impact of the GDPR on social justice efforts from the EU to other contexts. Once again, this reiterates the need for a continuing, global conversation on the impact data protection laws would have across varied geopolitical contexts.

Conclusion

We hope this series has sparked inspiration around the potential utility of the GDPR in advancing civil society work. While our series is not meant to be comprehensive, we hope it can contribute to a growing dialogue around the role of data protection in contributing to meaningful social change. We are looking forward to hearing more about the amazing work we did not get a chance to explore, as well as initiatives that fell outside the geographical scope of this project!

Acknowledgements

In the process of researching this series, we had the incredible opportunity to connect with many digital rights groups, privacy experts, and advocates. We are grateful for their generosity in sharing their time, knowledge and perspectives with us. A big thank you to Cynthia Khoo (Citizen Lab); Ailidh Callander and Eliot Bendinelli (Privacy International); Karolina Iwańska (Panoptykon); Paul-Olivier Dehaye (PersonalData.io); David Korteweg (Bits of Freedom); Matthew Rice and Ed Johnson-Williams (Open Rights Group); Arne Semsrott (OpenSCHUFA); James Farrar (Worker Info Exchange); Francis Moyango; Seeta Peña Gangadharan (Our Data Bodies); Ken Montenegro Center for Constitutional Rights) and Jevan Hutson (Technology Law and Public Policy Clinic).