Building Collective Momentum to Challenge the Ad Tech Industry

/ June 27, 2019

We talked to the Panoptykon Foundation to learn how digital rights organizations are changing the ad tech ecosystem with the help of the GDPR.

This story is the first of a series on how civil society organisations and activists are using the GDPR (and similar data protection legislation) to advance rights and strengthen their work. It was co-written by Lorraine Chuen and Laura Guzmán. Keep an eye on this space for future stories!


The Issue
Research conducted in late 2018 by Privacy International found that a number of popular Android apps—including Spotify, Duolingo, and TripAdvisor—automatically transferred personal data to Facebook the moment users opened the app, without user consent. They found this occurred even if a user did not have a Facebook account.

The Role of the GDPR 
Privacy International conducted a legal analysis of this data-sharing in a report published in December 2018, which outlined how applications fell below an acceptable standard with respect to user consent and privacy under the EU’s General Data Protection Regulation (GDPR) and the ePrivacy directive. Privacy International reached out to applications directly to share their findings. The report also gained attention from the press, and in response, a number of popular apps updated their code in the following months, and no longer automatically share the data in question with Facebook.

Key Takeaways
Filing complaints to data protection authorities can be an effective litigation tool, but it is not always necessary to use the GDPR in this way to push companies to change their data practices. For Privacy international, publishing a legal analysis and reaching out to applications directly—alongside public pressure from the media—proved to be a relatively fast and effective first step for convincing many tech companies to improve their practices around consent, data-sharing and user privacy.

The invasive world of ad tech

Our experience on the Internet today is shaped by an opaque, invasive and quietly booming advertising technology (ad tech) industry. The online advertising industry is built on a system of profiling and real-time bidding. Browser cookies collect information about a user, which, over time, makes up a rich profile about them that can be used to target ads. When a user visits a website with ad space, this detailed profile built from their browsing history, IP address, and/or location is broadcast in real-time to hundreds of potential advertisers, who then bid for ad space based on what they know about the user. 

The information contained in user profiles can be quite invasive. In some cases, users are labelled with sensitive categories based on their religious beliefs, ethnicity, sexual orientation, or private health conditions, putting their rights and freedoms—such as the right to freedom from discrimination—at risk. People are typically the targets of such bid requests without their awareness or consent. There is little oversight on how this data is used, and data brokers in this ecosystem are known to use information for other purposes besides targeting ads. 

Panoptykon: Investigating Ad Tech in Poland

The way the ad tech industry currently works raises serious privacy concerns, and several digital rights groups have begun exploring how the GDPR—the EU’s General Data Protection Regulation—could change the landscape of online behavioural advertising (OBA) in the EU. One of the teams working on this issue is the Panoptykon Foundation, a Polish privacy rights NGO.

Panoptykon focuses on protecting digital rights in our growing surveillance society. Their Executive Director, Katarzyna Szymielewicz, played an early role in European data protection reform, and the organization was curious to see how this much-awaited legislation might affect, or strengthen, their privacy mission. Following the implementation of the GDPR in May 2018, Panoptykon launched a seven-month investigation into the OBA ecosystem in Poland. 

They started their investigation by exploring what information was contained in the user profiles created by ad tech companies, with a particular interest in learning more about the marketing categories people are placed in. Leveraging their right of access (Art. 15) under the GDPR, members of the Panoptykon team sent subject access requests to over twenty online advertising brokers, requesting individuals’ marketing profiles. These brokers operate in notoriously opaque ways, collecting and selling extensive data, but sharing little information about their inner workings. Of the several requests filed, the team received only two partial datasets in return. The others either never responded or refused due to an “alleged difficulty with their identification.”   

Under the GDPR, these subject access requests should have been honored. As the responses came in, the Panoptykon team saw how there was no standard for how to respond; the GDPR hadn’t changed anything about how the tech ecosystem operated. They contemplated taking the brokers to court, filing complaints for non-response. Before they had the chance to do so, they were approached by a team from the private web browser Brave.

Teaming up to file complaints against Google and IAB 

While Panoptykon was investigating the ad ecosystem in Poland, other groups in the EU were doing similar work. In September 2018, Dr. Johnny Ryan (Brave), Jim Killock (Open Rights Group), and Michael Veale (University College London) filed simultaneous complaints against several ad tech companies to data protection authorities (DPAs) in the UK and Ireland. Their complaints outlined how ad tech companies, including hugely influential players such as Google and the Interactive Advertising Bureau (IAB), violated the GDPR by broadcasting individuals’ intimate personal data to attract real-time bids from advertisers. Because these broadcasts are not protected against unauthorized access to user data, they constitute a large-scale, unlawful data breach under the GDPR. 

After being approached by Brave, Panoptykon joined these complaints, using the GDPR to file complaints against Google and IAB Europe to the Polish DPA in January 2019. The complaint positioned Google and IAB as data controllers, given their role as highly influential players that set standards around how the OBA ecosystem works. The Polish DPA recently decided that Panoptykon’s complaint was a cross-border issue, and referred the complaint to DPAs in Belgium and Ireland, where IAB and Google’s European headquarters are located, respectively. The cross-border nature of the complaint points to the potential for this effort to impact the ad tech ecosystem in the EU more broadly: Ireland’s Data Protection Commission has since opened an investigation into Google, and a decision on whether major players in ad tech like Google will face fines will be made this summer. 

Building collective momentum against the ad tech industry 

These complaints now make up part of a wider campaign called Fix AdTech.  In May 2019, parallel complaints were filed to DPAs in Spain, the Netherlands, Belgium, and Luxembourg. These complaints, again, focused on the data breach associated with real-time bidding and were filed by Gemma Galdon Clavell (Eticas Foundation) and Diego Fanjul (Finch), David Korteweg (Bits of Freedom) and Jef Ausloos (University of Amsterdam), Pierre Dewitte (University of Leuven), and Jose Belo (Exigo Luxembourg), respectively. Earlier this June, the Civil Liberties Union for Europe filed a series of complaints around real-time bidding to DPAs in nine additional jurisdictions. By raising similar complaints in multiple countries, DPAs will be encouraged to cooperate as they interpret the complaints and apply the GDPR. 

These complaints are situated in a broader movement of civil society organizations working on addressing various issues in the ad tech industry, including Privacy International, noyb, and the Electronic Frontier Foundation. Given the growing number of parallel efforts, there is the potential to create a massive shift in online advertising in the EU.  Indeed, DPAs are beginning to show responses: the Information Commissioner’s Office in the UK just released a report on real-time bidding last week, highlighting various concerns around the ad tech industry’s failure to comply with the GDPR. 

Karolina Iwańska, a lawyer at Panoptykon, says that with complaints in so many jurisdictions, the focus should now be on the European Data Protection Board, which has more power than an individual DPA to set new standards in the EU. Meanwhile, she says it is also important to work on mapping out alternatives to the current ad tech system. The Fix Ad Tech campaign suggests the removal or truncation of personal data as an avenue to make online advertising safer, but Iwańska says that this is just one technical solution that does not address all of ad tech’s current problems. She emphasized that understanding how to create a safer, more secure ad tech industry will not be possible without coordination and cooperation across many stakeholders, including both publishers and advertisers. 

Though it is too early to say to what degree changes in data protection legislation will transform the ad tech ecosystem, the Fix Ad Tech campaign highlights how the GDPR has created new avenues for civil society organizations to pool their efforts strategically in the fight to advance privacy rights. 

Have a story to share?

This piece is part of The Engine Room’s case studies series on using the GDPR in the fight for privacy rights and social justice. If you have a case study to share on how the GDPR – or data protection legislation more broadly – was used to support your work, we’d love to hear about it. Please get in touch with The Engine Room’s Laura Guzman at

About the contributor

Lorraine is a communications professional and designer working at the intersections of technology, storytelling, and social justice. She has a background in media-based organizing, open knowledge initiatives, and data strategy and policy development. She strives to bring a feminist, anti-oppressive lens to discussions around technology, privacy, and surveillance. In Toronto, she works on facilitating spaces for communities of colour to discuss digital justice issues.

See Lorraine's Articles

Leave a Reply

Related /

/ July 26, 2022

Takeaways from our Community Call on Responsible Data for Social Justice Organisations

/ February 15, 2022

Responsible Data Community Retrospective series: 12 articles about the state of the Responsible Data work

/ March 2, 2021

Curating connections in the Responsible Data community