Open Data Meets Responsible Data for a More Sustainable Future


Over the past decade, hype around access to public information has moved on to serious discussions around what it means to collect, store and disseminate data, and the responsibilities of public and private data holders.

In 2011, discussions within the recently founded Open Government Partnership (OGP) boosted a novel collaboration between civil society and public agencies worldwide. The aim of these discussions was to jointly determine best practices and methodologies for making public information available through open databases, and initiatives hosted by civil society emerged to strengthen the process shortly after. 

In Latin America, Abrelatam and Condatos are  the most well-known of these initiatives, and they have played a  vital role in building connections between governments, citizens, and CSOs interested in open data for research, advocacy and policy making, engendering high levels of interaction and trust between these stakeholders.

As Paraguay’s main digital rights organisation, TEDIC has participated in discussions around open data on both the country and regional level since the organisation’s inception. However, after ten years of openness policies, reflections and lessons learned must inform appropriate evolution and adaptation and ensure that future open data policies adopt a more responsible data approach on the part of both civil society organisations (CSOs) and governments. 

Here, we offer some reflections based on years of engaging with these issues.

Building databases is not enough

At the start of this movement, building databases, apps and websites appeared to be a goal in itself. Since then, we have observed an increase in understanding that it is not enough to just create a database and expect that people will use it. Although ICTs (Information and Communication Technologies) have enormously facilitated access to information in general, governments and CSOs must understand that there are inequalities in the region that need proper attention, and both must work to ensure proper access to publicly available information. Structural problems such as internet access and data literacy should be taken into account by policymakers, in order to find appropriate ways to disseminate information and engage with citizens.

More collaboration between governments and CSOs

We need robust partnerships between government and CSOs. TEDIC has documented collaborations with the government that aimed to create civic technology based on public information and databases but that ultimately failed due to public bodies’ restrictions that made effective reuse by third parties difficult, violating open data OGP agreements.

In order to build a society that truly values transparency, we need robust relationships between governments and civil society organisations that ensure that effective reuse of data by different stakeholders is possible. 

Diversity should be incorporated by design

When CSOs and governments design open data policies, there is a lack of representation from LGTBQI communities and grassroots women’s communities. This underrepresentation is a serious problem that threatens the generation of databases that might systematise these groups’ particular issues and needs. It could arguably continue to make them invisible to both public and private initiatives that could improve their lives.

We need better privacy and data protection regulation

The initial trend of aiming to open up as much information as possible has evolved into a more precautionary approach that also considers the protection of privacy and personal data. The recent enactment of the GDPR in the European Union has now forced civil society and even governments to recognise the importance of responsible data management, and incorporate responsible data practices in their organizational strategies.

This is important, and we are noticing how different countries in Latin America are now moving towards enacting data protection laws and provisions as well. In Paraguay, for example, TEDIC is part of a national coalition that has recently presented a comprehensive data protection law in Congress, with the endorsement of a number of public and private stakeholders.

The future: going beyond data

The tectonic shift brought about by transparency and open data policies in Paraguay and in the region is undeniable. However, such initiatives need to incorporate provisions to mitigate the potentially harmful effects associated with creating databases that hold identifiable and sensitive information. We believe that government open data action plans and CSO projects that collect, reuse or store data should, at a minimum, conduct a Data Protection Impact Assessment (DPIA) for any project related to the creation of any database. Such DPIAs should be available to the public.

We also need to go beyond focusing only on databases, apps and websites that store and visualise information. While important, this needs to be part of a broader debate around ensuring connectivity, accessibility and digital literacy. We consider spaces like Abrelatam/Condatos to be vital in tackling these structural problems. In the long run, inequalities in these areas could risk creating first- and second-class citizens, depending on the amount of information and data that people can access and process. We must avoid this at all costs.

Lastly, we believe that there needs to be a more concerted effort to systematise experiences and lessons learned over the last ten years. The experiences of both CSOs and governments could provide insight into both the policies and initiatives that have worked, and those that have not. Building an accessible knowledge infrastructure could help strengthen the current community of practitioners, who aim to create a more responsible data ecosystem that respects peoples and communities’ right to consent, privacy and data ownership, while also ensuring the transparency, openness and reuse of data our societies need for a healthy democracy. Such is the challenge in the years to come.  


Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?

The Seven Principles of Data Feminism


In November 2015, Catherine, who was based out of the MIT Media Lab at the time, was invited by Mushon Zer-Aviv to write a blog post for the Responsible Data Forum – an event he was co-organising for January 2016. At the same time, Lauren, who was working at Georgia Tech in Atlanta, was preparing to travel to the NULab at Northeastern University, in Boston, to give a talk on some new research. Unbeknownst to each other, both of us had decided to focus on the same unusual topic: feminist data visualisation. Struck by the coincidence, a mutual friend put us in touch, and we soon began planning a collaboration. 

Our first work together was a short paper, Feminist Data Visualisation, which was in many ways inspired by the conversations and design workshops about ethics in visualisation that took place at the Responsible Data Forum. But as we continued to develop the concept, in conversation with each other and within our respective communities of practice, we realised that a feminist data visualisation, or any data visualisation, represents the output of a much longer and more complicated set of processes. 

We also realised that a feminist approach to data visualisation would need to consider the social, political, and historical context in which these processes took place. And so the concept of feminist data visualization evolved into data feminism: a way of thinking about data, data systems, and data science that is informed by the rich history of feminist activism and feminist critical thought. 

Data feminism begins with a belief in gender equality, and a recognition that achieving equality for people of all genders requires a commitment to examining the root cause of the inequalities that certain groups face today. 

Data feminism is not only about women. It takes more than one gender to have gender inequality and more than one gender to work toward justice. Similarly, data feminism isn’t only for women. Many men, nonbinary people and genderqueer people are proud to call themselves feminists and use feminism in their work. 

Furthermore, data feminism isn’t only about gender. Intersectional feminists like Kimberlé Crenshaw, bell hooks and the Combahee River Collective have taught us how race, class, sexuality, ability, age, religion, geography, and more are factors that work together to influence each person’s experiences and opportunities in the world. Intersectional feminism also teaches us that these experiences and opportunities (or the lack of opportunities, as the case may be) are the result of larger structural forces of power, which must be challenged and changed. In our contemporary world, data is power too. And because the power of data is wielded unjustly, it too must be challenged and changed. 

Underlying this commitment to challenging power is a belief in co-liberation: the idea that oppressive systems harm all of us, that they undermine the quality and validity of all of our work, and that they hinder all of us from creating true and lasting social impact. To guide us in this project, we have developed seven core principles. Individually and together, these principles emerge from a foundation in intersectional feminist thought. 

The seven principles of data feminism are as follows: 

  • Examine power. Data feminism begins by analysing how power operates in the world. 
  • Challenge power. Data feminism commits to challenging unequal power structures and working toward justice. 
  • Elevate emotion and embodiment. Data feminism teaches us to value multiple forms of knowledge, including the knowledge that comes from people as living, feeling bodies in the world. 
  • Rethink binaries and hierarchies. Data feminism requires us to challenge the gender binary, along with other systems of counting and classification that perpetuate oppression. 
  • Embrace pluralism. Data feminism insists that the most complete knowledge comes from synthesising multiple perspectives, with priority given to local, Indigenous, and experiential ways of knowing. 
  • Consider context. Data feminism asserts that data is not neutral or objective. It is the product of unequal social relations, and this context is essential for conducting accurate, ethical analysis. 
  • Make labour visible. The work of data science, like all work in the world, is the work of many hands. Data feminism makes this labour visible so that it can be recognised and valued. 

In our book, Data Feminism (MIT Press, 2020), we explore each of these principles in more detail, drawing upon examples from the field of data science, expansively defined, to show how that principle can be put into action. 

Along the way, we introduce key feminist concepts like the matrix of domination (Patricia Hill Collins), situated knowledge (Donna Haraway), and emotional labour (Arlie Hochschild), as well as some of our own ideas about what data feminism looks like in theory and practice. To this end, we introduce readers to a range of folks at the cutting edge of data and justice. These include engineers and software developers, activists and community organisers, data journalists, artists, and scholars

This variety of people, and the variety of projects they have created or helped to create, is our way of answering the question: What makes a data science project feminist? As we assert, a data science project may be feminist in content, in that it challenges power by choice of subject matter; in form, in that it challenges power by shifting the aesthetic and/or sensory registers of data communication; and/or in process, in that it challenges power by building participatory, inclusive processes of knowledge production. What unites this broad scope of data work is a commitment to action and a desire to remake the world to be more equitable and inclusive. 

Our overarching goal is to take a stand against the status quo against a world that unfairly benefits rich white cisgender heterosexual non-disabled white men from the global north at the expense of others. 

Our principles are intended to function as concrete steps to action for data scientists seeking to learn how feminism can help them work toward justice, and for feminists seeking to learn how their own work can carry over to the growing field of data science. They are also addressed to professionals in all fields in which data-driven decisions are being made, as well as to communities that want to resist or mobilise the data that surrounds them. 

They are written for everyone who seeks to better understand the charts and statistics that they encounter in their day-to-day lives, and for everyone who seeks to communicate the significance of such charts and statistics to others. 

Borrowing from bell hooks, we say: data feminism is for everyone. Data feminism is for people of all genders. It’s by people of all genders. And most importantly: it’s about much more than gender. Data feminism is about power, about who has it and who doesn’t, and about how those differentials of power can be challenged and changed. 

More About Data Feminism

Data Feminism is an open access book published by MIT Press in 2020. You can read it for free online at https://data-feminism.mitpress.mit.edu/ or buy it from your local independent bookstore. 


Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?

Re-Imagining a Responsible Approach to Informed Consent


Earlier in 2021, Human Rights Watch published a report about UNHCR’s improper collection and sharing of data pertaining to Rohingya refugees in Bangladesh, in which they found that the agency failed to conduct a full data impact assessment, as its own policies require. In some cases, the report shows, UNHCR had failed to obtain refugees’ free and informed consent to share their data with Myanmar, the country they had fled. The report sparked a discussion on the Responsible Data listserv about informed consent. 

This was not the first time – and will probably not be the last time  – that the topic of informed consent has taken centre stage in discussions among the RD community. Since its inception, the RD community has tried to grapple with the ethical challenges that the reliance on informed consent as a legitimate basis for data collection presents, particularly (though not exclusively) in the humanitarian aid sector. 

The promises and perils of informed consent in data processing

At its heart, informed consent is about upholding dignity for individuals and communities involved, regardless of who is doing the data collection – whether researchers, governments or aid organisations. 

In many cases, humanitarian organisations rely on informed consent from the communities they serve to legitimise the  collection and use of their information. The emergence of new technologies, however – combined with a rapid increase in the amounts of beneficiary data collected  – has heightened and widened concerns about the validity of informed consent in this context. 

Back in 2014, in the early days of the budding RD community, a small group of members came together to discuss what consent policies for civil society organisations can and should look like, recognising the thorniness of this topic. Not long after, the community explored the role of informed consent in crowdsourced and user-generated data for advocacy at a Responsible Data Forum in Nairobi, organised in partnership with Amnesty International. In many ways, the questions related to informed consent that emerged at this RDF still resonate today, as participants discussed its relation to technology, duty of care, and the education of data subjects. 

Seven years later, for the RD community these questions are still front of mind as we see examples of how data collection processes are insufficiently seeking informed consent of individuals whose data is being collected. Recent research has highlighted how data collection processes fail to take into account the particular experiences of vulnerable communities, and to integrate contextual interpretations of informed consent within these communities. Dragana Kaurin conducted research into the collection and use of personal data of refugees who arrived in Europe since 2013 and found that informed consent is rarely sought. When conducting research with migrants and refugees arriving in Italy in 2019, Data & Society noted that “there is a lack of meaningful informed consent in the collection of information from migrants and refugees,” and that, consequently, migrants may not be truly giving meaningful consent due to cultural differences, knowledge gaps, or power inequalities. Recent research by The Engine Room on the lived experiences of marginalised communities with digital ID systems in five countries also showed how informed consent processes were lacking in providing refugees with clear and accessible information about the processing of their personal data. 

In many of these cases, informed consent fails to take into account the specific contexts and needs of the communities at hand; particularly in environments with stark power imbalances, standard approaches to informed consent have proven to be insufficient in empowering these communities to exercise their agency. Ultimately, if an individual is not made aware of the implications of their choice or cannot say no, then consent to data processing cannot be regarded as valid.

Re-imagining our approach to informed consent

Seeking consent in a way that ensures those providing consent actually have the information, agency, control and alternatives they need and are entitled to is a key part of using data responsibly. However, if power asymmetries between those doing the collecting and those from whom data is collected prohibit the implementation of free and informed consent processes, can we consider informed consent a legitimate basis for data collection at all? 

As we continue to tackle this question within the RD community, we have witnessed how organisations are starting to recognise the limitations of informed consent. In a recent blog post that accompanied the publication of their biometrics policy, the ICRC wrote that in “rendering its data processing as transparent as possible to its beneficiaries and affected populations, it does not believe that consent provides a legally valid basis for data processing in many emergency situations.” 

As we continue to re-imagine the place of informed consent in the humanitarian space, we hope the RD community can continue to provide a space where practitioners can discuss what informed consent could, or should, look like in the future.


Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?

Responsible Data and MERL


Monitoring, evaluation, research and learning (MERL) are by nature data-heavy activities. It makes sense, then, that over the past decade, the use of digital technology and digital data have permeated the practice of MERL.

In the social change sphere (humanitarian aid, development work, human rights, and program areas such as health, education, social protection, and protection overall), much of the data that we collect when conducting MERL comes from at-risk populations or underrepresented populations. This data influences decisions to support these same populations to access rights and services. It also tells us whether or not our interventions have worked and for whom. For this reason, a responsible data approach that takes data ethics and data protection into account is an imperative for MERL. 

Around 2013 or so, the use of digital tools and platforms to support MERL began to claim more attention among a small set of ‘early adopter’ MERL practitioners. We began seeing mobile devices used to collect data and as a way to encourage community feedback on programs. The use of satellite data and participatory mapping projects became more prominent, and there was great hope placed on crowdsourcing, and citizen journalism for gathering insights. In 2014, Michael Bamberger and I took a closer look, laying out an initial landscape of digital approaches to monitoring and evaluation

At the same time, following a discussion on the ethics of participatory mapping, a group of practitioners (including The Engine Room’s founders) assembled to look more closely at ethics in technology for development. We were concerned that development agencies were pushing innovation and technology in development while being largely unaware of data ethics, privacy and security issues that could expose individuals and communities to risk. Our first MERL Tech Conference also happened that year.

When adoption of innovation is used as the principal indicator of success or failure, the wider positive or negative ramifications, including unintended consequences, costs, and risks are likely to be overlooked, as noted by Glover. Close overlap with the Responsible Data community has helped ensure that the MERL Tech community continuously reflects on these concerns. We have created a space for dialog and discussion between tech developers, early adopters of technology in MERL, privacy and ethics advocates, and those who are newly learning about how technology can enable MERL. This has allowed the sector to improve its use of digital data, to highlight potential negative outcomes for vulnerable groups, and to introduce approaches to mitigate the harm that can come from the collection and use of MERL data.

The Responsible Data and MERL Tech communities have together explored many such areas of concern, including consent in the digital age, developing responsible data policies, and operationalizing responsible data policies. As the MERL community matured, discussions about RD moved from the margins to the center.

Over the years, we’ve seen heated debates about the potential for harm and unintended consequences stemming from digital approaches or from partnerships whose value and ethics are questioned. Together, the MERL Tech and RD communities advocated for ethical frameworks and better research on the potential for technologies and digital approaches to do harm. For example, research by Oxfam and The Engine Room on balancing the risks and benefits of using biometrics in the humanitarian field has helped to shape how we think about the role of emerging technology. We also co-curate a Responsible Data Resource List which lives on the MERL Tech site and on the RD website. Key to these discussions is the active and lively community debate that happens on the RD listserv,  at events by both organizations, and additional discussions that have taken place on these themes at the New York City Technology Salons. This constant reflection on the responsibilities of MERL professionals to use data in ethical and responsible ways has strengthened the MERL Tech space. 

While collective progress is being made in documenting and assessing technology-enabled MERL initiatives and good practice guidelines are emerging, ethical questions related to these new and emerging methods and approaches remain. In 2022, we are talking about much more than mobile data gathering, mapping and crowdsourcing, as we have explored in a series of MERL Tech State of the Field reports. Alongside more traditional uses of digital tools and data, we see ‘big data’ and predictive analytics sitting squarely in the MERL space. Emerging tools and technologies such as blockchain, artificial intelligence and machine learning, new forms of data storage, text and voice analytics, biometrics, non-traditional data and metadata are being explored as part of the MERL toolbox. Collaboration with the RD community helps the MERL Tech community ensure that we are looking at the ethics and RD issues that come with new and emerging approaches to MERL. 

Going forward, the MERL Tech community is addressing the fallout of COVID-19, which has made digital MERL and remote monitoring even more relevant. Building on a Responsible Data in MERL during COVID-19 series of events co-hosted with the CLEAR Center in Anglophone Africa), we have convened a Responsible Data in M&E – RDiME working group and community of practice that focus on these issues in the African context. Members of the working group developed guidance on data governance and responsible data practices for MERL with a focus on African contexts. We will also continue our focus on documenting, sharing, learning, training, and guidance on how to improve rigor, validity, representativeness, and inclusion, how to enhance the safeguarding of vulnerable individuals and groups and ways to assess new approaches and methods to ensure that ethics and safeguarding are included in MERL design and implementation. 

Ultimately, we hope our two communities will continue to collaborate in order to strengthen the MERL sector through intentional, responsible, and ethical approaches to technology and digital data.


Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?

What Do Responsible Data and Acrobats Have in Common?


At the Responsible Data Forum in 2014, a participant described an ‘acrobat’ data controller weighing up opportunities and risks before stepping out on the ‘tightrope’ of data collection. In our case, the data controller is an international non-governmental organisation (INGO) or local partner deciding whether to collect or process data in a humanitarian or social justice situation. 

Responsible Data is a reference point for balancing decisions about the stories data can tell alongside the rights people hold within dynamics of power. After surveying the scene and making a plan, a codified set of principles can help us get balance or decide whether or not to stop – if, for example, we’re unsure whether our acrobat can cross the tightrope safely, or indeed whether they are the best person for the job.

Values-based decisions

Since 2014, many organisations, including Oxfam, have codified their data-related commitments in Responsible Data policies, crucially linking approaches to safety, dignity and rights. In conjunction with laws, policies offer a framework. 

As data collection activities can often involve subjectivity, judgement, and necessary decisions around proportionality, policies based on principles are important to set out the values that frame practical decisions, as well as to start conversations and stimulate thinking on the application and meaningful use of data. 

In this respect, Responsible Data isn’t only about complying to a set of predefined rules; it’s about balancing the opportunities against the risks: only collecting what we need (data minimisation) or working with partners who may be better placed to collect the data that’s needed; being clear about the data we keep (and why) and the data we never collect; understanding who might be highlighted in the data we have, and who is missing from it. 

While a new lens is needed to realise how inequalities show up differently in the use of data in the context of rapid technological advancements, Responsible Data can be an anchor through this change as we apply the same principles, no matter the new sparkly tools. This is ever more important, as the COVID-19 pandemic has resulted in a sharp increase in technology hype and concerns over privacy in applications like contract tracing systems.

It’s important to recognise that the time and resources needed to write up a policy, and the power to enforce it, are in themselves privilege. Organisations, government agencies and service providers develop and implement policies for people who may not be aware of their existence or know anything about them. Data protection frameworks, regulators and national discourse about the role of data as an expression of power reflect widespread concern and different approaches to solving the systemic  problems at hand. We expect to increasingly have conversations through civic or public frameworks and expression – and see the Responsible Data movement take leadership from outside of Europe and North America. For example, Kenya launched a data protection law in 2019 and appointed the first Data Protection Commissioner in 2020.

Data and power – organisations and individuals

Data is inherently linked to power. Understanding who the acrobat is and what kind of power they hold in their role is crucial. Those who design participatory systems and methodologies can affect who shows up in data, whose voice is heard and, in turn, how decisions get made. Many INGOs are on a reflective journey to acknowledge the power we hold in partnerships and relationships, as we rethink both our role in shifting power and how we can show up in a way aligned to equity, inclusiveness, anti-racism and accountability.

In turn, we consider the power of individuals within organisations (or adjacent service providers) who end up on the front lines of implementation in their relationships within a community. Looking at the use of data itself with a power lens asks us to reflect on whether data subjects really know what their data is used for or what stories are told about them. 

Most of us now agree that consent is very rarely ‘informed’ or freely given – especially in contexts where there is a significant power differential (such as humanitarian response). The reality always involves sharing some data with the understanding that being interviewed means you are more likely to be eligible for assistance. As a local partner, funding pressures require collecting and sharing more data to back up proposals or reporting. 

Sharing is caring: policy to practice

For Responsible Data to be meaningful, we must move from wordsmithed policy documents to winning over the hearts and minds of those making decisions about data collection and use in the course of their work, as we realise that we all have a role to play. We need to emphasise work on culture, which frames a collective understanding of what ‘doing the right thing’ looks like in pragmatic, solutions- orientated ways. This goes hand-in-hand with building a broader culture of accountability that is not only open to feedback, but willing to change as a result. 

The Responsible Data community has been a great example of active resource-sharing, and many of the resources shared have been practical, actionable and supportive of learning. The more such resources and tools that are available within our community – focused on values-driven action and on reducing the burden on implementers, and presented in a way anyone can interpret and use in a reflective way for their own ends – the better. Context, discussion, and co-creation all matter when ‘landing’ principles in practice.

The future of Responsible Data must focus on shifting power in the data landscape, including around who is leading the charge and shaping the narrative about what Responsible Data looks like from policy to practice. We should perceive the acrobat stepping out not just as an INGO waving around their policy, but as individual staff members or partners making daily decisions about data. To support that judgement, we need to work collaboratively to emphasise the practical, principle-based day-to-day decisions, in order to see more clearly the tightrope ahead of us. 


Latest Event / View Past Events

/ ON March 29, 2016 / San Francisco

Human Rights documentation

How are new types of data changing the way that we document human rights violations?