TO: David Beasley, Executive Director, WFP
CC: Enrica Porcari, Chief Information Officer, WFP; Pierre Guillaume Wielezynski, Chief of Digital Transformation Services, WFP; Patricia Ann Kikuko Harvey, Inspector General and Director of Oversight Office, WFP
Dear Mr. Beasley,
We are deeply concerned by the recent announcement from the United Nations World Food Programme (WFP) about a new data partnership with the private software company Palantir Technologies Inc. (Palantir). This partnership has the potential to seriously damage the reputation of the WFP, and its ability to achieve its mandate. Moreover, the partnership has the potential to seriously undermine the rights of 90 million people the WFP serves.
We write to urge you to reconsider the terms and scope of the agreement with Palantir, to be more transparent about the process and terms of the agreement, and to take concrete, immediate steps to mitigate the serious risks of harm arising from the agreement, including by establishing an external committee of experts to review the implementation of the partnership terms.
We, too, are keen for a debate centred around facts not speculation – but due to a significant lack of transparency, it is difficult to understand Palantir’s business practices and the risks they entail, as well as what WFP has partnered with them to do. From the little that we know, we can assume that the risks in working with them arise not from data sharing, but from the models that are extracted from WFP’s data, the software that is used, and biases within models that will be applied to the data.
Some of the more salient risks of concern to us include:
De-anonymization. There are a number of compounding risks that arise when huge datasets are merged and analyzed, which is the premise behind Palantir’s business model. Even when data points are stored separately, the “mosaic effect” presents a particular risk. Harvard researchers have noted that traditional privacy and anonymization frameworks that focus on identifying and removing personally identifiable information (PII) are “unsustainable and ineffective.” Even without full data being shared, significant risks arise from sharing metadata. WFP’s insistence that no personally identifiable information will be shared is largely invalid because of this point.
Bias. Another risk involves the use of algorithms to analyze data and flag “potential misuse,” a stated goal of the partnership. Based on past business practices, we assume Palantir will be using data models generated from their work with other organisations, and it is impossible to know what biases may be baked into these models. Algorithmic filtering is known to have a high risk of embedded biases and produce results of extremely variable quality and reliability. This is especially true in the context of data concerning widely diverse populations.
Rights to data, models and derivative analyses. The latest statement from WFP states that no access to WFP data will be provided under this agreement, and that WFP retains full control over the data, analysis and derivative work. However, it fails to explain what “control” implies in a legal sense, or whether it covers models, training data, and/or inferences made by such models. Intellectual property rights of any data, models or derivative analyses must be retained by WFP, or the WFP risks significant financial and intellectual property losses.
Future costs to WFP and sustainability. While the agreement may save money in the short term, long term costs should also be evaluated. Costs of using the system may increase over time, as experienced by other Palantir customers, who have complained about Palantir’s opaque pricing model, spiraling prices and a failure to deliver products. In addition, any future decision to end the partnership could be extremely costly and difficult, as the datasets will be difficult to extract and decouple from Palantir’s software.
Undermining the humanitarian principles. The partnership risks undermining WFP’s fundamental humanitarian principles and other core sector standards. For example, the Principles for Digital Development are a powerful force for the transparent, inclusive and equitable use of technology within the aid and development community. The WFP is a prominent and influential signatory to the Digital Development Principles, which means they have committed to “embody the concepts of the Digital Principles, represented in our work culture and in the policies and processes guiding our international development activities.”
Transparency and accountability. Nothing has been transparently shared about the procurement process that WFP set up to engage Palantir. Given the gravity of these concerns, building in transparent checks and balances – such as third-party audits, open procurement, and contract or agreement transparency, seems essential to ensuring that WFP can build upon the knowledge and expertise of experts in this community who want to support innovation and reduce harm. The “checks and balances” that are mentioned by WFP in their latest statement are done without any source documents being shared, making them closed-door agreements rather than actual checks and balances that they can be held accountable to. This sets a dangerous precedent for the WFP and other humanitarian agencies.
The undersigned organizations and individuals are dedicated to upholding human rights protection around the world. Many of us have worked firsthand with vulnerable populations, including with respect to gathering, sharing, and analyzing data about them in efforts to ensure basic rights and provide much-needed humanitarian aid. We recognize both the value that data can bring to making humanitarian programs more effective as well as the potential risks to those whose data is being gathered.
We recognise the complexity of the environment in which humanitarian agencies work, and the pressures and constraints that you face. In particular, we understand the constant pressure from donors to increase efficiencies and the effective use of every dollar. We are committed to supporting the responsible use of data in the sector, from developing strong, trusted partnerships, to developing better practices for information management.
If, as stated in the latest account, WFP is committed to engaging in responsible data practices, we urge you to do so in a transparent and accountable way, engaging in a meaningful way with the community here.
In conclusion, we urge you to take the following actions:
- Release the terms of the agreement in a show of transparency, and commit to sharing contracts with private sector companies in the future;
- Release information about the procurement and due diligence processes that went into deciding to engage with Palantir, as well as an assessment of Palantir’s compliance with the UN Business and Human Rights Guidelines;
- Establish an independent review panel to review the project plan and safeguards; and
- Take any necessary steps to amend the agreement to ensure the privacy and security of the people WFP serves, such as by:
- Limiting the ability of Palantir to apply models and analyses from WFP’s data to other datasets;
- Ensuring that WFP is not “locked-in” to the Palantir system by retaining control and rights over metadata and models built on the WFP data;
- Demonstrating openness and establishing a consultation process open to the community;
- Establishing a transparent grievance mechanism for those who may wish to challenge characterizations made based on data analysis;
- Undertaking human rights and conflict sensitivity impact assessments; and
- Establishing a clear protocol and agreement with Palantir for terminating the agreement should it become clear that the privacy and security of the people WFP serves cannot be ensured.
These recommended actions also pertain to other humanitarian and public interest organizations that have signed agreements with Palantir, including the Mercy Corps, NCMEC, C4ADS, Carter Center, Team Rubicon, the Enough Project, and the Rockefeller Foundation.
AI Now Institute
Data Justice Lab
Derechos Digitales · América Latina
Digital Civil Society Lab
The Engine Room
Global Justice Clinic, NYU School of Law
Lauri Goldkind, Fordham University
The Good Data Project
Joseph Guay (The Do No Digital Harm Initiative)
Human Rights Data Analysis Group
International Accountability Project
International Rehabilitation Council for Torture Victims (IRCT)
Tom Kunzler, Open State Foundation
Béatrice Leydier, gui2de
The Linkage Project
David Losada Carballo
Connie Moon Sehat
Angela Oduor Lungati
Powered by Data
Public Data Lab
R3D: Red en Defensa de los Derechos Digitales
Linda Raftree (MERL Tech)
David S. H. Rosenthal
Signal Program, Harvard Humanitarian Initiative
Berhan Taye Gemeda
Nick Wehner, Open Communications for The Ocean
Mushon Zer-Aviv, The Public Knowledge Workshop