Measuring what Matters (Human Rights) and doing it responsibly


/ September 5, 2017

This post was written by Anne-Marie Brook, co-founder of the Human Rights Measurement Initiative (HRMI). Currently, HRMI is accepting nominations for countries to participate in their pilot project; nominations close Friday, September 15. If you’d like to find out how to submit a nomination, or share your ideas more generally, email Anne-Marie at anne-marie.brook@motu.org.nz. If you’d like to sign-up to HRMI’s occasional newsletter you can do that here.

Measuring human rights is hard, but that’s no excuse not to do it. The Human Rights Measurement Initiative (HRMI) aims to use responsible and inclusive approaches to produce a comprehensive set of metrics on countries’ human rights performance.

Right now, we are focusing on developing new measures of civil and political rights. We believe it’s important to do this in a way that is both inclusive and responsive to the needs of human rights advocacy communities and other users of HRMI data around the world. We are doing this by:

  1. Using a co-design approach
  2. Designing with safety first
  3. Blending in
  4. Going where the energy is

Using a co-design approach

Collaborating with users to design our methodology

A couple of weeks ago I held online video calls with a human rights lawyer in Mexico, a journalist in Mozambique and a human rights advocate in Fiji. In each case, I sent them a link to the latest version of our online expert opinion survey and asked them to fill in the survey while ‘sharing their screen’, so that I could see what they were doing. While they walked through the survey, I asked them to talk out loud about what they were seeing, and how they were experiencing the survey.

In co-design language, we call this ‘testing a prototype’. Co-design, which is short for ‘co-operative design’ is a participatory approach to designing a product that actively involves a range of stakeholders in the design process to help ensure that the thing produced actually meets their needs. In our case, we are designing new metrics in the area of civil and political rights, and our design process has actively incorporated dozens of people from around the world. Some of these people attended the co-design workshop we ran at the University of Georgia in March this year. Others have participated in one-on-one “testing” sessions done via skype or other similar video conferencing software.

We are designing new metrics in the area of civil and political rights, and our design process has actively incorporated dozens of people from around the world

By watching people actively engage with our prototype, we pick up on all sorts of things that these people might not notice themselves. For example, I saw that when rating a country on a scale from 0 to 10, if a respondent wanted to choose the number zero (e.g. for use of the death penalty) they left the default marker at zero without actively ‘selecting’ it. This meant that the survey did not store their response or notify the survey-taker of this. Once we saw the pattern, it was something we could easily fix.

So far, we feel like our prototype is pretty good. But Steve Jobs probably thought the first iPhone was pretty good, too, and these days it looks pretty clunky. While we think of the survey that has popped out of this co-design process as being great for right now, we have no doubts that our future iterations of this survey will be even better.

Designing with safety first

Collecting data in high-risk countries from key actors safely

An unfortunate reality is that human rights advocates and journalists are often at risk of persecution from their governments. Since these brave and dedicated people often have the expertise to serve as expert survey respondents, it is a priority for us to safeguard their identities and minimise the chance that participating in HRMI’s data collection process will put them in any additional danger.

HRMI collects two types of sensitive information as part of our civil and political rights metrics development, and we have put in place a clear process to collect and store this information securely. The first type of sensitive information is the names and contact details of potential survey respondents. For our pilot study, we are collecting this via an on-line nomination form on a website with an SSL certificate and secured by https. This means that the names and contact details are encrypted as they transit online. This information is sent to a dedicated email address hosted by an email service provider based in Switzerland with very strict privacy laws, and stored in files with the same service provider. Access to this information is restricted to 1 or 2 HRMI staff based in New Zealand, and used only to send out links to the online survey.

HRMI collects two types of sensitive information as part of our civil and political rights metrics development, and we have put in place a clear process to collect and store this information securely.

The second type of sensitive information we collect is the responses to our civil and political rights expert survey. Again, this information will be collected via online software with an SSL certificate and secured by https, which encrypts all information submitted via the survey. Neither we, nor anyone else, will know who each set of responses has come from, and we have put in place a process for stripping the most sensitive information from these responses before they are summarised into generic metrics and published on our website. Overall, the biggest risk we can see for respondents would be if someone hacks into their email and sees that they have received a survey link from HRMI. Hostile governments would then know that that person is potentially contributing to our civil and political rights metrics. However, they would not be able to access the submitted information itself. For potential survey respondents who are concerned about this risk, we will provide information on how to hide their online activity. For more detail you can see our security policy.

Blending in

Choosing the right technology (and questions) for each

context

When you don’t want to draw attention, it helps to blend in. For now our survey vehicle is the early-stage prototype that we know we’ll improve. As we advance, we hope to do this in part by developing country-specific or culture-specific variants. I can think of two ways this might happen.

First, we could vary our survey software to fit country-specific preferences. For example, the Great Firewall of China has led to the development of many substitute products for online interaction in China. We might need to vary our own survey software across countries to ensure access to human rights experts worldwide.

For now our survey vehicle is the early-stage prototype that we know we’ll improve.

Second, there is some scope for varying the survey questions themselves across countries. For example, as well as collecting information on the overall prevalence of abuses of each human right, we will also be collecting information on the population groups who are particularly vulnerable. As we build up a picture of which groups are especially vulnerable in each country, our survey questions could be tailored to each country. Survey respondents in Malawi might be asked about discrimination against people with albinism, while respondents in the Slovak Republic might be asked about discrimination against Roma.

Going where the energy is

Selecting pilot study countries on a volunteer basis only

We are currently looking for nominations of countries to include in our pilot study. We would like to include 12 diverse countries – covering a range of sizes, income levels, cultures, and degrees of openness. The one thing that all 12 countries will have in common is that they will have been volunteered for inclusion by human rights practitioners working in and on those countries. We don’t want to drive our early-stage prototype vehicle into a neighbourhood where it’s not wanted. We are looking for the early adopters who are keen to use this opportunity to draw attention to their country’s performance on seven civil and political rights, while helping us develop a better vehicle for roll-out to the rest of the world.

We are looking for the early adopters who are keen to use this opportunity.

Country nominations close on Friday September 15th, and a list of the countries selected will be published in October. If you’d like to find out how to submit a nomination, or share your ideas more generally, email Anne-Marie at anne-marie.brook@motu.org.nz. If you’d like to sign-up to HRMI’s occasional newsletter you can do that here.

About the contributor

Anne-Marie is a former OECD economist with a passion for helping to bring about systemic change. She believes that having good metrics for tracking the human rights performance of countries is a prerequisite for the collective action that is needed to transform our world for the better. She co-founded the Human Rights Measurement Initiative (HRMI) from New Zealand, and leads it to be an innovative collaboration of human rights experts from around the world. She encourages anyone interested in contributing their skill-set to this mission to get in touch.

See Anne-Marie's Articles

Leave a Reply


Related /

/ July 26, 2022

Takeaways from our Community Call on Responsible Data for Social Justice Organisations

/ February 15, 2022

Responsible Data Community Retrospective series: 12 articles about the state of the Responsible Data work

/ March 2, 2021

Curating connections in the Responsible Data community