Error 404: Digital Humanitarianism and the Mirage of Human Rights

On June 6, 2020, global government representatives, human rights activists and technologists convened at the annual RightsCon Summit to discuss the transformation of human rights in the digital age.  UN Special Rapporteurs called attention to the impact of digital technology on humanitarian response, specifically concerning the role of big data. [1]  In a period of increased state violence, geopolitical conflict and environmental catastrophes, there is a newfound urgency to improve the efficiency of humanitarian aid.  As with most challenges in the twenty-first century, governments and organizations have turned to technology to address these issues.  However, there is nothing inherently progressive about big data, and when examining its humanitarian consequences, big data collection threatens to undermine existing international human rights law (IHRL), namely the right to privacy and equality.  In order to protect the rights of data subjects and historically marginalized communities, while improving humanitarian aid responses, it is critical to develop international data protection laws. 

Big data, the amassing of colossal amounts of statistical information on social and human behavioral patterns, has become a core tenet of the Information Age and advanced data-driven solutions have come to the forefront of international human rights efforts.  Currently, one of the greatest critiques of IHRL is its focus on ex post facto responses such as treaty compliance and reactive tools rather than the prevention of human rights violations. [2] The urgent need to refocus humanitarian efforts on prevention was emphasized by former United Nations Secretary-General Ban Ki-moon who pledged to “incorporat[e] human rights and democracy” into “early warning and early action” practices; an objective that is attainable through predictive data analysis and early warning systems. [3]  From Ushahidi, an open-source crisis-mapping platform that collected digital traces left by mobile devices to improve humanitarian situational awareness after the 2010 Haiti earthquake, to the Amnesty DataKind project, an initiative that sorted through 11,000 data files from Amnesty’s Urgent Action calls to create a preliminary predictive model to forecast international human rights risks, big data offers to revolutionize IHRL. [4]  Yet, despite the seemingly conspicuous positive impact of big data analysis on human rights efforts, it is critical to consider how data is being collected and by who.  The most consequential question being: where does power lie in the collection of humanitarian data?

Data-driven humanitarian aid challenges two standards of international human rights law: the right to privacy and equality.  Article 12 of the Universal Declaration on Human Rights (UDHR) and Article 17 of the International Covenant on Civil and Political Rights set the right to privacy as a fundamental human right — playing a foundational role in protecting personal identity. [5] [6]  However, the increasing reliance of humanitarian information systems on data from wireless carriers, drone imagery, cell site location information (CSLI) to the Global Positioning System (GPS) present serious challenges to the individual’s right to privacy. 

In Carpenter v. United States (2018), the court grappled with whether a warrantless seizure of CSLI supported by probable cause violated the Fourth Amendment. [7] In 2011, as part of a criminal investigation into a series of armed robberies, the FBI used CLSI produced by wireless carriers “to obtain 12,898 time-stamped location points cataloging Carpenter’s movements over 127 days”. [8]  The District Court and Sixth Circuit ruled that Carpenter lacked a reasonable expectation of privacy since he voluntarily shared his information with third-party wireless carriers.  However, the Supreme Court decided that since mobile devices had become “a pervasive and insistent part of daily life,” stored mobile data held the “privacies of life” and should not be exempt from Fourth Amendment protection. [9]  In essence, accessing data collected by third-parties, such as wireless carriers and technology conglomerates, without a warrant or explicit consent violates an individual’s right to privacy, even if such data is obtained for seemingly righteous undertakings. 

Digital humanitarianism poses inherent dangers to equality rights enshrined in international human rights law.  Principles of equality and non-discrimination underpin the vast majority of IHRL instruments: most notably, Article 1(2) and (3) of the United Nations Charter outlines the basis of international law on the “principle of equal rights and self-determination of peoples… without distinction as to race, sex, language, or religion” and Article 2 and 7 of the UDHR prohibits “distinctions of any kind” with regards to the distribution of rights and freedoms. [10] [11]  Yet, as highlighted by the current refugee crisis in Ukraine, humanitarian aid can contradict these principles of equality.  From groups of Nepalese, Indian and Somali men describing being beaten by Ukrainian guards in Lviv, Middle Eastern students being prevented from boarding buses and trains to Poland and Moldovan authorities deliberately housing Romani refugees in separate refugee centers from other migrants, historically marginalized groups are facing dehumanizing treatment and blatent discrimination, contradicting the fundamental objective of IHL. [12] [13] [14]  Unfortunately, the mobilization of data in humanitarian aid has introduced a new state of social sorting at state borders.  For example, the European Asylum Dactyloscopy Database (EURODAC), a European Union (EU) fingerprint database for EU asylum seekers, requires individuals from 14 years of age to be fingerprinted and allows all 27 EU nations and 4 associated states to access and store this data. [15]  When considering the existing discrimination in humanitarian aid, the incorporation of biometric and big data creates a new state of automated social sorting where historically magrinalized groups can be classified as “undesirable data subjects” prior to reaching border control — paralleling the Eurocentric legacies of  “colonial-era human classification,” exacerbating existing societal fractures and contradicting the universal principle of equality. [16]  

Despite the glaring dangers of digital humanitarianism on human rights, there is a lack of international regulation governing data protection and privacy, especially with regards to humanitarian aid.  Though many regional governments have enacted laws surrounding data collection and processing such as the General Data Protection Regulation (GDPR) in the EU, the Data Protection Act (DPA) in the United Kingdom and the General Personal Data Protection Act (LGPD) in Brazil, there is an absence of a universal framework. [17] [18] [19]  Since digital humanitarianism transcends national borders, it is imperative to establish international guidelines for data protection and the ethical application of big data in humanitarian aid.  To that end, entities such as the Global Privacy Assembly (GPA) and Office of the United Nations High Commissioner for Human Rights (OHCHR) have created working groups to “develop guidelines and share best practices in privacy and data protection relating to international development assistance.” [20]  In particular, the OHCHR working group urges for the disaggregation of data and stresses that data collection should “not create or reinforce existing discrimination, bias or stereotypes exercised against population groups.” [21]  These values must be translated from working groups into international law. [22]  It is indisputable that big data and technology offer unprecedented tools to preserve human rights around the globe, but a paradox emerges in our international humanitarian data ecosystem: digital humanitarianism poses dangers to the very individual liberties, namely privacy and equality, that international human rights law sets out to protect.  In the face of humanity’s increasing reliance on big data and the social danger of its misuse, it is paramount that international law upholds the core tenets of human rights and protects the most vulnerable populations. 

Edited by Genevieve Cabadas 

Sources:

[1] UN experts highlight digital rights in conflict and humanitarian crises at RightsCon, United Nations Office of the High Commissioner (2022), online at https://www.ohchr.org/en/press-releases/2022/06/un-experts-highlight-digital-rights-conflict-and-humanitarian-crises (visited August 3, 2022). 

[2] Galit A. Sarfaty, “Can Big Data Revolutionize International Human Rights Law?”, 39 University of Pennsylvania Journal of International Law 1, (2018). 

[3] Ban Ki-moon, “Securing our Future: Singapore, the Region and Beyond,” (speech, Singapore, March 23, 2012), Fullerton Lecture Series, online at https://perma.cc/95G5-2MZ3   

[4] Galit A. Sarfaty, University of Pennsylvania Journal of International Law. 

[5] Universal Declaration of Human Rights, United Nations General Assembly (December 10, 1948), online at https://www.un.org/en/about-us/universal-declaration-of-human-rights (visited August 6, 2022).

[6] International Covenant on Civil and Political Rights, United Nations General Assembly (December 16, 1966), online at https://www.ohchr.org/en/instruments-mechanisms/instruments/international-covenant-civil-and-political-rights (visited August 6, 2022)

[7] Carpenter v. United States, 138 S. Ct. 2206, 201 L. Ed. 2d 507 (2018).

[8] Ibid.

[9] Ibid.

[10] United Nations Charter, United Nations (October 24, 1945), online at https://www.un.org/en/about-us/un-charter/full-text (visited August 6, 2022).

[11] Universal Declaration of Human Rights (1948). 

[12] Amie Ferris-Rotman, They Called Ukraine Home. But They Faced Violence and Racism When They Tried to Flee, TIME (2022), online at https://time.com/6153276/ukraine-refugees-racism/ (visited August 3, 2022). 

[13] Moldova: Romani Refugees from Ukraine Face Segregation, Human Rights Watch Report (2022), online at https://www.hrw.org/news/2022/05/25/moldova-romani-refugees-ukraine-face-segregation  (visited August 3, 2022). 

[14] Monika Pronczuk and Ruth Maclean, Africans Say Ukrainian Authorities Hindered Them From Fleeing, The New York Times (2022), online at https://www.nytimes.com/2022/03/01/world/europe/ukraine-refugee-discrimination.html (visited August 3, 2022).

[15] Regulation (EU) No. 603/2013 of the European Parliament and of the Council, Council of the European Union (June 26, 2013), online at https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32013R0603&from=EN (visited August 14, 2022).

[16] Koen Leurs and Tamara Shepherd, “Datafication & Discrimination,” The Datafied Society 211-232 (Amsterdam University Press 2017). 

[17] General Data Protection Regulation, Council of the European Union (April 27, 2016), online at https://gdpr-info.eu/ (visited August 14, 2022).

[18] Data Protection Act 2018, Parliament of the United Kingdom (May 23, 2018), online at https://www.legislation.gov.uk/ukpga/2018/12/contents/enacted (visited August 14, 2022).

[19] General Personal Data Protection Law, Chamber of Deputies of Brazil (August 14, 2018), online at https://lgpd-brazil.info/ (visited August 14, 2022).

[20] Working Group on the Role of Personal Data Protection in International Development Aid, International Humanitarian Aid and Crisis Management, Global Privacy Assembly (July 2021), online at https://globalprivacyassembly.org/wp-content/uploads/2021/10/1.3k-version-4.0-Humanitarian-Aid-Working-Group-EN-adopted.pdf (visited August 6, 2022). 

[21] A Human Rights Based Approach to Data, United Nations Office of the High Commissioner (April 2016), online at https://www.ohchr.org/sites/default/files/Documents/Issues/HRIndicators/GuidanceNoteonApproachtoData.pdf (visited August 6, 2022).

[22] Ibid.