key: cord-0252454-kq4wqwd0 authors: Khosla, Rajat title: Technology, Health, and Human Rights: A Cautionary Tale for the Post-Pandemic World date: 2020-12-03 journal: Health Hum Rights DOI: nan sha: 2f1574fc5fc67eeaf8dfe04cd1d56277f187abdd doc_id: 252454 cord_uid: kq4wqwd0 nan tendency of data collected for one purpose to be repurposed for another, unforeseen purpose-is a fundamental component of the human rights risk posed by state surveillance and, as such, needs to be addressed with adequate safeguards. This means, as our research has shown, that we need strong data protection rules, legal safeguards, and meaningful regulation of the surveillance industry as we enter this new world of massive collection of data within the context of public health concerns. 4 Unfortunately, we are already seeing rumblings of such government abuse. For example, Israeli authorities attempted to grant the security services access to contact tracing data. 5 While this proposal appears to have been withdrawn, a look at the United States gives a dire warning of where such data sharing could lead. In the United States, the Immigration and Customs Enforcement Agency uses technologies provided by Palantir, a secretive tech giant, to conduct immigration raids that have led to hundreds of arrests, deportations, and family separations. 6 In April 2020, Palantir won a contract with the Department of Health and Human Services to build the "Protect Now" platform aggregating over 187 different data sources from the government and private sector. Given previous examples of the Department of Health and Human Services sharing data with Immigration and Customs Enforcement Agency, policies and practices around the use of technology within the context of public health pose serious concerns, especially for groups that are in particularly vulnerable situations. 7 In response, Amnesty Tech continues to make targeted and mass state surveillance an ongoing focus of our work, both in our investigations via our Digital Security Lab (which leads technical investigations into cyber attacks against civil society and provides critical support when individuals face such attacks) and in our advocacy and legal efforts, while taking account of the new ways in which states' use of technology to respond to the pandemic may exacerbate these harms globally. The second cause for concern is the ways in which employers can potentially abuse employee health data. Can an employer be allowed to demand that employees take COVID tests or to reveal their status? How can an employer use this information, and with whom can it be shared? Furthermore, existing regulations governing the collection and use of health data have not kept pace with a rapidly changing economy, especially in the United States. For instance, a gig worker may have very different legal protections or face other vulnerabilities than a contract employee. The intersection of the right to health and privacy requires more robust data protection standards. Amnesty Tech is analyzing existing legislation and upcoming efforts that may offer protection in some circumstances across jurisdictions governed by differing data protection frameworks, in order to inform our work to ensure human rights-compliant safeguards in these new contexts. The third cause for concern is health data taking on a key role in the expansion of the surveillance-based business model that dominates the tech sector, whereby people's digital data are bought and sold as a commodity. It is crucial to understand that these risks and harms take place against the backdrop not only of this business model but of a generally thinly regulated marketplace in data. As we pointed out in our report Surveillance Giants, the internet is dominated by companies whose primary means of earning profit is through advertising sales premised on their ability to collect, analyze, and draw inferences from massive amounts of our personal data. 8 Consider the issue of gathering and using health data to sell for advertising purposes. While this practice did not start with the pandemic, it has accelerated during this time. Numerous firms have been collecting health data from consumer products for some time now, including from a wearable fitness trackers, genetic test kits, and myriad other products. This valuable data can fuel analytics aimed at predicting consumer habits or choices and can be purchased to increase data companies' resources and value. In 2019, Fitbit's CEO stated that "ultimately Fitbit is going to be about the data" rather than its hardware or devices. 9 In 2020, Google acquired the company for US$2.1 billion. 10 This massive accumulation of personal data r. khosla / viewpoint, BIG DATA, TECHNOLOGY, Artificial Intelligence, AND THE RIGHT TO HEALTH, 63-66 N U M B E R 2 Health and Human Rights Journal usually occurs without individual consent (or with "consent" that is far from adequate under most data protection regimes), but the risk of harm is compounded when the data in question can be resold or shared without adequate safeguards. Moreover, the data are often useful insofar as they provide the basis for predictions about our behavior. While the underlying data themselves may be subject to protections in some jurisdictions, the inferences based upon them often are not, creating a particularly complex scenario. 11 Inferences that are created based on personal health data-"emergent medical data"-carry tremendous risks for human rights. 12 A health insurer may deny coverage based on a prediction made about a person to which they never consented and may not even know about. Likewise, artificial intelligence can monitor people's movements to track the spread of infectious disease or purchases to track a person's pregnancy status. Just as worrying is how frequently these predictions are inaccurate. 13 Moreover, without proper data subject rights or other avenues via which to claim a remedy, we are left without much recourse. In response, Amnesty Tech continues to push for a human rights-respecting business model for the internet, as well as safeguards for AI and machine learning systems, such as the Toronto Declaration, which highlights principles for protecting the rights to equality and nondiscrimination in machine learning systems. 14 We will do this while continuing to expose and oppose the harms created by the current business model, as well as any additional harms that may emerge from extensive and invasive analysis of our health data and the uses of the inferences that flow from them. Our rights to health and privacy are now more interlinked than ever before. Health data pose significant risks at the intersections of state surveillance, a surveillance-based internet, and data protection. All of these lack adequate safeguards to protect the rights at risk. George Orwell once said, "Who controls the past controls the future. Who controls the present controls the past." Without adequate safeguards and protection of rights in the digital space, we risk the health and well-being not only of people today but also of future generations. Amnesty International, Bahrain, Kuwait and Norway contact tracing apps among most dangerous for privacy When best practice isn't good enough: Large campaigns of phishing attacks in Middle East and North Africa target privacy conscious users Amnesty International, Ending the targeted digital surveillance of those who defend our rights: A summary of the impact of the digital surveillance industry on human rights defenders seeks-to-give-police-unrestricted-access-to-covid-contact-tracing-data-1.9261494. 6. Amnesty International, Failing to do right: Urgent need for Palantir to respect human rights Why are we trusting a company with ties to ICE and intelligence agencies to collect our health information? Surveillance giants: How the business model of Google and Facebook threatens human rights Fitbit's healthcare unit to deliver $100 million in revenue in 2019 Emergent Medical Data, Health information inferred by artificial intelligence Amnesty International, The Toronto Declaration: Protecting the rights to equality and non-discrimination in machine learning systems I would like to thank Joshua Franco, Tamaryn Nelson, and Rasha Abdul Karim for their inputs on the draft manuscript.