key: cord-0626040-p8m2flqm authors: Jamroga, Wojciech; Mestel, David; Roenne, Peter B.; Ryan, Peter Y. A.; Skrobot, Marjan title: A Survey of Requirements for COVID-19 Mitigation Strategies. Part I: Newspaper Clips date: 2020-11-16 journal: nan DOI: nan sha: 129cc684843788a1fe4693f4a047d100b76710ff doc_id: 626040 cord_uid: p8m2flqm The COVID-19 pandemic has influenced virtually all aspects of our lives. Across the world, countries have applied various mitigation strategies for the epidemic, based on social, political, and technological instruments. We postulate that one should {identify the relevant requirements} before committing to a particular mitigation strategy. One way to achieve it is through an overview of what is considered relevant by the general public, and referred to in the media. To this end, we have collected a number of news clips that mention the possible goals and requirements for a mitigation strategy. The snippets are sorted thematically into several categories, such as health-related goals, social and political impact, civil rights, ethical requirements, and so on. In a forthcoming companion paper, we will present a digest of the requirements, derived from the news clips, and a preliminary take on their formal specification. The COVID-19 pandemic has influenced virtually all aspects of our lives. Across the world, countries have applied wildly varying mitigation strategies for the epidemic, ranging from minimal intrusion in the hope of obtaining "herd immunity", to imposing severe lockdowns on the other extreme. The strategies are based on various social, political, and technological instruments. According to Digital Rights Tracker, as of June 2020: • Contact Tracing Apps were being used in 28 countries, • Alternative digital tracking measures were active in 35 countries, • Physical surveillance technologies were in use in 11 countries, • COVID-19-related censorship had been imposed by 18 governments, • Internet shutdowns continued in 3 countries despite the outbreak (Woodhams, 2020). It seems clear at the first glance what all those measures are trying to achieve, and what the criteria of success are. But is it really that clear? Quoting an oft-repeated phrase, with COVID-19 we fight an unprecedented threat to health and economic stability 1 (Soltani et al., 2020) While fighting it, we must Protect privacy, equality and fairness (Morley et al., 2020) and do a coordinated assessment of usefulness, effectiveness, technological readiness, cyber security risks and threats to fundamental freedoms and human rights (Stollmeyer et al., 2020) Taken together, this is hardly a straightforward set of goals and requirements. Thus, paraphrasing (Stollmeyer et al., 2020) , one may ask: What problem does a COVID mitigation strategy solve exactly? Even a quick survey of news articles, manifestos, and research papers published since the beginning of the pandemic reveals a diverse landscape of postulates and opinions. Some authors focus on medical goals, some on technological requirements; others are more concerned by the economic, social, or political impact of a containment strategy. The actual stance is often related to the background of the author (in case of a researcher) or their information sources (in case of a journalist). We postulate that one should identify the relevant requirements before committing to a particular mitigation strategy. In fact, this should be done before any informal or formal analysis and assessment of the available strategies, with as little bias as possible. One way to achieve it is through an overview of what is considered relevant by the general public, and referred to in the media. To this end, we have collected a number of quotes and "news clips" touching on the topic. We looked especially at the most popular IT solution proposed to contain the epidemic, namely digital contact tracing apps. The idea has generated a fair amount of controversy, and for a while virtually dominated the discussions and press coverage of COVID-19 containment strategies. Eventually contact tracing apps have been deployed on a large scale, with several hundred million downloads across the world, 2 leading to the discovery of various flaws and even more discussion on what characterizes acceptable and effective containment measures. In this paper, we present some snippets from the press and Internet media, published between March and June 2020, that mention the possible goals and requirements for a mitigation strategy. The snippets are sorted thematically into several categories, such as health-related goals, social and political impact, civil rights, ethical requirements, etc. While most of the presented news clips focus on digital contact tracing for COVID-19, it is easy to see that the relevance of the requirements goes beyond contact tracing apps, and applies to any large scale epidemic: also the ones that may happen in the future. We emphasize that we do not endorse the opinions being presented in the quotes. We also do not comment on their content. We merely use the quotes to collect relevant keywords and conceptual categories. The assessment of how the requirements are addressed by different mitigation strategies, and which strategy to select, is a matter for systematic analysis that should follow in the next step. Clearly, an epidemic is first and foremost a threat to people's health and lives. Accordingly, we begin with media clips related to this aspect of COVID-19 mitigation strategies. Containment measures (and contact tracing apps in particular) should slow the spread of the virus, decrease the transmission rate, and save lives: Since the outbreak of COVID-19, governments around the world have implemented a range of digital tracking, physical surveillance and censorship measures in a bid to slow the spread of the virus. (Woodhams, 2020) Lockdowns prevented around 3.1 million deaths in 11 European countries, according to a new modelling study (AFP, 2020) Researchers also calculated that the interventions had caused the reproduction number -how many people someone with the virus infects to drop by an average of 82 percent, to below 1.0. (AFP, 2020) Contact tracing can be an important component of an epidemic response especially when the prevalence of infection is low. Such efforts are most effective where testing is rapid and widely available and when infections are relatively rare. (Soltani et al., 2020) designed and built by the NHS to help slow the spread of the coronavirus (NCS, 2020) you'll be helping to slow the transmission of the coronavirus. (NCS, 2020) To best meet public health needs, digital technology should be able to trace the spread of the virus, identify dangerous Covid-19 clusters and limit further transmission. The essential goal is to register contacts between potential carriers and those who might be infected. (Ilves, 2020) [Contact tracing apps are] a way to enhance traditional forms of contact tracing to find potential new infections (Timberg, 2020) Ultimately, contact tracing is a public health intervention, not an individual health one. It can reduce the spread of disease through the population (Soltani et al., 2020) Contact tracing via smartphone is a powerful way to tackle the spread of coronavirus, but it mustn't be done at the expense of individual civil rights. (Bicheno, 2020) Smittestopp is an app that will help the health authorities to limit the transmission of coronavirus and alert users with text messages about close contact with infected persons. (hel, 2020) The measures should be effective in containing the epidemic: what's missing in most debates is how effective the apps actually are (Stollmeyer et al., 2020) We and many others have pointed out a host of pitfalls for voluntary, self-reported coronavirus apps of the kind Apple, Google, and others contemplate. First, app notifications of contact with COVID-19 are likely to be simultaneously both over-and under-inclusive. (. . . )Individuals may be flagged as having contacted one another despite very low possibility of transmission-such as when the individuals are separated by walls porous enough for a Bluetooth signal to penetrate. Nor do the systems account for when individuals take precautions, such as the use of personal protective equipment, in their interactions with others. At least as problematic is the issue of false negatives (. . . ) Contact-tracing apps therefore cannot offer assurance that going out is safe, just because no disease has been reported in the vicinity. (Soltani et al., 2020) Inaccurate results are likely to lead to a high number of false positives and negatives that may adversely impact the relaxation of lockdown measures. (Woodhams, 2020) On the Isle of Wight, the NHS contact tracing app is making a difference. Around 25 people per day are being tested for coronavirus after reporting it through their phones (Burgess, 2020) The strategy should support rapid information flow to the most concerned: As the global fight against the COVID-19 pandemic continues, much of the world is pinning its hopes of easing lockdowns on being able to quickly identify people who might have been exposed to the virus. (Zastrow, 2020) Technologies to rapidly alert people when they have been in contact with someone carrying the coronavirus SARS-CoV-2 are part of a strategy to bring the pandemic under control. (Morley et al., 2020) some public health and consumer advocacy groups have urged the use of apps that allow people who test positive for Covid-19 anonymously message recent contacts. (POL, 2020) a retired home-care nurse (. . . ) said she would welcome an app to help spread word of possible exposure. (Timberg, 2020) Smittestopp is an app that will help the health authorities to limit the transmission of coronavirus and alert users with text messages about close contact with infected persons. (hel, 2020) Clearly, there are tradeoffs between epidemiological efficiency and other concerns, as we will also see in the subsequent sections: The immediate goal for governments and tech companies is to strike the right balance between privacy and the effectiveness of an application to limit the spread of Covid-19. (Ilves, 2020) So far the clear evidence is that greater control of populations has worked better at stopping the coronavirus spread than a more relaxed attitude (McCarthy, 2020) Some states are raising concerns that the [Google and Apple] app won't allow them to collect enough information due to privacy concerns. (POL, 2020) China's Health Code system (...) records a user's spending history in order to deter them from breaking quarantine (Clarance, 2020) The containment strategy should enable monitoring the state of the pandemic, the behavior of people, and the effectiveness of the strategy itself: Most states are waiting to see whether the Bluetooth-based release from Apple and Google, which is supposed to automatically notify people when they come close to someone who's tested positive, will be an effective way to monitor outbreaks. (POL, 2020) These serology tests can provide important data on how COVID-19 is spreading through a population. There is also hope that the presence of certain antibodies may signify immunity to future infection (Frasier, 2020) "The police won't have to monitor the places of quarantine," said Mateusz Morawiecki, the Polish prime minister, when announcing the country's coronavirus app would become mandatory. "We will know if people are following the rules." (Scott and Wanat, 2020) Effective monitoring is often at odds with other requirements, especially privacy: There are of course advantages to [the centralized] approach [of contact tracing], models can be adapted quicker and additional analysis can be performed, but the question which remains is whether this outweighs the risk to security and privacy (Davies, 2020) Most measures to contain the epidemic are predominantly social (lockdown being a prime example), and have strong social and economic impact. The impact of the containment strategy on the economy is of prime importance: The unprecedented threat from the novel coronavirus has confined many Americans to their homes, distancing them from one another at great cost to local economies and personal well-being. (Soltani et al., 2020) In a separate study, also published in Nature, researchers from UC Berkeley used a different method -econometric modelling used to assess how policies affect economic growth -to evaluate containment policies in China, South Korea, Italy, Iran, France and the United States. (AFP, 2020) [Contact tracing apps] help make resumption of economic and social activities safer in the months ahead. (Timberg, 2020) [Australian Prime Minister, Scott Morrison:] "If you want to return to a more liberated economy and society, it is important that we get increased numbers of downloads when it comes to the Covidsafe app." (Taylor, 2020) Regarding the available measures in general, and contact tracing apps in particular: we know very little about them or how they could affect society. As the global fight against the COVID-19 pandemic continues, much of the world is pinning its hopes of easing lockdowns on being able to quickly identify people who might have been exposed to the virus. (Zastrow, 2020) [The Covidsafe app] was sold as the key to unlocking restrictions (. . . ) but as the country begins to open up, the role of the Covidsafe app in the recovery seems to have dropped to marginal at best. (Taylor, 2020) The health minister, Greg Hunt, tweeted that [the Covidsafe app] was the key to being allowed to go back to watching football. (Taylor, 2020) We worry that contact-tracing apps will serve as vehicles for abuse and disinformation, while providing a false sense of security to justify reopening local and national economies well before it is safe to do so. (Soltani et al., 2020) Since the outbreak of COVID-19 there has been a rapid acceleration in the spread of mis-and disinformation. To control this, governments and social media platforms have sought to stringently regulate online content and promote official facts and figures from international health organisations. (Woodhams, 2020) There is also a very real danger that these voluntary surveillance technologies will effectively become compulsory for any public and social engagement. (Soltani et al., 2020) there is a real risk that these mobile-based apps can turn unaffected individuals into social pariahs, restricted from accessing public and private spaces or participating in social and economic activities. (Soltani et al., 2020) protecting those communities who can be (. . . ) harmed by the collection and exploitation of personal data. (Soltani et al., 2020) Protections need to be put in place to expressly prohibit economic and social discrimination on the basis of information and technology designed to address the pandemic. (Soltani et al., 2020) The pandemic creates new space for political manipulation and changes the distribution of power: the issue of malicious use is paramount-particularly given this current climate of disinformation, astroturfing, and political manipulation. Imagine an unscrupulous political operative who wanted to dampen voting participation in a given district, or a desperate business owner who wanted to stifle competition. Either could falsely report incidences of coronavirus without much fear of repercussion. Trolls could sow chaos for the malicious pleasure of it. Protesters could trigger panic as a form of civil disobedience. A foreign intelligence operation could shut down an entire city by falsely reporting COVID-19 infections in every neighborhood. (Soltani et al., 2020) In the long run, however, this poses a far more fundamental question: how much can the decisions of sovereign democratic countries be overruled by technology companies [i.e., Google and Apple]? (Ilves, 2020) The unprecedented threat from the novel coronavirus has confined many Americans to their homes, distancing them from one another at great cost to local economies and personal well-being. (Soltani et al., 2020) we all long for freedom from lockdowns and home confinement (Stollmeyer et al., 2020) The financial costs incurred by the containment measures, as well as the engaged human resources, are another important factor. The government has been forced to abandon a centralised coronavirus contact-tracing app after spending three months and millions of pounds on technology that experts had repeatedly warned would not work. (. . . ) [T]he health secretary said the government would not "put a date" on when the new app may be launched, although officials conceded it was likely to be in the autumn or winter. (Hern, 2020) "The police won't have to monitor the places of quarantine," said Mateusz Morawiecki, the Polish prime minister, when announcing the country's coronavirus app would become mandatory. "We will know if people are following the rules." (Scott and Wanat, 2020) The lure of automating the painstaking process of [manual] contact tracing is apparent. (Soltani et al., 2020) The logistics of the strategy (in particular, coordination and cooperation between different institutions and authorities) also plays a role: U.S. efforts (. . . ) are running into an all-too-familiar problem that has plagued the pandemic response: a lack of national coordination. (Tahir and Lima, 2020) the federal government has so far failed to institute concrete privacy standards. (Tahir and Lima, 2020) Similarly, the European Commission postulated a set of technical specifications to ensure a safe exchange of information between national contact tracing apps based on a decentralised architecture [to] work seamlessly when users travel to another EU country which also follows the decentralised approach. (Eur, 2020) 4 Compliance with Norms, Rights, and Obligations In this section, we look at requirements that aim at the long-term robustness and resilience of the social structure, such as ethical and legal requirements. The mitigation strategy must be ethically justifiable (Morley et al., 2020) . Importantly: ethical and social considerations [other than privacy] must not be cast aside in the rush to quell the pandemic. For instance, contact-tracing apps should be available and accessible to anyone, irrespective of the technology needed or their level of digital literacy. (...) Even in a crisis, a 'try-everything' approach is dangerous when it ignores the real costs, including serious and long-lasting harms to fundamental rights and freedoms, and the opportunity costs of not devoting resources to something else. (Morley et al., 2020) Some of these [mitigation measures] may well be proportionate, necessary and legitimate during these unprecedented times. However, others have been rushed through legislative bodies and implemented without adequate scrutiny. (Woodhams, 2020) the data agency Datatilsynet (. . . ) said the restricted spread of coronavirus in Norway, as well as the app's limited effectiveness due to the small number of people using it, meant the invasion of privacy resulting from its use was disproportionate. (Oslo, 2020) Note that the ethical considerations do not necessarily oppose strict measures of containing the epidemic. In particular: Temporarily restricting some fundamental rights and freedoms might be ethically justifiable in the context of hastening the end of the pandemic. Quarantining individuals, for example, helps to prevent the spread of the disease. Arguably, it might be unethical not to use digital tracing apps when necessary. Nevertheless, much depends on the effectiveness of the app, the goal pursued, the type of system and the context in which it will be deployed. (...) To be ethical, a contact-tracing app must abide by four principles: it must be necessary, proportional, scientifically valid and time-bound. These principles are derived from the European Convention on Human Rights, the International Covenant on Civil and Political Rights (ICCPR) and the United Nations Siracusa Principles, which specify the provisions in the ICCPR that limit how it can be applied. (Morley et al., 2020) An app's implementation strategy and impact must also be considered. Something that looked good on paper can turn out to be ineffective in practice. (. . . ) If an app fails, it becomes unnecessary, and thus unethical. (Morley et al., 2020) Moreover, the terms and conditions of such an app should be just (Scott and Wanat, 2020) . One must also keep in mind that ethical norms and their interpretation vary depending on the actual place, time, and social context. In practice, there will be trade-offs. These will depend on the laws, values, attitudes and norms in different regions, as well as on changes over time in the spread and scale of the virus and the available technology. (. . . ) Similarly, what was ethically justifiable in one place yesterday might not be so tomorrow. (Morley et al., 2020) The detailed guidelines for ethical justifiability of a contact-tracing app from (Morley et al., 2020) are presented below. Principles: is this the right app to develop? 1. Is it necessary? • Yes, it must be developed to save lives (+). • No, there are better solutions (-). • Yes, the gravity of the situation justifies the potential negative impact (+). • No, the potential negative impact is disproportionate to the situation (-). 3. Is it sufficiently effective, timely, popular and accurate? • Yes, evidence shows that it will work, is timely, will be adopted by enough people and yields accurate data and insights (+). • No, it does not work well, is available too late or too early, will not be used widely, and is likely to collect data that have false positives and/or false negatives (-). • Yes, there is an explicit and reasonable date on which it will cease (+). • No, it has no defined end date (-). Requirements: is this app being developed in the right way? • Yes, it is optional to download and install (+). • No, it is mandatory and people can be penalized for non-compliance (-). 6. Does it require consent? • Yes, people have complete choice over what data are shared and when, and can change this at any time (+). • No, default settings are to share everything all the time, and this cannot be altered (-). 7. Are the data kept private and users' anonymity preserved ? • Yes, data are anonymous and held only on the user's phone. Others who have been in contact are notified only that there is a risk of contagion, not from whom or where. Methods such as differential privacy are used to ensure this. Cyber-resilience is high (+). • No, data are (re)identifiable owing to the level of data collected, and stored centrally. Locations of contacts are also available. Cyber-resilience is low (-). 8. Can users erase the data? • Yes, they can do so at will; all data are deleted at the end point (+). • No, there is no provision for data deletion, nor a guarantee that it can ever be deleted (-). 9. Is the purpose of data collection defined? • Yes, explicitly; for example, to alert users that they have encountered a potentially infected person (+). • No, the purposes of data collection are not explicitly defined (-). 10. Is the purpose limited ? • Yes, it is used for tracing and tracking of COVID-19 only (+). • No, it can be regularly updated to add extra features that extend its functionality (-). 11. Is it used only for prevention? • Yes, it is used only to enable people voluntarily to limit spread (+). • No, it is also used as a passport to enable people to claim benefits or return to work (-). 12. Is it used for compliance? • No, it is not used to enforce behaviour (+). • Yes, non-compliance can result in punishment such as a fine or jail time (-). • Yes, the code is publicly available for inspection, sharing and collaborative improvement (+). • No, the source code is proprietary, and no information about it is provided (-). 14. Is it equally available? • Yes, it is free and distributed to anyone (+). • No, it is arbitrarily given only to some (-). • Yes, it is user-friendly, even for naive users, and works on the widest possible range of mobile phones (+). • No, it can be used only by those with specific devices and with sufficient digital education (-). 16. Is there a decommissioning process? • Yes, there is a process for shutting it down (+). • No, there are no policies in place (-). An important aspect of ethical requirements is to protect rights and liberties: Contact tracing via smartphone is a powerful way to tackle the spread of coronavirus, but it mustn't be done at the expense of individual civil rights. (Bicheno, 2020) Is [the app] mandatory ? How private is the app? Are citizens' rights being safeguarded? According to a legal opinion on smartphone contact tracing and other data driven proposals: A de-centralised smartphone contact tracing system -the type contemplated by "DP-3T" and being considered by governments across Europe and also Apple and Google -would be likely to comply with both human rights and data protection laws. In contrast, a centralised smartphone system -which is the current UK Government proposalis a greater interference with fundamental rights and would require significantly greater justification to be lawful. That justification has not yet been forthcoming. (Mat, 2020) Any attempt to introduce 'immunity passports' would be a dramatic measure, both socially and legally. It would need a clear scientific basis and would also have to address the significant impact on fundamental rights including the risk of indirect discrimination. (Mat, 2020) For instance, the designers of the NCSC app developed in the UK declared that: [installing the app is] entirely voluntary (NCS, 2020) and You can decide if you want to tell the app that you're suffering from coronavirus symptoms. (NCS, 2020) The reality, however, is often far removed from the intended properties, as the same NCSC app illustrates: as soon as someone agrees to share their information with UK government -by claiming to feel unwell and hitting a big green button -28 days of data from the app is given to a central server from where it can never be recovered. That data, featuring all the unique IDs you've encountered in that period and when and how far apart you were, becomes the property of NCSC (. . . ) [The NCSC chief exec Matthew Gould] also admitted that the data will not be deleted, UK citizens will not have the right to demand it is deleted, and it can or will be used for "research" in future. (McCarthy, 2020) In an attempt to slow the spread of COVID-19, governments around the world are also adopting increasingly extensive physical surveillance measures. These include the deployment of facial recognition cameras equipped with heat sensors, surveillance drones used to monitor citizens' movements, and extensive CCTV networks in a bid to help enforce curfews. (Woodhams, 2020) India has made it mandatory for government and private sector employees to download [its contact-tracing app]. (Clarance, 2020) Using a phone's Bluetooth and location data, Aarogya Setu lets users know if they have been near a person with Covid-19 by scanning a database of known cases of infection. The data is then shared with the government. (Clarance, 2020) To register, users have to give their name, gender, travel history, telephone number and location. (...) While your name and number won't be made public, the app does collect this information, as well as your gender, travel history and whether you're a smoker. (Clarance, 2020) China's Health Code system, which records a user's spending history in order to deter them from breaking quarantine, is invasive. (Clarance, 2020) Many regions that have made contact tracing a key part of their COVID-19 playbook -including China, South Korea, Taiwan and Israel -have empowered contact tracers with sensitive details of infected people, including CCTV footage, credit-card transactions and location data from mobile-phone carriers. (Zastrow, 2020) several governments have also co-opted the rise of mis/disinformation to justifying censorship practices which seek to silence critics and control the flow of information. (Woodhams, 2020) UK finds itself almost alone with centralized virus contact-tracing app that (...) may be illegal (McCarthy, 2020) The UK Government's announcements in March and April for sharing health data between the private and public sector appear to be flawed. This means such data sharing is potentially not in compliance with legal requirements. Further information needs to be provided to ensure compliance and a data impact assessment should be conducted and made public. (Mat, 2020) the UK's own contact-tracing app runs roughshod over regional [GDPR] regulations and could, hypothetically, lead to a sweeping, NHS-widefine in its current state. (Wodinsky, 2020) Privacy-related issues for COVID-19 mitigation strategies have triggered heated discussion, and at some point gained much of the media coverage. This is understandable, since privacy and data protection is an important aspect of medical information flow, even in ordinary times. Moreover, the IT measures against COVID-19 are usually designed by computer scientists and specialists, for whom technological requirements (such as information security) are relatively easy to identify and understand. As the authors of (Soltani et al., 2020) cautiously stated: Some of the contact-tracing frameworks have been designed with security and privacy in mind, to some degree. The designers of contact-tracing apps have sometimes made overly bold statements about their privacy level. Nevertheless, those indicate a general requirement that is clearly important for many users: Any information you choose to submit is protected at all times. (NCS, 2020) In particular: developers of contact-tracing apps (. . . ) should include recommendations for how back-end systems should be secured and how long data should be retained, criteria for what public health entities can qualify to use these technologies, and explicit app store policies for what additional information, such as GPS or government ID numbers, can be collected. They should adopt commonly accepted practices such as security auditing, bug bounties, and abusability testing to identify vulnerabilities and unintended consequences of a potentially global new technology. Finally, app creators-as well as the platforms that enable these applications-should make explicit commitments for when these apps and their underlying APIs will be sunsetted. (Soltani et al., 2020) The design is intended to mitigate privacy concerns inherent in a technological approach to identifying possible new infections, the tech companies have said. They say the apps would not collect location data or save records to a central database and would instead rely on the records saved and distributed across people's phones, which would be privacy-protected and periodically erased. (Timberg, 2020) The app itself doesn't collect information that would obviously reveal someone's identity (Burgess, 2020) But under data protection laws the app isn't anonymous. GDPR and the UK's data protection rules define 'personal data' as something that can identify an individual. Under GDPR, an identifier assigned to a phone can be considered personal data. (In the past a person's IP address has been ruled to be personal data). While the Bluetooth logging system in the NHS app doesn't collect location information, or other types of data, it does create an identifier (known as InstallationID) for every phone that uses the app. This counts as something that could lead to the identification of an individual. (Burgess, 2020) "The NHSX app does not preserve the anonymity of users, as it primarily processes pseudonymous, not anonymous, personal data," Michael Veale, a lecturer in digital rights and regulation at University College London, wrote in an analysis of the NHS app. "Anonymous information is only that which is not personal data". (Burgess, 2020) From the user perspective there is also the problem of informed consent. (Ilves, 2020) Alerts can be too revealing (BBC, 2020), e.g., it may be possible to work out who is associating with whom from the app's ID numbers. (McCarthy, 2020) [the data from the app] may be linked to other datasets at some point in future. (Wodinsky, 2020) A direct exploitation of personal information is even more dangerous: Kuwait and Bahrain have rolled out some of the most invasive Covid-19 contact-tracing apps in the world, putting the privacy and security of their users at risk, Amnesty International says. (Garthwaite and Anderson, 2020) The rights group found the apps were carrying out live or near-live tracking of users' locations by uploading GPS co-ordinates to a central server. (...) The researchers say Bahraini and Kuwaiti authorities would easily be able to link this sensitive personal information to an individual, as users are required to register with a national ID number. (Garthwaite and Anderson, 2020) Bahrain's app was linked to a television show called "Are You At Home?", which offered prizes to users who stayed at home during Ramadan. (Garthwaite and Anderson, 2020) Most people accept that it might be necessary to waive users' privacy in the short term in order to contain the epidemic. However, one must look for mechanisms that impact privacy to the least possible extent. The [Apple and Google] approach is not without risk, researchers acknowledge, but exploiting that risk would require significant effort by hackers for seemingly little reward. For example, they would have to turn on a different phone every time they came near a different person and wait several days to see if it reported a positive test result. (Zastrow, 2020) Moreover, privacy-related requirements depend on the geographical and political context: The issues uncovered by Amnesty's investigation are particularly alarming given that the human rights records of Gulf governments are poor. (...) "If privacy is violated in a country like Norway, I can resort to regional tools such as the European Court of Human Rights and European Committee of Social Rights. But in our region there is not any such tool. On the contrary, resorting to local authorities may present an additional risk" [Mohammed al-Maskati, a Bahraini activist said.] (Garthwaite and Anderson, 2020) Here, the key question is: What data will they collect, and who is it shared with? In particular, it is often postulated to have less state access and control over user data (Bicheno, 2020) the limits on the type of data collection are the core concern for states. (Tahir and Lima, 2020) In Singapore, for example, the TraceTogether app can be used only by its health ministry to access data. It assures citizens that the data is to be used strictly for disease control and will not be shared with law enforcement agencies for enforcing lockdowns and quarantine. (Clarance, 2020) [In Australia,] Only health officials in the states can access the data, and you can't be forced to download it. (Taylor, 2020) The app does not collect any of your personal data. (NCS, 2020) the lack of clear privacy policies and the use of centralised data storage increases the possibility that the data may be vulnerable to abuse. (Woodhams, 2020) collection of data on centralised servers: Aside from the risk to privacy, collecting millions of datasets of personal information in a single place could be viewed as somewhat of a treasure trove. (Davies, 2020) if the [central] database is hacked, the anonymity provided by rotating pseudonyms is nullified, and individuals can be more easily tracked. Plus, says Kreps, "there's a risk of function creep and state surveillance". "I have little faith in government's ability to keep data like this secure," says Green. (Zastrow, 2020) data breaches can also come through cyberattacks or independent actors within an agency (Eisenberg, 2020) In particular, the collected information should not be exploited for commercial purposes: existing regulations don't address whether data can be shared across agencies or if it can be sold by a third party for non-Covid-19 tracking. (Eisenberg, 2020) We found code relating to Google's advertising and tracking platforms in 17 contact tracing apps. (. . . ) Aside from the ethics of monetizing public health in this way, the presence of such tracking code in contact tracing apps raises privacy red flags due to the targeting options offered by Google's ad platforms. (...) We also found code that enabled varying levels of integration with Facebook in seven apps. This ranges from direct integration with Facebook's advertising platform to functionality allowing users of the apps to link their Facebook accounts, or to share content from the contact tracing apps to Facebook. (Woodhams, 2020) Protections need to be put in place to expressly prohibit economic and social discrimination on the basis of information and technology designed to address the pandemic. (Soltani et al., 2020) protecting those communities who can be (. . . ) harmed by the collection and exploitation of personal data. (Soltani et al., 2020) In this respect, an important question is: How will that information be used in the future? Are there policies in place to prevent abuse? To capture this information, guided by principles put forward by the American Civil Liberties Union and others, we asked five questions: Is it voluntary ? Are there limitations on how the data gets used? Will data be destroyed after a period of time? Is data collection minimized? Is the effort transparent? Among temporal requirements, sunsetting is most often mentioned. There is also a concern that the [surveillance] technology will continue to be used after the threat of the coronavirus recedes (Garthwaite and Anderson, 2020) questions about privacy -including when the tracking will be switched off. (Scott and Wanat, 2020) "We need an independent state figure that's not the government who can guarantee this data will eventually be deleted," said Arnoldo Frigessi, a professor at the University of Oslo (. . . )"We need to ask the questions: When will this stop, and who will get to decide? " (Scott and Wanat, 2020) Any information you submit is deleted once it is no longer needed to help manage the spread of coronavirus. (NCS, 2020) All the data will stay with the government for six years" (Scott and Wanat, 2020) For example, the Norwegian app Smittestopp explicitly promised the following sunsetting requirements (hel, 2020): • data from the last 30 days are constantly recorded and older data are deleted • one can delete one's personal information at any time • one can delete the app. • the user can also choose whether to turn the logging features on or off • one has the right to access the data that the Norwegian Institute of Public Health stores about them. Provisions for monitoring of privacy are also essential: [Unlinkability] must be backed up with clear lines of accountability, processes for evaluating linkage or export requests, and strong assurance monitoring. (Wodinsky, 2020) judicial oversight and sunset provisions (. . . ) safeguards with respect to the privacy of data (Soltani et al., 2020) As the virus spreads rapidly, it's vital that the public are giving the information [people] need to protect themselves and others. (BBC, 2020) Dr Lee says the public needs to remain mature with this information -otherwise "people who fear being judged will hide and this will put everyone in further danger ". (BBC, 2020) the government is letting people know if they were in the vicinity of a patient (. . . ) and now there is as much fear of social stigma as of illness (BBC, 2020) even if patients are not outright identified, they're facing judgementor ridicule -online. (BBC, 2020) A research team at Seoul National University's Graduate School of Public Health (. . . ) found "criticisms and further damage" were more feared than having the virus. (BBC, 2020) Doctors warn that online pursuit of patients could have very serious consequences. Malicious comments online have long been a problem in South Korea, and in some cases have led to suicide. (BBC, 2020) Much of the debate of centralized vs. decentralized design for contact-tracing apps follows the dilemma whether it is justified to sacrifice the users' privacy for the sake of efficiency in containing the epidemic: [A centralized contact-tracing app] takes data from people's phones and saves it on a central system where experts are trusted to make the best possible use of the data, including providing advice to people as and when necessary. (McCarthy, 2020) Some [centralized apps] store all users' interaction data on government servers that analyse the data and perform the contact matching. Proponents say that this 'centralized' model allows health authorities to use the database to piece together a view of the network of contacts, enabling further epidemiological insights such as revealing clusters and superspreaders. (Zastrow, 2020) [decentralized contact-tracing apps:] puts users in more control of their information, and alerts them automatically with no intervention from a third party. Health experts are less in favour of [the decentralized] model because it makes it harder to spot outbreaks and is harder to follow up for close contacts, but privacy advocates insist it is the most secure. (Taylor, 2020) As we have seen in Section 5.6, there is a tradeoff between protecting privacy vs. collecting and protecting all the information that can be useful in fighting the epidemic. The relationship is not that simple, though. Privacy provisions are instrumental in building trust. Conversely, lack of privacy undermines trust, and may weaken the epidemiological, economic, and social effects of the mitigation activities. Privacy fears threaten New York City's coronavirus tracing efforts (Eisenberg, 2020) the de Blasio administration's unwillingness to specify how privacy will be protected will limit the tracing effort's reach and potentially prolong the need for strict lockdown (Eisenberg, 2020) [Manual] Contact tracing relies on a city worker building trust with a Covid-19-positive person, who then details where they've been and with whom they've been in contact prior to the onset of symptoms (...) Contact tracing requires handing over intimate personal data -including home addresses, names of friends and relations -to strangers, many of whom were only recently trained and hired to collect the information. (Eisenberg, 2020) Hundreds of thousands of New Yorkers will be asked to disclose personal information this month as part of the city's herculean Covid-19 tracing effort -but suspicions over how the government will use that information are threatening the city's best chance to crawl out of its coronavirus lockdown. (Eisenberg, 2020) "I did already hear rumors in Yiddish that they shouldn't answer the [contact tracing] calls," said Yosef Hershkop, manager of Kāmin Health Crown Heights Urgent Care. "There's no way they're going to get any more cooperation from the Jewish community than the African American community or Arab community or Muslim community. Everyone is suspicious." (Eisenberg, 2020) The measures must be adopted and followed by the people, in order to make them effective. Ultimately, contact tracing (...) can reduce the spread of disease through the population, but does not confer direct protection on any individual. This creates incentive problems that need careful thought: What is in it for the user who will sometimes be instructed to miss work and avoid socializing, but does not derive immediate benefits from the system? (Soltani et al., 2020) A poll published last week by the Kaiser Family Foundation found the nation similarly split on willingness to use an infection-alert app but found much higher rates of acceptance -with 2 out of every 3 Americans willing -when they were told the technology was seen as assisting in opening schools and businesses and helping revive the stalled national economy. But fewer than 3 in 10 were willing to use an app if there was a "chance the data could be hacked." (Timberg, 2020) The government (. . . ) didn't exclude mandatory downloading [that] stands in stark contrast to the 'smart lockdown' with strong reliance on personal responsibility (Stollmeyer et al., 2020) Won't this tool divert attention from more important measures, and make people less alert? (Szymielewicz et al., 2020) even with a very accurate test, the fewer people in a population who have a condition, the more likely it is that an individual's positive result is wrong. If it is, people might think they have the antibodies (and thus may have immunity), when in fact they do not. (Frasier, 2020) Since the app was released in mid-March people's experiences have varied, with many willing to give up their personal information to help reduce the number of overall infections. Most said the app is prone to failure, not a surprise as the Polish government created it in just three days based on an out-of-the-box service offered by a third-party developer. (Scott and Wanat, 2020) The aspects indicated in Sections 6.1 and 6.2 have impact on adoption of the strategy: How many people will download and use [the apps], and how widely used do they have to be in order to succeed? Another challenge is ensuring that enough people download the app to make it effective. (Zastrow, 2020) For example, it was reported in (Bezat, 2020) that only 2% of people in France installed the app as of 10/06. This has important consequences: Reduced adoption [of the app] could limit its effectiveness in slowing new infections and deaths. (Timberg, 2020) The Oxford models 3 found that "the app has an effect at all levels of uptake" (...) The paper says that if 80% of all smartphone users download the app-a number that excludes groups less likely to have a smartphone and is equivalent to 56% of the overall population-this would be enough to suppress the pandemic on its own, without any other form of intervention. if fewer people download the app, say the researchers, other prevention and containment measures will be required. These include social distancing, widespread testing, manual contact tracing, medical treatment, and regional shutdowns Trust is an important issue (see also the connection between privacy provisions and trust in Section 5.7). For people to use the app, there needs to be trust. (Burgess, 2020) [J.T. Lane, chief population officer at the Association of State and Territorial Health Officials:] "You need to learn everything you can about [the users'] culture and maybe even their previous relationship with technology," Lane said, including "how stigma and discrimination might be playing a role in that particular group or culture." (POL, 2020) The concrete measures have to work, in the most basic sense of the term. In case of contact tracing apps, this means that the design correctly addresses the desired functionalities, the implementation is correct with respect to the design, and the app has an acceptable level of usability. UK finds itself almost alone with centralized virus contact-tracing app that probably won't work well, asks for your location, may be illegal (McCarthy, 2020) According to local reports from folks that have downloaded the program thus far, not only is the app incompatible with a range of different devices, but it's also a battery-draining mess that can be confusing to use. (Wodinsky, 2020) Beyond concerns over privacy, one key practical challenge to phonebased contact tracing is making accurate measurements of how close two devices are. (Zastrow, 2020) Moreover, the design and the implementation should be sufficiently transparent. Covid Tracing Tracker puts forward a number of "basic questions" related to the transparency of the design and implementation: Other requirements include: • cross-border interoperability (Cyb, 2020) • transparency of the code (SDZ, 2020) • possibility to verify the code by the public and experts (SDZ, 2020) 8 Evaluation and Learning for the Future COVID-19 mitigation activities should be subject to rigorous assessment. Moreover, their outcomes should be used to extend our knowledge about this and similar pandemics, and better defend ourselves in the future. Systematic evaluation of the adopted solutions is itself a requirement: "After the spread of virus ends," [Mr Goh, from the Korea Centers for Disease Control Prevention] says, "there has to be society's assessment whether or not this was effective and appropriate." (BBC, 2020) A review and exit strategy must be in place to establish when and how fast this should happen. These assessments should be conducted by an independent body, such as a regulator or an ethics advisory board, and not by the designers or the government itself. Circumstances and attitudes are changing quickly, so the questions (...) must be asked anew at regular intervals. (Morley et al., 2020) Evaluation and assessment is bound to be multi-dimensional and complex: Before rolling out any app, the EU needs to do a coordinated assessment of usefulness, effectiveness, technological readiness, cyber security risks and threats to fundamental freedoms and human rights. (Stollmeyer et al., 2020) it's difficult to determine the effect of app use in combination with early and extensive testing, the massive wearing of face masks and strict enforcement at physical distance. (Stollmeyer et al., 2020) 8.2 Learning for the Future Last but not least, we need to building scientific knowledge, and learn how to better fight pandemics in the future. Anonymised data on movement patterns of the population collected by the app are also supposed to be used to develop efficient infection control measures. [They] will be used to gain insight into the effect of changes to the measures for fighting the virus. (hel, 2020) In this report, we make the first step towards a systematic analysis of strategies for effective and trustworthy mitigation of the current pandemic. The strategies may incorporate medical, social, economic, as well as technological measures. Consequently, there is a large number of medical, social, economic, and technological requirements that must be taken into account when deciding which strategy to adopt. Being computer scientists who specialize in information security, we find the latter kind of requirements most obvious, especially with respect privacy and security of information flow. This is exactly the pitfall that computer scientists must avoid. It is essential to realize that the goals (and acceptability criteria) for a mitigation strategy are much more diverse, and consciously choose a solution that satisfies the multiple criteria to a reasonable degree. In a forthcoming companion paper, we will present a digest of the requirements, derived from the news clips, and a preliminary take on their formal specification. Coronavirus privacy: Are South Korea's alerts too revealing? BBC News Cybernetica proposes privacy-preserving decentralised architecture for COVID-19 mobile application for Estonia. Cybernetica Legal advice on smartphone contact tracing published. matrix chambers Getting it right: States struggle with contact tracing push German coronavirus tracing app now available in Luxembourg. RTL Today Corona-app soll open source werden Aarogya Setu app enters 100 million users club. Times of India Together we can fight coronavirus -Smittestopp. helsenorge Major finding: Lockdowns averted 3 million deaths in 11 European nations: study. RTL Today Ten reasons why immunity passports are a bad idea. Nature Comment L'application StopCovid, activée seulement par 2% Unlike France, Germany decides to do smartphone contact tracing the Apple/Google way. telecoms.com Just how anonymous is the NHS Covid-19 contact tracing app? Wired Why India's Covid-19 contact tracing app is controversial UK snubs Google and Apple privacy warning for contact tracing app. telecoms.com Effective configurations of a digital contact tracing app: A report to NHSX Why are Google and Apple dictating how european democracies fight coronavirus? The Guardian COVID-Tech: the sinister consequences of immunity passports. EDRi Rzadowa aplikacja ma blad pozwalajacy na sprawdzenie, czy nasi znajomi podlegaja kwarantannie. Tabletowo UK finds itself almost alone with centralized virus contacttracing app that probably won't work well, asks for your location, may be illegal. The Register Ethical guidelines for COVID-19 tracing apps No, coronavirus apps don't need 60% adoption to be effective A flood of coronavirus apps are tracking us. now it's time to keep track of them Poland's coronavirus app offers playbook for other governments Contact-tracing apps are not a solution to the COVID-19 crisis The dutch tracing app 'soap opera' -lessons for europe Jak polska walczy z koronawirusem i dlaczego aplikacja nas przed nim nie ochroni? Panoptykon Google and Apple's rules for virus tracking apps sow division among states How did the Covidsafe app go from being vital to almost irrelevant? The Guardian Most Americans are not willing or able to use an app tracking coronavirus infections. that's a problem for Big Tech's plan to slow the pandemic The UK's contact-tracing app breaks the UK's own privacy laws (and is just plain broken) COVID-19 digital rights tracker Coronavirus contact-tracing apps: can they slow the spread of COVID-19? Something to declare? surfacing issues with immunity certificates.Ada Lovelace Institute, 02 June 2020.URL https://www.adalovelaceinstitute.org/