key: cord-0155645-uu4k2j2a authors: Talukder, Sajedul; Sakib, Md. Iftekharul Islam; Talukder, Zahidur title: Giving Up Privacy For Security: A Survey On Privacy Trade-off During Pandemic Emergency date: 2020-07-01 journal: nan DOI: nan sha: a3b73e4458945d35d3c8df342f2eeb00f21f6932 doc_id: 155645 cord_uid: uu4k2j2a While the COVID-19 pandemic continues to be as complex as ever, the collection and exchange of data in the light of fighting coronavirus poses a major challenge for privacy systems around the globe. The disease's size and magnitude is not uncommon but it appears to be at the point of hysteria surrounding it. Consequently, in a very short time, extreme measures for dealing with the situation appear to have become the norm. Any such actions affect the privacy of individuals in particular. For some cases, there is intensive monitoring of the whole population while the medical data of those diagnosed with the virus is commonly circulated through institutions and nations. This may well be in the interest of saving the world from a deadly disease, but is it really appropriate and right? Although creative solutions have been implemented in many countries to address the issue, proponents of privacy are concerned that technologies will eventually erode privacy, while regulators and privacy supporters are worried about what kind of impact this could bring. While that tension has always been present, privacy has been thrown into sharp relief by the sheer urgency of containing an exponentially spreading virus. The essence of this dilemma indicates that establishing the right equilibrium will be the best solution. The jurisprudence concerning cases regarding the willingness of public officials to interfere with the constitutional right to privacy in the interests of national security or public health has repeatedly proven that a reasonable balance can be reached. Today the coronavirus epidemic is overtaking areas of Asia, Europe and North America in full intensity, with the USA reporting more coronavirus-related deaths than any other country officially [1] . Everyday life has been impacted by the proliferation of virus, from social networks [2] [3] [4] [5] [6] to e-governance [7, 8] , from digital automation [9, 10] and cyber security [11, 12] to cellular networks [13] . During the global coronavirus epidemic, many high-tech approaches are placed in motion to help mankind combat the virus. The entire world is coming to a standstill as policymakers are urging people to stay indoors, even using coercion. It would not have been possible to separate ourselves from civilization if people had no access to modern technology. Digital media services have provided content for hundreds of millions of users, a vast amount of individuals have begun telecommuting to schools and jobs, and the usage of safety devices has grown dramatically when millions more patients are diagnosed with telehealth applications -the last thing politicians across the globe expected was to see citizens with extremely infectious illness symptoms. Nevertheless, the use of such technologies often poses several questions regarding privacy [14] . Israel also agreed to use the cellphone data gathered anonymously, typically meant for counter-terrorism use. The information was used to identify people who crossed paths with persons bearing COVID-19. A combination of geotracking and AI technology has allowed the Israeli government to recognize individuals who should be quarantined because of their potential coronavirus exposure. When the coronavirus pandemic spread across the globe, countries leveraged massive monitoring networks to track the transmission of the virus and pressured policymakers around the world to consider the trade-offs with millions of people in public health and safety. Government agencies in South Korea are harnessing surveillance camera video, mobile location data, and credit card payment records to help monitor recent coronavirus patient movements and create virus transmission chains. Iran launched an app promising to cure COVID-19. Today, much of what it does is collect millions of people's location info, effectively serving as a way for the government to monitor its population in real time. The U.S. aims to counter coronavirus through mobile location info. The White House and the Centers for Disease Control and Prevention are calling on Facebook, Google and other tech giants to give them greater access to American smartphone location data to help International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 them fight the spread of coronavirus, according to four people at the companies involved in the discussions who are not authorized to speak publicly about it. Advocates for privacy are concerned. Although, some sources emphasized that the effort would be anonymized and the government would not have access to the locations of specific individuals. Federal health authorities say they will use confidential, aggregated consumer data obtained by the internet firms to monitor the distribution of the virus -a technique known as "syndromic surveillance", and avoid more outbreaks. They might still use the data to see if people were using "social distancing." In China, South Korea and Israel, similar and more extreme monitoring techniques have also been used. The moves set off warning bells for privacy activists who are fearful of what the government might do with consumer data. Facebook also offers anonymized data to health experts and Charities in certain countries to help deter epidemic efforts. Yet other reports cautioned that supplying government with greater access to anonymized location data might now contribute to the downline degradation of individual privacy, especially if the government begins asking for non-anonymized data. The data-protection consequences of efforts to monitor coronavirus dissemination have been increasingly evident in Asia. When this economic epidemic is fully under control, it is possible that China will be commended for the scientific success with which it ultimately succeeded in preventing an outbreak that could have infected billions. Yet it sure does come at a expense. There are examples of "epidemic maps" displaying the exact position in real time of reported and alleged cases so people can stop traveling to the same locations. There's also an app that lets people communicate with someone who acquired the virus whether they've been on a train or plane. Such interventions are also successful, but they involve the vast processing and distribution of accurate medical data. Similarly, creative approaches to addressing the problem have been introduced in Korea and Singapore, and ultimately, given the broad and apparent intrusions to privacy, they seem to have been successful. Nowhere in the world, the right to privacy is an absolute right. Laws on privacy and data security can not and do not mess with a common-sense solution to saving lives. Of this reason, all these systems allow the use and sharing of data for that purpose where appropriate. Around the same time, it is not possible to ignore the criteria laid down in the legislation-except in times of crisis. Disproportionate decisions and measures are often the result of knee-jerk reactions, and everybody is at risk when that happens on a global scale-no matter how often you wash your hands. International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 Our Contributions: This paper presents the following contributions: • Survey on privacy issues during pandemic. This paper provides the first comprehensive survey on various privacy issues on various fronts that are taking place during the pandemic. To the best of our knowledge, this is the first comprehensive survey that addresses privacy issues from all sectors from surveillance to medical data. • Privacy principles recommendation. This paper presents various privacy principles recommendation for monitoring app and device designers that include privacyconscious mechanisms, aggregated anonymized data and volunteered data. • Guideline for post-pandemic privacy restoration. Finally, this paper presents a comprehensive guideline to ensure post-pandemic privacy restoration that would be beneficial for the public and private sectors alike. The rest of the paper is organized as follows. Section II describes the privacy issues. Section III describes the recommended privacy principles. Section IV discusses the post-pandemic privacy restoration. Finally, Section V concludes the paper with a highlight on the scope of future work. Deutsche Telekom, Europe's biggest telco firm, announced that it is turning over 5 GB of consumer data to the German Robert Koch Institute to counter coronavirus transmission, COVID-19. The Institute would use the anonymized data to monitor the activities of the general public to forecast how the virus spreads and to help address questions about the social distance effectiveness. Similarly, Vodafone has published a five-point program in which it promises that private consumer data will now be transferred to Lombardy, Italy. And Austria's largest telco, A1, has contributed the data as well. Critics point out that other countries are already making more authoritarian use of mobile-phone data. These data are used in China, Israel, and South Korea to monitor the connections of contaminated locals and to ensure quarantine is enforced. Critics also question the legality of the donation, and whether the data privacy of customers has been respected -and whether the donation of data would be useful. Although GPS-related data, such as that International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 obtained by Google, can be very reliable, cellular providers' mobile phone tracking data still use cell phone towers to locate the customers [15] . Its accuracy ranges from 25 to 100 metres, that might not be particularly helpful in big cities. While European officials assured data-privacy advocates of not tracking individuals, consent and transparency issues were not eased. Much information is associated with the mobility data, which could be worse with the lack of transparency of the Telecom companies that have long been collecting and selling geolocation data of their customers [16, 17] . In the US, the government is currently engaged in talks with tech giants such as Google, Apple, and Facebook about how data from their customers could be used to prevent COVID-19 from spreading. A medical school in Hanover in Germany is collaborating with local mapping firm, Ubilabs, to develop an app that will enable individualized monitoring of infections. A person who has tested positive for COVID-19 voluntarily donates the GPS data [18] from their phone with this app, named GeoHealth and expected to be available in a few weeks time. Many users would be able to say whether they are at the same location, at the same time, as the infected person. If the users get a "red signal," they're told to go and get checked, warning them they were really close by. Recently, the government of the autonomous Spanish territory of Catalonia has released its own, related app, named STOP COVID19 CAT. Anonymous data can also be quickly re-identified, which, they claim, raises fear in cases like this. This, and the absence of a system, has in the past been a significant obstacle to the use of evidence in a humanitarian situation, as happened during Ebola. The Oxford University academics and the health service's digital innovation unit in UK also designed a contact-tracing app (see Fig 1) that takes into account different age groups, household structures and movement patterns in an effort to try to maximise the number of people who could be allowed to freely move [19] . When South Korea is fighting a snowballing number of COVID-19 incidents, the government is letting people know if they are in a patient's area. But the volume of information has led to some awkward moments and there is just as much fear of social stigma as of illness. Such warnings arrive all day, all day, asking people where an infected person was. People may also visit the Ministry of Health and Welfare website for information. When online searching for the case number of a virus victim, the associated inquiries include "contact information," "name," "photo," "father" or even "adultery." There are no names or addresses given but certain people nevertheless tend to connect the dots and recognise individuals. And, even though patients aren't specifically known, online they risk criticism or mockery. Doctors warn that pursuing patients online may have very serious consequences. Malicious online remarks have long been a issue in South Korea, and have led to suicide in some situations. Another known thing is elevated levels of anxiety due to online criticism and endured sleep deprivation. Because the virus spreads quickly, the knowledge they need to defend themselves and others will be made available to the public. The good news is, however, that data security agencies from around the world are coming in to provide their advice and feedback on data management practices and on fighting coronavirus. The contrasting perspective among the opinions of data protection agencies -who may be categorized as punitive, impartial, or acceptable -indicates that the best response would be to pursue a reasonable middle ground that does not neglect the enforcement of basic standards of privacy. This is also in accordance with the announcement released on March 16 by the European Data Protection Board (EDPB) stressing that data security laws (such as GDPR) do not obstruct steps taken to counter the coronavirus pandemic. The coronavirus epidemic is forcing the US government to relax one of its few laws relating to data protection. The Health Insurance Portability and Accountability Act is one of the protections against the misuse of patient records, but the Health and Human Services Administration said it would waive fines for suspected HIPAA breaches which is very common [20] . safety of students or other individuals. FERPA requires school agencies and organizations to reveal information from educational documents without permission until all PIIs have been deleted, provided that the organization or institution has made a fair decision that the student's name is not publicly identifiable, whether by single or multiple releases, and taking into account all fairly available information. Therefore, reporting that a student in a certain class or grade level is missing would be troublesome because, for example, there is a directory containing the names of every student in that class or grade. Many known attacks have occurred and can de-anonymize individual data from the anonymized data collection. There is an exception to the general consent requirement of FERPA, "health or safety emergency," which is limited in time to the emergency period and generally does not allow for a blanket release of PII from student education records. However, the concern is that while educational agencies and institutions may frequently address threats to the health or safety of students or other persons in a manner that does not identify a particular student, FERPA allows educational agencies and institutions to disclose PII from student education records to relevant parties in connection with an emergency without prior written consent. Usually, law enforcement officers, public health authorities, qualified medical agency or institution will have sufficient services to sustain the situation because, on the basis of the knowledge available at the time of the assessment of the educational agency or institution, there is a reasonable reason for such a decision. Therefore, there is a substantial possibility that certain school organizations and institutions may report to the Department of Public Health the names, addresses and telephone numbers of missing students without permission, so that the Department of Health can contact their parents to determine the illnesses of the children. There is also a possibility that a single pupil, teacher, or other school official may be reported by the school as requiring COVID-19 for parents of other students in the organization. As a result of its coronavirus-fuelled market expansion, Teleconferencing firm Zoom is attracting attention. Internet-rights activists claim they want Zoom, which was flooded by demand after the ebola epidemic, to begin releasing a transparency report outlining its data protection policies, and how it responds to governments trying to clamp down on freedom of speech. The growing need for their services makes Zoom a priority for third parties, ranging from law enforcement to sophisticated hackers targeting confidential data and sensitive data. Meanwhile such gatherings will attract attention from regulators trying to regulate the flow of information as people assemble online. While the video conference software Zoom is gaining traction as a result of increased usage of the coronavirus pandemic, federal authorities are now warning of a new possible risk for privacy and security dubbed "Zoombombing." The phrase applies to a type of cyber abuse reported by some users of the app, who have claimed that some of their calls have been intercepted by anonymous persons and trolls speculating hatefully. "Zoombombing" has become so widespread that the FBI has released a news release reminding people of the threat which includes profanity, pornography and swastika. These "Zoombombing" incidents occur as Zoom faces criticism for its privacy rights, which customers, technology experts and US regulators have flagged. In their efforts to mitigate these threats from potential hackers or trolls, federal officials urged those using video teleconferencing apps to exercise due diligence and caution. Malicious cyber actors are searching for ways to manipulate flaws in telework applications to obtain private information, eavesdrop on phone calls or simulated meetings, or conduct other disruptive activities. International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 Coronavirus is a particularly infectious illness and will most likely entail the monitoring of to exchange confidential data with organizations such as Clearview AI and Palantir in times of disaster, such as a terrorist incident or pandemic, provides short-term gains but has an disturbing effect on data protection even after the emergency passes. One major concern is that after the crisis passes, new surveillance technologies deployed during the coronavirus crises will become the'new normal' and permanently embedded in daily life. It will result in ongoing mass population monitoring without proper oversight, accountability or justice. Of this, there is a history, albeit not long after. The terrorist attacks of 9/11 in 2001 led to the proliferation of surveillance cameras and networks across the U.S. and the Patriot Act, a new statute that eliminated statutory guardrails for government oversight and limited accountability, increasing the invasive and extensive surveillance powers of the National Security Agency later exposed by whistleblower Edward Snowden. Despite the public uproar against NSA activities, it is yet to be de-authorized by lawmakers. There is no question about the presence of government departments to represent the people. Nonetheless, people run institutions and often people make errors, errors that can reveal sensitive information to hackers who do not have access to it. Governments will certainly ensure that the tools they have built to combat extremism, such as the position monitoring device used in Israel, remain in safe hands only. All people who want to secure their digital life need to ensure that all connected devices are protected by reliable anti-virus software as regards regular folks. International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 After the advent of the coronavirus pandemic, tech companies have provided their resources, funds, and face mask stashes to help. Many organizations that work with the data are now stepping up providing their tools for data collection to try and monitor or avoid the virus spread. Unacast, a data company that collects and provides the retail, real estate, marketing, and tourism industries with cellphone location data and analysis, has recently revealed something called the Social Distancing Scoreboard [21] . The scoreboard is an interactive map that gives letter ratings to each state and county in America depending on how often Unacast's examination of data conclude that its people conduct social distancing. It's the first offering from the company's latest COVID-19 Location Data Toolkit, and more location data will be introduced in the coming days and weeks to demonstrate trends and patterns that the company hopes would. Unacast isn't the only tech firm that has used the technology for what it claims is a public benefit these days. The "Data for Good" platform on Facebook uses de-identified statistical data from its users to support its Disease Prevention Maps, which will offer information about where people live and where they travel and can help health agencies monitor disease transmission or anticipate when they will be heading next. Kinsa Health uses data from its smart thermometers to attempt to identify extremely high rates of influenza with its US Health Climate Monitor, which the organization claims has forecast the spread of flu correctly in the past and may also map outbreaks of coronavirus. Unacast is a bit different though. For the Facebook system and the Kinsa app, users will opt to monitor their position and instead have a clear interaction with those businesses. In comparison, Unacast receives app data from a number of third-party providers. Such outlets, according to its privacy policy, include Unacast's suppliers as well as the software development kit, or SDK, which it puts in applications. The stumbling point is that users give one of those applications permission to access their location data without realizing that this location data will go to Unacast as well. To the average consumer, there is no simple way to see what SDKs an app may use, because the privacy rules of the app typically state the details goes to third parties without disclosing who certain parties are. Unacast claims its SDK is its "preferred" data base on its website, but when we asked for clarification the company wouldn't specify the applications or partners it deals with. An study by Apptopia, a smartphone app technology company, these apps, users can always turn the location tracking off but some of them obviously need the location services to be able to work at all. While the chart itself can be a helpful device, it also makes such data collection activities even more visible behind the scenes вҐҰ and how precise the data collected can become. Unacast is hardly the only company doing this kind of data crunching. For example, oneAudience, a marketing company, puts its SDK into apps to gather user information. The firm also illegally collected social media data, as Facebook alleged in a recent complaint, but the firm admitted this compilation was accidental, and that it changed its SDK One of the best tools at the hands of public health authorities during a epidemic crisis is low-tech detective work. If a person is infected with a disease such as COVID-19, the disease caused by the novel coronavirus, public health authorities figure out where they have been lately and monitor all people they have been in touch with. The tension between preserving the rights of people and obtaining knowledge that is important to the general interest varies over the course of dissemination of a epidemic. Core specifics would need to be sorted out by monitoring device designers: how to assess phone proximity and user safety status, where the information is processed, who uses it, and in what format. Digital contact tracing services are currently working in a variety of countries, but specifics are sparse, and questions over privacy exist. In the following, we propose several recommended privacy principles. Another choice is to use a coronavirus-specific app to start new, requiring users to willingly share their location and health details. One emerging app in Germany depends in part on location data already held by Google for its account holders. A person who is performing favorably may use the GeoHealth app to donate the history of their place. The data will then be anonymized and processed on a central server. A data analytics platform developed by the Ubilabs tech firm will equate the history of users' activity with that of infected individuals, and the system will display color-coded warnings depending on how recent they might have experienced the virus. As a mixture of GPS mapping, wireless network data, and telephone communications using Bluetooth, the device will be able to identify when a mobile comes within one meter from another. Coordinated data-sharing has become an essential tool in the ongoing fight against coron- In an unprecedented situation several policymakers are able to neglect the ramifications of privacy in an attempt to save lives. The confidential data being gathered, though, is not limited to policy and public health organisations. Throughout the United States, the International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 government is partnering publicly with Verily, a sister corporation owned by Google, to provide electronic verification services allowing users to have a Google account. Surveillance security providers and smartphone device creators are now allowing access to confidential data. For example, users of the Corona 100 m app will see the date a coronavirus patient was infected, along with their race, ethnicity, age and where they went. Sensitive medical records related to a patient will and should be kept confidential in ordinary circumstances. Exposing them to private corporations is a matter of anxiety, particularly in the name of public health, as such documents have considerable economic importance. For example, they could provide valuable targeting data for health-care and pharmaceutical companies to advertising agencies. They could also help inform health insurers making decisions seeking to verify medical history when processing new policies and claims. Databases containing identities linked to mobile location data do bear a price tag, especially for consumer markets. Data protection laws such as the European General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) would limit businesses attempting to retain personal data of any kind, and even use it for potential commercial advantage. However, businesses must adopt the newest advances in privacy-enhancing technologies (PET) to completely ensure regulatory enforcement and secure data-an enormously important market commodity. As demonstrated by the World Economic Forum, this emerging type of privacy technologies helps companies to exploit knowledge obtained from third-party private data without disclosing sensitive information which can not and should not be exchanged. Luckily sophisticated cryptographic methods based on PET are now in use by businesses. The global research community has rigorously sought and tested them, and business members are deeply engaged in attempts to standardize PETs such as ZKProof to promote wider acceptance. If correctly applied, PET will inspire companies, rather than constrain them. It can help them to securely leverage third-party data and remain competitive, without jeopardizing user privacy or business confidentiality. Preserving privacy as we develop and implement these technical solutions will be critical. It is more than important ever to consider privacy principles as we collectively move forward into this next phase of tracking, tracing and testing, and using similar technologies developed to address the pandemic. Meaningful consent must be obtained by being International Journal on Cryptography and Information Security (IJCIS), Vol. 10, No.3, Jul 2020 transparent about the reason for collecting data, what data is collected and how long it is kept. Only when people make the decision to participate should data be collected with consent and used in the manner explained. Clear and user-friendly communication aims to encourage cooperative engagement and will ensure that anyone engaging with the system makes informed decisions to engage in data collection. This should also ensure the user is aware of the purpose of gathering the data, the essence of the data to be collected, the length of data retention and the advantages of data collection. Economic effects of coronavirus outbreak (covid-19) on the world economy A study of friend abuse perception in facebook Coronavirus outbreak and its impacts on global economy: the role of social network sites Abusniff: Automatic detection and defenses against abusive facebook friends Social media and emergency preparedness in response to novel coronavirus Detection and prevention of abuse in online social networks The world after covid-19 Model for e-government in bangladesh: A unique id based approach Tackling the cybersecurity impacts of the coronavirus outbreak as a challenge to internet safety Usensewer: Ultrasonic sensor and gsm-arduino based automated sewerage management Corona virus (covid-19) pandemic and work from home: Challenges of cybercrimes and cybersecurity A survey on malware detection and analysis tools Mobile messaging as surveillance tool during pandemic (h1n1) 2009, mexico Tools and techniques for malware detection and analysis Attacks and defenses in mobile ip: Modeling with stochastic game petri net When friend becomes abuser: Evidence of friend abuse in facebook Abusniff: An automated social network abuse detection system Digital land management system: A new initiative for bangladesh Coronavirus: Nhs contact tracing app to target 80% of smartphone users Mobile technology in healthcare environment: Security vulnerabilities and countermeasures Assistant Professor of Computer Science at Edinboro University and the founder and director of Privacy Enhanced Security Lab (PENSLab), where he and his group develop privacy enhanced security systems. Dr. Talukder's research interests include security and privacy with applications in online and geosocial networks, machine learning, wireless networks, distributed systems, and mobile applications Iftekharul Islam Sakib is a Ph.D. student in Computer Science at the University of Illinois at Urbana-Champaign. Currently, he is working in Cyber Physical Computing group at UIUC and advised by Professor Tarek Abdelzaher. Before that, he served the Department of Computer Science and Engineering (CSE) at Bangladesh University of Engineering and Technology (BUET) as an Assistant Professor and currently in study leave for higher studies Currently, he is working in Rigorous Design Lab (RiDL) at UTA and advised by Professor Mohammad Atiqul Islam. His research interests are broadly in the areas of cyber-physical systems, computer architecture, and security. Currently, he is working on data center security, with a particular focus on mitigating the emerging threat of