key: cord-0962122-mj06msjh authors: Floridi, Luciano title: Trump, Parler, and Regulating the Infosphere as Our Commons date: 2021-03-08 journal: Philos Technol DOI: 10.1007/s13347-021-00446-7 sha: 5d6eaba528c5fb641383b51517dae9d5782d5cc9 doc_id: 962122 cord_uid: mj06msjh nan arbitrary, potentially whimsical and unaccountable power of these companies. The truth is that, for some time, people had been complaining about Trump's misuse of social media to spread populist, demagogic, misleading and incendiary messages, unacceptable both for what they stated (e.g. about the pandemic or the presidential election) and for what they omitted (e.g. in terms of rejecting or criticising white supremacists' actions or propaganda). The violence in Washington and the pandemic, which has forced people to live increasingly connected and online, have made the public more keenly aware of the importance of good digital communication and a decent ecology of social media. What has been clear to researchers for a long time has become obvious to the educated public as well: the same companies involved in the deplatforming of Trump are also criticised for abusing their oligopolistic positions and enabling the spread of so much misinformation and fake news, so the question asked abovewhether the deplatforming was acceptable-is important because it is the symptom of a more general and crucial historical problem: who is in charge in the infosphere (Floridi 2014a) ? Today, digital sovereignty (Floridi 2020a )-understood as the ability to control our lives online and, increasingly, our onlife experience tout court (Floridi 2014b )-is also largely in the hands of a few, colossal companies. We have already seen this with Google and Apple and mobile telephony: through their APIs, the two companies have decided who can do what and how with mobile phones, even in the case of apps designed to fight COVID-19 (Morley et al. 2020; Floridi 2020b) . The problem is clearly serious, but I already mentioned that the question-did they do the right thing or not?-is both simplistic and polarising. On the one hand, those in favour of the suspension of Trump's accounts argue that the platforms in question are private companies that offer services on their own terms, set by them and freely accepted by the users, and hence that they have the right to suspend any user as and when they want, if the terms of service are not respected (Brandom 2021) . They stress that the platforms have allowed Trump to communicate for so long only because, as the President of the United States (POTUS), he was considered one of those exceptional cases where, for reasons of public interest, messages were tolerated that would have otherwise led to the suspension of the services if sent by any other user, but they also conclude that things changed because, in the long run, communications like Trump's, which deny the truth (think of Trump's denial of the pandemic or of global change) and incite violence, end up harming the public interest, and ultimately must be moderated and then blocked. On the other hand, those opposed to the suspension object that this is not only a question of consistent application of the terms of use-because, in that case, the same platforms should have blocked Trump much earlier and intervened in many other contexts (Sri Lanka, Myanmar, India, Ethiopia, (Satariano 2021 -updated 17 January 2021))-but also of economic interest, unaccountable arbitrariness and a risk of 'censorship' (but note that this is a loaded word that prejudges as negative whatever content moderation it is used to describe, (Graham 2021) ). The suspension happened so late-they continue-because Trump was finally an outgoing loser, because past clashes, even personal ones, could finally find an outlet without repercussion, and because the operation could help gain some favour with the new Biden administration. Too little too late for society, too convenient for companies, too risky for democracy. The real problem was not Trump, soon out of the game, but that the decision to silence a voice-no matter how problematic-was left to corporate discretion. The reasoning continues by stressing that the companies in question are not neutral but promote an ideology that is neo-liberalist, anti-conservative and exclusively focused on the freedom of speech as more important than any other right (think of privacy or security), as long as such an ideology is coherently but also conveniently aligned with companies' business models and strategies. In the case of Trump, such a Californian ideology may be likable, but in other cases, it could easily erode pluralism and silence dissenting voices. Because of these arguments, I observe, those who want to defend the freedom of expression at all costs end up somewhat paradoxically being on the same side as rightwing and autocratic powers that have strongly objected to the decision to block Trump's accounts. Indeed, more generally, digital sovereignty in the hands of private companies scares both those who fear it as an erosion of democracy and freedom of speech (Ragozin 2021) , and those who oppose it as a threat to their own authoritarian power (Chunduru 2021) . Thus, the editorial immunity sanctioned by the famous Section 230 is defended both by those who want freedom of speech protected against censorship, and by those who want it to ensure that their own violent and extremist contents are not removed; it is attacked both by those who want to make sure, like Trump (Smith 2020) that platforms cannot remove any content, and by those, like Biden (Lerman 2021) , who want platforms to be held accountable for removing unacceptable content. The real difficulty is that it all depends on what it may replace if it is removed. How can this problem be solved? From a public interest and legality standpoint, companies did well to block Trump and Parler. They should have done it before, they should have done it in many other cases too, and they certainly were not too brave to do it so late. However, by blocking Trump and Parler (think also of the current debate about Facebook and Australian legislation on the linking and dissemination of news), these private companies have shown that, de facto, they have a public role which is of crucial public interest, since they decide what may or may not happen in the infosphere and hence in the lives of billions of people (Naughton 2021) . This was never a simple matter of communication channels, where providers have no responsibility for the exchanged contents. In reality, the infosphere is a shared, relational space, a commons, to use a traditional English legal term (Ostrom et al. 1999; Floridi 2013) . It is the space where humanity spends more and more time and where more and more activities take place directly or indirectly, from education to work, from socialisation to entertainment, from commerce to finance, from the exercise of justice to political discussion, from research to journalism. It is the space that influences every other space, even the physical one; just think of all the issues surrounding defence and security. It is a space that should be conceptualised and governed more like a condominium 1 -like Antarctica and the Space Station, which belong to everyone-rather than like a new frontier that can be appropriated and colonised by anybody, or like a space that belongs to no one, like the Moon. So, those who are worried about the fact that some companies have silenced Trump and Parler (for example in Germany and France, (Jennen and Nussbaum 2021)) are right because the sovereignty of this space should not be left to private enterprises, business strategies, self-regulation and market forces (Breton 2021) . It is time to take seriously the fact that the infosphere is humanity's commons and hence regulate its use with open and transparent rules, legally grounded on all human rights and on human dignity, to avoid arbitrariness, unaccountability, abuse, and discrimination (Stoller and Miller 2021) . One must remember that the companies that suspended Trump are also part of the problem, not just the solution, because they are also the ones who first empowered and then disempowered such a demagogue through their platforms. Companies did the right thing by deplatforming Trump and Parler, for reasons of self-regulation of services provided and of public interest. Still, it is not right that they have so much power in the first place, for reasons of accountability and misplaced digital sovereignty. The conclusion is that this time we were lucky (Goldberg 2021) and the companies in question acted correctly (if late and partially), but crossing our fingers is not a viable political strategy, and therefore, we must establish the right ethical and legal framework to ensure that next time these companies operate in the interest of all, not just for convenience or if they wish to exercise some good will, but for reasons of regulatory responsibility and social accountability. This may sound unrealistic, but it is enough to read the Digital Services Act 2 to understand that the European Union is coming to the same conclusion and building the regulatory framework that will make an operation like the one against Trump not only justified but also accountable and not arbitrary (see Article 20). And if this development seems worrisome because politics should never control free speech, two things must be remembered: that even the right of freedom of speech knows its limits when aligned and harmonised with other rights (Wildman 2017), such as that of security against disinformation and incitement to violence, and that politics is not the same everywhere. It is only there where those who control the controllers are the controlled themselves that one can talk of real democracy, and it is only in a real democracy that a limit to freedom of speech is not censorship but tolerant respect for civil communication, one that hurts nobody and is good for everybody, as in the European Union. And, to those who object that suspensions and deplatforming may even be welcome sometimes but never work because they do not block extreme, intolerant or radicalised views, and that the same unacceptable forms of communication will reappear elsewhere (Blackburn et al. 2021; Ou 2021) , one may retort that separating what is edible from what is poisonous maybe does not wipe out the poisonous, but enables one to have a much healthier and safer diet (Bedingfield 2021) . True, those who wish to do so will be able to continue to feed on falsehood, lies, demagogy, nonsense, violence and other unpalatable contents, but with greater difficulty, and those who want to avoid certain poisons will be able to do so much more easily, not running the risk of finding them mixed everywhere, indiscriminately, on open platforms accessible to billions of people. It is time to be green on our blue technologies: an ethically preferable and legally acceptable ecology of the infosphere is overdue. Maybe someday, we will thank Trump for making us reach a tipping point, and finally decide to reform the rules that determine who controls the infosphere and how. Deplatforming works, but it's not enough to fix Facebook and Twitter Does 'deplatforming' work to curb hate speech and calls for violence? Three experts in online communications weigh in. The Conversation Why platforms had to cut off Trump and Parler. The Verge Capitol Hill -the 9/11 moment of social media After social media companies boot trump, poland doubles down on draft law to make 'censorship' illegal Twitter permanently bans trump, capping online revolt. The New York Times The deplatforming of President Trump -a review of an unprecedented and historical week for the tech industry The Ethics of Information The fourth revolution -how the infosphere is reshaping human reality The fight for digital sovereignty: what it is, and why it matters Mind the app-considerations on the ethical risks of COVID-19 apps The scary power of the companies that finally shut trump up. The New York Times Mexico president slams social media 'censorship' after chaos in Social media liability law is likely to be reviewed under Biden. The Washington Post Can twitter legally bar Trump? The First Amendment Says Yes. The New York Times Ethical guidelines for COVID-19 tracing apps The silencing of Trump has highlighted the authoritarian power of tech giants. The Guardian Revisiting the commons: local lessons, global challenges You can't keep rage off the internet. The Japan Times Why does the Russian opposition reject the Twitter Trump ban? And what lessons can American liberals learn from Eastern Europe on dealing with the far right Trump's Twitter and Facebook bans are working -Trump's deplatforming has already slowed the spread of election misinformation After barring trump, facebook and twitter face scrutiny about inaction abroad. The New York Times What is Section 230 and why does Trump want it revoked? Independent Donald Trump being banned from social media is a dangerous distraction The court, the constitution, and the deplatforming of Trump Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.