Most people would agree, even ardent supporters of free speech, that there should be some limits on free speech. Everyone knows the example of shouting "fire!" in a crowded theatre. And I think that much of the acceptance and support for the silencing of "misinformation" in current times arises from the belief that misinformation can be dangerous and potentially cause a lot of harm if it is believed and/or acted upon.
"Spreading misleading information about the ongoing global COVID-19 pandemic and downplaying the importance of continued mitigation practices" (to quote an actual moderation policy of an on-line platform that will remain anonymous) is the kind of thing that will get you censored/banned all over the place right now.
But I have real concerns about this that I shall elaborate on here.
- How do you precisely classify what misinformation and who gets to decide what it is (and whether it is dangerous enough to warrant censorship)?
- Censoring misinformation is used by authoritarian regimes as a justification/smokescreen for clamping down on dissent. Can any government or corporate entities be trusted not to abuse this power?
- Will the quest to stamp out misinformation inevitably become something that protects orthodoxy and opposes heterodoxy? Aren't all new ideas heterodox to start with?
- When a subject has become politicized, is it inevitable "misinformation" from one side of the debate becomes more rigorously clamped down on than the other?
- Is there a danger that silencing, cancelling, de-platforming, demonetizing, (etc.) people that convey misinformation (rather than confronting and engaging with them) going to fuel support for conspiracy theories?
- Is there a difference between government and corporate/private censorship? Should private companies (including social media platforms, online payment platforms) be free to censor and ban people for whatever reasons they like?