Section 230 refers to US Code, Title 47, Chapter 5, Subchapter II, Part 1, Section 230. This is an important part of federal law that essentially says two things:
- (c) (1) No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
- (c) (2) No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.
Part (1) makes internet social media possible. Without it, companies could be prosecuted (criminally or civilly) or held responsible for the content their users publish. No company could risk that liability, so it would destroy social media as we know it.
Part (2) enables companies to filter or censor content without losing their liability protection granted by part (1). Without it, internet social media could become a cesspool of illegal and offensive material. You think it’s bad today? Just imagine if part (2) didn’t exist!
The $64,000 question: what exactly does part (2) protect? That is, how much filtering can companies apply before they lose their liability protection?
Surely, if they use too heavy a hand, cherry picking only certain kinds of content, they are effectively defining the platform’s content and it is “user-generated” in name only. The platform no longer represents the views of users, but of the platform owners themselves. For example, 3 months before an election they suppress all content promoting candidate A and allow only content promoting candidate B. In this case, for all practical purposes they have become the content publisher and should lose their part (1) liability protection. Clearly, part (2) was never intended to allow this.
If, on the other hand, they use too light a hand, then they lose control of their platform. It would be impossible to create certain types of socially beneficial spaces, whether family friendly, or forums for discussing religious, political, personal, economic or other sensitive yet important and relevant topics. Section 230 specifically allows filtering and censorship for this purpose.
The problem with today’s discussion about section 230 is that it is treated as an all or nothing thing. Either section 230 allows any amount of filtering and censorship (ignoring its own words that it is limited), or section 230 should be rescinded entirely. In my view, these views are both wrong. Section 230 is necessary and beneficial. Deciding exactly how far part (2) goes in allowing filtering and censorship is messy hard work whose outcome will never please everyone. But real-world application of law is never clean or easy, and Section 230 is important enough that this work is worth doing.