The Communications Decency Act of 1996 has a provision known as Section 230, which says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
That means that Facebook is not the publisher when you post something on your Facebook page. You have the responsibility for what you say. Just as your phone service provider isn’t responsible for anything you say on your phone, Twitter isn’t responsible for anything you post on your account.
Section 230 also says that online publishers can delete user-generated content that is lewd or violent, or in any way objectionable. I routinely delete comments which include obscenities or personal attacks on the social media accounts we manage for our clients, and Section 230 allows me to do so.
This seems reasonable, but there is controversy over the law. Donald Trump threatened to veto a Defense Spending Bill if it didn’t include a provision to repeal Section 230.
What’s the controversy?
As Facebook and other social media platforms become important sources of news for Americans, there is concern that they have too much influence on the American public. Specifically, some worry that they are being used to influence voters. There are also unsubstantiated claims that Google and other Big Tech players are squelching conservative voices.
Mark Zuckerberg said, “More people across the spectrum believe that achieving the political outcomes they think matter is more important than every person having a voice. I think that’s dangerous.” Facebook is cracking down amid plenty of political pressure, and now deletes 10 times as many posts as they did a few years ago.
But others have questioned whether it’s really a question of free speech. A letter from Adam Isler to The economist said, “The distinction between free speech and free publication is that I am free to write offensive screeds. I am free to post them off to editors and publishers. They, however, are under no obligation to print my diatribes. Neither is Facebook or Twitter.”
In the same edition,Trevor Schindeler said it like this: “Freedom of speech is protected, but the right to be heard is not. Speakers were never guaranteed a platform for their views until social media came along. Now everyone has a chance to be heard. But denying access to anti-vaxxers, say, would in no way limit their right to free speech. They would still have all the freedom of speech they enjoyed before Facebook was created.”
Legal Scholar Mike Godwin said, “Section 230 was designed to free online forums to police bad content without becoming legally liable for all that they missed.” It wasn’t intended to force online publishers to present balanced coverage or to enforce free speech in private spaces. It was just acknowledging that there’s too much user-created content for anyone to police adequately.
One of the biggest problems with the controversy, though, is that it ranges from “Big Tech shouldn’t censor content” to “Big Tech should censor more content.” These two positions showed up in the Congressional hearings on the subject so clearly that there was obviously no way for Congress to agree on any position.
What does this mean for you?
First, it’s important to note that Section 230 is not about content creators. If someone produces false information about vaccinations (for example, a claim that the COVID-19 vaccine will alter your DNA in some way that benefits Bill Gates, as claimed by a recent weird email I received) and publishes it on their website, they are responsible for publishing that false information, and Bill Gates could sue them for defamation.
If some random person makes that claim in the comments or in their forum or something, the random person is responsible, not the site owner.
Section 230 is about user-generated content.
At the moment, you are not legally responsible for any user-generated content on your website. If Section 230 is repealed, that could change. So if you are worried about Section 230 and your website, you could step up moderation of comments to make sure that none of the rants at your forum are breaking the law.
You could also remove user-generated content such as forums and comments. If you have a popular forum that works hard for you, it makes sense to think about potential legal issues. Equally, if you get a lot of traction from controversial social media conversations, you might want to think about alternatives.
However, the end of Section 230 would probably mean the collapse of social media all over the web, which would probably distract people — maybe even Bill Gates — enough that they wouldn’t notice you for a while.
We don’t recommend that you take any immediate steps about Section 230. It isn’t so much about free speech, neutrality, or censorship as it is about limiting people’s ability to bring lawsuits against online publishers.
We have a saying: “Dance like nobody’s watching; love like you’ve never been hurt. Sing like nobody’s listening; tweet like you have your lawyer sitting next to you.”
Leave a Reply