Section 230 — What’s next?

One of the unresolved issues from the Trump Administration, which in part reflects the lack of nuance in Trump World, is what to do about Section 230.  Section 230 refers to a statute enacted back in 1996 as part of the Communications Decency Act which is currently codified at Title 47, Section 230 of the United States Code.

As currently written, Section 230 is a response to some basic principles of defamation law.  Generally speaking, a publisher (whether of books or newspapers) or broadcaster is responsible for the content that they publish or broadcast.  Thus, if I am Harper Collins Publisher and I publish a tell-all book written by a former White House aide, I am potentially liable if that book contains false information.  Similarly, if I am your local newspaper and I publish a letter to the editor or an ad in my paper, I am potentially liable if the letter to the editor or ad contains false information.  And if I am a television station and I broadcast an ad or a program which contains false information, I am potentially liable.  All of these entities are, usually, very selective in what they print or publish.  They either decline something that is potentially defamatory or they do what they feel is necessary to verify the allegations before publishing.

Now back in the 1990s, the internet was still in its infancy.  And many websites were letting users post content.  What is currently Section 230 arose from competing desires.  On the one hand, Congress wanted to create a mechanism that would permit those entities which were establishing the website to remove obscene content or content that violated somebody’s intellectual property rights.  On the other hand, website providers were concerned that playing any role in editing content would make them “publishers.”  The solution was contained in Section 230 (c).  That subsection contains three basic provisions.  First, the website owner would not be considered the publisher of any information posted by another on its website.  Second, the website owner would be immune from liability for any good faith efforts to restrict access to obscene or objectionable material (even if the material might be constitutionally protected).  Third, the website owner could also opt to provide users the ability to block access to certain materials.

Now back in 1996, there was no Facebook or Twitter or anything similar.  And as currently written, Section 230 has permitted the existence of such social media sites.  The issue today is how to balance the competing concerns.  We do not want a rule in which a website that accepts on user-generated content has to permit all user-generated content regardless of how obscene or defamatory or misleading.  We also do not want a rule in which the website owner is liable for accepting defamatory content because that will lead to websites simply decline to allow user-generated content.   But how to draw the line n the middle is the hard question.  The broad deference give to the owners of the websites may have made sense in the early days, but the large and dominant presence of certain websites raises concerns that the current rules give too much power to the website providers.

But the answer to these concerns is not clear; or at least not clear enough to be easy to reduce to statutory language.  The current rules merely require good faith on the part of the website.  And, many of us doubt the ability of any big company to operate in good faith.  We want these companies to be more active in blocking any attempt by foreign governments to spread disinformation or fictional conspiracy theories.  At the same time, we fear these companies using that power to censor our attempts to spread what we consider to be accurate information.  And the companies do not want have to fight every decision that they make in court.

I do know that the answer is not as simple as simply repealing the current language.  Too much of what we take for granted on the internet relies on Section 230.  We have already seen some websites turn off their comments features because they have seen too much abuse and do not want to have to review every post.  A complete repeal of Section 230 would lead to more sites making this decision and would potentially cause serious problems for social media websites.  Even among those who criticize Facebook, many find its positive aspects of making it easy to connect with friends too important for them to not be on Facebook.  But it is unclear how Facebook (or any replacement mini-Facebook) could continue if Section 230 disappeared.  So the real issue going forward is how to fix the current problems in Section 230 — the lack of incentive for responsible conduct by website providers and preventing misuse of the power to restrict user-generated content while still permitting website providers to define what their site offers and allows.

Ultimately, the issue with Section 230 is the same as with many other provisions of law.  A law typically responds to a perceived need at the time.  If we are lucky, the legislature tries to consider the potential consequences of the proposal and account for them.  But, after the law goes into effect, things change and those subject to the law adjust their conduct in unexpected ways to account for the regulations.  Especially in a fast developing field, it is too much too ask for a law that is over two decades old to be a good fit for current needs.   Section 230 deserves a close look and an extended debate over how we should change the rules to adjust to make it work for today’s reality instead of the world of the 1990s.  But it’s also important to recognize the vital role that Section 230 plays in the functioning of the internet and to be careful in making changes.

This entry was posted in Freedom of the Press, Politics and tagged , , , , , . Bookmark the permalink. Follow any comments here with the RSS feed for this post. Both comments and trackbacks are currently closed.