Section 230 of the Communications Decency Act – Chestertons View

December 4, 2025The Purple People Leader

This law is a foundational “fence” of the modern internet. Before we can intelligently discuss tearing it down, we must first understand why it was built..

The “Fence”: What is Section 230?

Enacted in 1996, Section 230 is a U.S. law that provides two key protections for “interactive computer services” (like social media platforms, forums, and search engines):
1. Immunity from Liability for Third-Party Content: Platforms are generally not treated as the publisher or speaker of content provided by their users. This means if someone posts something defamatory or illegal on a platform, you can sue the user, but you generally cannot sue the platform itself.
2. “Good Samaritan” Protection for Content Moderation: Platforms are allowed to moderate and remove content they deem “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” in good faith, without being held liable for those decisions.
As the Allen Lab for Democracy Renovation at Harvard puts it, these protections were created to solve a dilemma for early online services: either they don’t moderate anything to avoid liability, or they become liable for everything they miss if they start moderating. Section 230 was designed to encourage platforms to set and enforce community standards without the fear of constant litigation.
• Policy 101: Section 230 Reform – Harvard Ash Center


The Proposal to Tear It Down: Why Reform Section 230 Now?

In recent years, there has been a growing, bipartisan movement to reform or repeal Section 230. The arguments for tearing down this “fence” are powerful and stem from the vast changes in the internet since 1996.
Argument 1: Platforms Are Not Neutral Arbiters. Critics argue that massive platforms like Facebook, X (formerly Twitter), and YouTube are no longer passive hosts but active curators that use algorithms to amplify certain content for profit. They believe this active role should come with editorial responsibility.


Argument 2: Failure to Stop Harmful Content. Many believe platforms have failed to adequately address the spread of dangerous content, from misinformation and hate speech to material related to terrorism and child exploitation. Calls for reform often seek to hold platforms accountable for this failure. For example, Professor Mary Graw Leary argues that Section 230 has become a “failed experiment” that shields platforms from responsibility for foreseeable harm.
 Challenging Section 230: Professor Mary Graw Leary’s Call for Accountability – Catholic Law

Argument 3: Biased Content Moderation. From another perspective, some argue that platforms use their moderation power to censor certain political viewpoints, effectively acting as biased publishers who should not enjoy the protections of a neutral platform.

The sheer volume of legislative proposals reflects this widespread desire for change. Dozens of bills have been introduced in Congress aiming to modify or eliminate Section 230’s protections in various ways.
 What Has Congress Been Doing on Section 230? – Lawfare

Applying Chesterton’s Fence: Why Was the Fence Built?

Before we remove the fence, we must ask: What problem was Section 230 originally trying to solve?


In the mid-1990s, the internet was a fragile, nascent ecosystem. Lawmakers feared that if online forums or service providers could be sued for every single post made by their millions of users, the financial and legal risk would be overwhelming.
The “fence” of Section 230 was built to create a space for this new industry to grow. The core idea was that without this protection, one of two things would happen:
1. The “Anything Goes” Internet: To avoid liability, platforms would refuse to moderate any content, turning the internet into an un-policed cesspool.
2. The “Heavily Censored” Internet: Alternatively, to avoid risk, platforms would aggressively delete any post that was even remotely controversial or legally ambiguous, stifling free expression and open dialogue.
Section 230 was the solution: a legal shield that allowed platforms to host user-generated content and moderate in good faith without being buried in lawsuits. It is widely credited with enabling the rise of the modern social web, from Wikipedia and Yelp to Facebook and YouTube. Removing it, supporters argue, could lead to the very outcomes it was designed to prevent, fundamentally breaking the user-generated internet as we know it.
 Section 230 Under Fire: Recent Cases, Legal Workarounds, and Reforms – Dynamis LLP