Tell Congress:
read this website before you make changes to Section 230.

Thanks for signing the petition!

Please consider sharing this page with your friends and family.

Why is everyone talking about Section 230?

Big Tech is a big problem. The Internet has the potential to transform our society for the better, supercharge social movements for human rights, and give more people a voice than ever before. But a small handful of companies currently dominate the web with a monopolistic business model that’s incompatible with democracy: based on harvesting our data and using algorithms to amplify and micro-target harmful content in a never ending quest for advertising dollars.

Meaningful policy solutions need to strike at the root causes of Big Tech’s harms: monopoly power, data harvesting and abuse, and algorithmic manipulation. Unfortunately, many lawmakers and pundits have instead proposed reckless changes to Section 230 of the Communications Decency Act, a widely misunderstood but crucially important law that essentially allows ordinary people to have a voice on the Internet. Many proposed changes to Section 230 would do enormous harm to vulnerable communities, undermine human rights, and utterly fail to address the legitimate problems with Big Tech companies like Facebook, Amazon, and Google. And worse, they could inadvertently solidify the monopoly power of the largest companies while crushing competition from smaller, more community-minded platforms.

This website attempts to clear up some of the most common misconceptions about Section 230. Tell your lawmakers to read it before they change a law they don’t understand. We don’t need more partisan grandstanding, we need real policy changes that address the harm that Big Tech is doing to our communities and our democracy right now. Before it’s too late.

FAQ

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”Section 230 was introduced in response to lawsuits against two websites, Prodigy and CompuServe, which allowed users to access information on the internet (much like Google). The websites had been sued for defamation due to third-party content, with the difference being that one website had actively made efforts to moderate content and one had not. In response to this “Moderator’s Dilemma”, 230 enabled all private, non-government websites to apply a “shield and sword approach” to hosting and removing the kind of content they wished to, according to their rules.Why is that important for the average internet user?Section230 is the law that essentially allows all user generated content on the Internet: your posts, your thoughts, your ideas, your photos, jokes, opinions, memes, videos, reviews, how-to guides, recipes, etc. As outlined by Wikipedia in this blog post, Section 230 is what allows website operators the freedom to review edits to a biographical article (for example) without legal risk. Without Section 230, platforms would be unwilling to host content created by ordinary people. The Internet would become more like Cable TV, where everything you see and hear is backed by big money and has been cleared with risk-averse corporate lawyers. If any platforms hosting user-generated content do survive, they would be unable to engage in even the most basic forms of moderation, like removing spam posts, for fear of lawsuits.

While both Trump and Biden have called to repeal 230 entirely, most lawmakers just say they want to change it, but even minor tweaks to the existing rule can have enormous unintended consequences for online free speech. Websites would be unwilling or unable to host user generated content because of the constant fear of liability and lawsuits – moreover, bigger companies like Facebook and Google would likely thrive in a world without 230 as the only ones that can afford armies of lawyers to litigate and content moderators to take down tons of content – legal, free speech – to avoid liability. Smaller platforms will be sued out of existence. This will kill smaller companies and prevent newer platforms from competing with big tech.Content moderation should be based on the needs of online communities and based in human rights principles, not for fear of lawsuits. Without 230, platforms will have wrong incentives in place and will be unwilling to host your opinions because it allows them to avoid legal challenges. Research shows that the 230 provision has been beneficial for fostering a diversity of platforms and content” – exactly as the authors originally intended it to be.Section 230 doesn’t just apply to “Big Tech,” as some lawmakers seem to think. It protects user generated speech and content on smaller platforms like Discord, Reddit, Patreon, and Etsy, as well as the comments sections of local newspapers, a local library’s message board–literally anywhere that people can express themselves online. Section 230 is especially important for protecting the speech and organizing efforts of marginalized people, who have been frequently targeted by law enforcement and have had their speech suppressed by corporations. If Section 230 were weakened, platforms would be unwilling to host, for example, the videos of police abuse that have sparked long overdue uprisings for racial justice, or the posts calling out abusive behavior by powerful men that started the #MeToo movement.

This is a trick question: The answer is, it doesn’t matter. What matters is only the content in question. If that content is created by a third party, the website hosting it cannot be sued over it. Additionally, there is no “neutrality” component to Section 230, which is a good thing, because otherwise the government will have to decide what is “neutral” and what is not, as it would raise significant First Amendment concerns.As this explainer from the Electronic Frontier Foundation details further, “rather than enshrine some significance between online “platforms” and “publishers,” Section 230 intentionally nullifies any distinction that might have existed. It explicitly grants immunity to all intermediaries, both the “neutral” and the proudly biased. It treats them exactly the same, and does so on purpose. That’s a feature of Section 230, not a bug,” because it protects ALL user generated content on the internet.

While 230 provides protection for platforms to host user content it is not some kind of magic shield or “Get Out of Jail Free” card – Big Tech companies are still, and always have been, absolutely liable for content that they themselves create, or if they are caught knowingly hosting or sharing content that is considered illegal.The spread and amplification of hate speech and disinformation online is a valid concern. Unchecked harassment of women and people of color, often as a direct result of online mis- and disinformation, can be its own form of censorship if people face abuse and are afraid to express themselves online. But messing with Section 230 will not fix that.230 allows for a range of moderation practices, or allows various online communities to moderate in the best interest of their users. As private, non-government actors they are free to make their own decisions on what they want to leave up and what they wish to take down. 230 is what makes it possible for platforms to remove terrible (but not illegal) content, and even more mundane content like spam, without getting sued for it. So if you are concerned there is too much harmful content online, gutting 230 will make it harder for platforms to remove it.

The other complaint with 230 is that it gives platforms too much power to filter content, and this has led to bias against conservatives and silencing their viewpoints. This claim simply isn’t true, and is not backed up by data. Yes, big tech companies have too much power over what we can see and hear, and that includes suppressing controversial viewpoints on both the left and the right, but changing or killing 230 will make this problem worse. It will lead to faster, more draconian, and more haphazard moderation, and make risk-averse lawyers at these companies the ones deciding what content is allowed, thus centralizing decision-making in the hands of a few corporate leaders. Conservative voices would also be one of the first to be booted out, for platforms’ fear of lawsuits from liberal leaning groups.Concerns about online censorship and speech are actually a problem of monopoly on our attention and our speech, and because these platforms are trying to enforce community standards that are actually impossible to enforce equally at scale. If Facebook and YouTube didn’t have so much power, we wouldn’t care so much about their moderation decisions because if you didn’t like them you could choose another platform.

If lawmakers want to tinker with 230, it is crucial to look at a case study as to what happened the last time that somebody tried to poke a hole in it. The SESTA/FOSTA legislation, intended to end online sex trafficking, instead led to Craigslist shutting down its “Personals” section, and Tumblr cracking down on a wide range of content, including adult artwork and sex education information, which disproportionately affected sex workers and LGBTQ+ creators by shutting down spaces that made people safer and pushing them back onto the street. Sex workers lost income, became homeless overnight, were forced to take on clients who were dangerous or unstable, and could no longer negotiate with or vet clients in advance. Two sex workers took their own lives within a month of SESTA-FOSTA’s passage. And as journalist Cathy Riesenwitz chronicles in this article, “SESTA-FOSTA hasn’t [even] reduced the rate of sex trafficking, according to sex workers, advocates, sex trafficking survivors, and even the Department of Justice. It actually [made] it more difficult for law enforcement to find and rescue victims.”Senator Elizabeth Warren (D-MA) and Congressman Ro Khanna (D-CA) have introduced legislation to study the effects and impact of SESTA/FOSTA, and the need for this kind of research underscores one of the biggest concerns surrounding 230: Proposed amendments to it can have massive unintended consequences and should be approached through a “do no harm” model, or approached super carefully.

The good news is that there are a number of things lawmakers can and should do right now to hold Big Tech companies accountable, address their harm, and reign in their monopoly power without touching Section 230. Given the partisan gridlock around 230, and the many ways that changes to 230 could backfire and do more harm than good, lawmakers should focus on actual policy solutions to Big Tech’s abuses, rather than rushing through some legislation so they can say they did. Here’s a few things lawmakers should do right now:

  • Passing strong federal data privacy legislation. Social media’s biggest harm to society is its lack of transparency and accountability: How their algorithms operate and manipulate, how they amplify the worst speech on the internet, and how they profit off promoting this speech to the people most susceptible, using micro-targeted advertising. This is the best way to pressure the companies to do better, even if Congress is slow to act.
  • Enforcement and expansion of civil rights legislation.
  • The monopoly power and unchecked acquisitions of these tech companies has stifled innovation, and requires a major “update” of existing antitrust laws to expand to the realities of consumer harm in the digital age.

The suggestions above strike at the root of the harms we see from Big Tech, and we need REAL policy solutions that actually address the issues rather than just turning 230 into a scapegoat – changing it could do enormous harm and not even fix the problems we have with the internet.