Disinformation and human rights experts: gutting Section 230 will help Facebook and harm marginalized communities
FOR IMMEDIATE RELEASE: Thursday, March 25, 2021
Contact: press@fightforthefuture.org, (508) 474-5248
Today, Fight for the Future held a livestream event with Dr Joan Donovan of Shorenstein as well as experts from the ACLU, Wikimedia, Access Now, Woodhull Freedom Foundation, and Reframe Health and Justice, who explained why gutting Section 230 won’t stop the spread of harmful content and disinformation online.
The event came just ahead of a hearing in the House Energy & Commerce Committee where lawmakers questioned the CEOs of Facebook, Google, and Twitter. Too often, reporting around these hearings focuses only on the statements of Big Tech CEOs and lawmakers, ignoring voices from civil society groups and smaller web platforms who have a crucial perspective to share. Earlier this year we also issued a letter signed by 70+ racial justice, civil liberties, LGBTQ+, and human rights groups opposing repeal or gutting of Section 230 and urging lawmakers to pass the SAFE SEX Worker Study act to examine the public health impact of SESTA/FOSTA before making further changes to Section 230.
During the hearing, Facebook CEO Mark Zuckerberg expressed support for changing Section 230. That’s because such changes will help Facebook and harm human rights, without addressing harms like disinformation. Here are some quotes from participants in our event:
Evan Greer (she/her), Director of Fight for the Future, said: “Of course Facebook wants to see changes to Section 230. Because they know it will simply serve to solidify their monopoly power and crush competition from smaller and more decentralized platforms. Facebook can afford the armies of lawyers and lobbyists that will be needed to navigate a world where Section 230 is gutted or weakened. And they’ve shown repeatedly that they don’t care about the impact that Section 230 changes could have on the human rights or freedom of expression of marginalized people – they are happy to sanitize your newsfeed and suppress content en masse in order to avoid liability or respond to public criticism. Zuckerberg’s support for changes to Section 230 is about maintaining Facebook’s dominance and monopoly control, nothing more. Instead of helping Facebook by gutting Section 230, lawmakers should take actual steps to address the harms of Big Tech, like passing strong Federal data privacy legislation, enforcing antitrust laws, and targeting harmful business practices like microtargeting and nontransparent algorithmic manipulation.”
Dr. Joan Donovan (she/they) of the Shorenstein Center: "The internet still exists: Platforms are built on top of it, Facebook is a product, Facebook is not the internet. Speech is like the cassette tape that goes in the boombox of the internet. The problem is messy and the solution is going to come in many different ways, there is no Section 230 magic bullet. One thing we can do that is not 230-related: We can pump up the volume on timely, local, relevant content. We can create within timelines and newsfeeds, room for local journalism, room for things that are not trying to trigger emotional responses, information that is not often shared because it is not sexy but people do want and don’t always get in their feeds. What this looks like is asking for public interest obligations for social media and this doesn’t require us to go in 230 necessarily and do anything significant. It’s really important that we all come together – universities, civil societies, the law community – and come at this with an orientation that we don’t want to destroy the benefits that the internet has brought to us, but at the same time we want to put community safety at the center of design.”
Kate Ruane (she/her) of the ACLU: “When it comes to disinformation specifically, amending Section 230 is unlikely to truly address the problem. One of the issues we face is that disinformation has no clear definition, and to the extent that it simply means ‘speech that is false,’ it will often be protected by the constitution, for better or for worse … It’s unclear to me what Section 230 changes to address disinformation will actually do to address the problems other than encouraging problems to continue to deploy ever stricter censorship regimes, which we know disproportionately silence people of color, the LGBTQ community, Muslims, other marginalized groups, and people who express dissenting views. But that doesn’t mean we should throw up our hands when it comes to disinformation. There is a lot we can do … meaningful privacy restrictions can also be tremendously helpful. If we limit the data these companies can collect and then empower users to limit the ways that companies can use that data, it will be harder and harder for disinformation campaigns to target people in the first place … I think we need to be talking about those things, rather than changing Section 230.”
Sherwin Siy (he/him) of the Wikimedia Foundation: "The Wikimedia Foundation hosts projects like Wikipedia–we provide the servers, and work on the software and interfaces for it–but Wikipedia is written by tens of thousands of users, who change what’s on the site several times each second. Section 230 means that, should one of those edits defame someone or cause trouble, neither the Foundation nor any other editor gets blamed for that one person’s action. It also means that the communities on these projects have the ability to create and enforce their own standards for how content gets moderated–and for the most part, that content moderation deals with how encyclopedic something is, not whether or not it’s illegal or abusive. Section 230 isn’t just about what is and isn’t decent–it’s about making sure a website, and the community on it, can set standards around things like not accepting original research, or self-promotion, or even creating standards around biographical information that respect article subjects’ rights that go beyond what’s required in the law. Having standards like these helps communities strive together to make Wikipedia as accurate and reliable as it can be, and Section 230 is a necessary part of making that happen.”
Lawrence (Larry) Walters (he/him), General Counsel for the Woodhull Freedom Foundation and attorney with Walters Law Group: “Requiring tech companies to moderate more user content through proposed Section 230 reform will not stop disinformation online, but will lead to greater censorship of constitutionally protected speech. Big Tech wants content regulation so they can claim they are simply following the law when shutting down disfavored speakers. This approach helps no one but a few large online platforms. The first attempt to tinker with Section 230, through FOSTA, was an unmitigated disaster resulting in censorship of protected expression and increased danger to sex workers. Congress should learn the hard lesson taught by FOSTA by fostering a free Internet by rejecting any further weakening of Section 230 immunity.”
“Repealing Section 230 will not solve the disinformation crisis,” said Jennifer Brody (she/her), U.S. Advocacy Manager at Access Now. “Disinformation wouldn’t be effective without coercive micro-targeting, and micro-targeting wouldn’t exist without invasive data harvesting practices. If we are serious about stopping the dangerous fire hose of lies online, we cannot overlook the importance of passing a rights-respecting federal data protection law in the United States.”
“As a community who has experienced being the target of legislative reforms and the unintended consequences, sex workers, and people associated with the sex trade have born the brunt of what happens when reforms to 230 do not consider marginalized communities, or create quickly drafted, budget-neutral bills,” said Kate D’Adamo, Partner at Reframe Health and Justice and long-time sex workers’ rights advocate. “While this conversation is centered on disinformation, it is using the same flawed starting point – to assume that 230 is the problem and that additional liability is the solution.What we need is not simply additional avenues for civil suits. What we need is transparency with how platforms are making decisions, accountability and redress for those who are constantly kicked off for exercising basic survival, and a serious investment in anti-violence efforts.”
###