This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Constitutional Law

Aug. 14, 2019

When online speech platforms remove speakers, due process is needed

In our country’s fraught search to do something to stop mass violence, and hold someone or something accountable for conduct that has caused unspeakable pain and loss, many are looking to online speech forums.

Saira A. Hussain

Staff Attorney, Electronic Frontier Foundation

Phone: 415-436-9333

Email: saira@eff.org

Saira is on the EFF's civil liberties team, focusing on racial justice and surveillance.

Shutterstock

In our country's fraught search to do something to stop mass violence, and hold someone or something accountable for conduct that has caused unspeakable pain and loss, many are looking to online speech forums.

Much of the focus has revolved around 8chan, an online forum on which the El Paso shooter allegedly posted a manifesto prior to carrying out his attacks. After facing intense public pressure, Cloudflare, a DDOS protection service provider, dropped 8chan as a customer, as it had previously done to the white supremacist site Daily Stormer. In the days since, lawmakers have suggested misguided fixes ranging from repealing Section 230 of the Communications Decency Act -- which protects online platforms from legal liability for user-generated content -- to developing tools for social media to predict who will become a mass shooter. Even apart from these legislative changes, many continue to call for technology companies -- both platforms that directly host speech and those that provide services that allow speech sites to exist online -- to deplatform hate speech.

But figuring out what constitutes hate speech and how to remove it has companies approaching censorship with a hammer rather than a scalpel.

Without a doubt, many of the views shared on 8chan and the Daily Stormer are repugnant. However, the same actions taken to silence hateful but legal speech today can be used tomorrow to silence marginalized voices. In fact, after Cloudflare deplatformed the Daily Stormer, it received 7,000 requests to deplatform other websites from across the political spectrum.

Indirect intermediaries -- including web hosting services, DNS providers, internet service providers, and domain name registrars -- like Cloudflare are generally far removed from the speech itself. Often, their actions to deplatform lead to taking down an entire website or cutting off services to other "innocent bystander" websites. The cases we have handled at EFF involving indirect intermediary takedowns have demonstrated that it is large corporations that often have the time, money, and attorneys to go after smaller entities in an attempt to deplatform, like when Shell Oil sent a takedown notice to an ISP that resulted in the removal of activist group Oil Change International website after it launched a campaign critical of Shell's sponsorship of the New Orleans Jazz Festival.

Direct hosts like Facebook and Twitter have not fared much better in wielding a scalpel rather than a hammer. Takedowns are often left up to underpaid and overstressed contractors to decide, with Facebook removing nearly 300,000 hate-speech posts a month. But those takedowns have silenced marginalized voices, including conversations among women of color about the harassment they receive online, bans on lesbians referring to themselves as "dykes," and the account removal of an anti-torture activist critical of the Egyptian government.

We caution against deplatforming, especially by intermediate intermediaries, but if an online platform at any level decides to remove a speaker, they must follow steps that are clearly laid out.

First, companies should deplatform speakers only in rare circumstances, and should do it only after careful consideration, applying predetermined and clear standards that are free from government or corporate influence.

Second, companies should strive for as much transparency as possible in their decision-making, providing notice to both the impacted users and the general public.

Third, companies should publish information about takedowns, including the number of users or accounts flagged and the number of users or accounts that the company deplatformed.

Fourth, companies must contend with the fact that any policies or systems put into place to eliminate hate speech will be exploited by other entities -- including governments -- and take active steps to prevent misuse and promptly identify and correct inevitable mistakes.

At EFF, we helped write the Santa Clara Principles to outline some basic transparency and due process standards that companies should implement when they directly host user-generated content.

Ultimately, we must grapple with whether we want private companies to have the absolute power to make these decisions. The First Amendment protects our right to free expression without government intrusion, but what does it mean when an online forum hosted and powered by private companies becomes the de facto town square? 

#353840

Ilan Isaacs

Daily Journal Staff Writer
ilan_isaacs@dailyjournal.com

Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com