By Jeremy Malcolm, Cindy Cohn, and Danny O’Brien for The Electronic Frontier Foundation

In the wake of Charlottesville, both GoDaddy and Google have refused to manage the domain registration for the Daily Stormer, a neo-Nazi website that, in the words of the Southern Poverty Law Center, is “dedicated to spreading anti-Semitism, neo-Nazism, and white nationalism.” Subsequently Cloudflare, whose service was used to protect the site from denial-of-service attacks, has also dropped them as a customer, with a telling quote from Cloudflare’s CEO: “Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet. No one should have that power.”

We agree. Even for free speech advocates, this situation is deeply fraught with emotional, logistical, and legal twists and turns. All fair-minded people must stand against the hateful violence and aggression that seems to be growing across our country. But we must also recognize that on the Internet, any tactic used now to silence neo-Nazis will soon be used against others, including people whose opinions we agree with. Those on the left face calls to characterize the Black Lives Matter movement as a hate group. In the Civil Rights Era cases that formed the basis of today’s protections of freedom of speech, the NAACP’s voice was the one attacked.

Protecting free speech is not something we do because we agree with all of the speech that gets protected. We do it because we believe that no one—not the government and not private commercial enterprises—should decide who gets to speak and who doesn’t.

What Happened?

Earlier this week, following complaints about a vitriolic and abusive Daily Stormer article on Heather Heyer—the woman killed when a white nationalist drove a car into a crowd of anti-racism demonstrators—GoDaddy told the site’s owners that they had 24 hours to leave their service. Daily Stormer subsequently moved their domain to Google’s domain management service. Within hours Google announced that it too was refusing Daily Stormer as a customer. Google also placed the dailystormer.com domain on “Client Hold”, which means that Daily Stormer’s owner cannot activate, use or move the domain to another service. It’s unclear whether this is for a limited amount of time, or whether Google has decided to effectively take ownership of the dailystormer.com domain permanently. Cloudflare, whose service was used to protect the site from denial-of-service attacks, subsequently dropped them as a customer.

Protecting free speech is not something we do because we agree with all of the speech that gets protected. We do it because we believe that no one—not the government and not private commercial enterprises—should decide who gets to speak and who doesn’t.

We at EFF defend the right of anyone to choose what speech they provide online; platforms have a First Amendment right to decide what speech does and does not appear on their platforms. That’s what laws like CDA 230 in the United States enable and protect.

But we strongly believe that what GoDaddy, Google, and Cloudflare did here was dangerous. That’s because, even when the facts are the most vile, we must remain vigilant when platforms exercise these rights. Because Internet intermediaries, especially those with few competitors, control so much online speech, the consequences of their decisions have far-reaching impacts on speech around the world. And at EFF we see the consequences first hand: every time a company throws a vile neo-Nazi site off the Net, thousands of less visible decisions are made by companies with little oversight or transparency. Precedents being set now can shift the justice of those removals. Here’s what companies and individuals should watch for in these troubling times.

Content Removal At the Very Top of The Internet

Domain registrars are one of many types of companies in the chain of online content distribution—the Internet intermediaries positioned between the writer or poster of speech and the reader of that speech. Other intermediaries include the ISP that delivers a website’s content to end users, the certificate authority (such as Let’s Encrypt) that issues an SSL certificate to the website, the content delivery network that optimizes the availability and performance of the website, the web hosting company that provides server space for the website, and even communications platforms—such as email providers and social media companies—that allow the website’s URLs to be easily shared. EFF has a handy chart of some of those key links between speakers and their audience here.

The domain name system is a key part of the Internet’s technical underpinnings, which are enabled by an often-fragile consensus among many systems and operators. Using that system to edit speech, based on potentially conflicting opinions about what can be spoken on the Internet, risks shattering that consensus. Domain suspension is a blunt instrument: suspending the domain name of a website or Internet service makes everything hosted there difficult or impossible to access. The risk of blocking speech that wasn’t targeted is very high.

Domain name companies also have little claim to be publishers, or speakers in their own right, with respect to the contents of websites. Like the suppliers of ink or electrical power to a pamphleteer, the companies that sponsor domain name registrations have no direct connection to Internet content. Domain name registrars have even less connection to speech than a conduit provider such as an ISP, as the contents of a website or service never touch the registrar’s systems. Registrars’ interests as speakers under the First Amendment are minimal.

If the entities that run the domain name system started choosing who could access or add to them based on political considerations, we might well face a world where every government and powerful body would see itself as an equal or more legitimate invoker of that power. That makes the domain name system unsuitable as a mechanism for taking down specific illegal content as the law sometimes requires, and a perennially attractive central location for nation-states and others to exercise much broader takedown powers.

Another lever that states and malicious actors often reach for when seeking to censor legitimate voices is through denial-of-service attacks. States and criminals alike use this to silence voices, and the Net’s defenses against such actions are not well-developed. Services like Cloudflare can protect against these attacks, but not if they also face direct pressure from governments and other actors to pick and choose their clients. Content delivery networks are not wired into the infrastructure of the Net in the way that the domain name system is, but at this point, they may as well be.

These are parts of the Net that are most sensitive to pervasive censorship: they are free speech’s weakest links. It’s the reason why millions of net neutrality advocates are concerned about ISPs censoring their feeds. Or why, when the handful of global payment processors unite to block certain websites (like Wikileaks) worldwide, we should be concerned. These weak links are both the most tempting, and most egregiously damaging places, to filter the Net.

The firmest, most consistent, defense these potential weak links can take is to simply decline all attempts to use them as a control point. They can act to defend their role as a conduit, rather than a publisher. And just as law and custom developed a norm that we might sue a publisher for defamation, but not the owner of a printing press, or newspaper vendor, we are slowly developing norms about who should take responsibility for content online. Companies that manage domain names, including GoDaddy and Google, should draw a hard line: they should not suspend or impair domain names based on the expressive content of websites or services.

Have A Process, Don’t Act on the Headlines

Other elements of the Net risk less when they are selective about who they host. But even for hosts, there’s always a risk that others—including governments—will use the opaqueness of the takedown process to silence legitimate voices. For any content hosts that do reject content as part of the enforcement of their terms of service, or are pressured by states to secretly censor, we have long recommended that they implement procedural protections to mitigate mistakes—specifically, the Manila Principles on Intermediary Liability. The principles state, in part:

  • Before any content is restricted on the basis of an order or a request, the intermediary and the user content provider must be provided an effective right to be heard except in exceptional circumstances, in which case a post facto review of the order and its implementation must take place as soon as practicable.
  • Intermediaries should provide user content providers with mechanisms to review decisions to restrict content in violation of the intermediary’s content restriction policies.
  • Intermediaries should publish their content restriction policies online, in clear language and accessible formats, and keep them updated as they evolve, and notify users of changes when applicable.

These are methods that protect us all against overbroad or arbitrary takedowns. It’s notable that in GoDaddy and Google’s eagerness to swiftly distance themselves from American neo-Nazis, no process was followed; CloudFlare’s Prince also admitted that the decision was “not CloudFlare’s policy.” Policies give guidance as to what we might expect, and an opportunity to see justice is done. We should think carefully before throwing them away.

It might seem unlikely now that Internet companies would turn against sites supporting racial justice or other controversial issues. But if there is a single reason why so many individuals and companies are acting together now to unite against neo-Nazis, it is because a future that seemed unlikely a few years ago—where white nationalists and Nazis have significant power and influence in our society—now seems possible. We would be making a mistake if we assumed that these sorts of censorship decisions would never turn against causes we love.

Part of the work for all of us now is to push back against such dangerous decisions with our own voices and actions. Another part of our work must be to seek to shore up the weakest parts of the Internet’s infrastructure so it cannot be easily toppled if matters take a turn for the (even) worse. These actions are not in opposition; they are to the same ends.

We can—and we must—do both.

source: https://www.eff.org/deeplinks/2017/08/fighting-neo-nazis-future-free-expression