Data can be a liability as well as an asset. It's great for ad targeting or fraud detection; it's problematic when the possessor is expected to police it. Duties range from reporting the blatantly illegal to blocking the undesirable to plumbing the murky depths of content moderation.
Most tech companies are ill-equipped to deal with matters outside of technology and engineering issues. On Monday, security company Cloudflare terminated services for 8chan after the forum was implicated in inspiring recent mass shootings. Cloudflare CEO Matthew Prince expressed discomfort over being an arbiter of content, while explaining that 8chan's removal "takes the heat off of us."
Tech executives generally don't want the burden of setting boundaries for internet discourse. It's a thankless job on par with being a soccer referee or a schoolmarm. A quick look at Jack Dorsey's Twitter mentions shows a constant barrage of angry tweets complaining about Twitter's content moderation policies. As one friendly user put it, "@jack does it feel weird that everyone on the site you created hates you?"
No wonder Mark Zuckerberg requested government regulatory standards for harmful content.
With all the debate over what material is fit for public consumption, some platform providers have opted for a solution that preemptively takes power out of their own hands. Messaging apps such as Telegram and Signal employ end-to-end encryption, preventing nonparticipants from viewing a conversation (theoretically even the company, though in practice hackers can get access). App providers cannot be expected to moderate conversations that they can't see.
Then there are decentralized social networks such as Mastodon and Diaspora. They are similar to Twitter and Facebook, but instead of residing on a centralized platform, the content is distributed across online communities, each hosted on an independent server.
Once such software is created, it's difficult to set rules about how it can be used. Mastodon was founded to enable active moderation beyond what Twitter could provide, by empowering those communities to police themselves; its founders were dismayed to see their software appropriated by the hands-off social network Gab. Creators of decentralized platforms cannot delete content, because they don't control the servers.
That's how 8chan found its way back online without Cloudflare, using an application called ZeroNet. ZeroNet is a peer-to-peer hosting service that downloads website material on behalf of visitors, then serves it back up as a decentralized content delivery network. This effectively performs the same function as Cloudflare, but rather than a single company controlling hundreds of servers, each server belongs to a unique visitor.
Censorship-resistant services have provided refuge for sex workers when a new law tried to prevent free speech about their industry, and more recently for protesters in Hong Kong. Then again, Diaspora and Telegram have both been used by the Islamic State group.
Decentralized platforms represent the resilient communications system that the internet was intended to be. As a result, obnoxious opinions don't simply disappear when they're removed from mainstream service providers. As undesirables are removed from social networks, they find like-minded individuals in darker corners of the web. Gab.com is often described as a "safe haven" for right-wing extremists, even though its founder emphasizes that the site welcomes dissidents of all stripes. The people who seek refuge in Gab tend to be those who have been banned from Twitter, and it just so happens that a lot of them represent the far right.
Similarly, 8chan gained traction as a haven for those who had been censored on 4chan, which had previously served as a refuge for those who had been banned by SomethingAwful, a rather dark site to begin with.
Facebook moderators have told of the mental distress suffered after viewing relentlessly awful content. This must be what it's like to be an 8chan user. If society wants to reform radical extremists, it's probably not a good idea to force them into a cesspool with other radical extremists. Spend enough time in a cesspool, and eventually that cesspool starts seeming normal.
Social media bubbles do this to everyone, to some extent. The White House has accused social media companies of an anti-conservative bias; from the perspective of a right-leaning president, Silicon Valley probably does seem anti-conservative. On the other hand, Facebook, Twitter and Google repeatedly deny that their platforms are biased. Engineers in Silicon Valley may honestly believe that their opinions are representative of political neutrality.
Banning extremists from social media platforms can prevent their ideas from poisoning the public well; however, the containment strategy can render it impossible for extremists to be "renormalized" by polite society. They'll find a place to exist no matter what, so heavy-handed moderation may end up doing more harm than good.
As former president Barack Obama said to the UN, "The strongest weapon against hateful speech is not repression; it is more speech." It may be better to have the nasty voices on major social networks where they can perhaps be redeemed, not just applauded by a few like-minded nasties.
Elaine Ou is a Bloomberg Opinion columnist. She is a blockchain engineer at Global Financial Access in San Francisco.