PEN AMERICA: FAQs on free speech and the Capitol Hill insurrection

Last week’s lawless insurrection on Capitol Hill raised a series of questions about free speech, the First Amendment, and protest rights. Should presidents be banned from Twitter and Facebook? Should tech companies refuse to host social networks sites like Parler? What defines “hate speech” and “incitement”?

 

This statement was originally published on pen.org on 11 January 2021.

Last week’s lawless insurrection on Capitol Hill raised a series of questions about free speech, the First Amendment, and protest rights. Should presidents be banned from Twitter and Facebook? Should tech companies refuse to host social networks sites like Parler? What defines “hate speech” and “incitement”?

Here, PEN America offers a guide to these thorny issues. These debates are sure to continue to simmer over the weeks to come, and we’ll be updating this post periodically as events dictate.

Was Twitter right to ban Trump?

There is no question that, under the First Amendment, Twitter has the right to police content on what is a private platform – the First Amendment protects speech from government intrusion, not from the actions of private actors. There is also little dispute that Trump’s many incendiary, dishonest, and hateful tweets violated the platform’s rules. But Twitter has asserted that speech by government leaders has unique value to the public, and has allowed such content and accounts to remain active even when they violate the rules. At PEN America, we supported that approach back in October of 2019 as the U.S. entered an election year in which we saw value in the public being able to see and evaluate an unvarnished Trump on Twitter.

The election is now behind us, and there’s clear evidence that Trump’s tweets have not only denigrated truthful accounts of democratic processes but also sown insurrection; as such, we think Twitter is justified in disabling his accounts. Overall, though, the platform has a lot more work to do to ensure transparency and accountability in how it moderates content, to explain how special criteria for leaders on the platform are applied, and to ensure that users who believe their content has been unjustifiably suppressed have ready and adequate recourse to appeal.

Should Apple, Google, Amazon, and others disable Parler from their app stores and servers? What is Parler, anyway?

Parler is a Twitter-like social media platform that has become a favorite among conservatives who believe mainstream social media outlets like Facebook and Twitter are biased against them. But it has also played host to calls for insurrection and violence. According to the app stores and services that have suspended it, Parler has failed to apply its own content moderation policies in violation of their terms of service. Apple, Google, and Amazon are private companies, and they are free to apply their terms of service and choose whether or not they will host Parler on their app stores or servers. Whether they are doing so out of a genuine concern of the risk of violence, out of a corporate inclination towards risk avoidance, or some combination of both, is impossible to know.

That said, we are troubled by the prospect of providers of dominant internet services making spur-of-the-moment value judgments that affect entire platforms that host many speakers, many of whom may not be violating any rules. While the content adjudication mechanisms of Google, Facebook, Twitter and other social media platforms remain deeply flawed, it is even less clear that companies like Amazon and Apple have staffing or policies that equip them to begin to come to grips with business decisions that may have sweeping ramifications for who can and cannot be heard in our public discourse.

When Zoom barred an individual speaker from using its platform because of her political views, we protested. Amazon, Apple, and Google have said that their ban on Parler is not about specific views being advanced but rather the platform’s persistent failure to enforce its own policies barring hateful and inciting content. It is critical that those service providers spell out their policies and verification methods with respect to what they require of applications they host or otherwise support, and provide details on how Parler’s conduct fell short – and that they be able to do so in any such future cases. Otherwise the risk is that all types of controversial speech could be summarily shut down by internet services that shy away from any reputational or political risk.

Read more here

Post a comment

You must be logged in to post a comment.