What is 'free speech' on the web — in theory and in practice?
July 22, 2016
By Gene Policinski
Inside the First Amendement
Who can say what, on the Web?
Twitter has raised questions anew with reports of a lifetime ban on tweets from conservative blogger Milo Yiannopoulos — reportedly after complaints that he engineered a wave of racist and sexist comments directed against comedian and actress Leslie Jones, who is co-starring in the latest "Ghostbusters" movie.
Yiannopoulos is an editor on the conservative blog site Breitbart.com whose posts frequently create controversy on the web. He responded to the reported Twitter action by saying, "Anyone who cares about free speech has been sent a clear message: You're not welcome on Twitter." He also called the ban "cowardly."
Twitter would not confirm the action against Yiannopoulos but issued a statement saying, "People should be able to express diverse opinions and beliefs on Twitter. ... But no one deserves to be subjected to targeted abuse online, and our rules prohibit inciting or engaging in the targeted abuse or harassment of others."
Jones wrote earlier in the week about a decision to end her own Twitter account, which was targeted with racist tweets — some using pictures of apes (one from a person identified only as "KKK Cool J"), and others with racial epithets.
"I used to wonder why some celebs don't have Twitter accts.," she wrote. "Now I know. You can't be nice and communicate with fans 'cause people crazy. As much as I love live-tweeting, posting the pics of awesome things that happen in this life I've been blessed with, I don't know anymore."
For those who claimed the Twitter action — which by its terms would be a "permanent suspension" — was illegal or "the end of free speech on the web," the response is, it's neither. As a private company, there's no First Amendment ban on private companies determining what they will or won't permit in the spaces — broadcast, print or web — that they own.
As to the future of free speech on the web, there's plenty left — but we are just starting to work out the kind of legal and social rules about content, tone and manner that have evolved over decades for other kinds of communications.
Social media and other websites now regularly monitor postings to look for images, videos and text from groups like ISIS that once went up unfiltered. Where early web advocates once touted the ability of the internet to provide millions around the world the opportunity to converse, so-called "chat rooms" and comment areas are closed or closing because conversations and posts quickly veer into profane, defamatory or scatological exchanges bereft of any real benefits expected from freedom of speech.
Twitter acknowledged that its current policies on objectionable content and abusive behavior — particularly by those it called "repeat offenders" — are being tested, and not just by Yiannopoulos: "We know many people believe we have not done enough to curb this type of behavior on Twitter. We agree. We have been in the process of reviewing our hateful conduct policy to prohibit additional types of abusive behavior and allow more types of reporting, with the goal of reducing the burden on the person being targeted."
Newseum CEO Jeffrey Herbst has written and spoken about the challenges of digital "etiquette." In a speech at The Media Institute earlier this year, Herbst said that more speech is generally a better response to speech you don't like, and that "hate speech" is often protected by the First Amendment.
But he told the group there is room for civility online without curtailing freedom of expression: "With rights come responsibilities. We have not really thought through our responsibilities when it comes to the web." He also called for a move away from anonymity — which marked an overwhelming number of the disgusting comments about Jones that I could find in a net search.
Herbst called anonymous comments and posts a significant contributor to the crisis of civility" online and, subsequently, in society. While noting some unnamed speech must be protected, such as whistleblowers reporting misdeeds, Herbst suggested an online campaign: "Our message should be incessantly to everyone, starting with young people, that it does not count unless you put your name on it."
As offensive to some as Twitter's ban may be, it undeniably is another example of where we collectively may be staking out the boundaries of what can and cannot be posted — sometimes in fits and starts prompted by events. Print publications and broadcast outlets — with some measure of government involvement in the latter due to public ownership of the airwaves — have gone through the cycle in earlier times.
News operations have developed their own guidelines to restrain "live" TV coverage of police chases, threatened suicides and such. Journalism groups have debated and reshaped ethics codes. Network television standards have changed to permit language and images that never would have been seen a generation ago.
The speed, volume and persistence of online posts raise new questions around rules and regulations regarding defamation and harassment developed in an earlier media era — and for relatively new spaces of social media, where private "terms of service" rather than government statutes and court decisions over time have determined a measure of what's acceptable and what's not.
If users agree with where Twitter eventually sets its rules, it will continue to prosper. If not, assuredly the next new thing in social media will pop up, get popular and likely start the process all over again.
This latest Twitter flap is not the end of free speech on the web. But it's certainly a sizeable milepost in the ongoing discussion of what we want to be said freely online.
Gene Policinski is chief operating officer of the Newseum Institute and senior vice president of the Institute’s First Amendment Center. He can be reached at email@example.com. Follow him on Twitter: @genefac.