Accessibility links

Facebook Caught Between Roles As Civility Cop, First-Amendment Facilitator


Facebook founder Mark Zuckerberg
Facebook founder Mark Zuckerberg
The recent controversy surrounding the Facebook group "Everybody Draw Muhammad Day" raised many questions about how the Palo Alto-based company is governing the online exchanges of some 400 million users around the world from many diverse cultures and countries, particularly when members seem to be violating the company's own terms of service against hate speech.

Facebook members have generated some 620 million groups, according to allfacebook.com, which cites Google's indexing.

Yet Facebook has developed a reputation for tolerating hate groups bashing Jews as well as Muslims, seeming to remove antigay groups faster than they do antireligious groups, and accepting at face value the complaints of abusive governments who get their sympathizers to get legitimate opposition groups removed as alleged violators of Facebook's terms.

These impressions of the social media giant are essentially generated by whatever mainstream media a group can get to cover its cause, as the company itself never talks about how it enforces its policies.

Facebook's Terms of Service (TOS), which have undergone many changes under constant pressure from users unhappy with various restrictions, have retained the basic bans on incitement of hate common to all Internet sites in the United States: "You will not post content that is hateful, threatening, or pornographic, or incites violence” and “you will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory.”

'Police Informants'

Yet like all online communities sheltering under the notion of the "Good Samaritan" interpretation of the 1996 Communications Decency Act and other laws on common carriers, Facebook management maintains that they are not responsible for third-party communications disseminated on their service. Despite the enormous technical capacity it possesses to pitch personalized friending options and demographically sensitive ads, Facebook says it cannot effectively police the speech of 400 million users around the world.

So like all online social media companies, Facebook falls back on the notion that users themselves have to file an abuse report in order to have a group considered for removal, creating a kind of "police informants" climate that can work with wild unfairness or effectiveness, depending on your issue.

Demonstrators shout slogans and wave placards as they protest against Facebook in Lahore, Pakistan, on May 26.

While Facebook's founder, Mark Zuckerberg, has often described his global sharing invention as a kind of country, even a massive community of people capable of acting in concert for good causes, unlike a real country Facebook has no democratically elected government nor any separation of powers or institutionalized checks and balances. Like all online services maintained by private companies, Facebook does not have to explain itself when it comes to questions about how its TOS is formulated and enforced.

Facebook never publicizes the complaints it receives about hate groups nor reports on how it acts on them. Often groups are removed by creators themselves under peer pressure and not by the company itself. There is no "police blotter" in this global village letting the public know what sort of complaints have come in -- nor whether they are legitimate.

Without that sort of transparency, it is all too easy for disgruntled governments in power to move against oppositions using Facebook, as apparently happened in Hong Kong before elections and before the Summer Olympics in Beijing.

'Difficult Decision'

Anti-Semitic and Holocaust denial groups have been common on Facebook, and the company seemed impervious to cries to have them removed. Last year, much like the current controversy over anti-Muslim groups, some Facebook users complained about two groups in particular -- "Holocaust: A Series of Lies" and "Holocaust is a Holohoax." In a widely read post, "Jew Haters Welcome At Facebook, As Long as They Aren't Lactating," popular tech blogger Michael Arrington marveled that Facebook used its own search technology to delete pictures of breast-feeding babies as indecent, but let Holocaust denial groups remain.

Despite Arrington’s highly publicized protest, Facebook did not immediately act.

"It's a difficult decision to make. We have a lot of internal debate and we bring in experts to talk about it," Facebook executive Barry Schnitt told CNN in May 2009. "Just being offensive or objectionable doesn't get it taken off Facebook. We want [the site] to be a place where people can discuss all kinds of ideas, including controversial ones."

Ultimately, it took litigation to have the groups removed.

Facebook is perceived by many to have moved faster to take down antigay hate groups, but it may have been because they were lobbied hard by numerous groups in the Lesbian Gay Bi-Sexual Transgender (LGBT) community to do so, and Facebook relies on multiple user complaints to make decisions.

How To Apply It Online

Groups trying to campaign for civility have a hard time in the U.S. context. The First Amendment applies only to the government and not to private companies, and they are not required to enforce free speech or its regulation. To be sure, there are limits on the First Amendment in various Supreme Court rulings such as Brandenberg v. Ohio (1969), which held that only "incitement to imminent lawless action" can be punishable by law. This notion has increasingly been tested by hate crimes such as cross-burning and threats against abortion doctors made online, and judges have developed a concept of “true threats” in ruling that certain violent speech is not protected. Yet they continue to struggle with how to apply it online where the imminence is not always clear.

Facebook executives seem to hide behind their users in these cases, claiming they need to receive abuse reports -- and sometimes they seem to let prosecutors take action before they will. For example, in Lithuania earlier this month two antigay groups were created to protest against the Baltic Pride gay parade in Vilnius. The Lithuanian Center for Human Rights brought the groups to the attention of authorities, as Article 170 of the Lithuanian Criminal Code provides for punishment of incitement of hatred or discrimination based on sexual orientation. They disappeared.

British police also recently investigated a Facebook campaign said to incite race hatred before the World Cup in South Africa in June, joined by 15,000 members said to be opposing whites, allegedly supporters of Julius Malema, youth wing head of President Jacob Zuma's African National Congress Party. The group was closed.

But while private companies in the United States are free to set any sort of speech code or membership criteria they wish, as exempt from the First Amendment, they resist becoming too specific and too active on policing speech; freedom of association essentially trumps outside efforts to impose civility as well as freedom of speech itself. In a case titled Boy Scouts of America v. Dale, the judges established that private groups can exclude whom they want, including gays.

Human rights advocates wish that Facebook would use its status of exemption from the First Amendment to enforce tolerance rules free of fear of litigation. The danger of that approach is that it strengthens the corporation rather than the constitution as the regulator of speech, and enables companies to act as arbitrarily as they like, depending on the personal beliefs of investors or executives.

Function As Local Governments

Government regulation of social media, much like television, radio, and video games, would help to enforce the First Amendment when it is violated by the companies themselves in suppressing criticism of their own actions, compelling them to be more transparent about what kind of complaints they get and how they handle them. Such an approach may also help enforce statutes against online bullying and stalking, making them criminal offenses as they would be offline.

A key consideration for policy-makers looking at modern application of the First Amendment since the boom in online social media is whether or not companies providing social media services function as local governments of sorts. In the landmark case of Marsh v. Alabama (1946), a Jehovah's Witness was permitted to distribute leaflets on a town's sidewalk even though it was owned by a private corporation. Yet the Silicon Valley titans have been able to elude such an analogy with "corporate towns" with the 2009 case of Estavillo v. Sony, when a user sued the PS3 gaming network for expelling him on free-speech grounds. The judge said the network “does not serve a substantial portion of a municipality's functions, but rather serves solely as a forum for people to interact subject to specific contractual terms."

Increasingly, with people moving online to engage not only in the surfing of news and entertainment sites but interacting intensively on social media web pages and engaging with others on their cell phones with such services, the civic space is no longer in the town square or the town hall but in a virtual commons run by various private corporations with arbitrary TOS. The First Amendment no longer has a functional space in which to operate on public streets and in public squares in the real world.

Lawmakers must realize that the public square is now largely online and now mainly controlled by private companies -- companies that resist any regulation as an interference in their own right to publish. Yet ultimately, the mounting complaints of users about the inflaming of hatred that characterizes online life, as well as the platform providers’ secrecy and impunity for their inaction, will compel lawmakers to regulate social media as they have long done with broadcast media, movies, and video games, both to enforce freedom of speech as much as to prevent incitement of violence.

Catherine A. Fitzpatrick is a freelance writer specializing in human rights in Eurasia and the author of a blog on the OSCE. The views expressed in this commentary are the author's own and do not necessarily reflect those of RFE/RL.

XS
SM
MD
LG