What You Can't Say on Facebook

After years of pleas from activists and users, Facebook publicly released a version of its Community Guidelines on Tuesday—thousands of words that attempt to describe what you can’t say on the service.

Or, more precisely, the document spells out what Facebook will take down, if it is alerted by users. The text lays out Facebook’s first principles—“safety,” “voice,” and “equity”—and demonstrates how hard those are to turn into operational dictums.

The big platforms all have a document like this one, and Facebook’s is an exemplar of the genre. Rochelle LaPlante, an expert content moderator through her work on Amazon Mechanical Turk, has seen different guidelines similar to this. “There’s nothing particularly unusual or strange that stands out,” LaPlante told me. “I’m impressed by the transparency and really glad they go into the level of detail that they do.”

A close reading of the text shows that this is a manual of adjudication, designed to provide guidance for humans who are trying to decide what to do with individual posts, comments, pictures, and videos. At times, the guidelines are remarkably broad, at others bizarrely precise; the document smells of high-minded ideals and sweaty-pitted compromise forged in reaction to news events.

For example, almost 20 percent (47 of 247 words) of the harassment section is dedicated to allegations about crisis actors:

[Do not] target victims or survivors of violent tragedies by name or by image, with claims that they are

  • Lying about being a victim of an event
  • Acting/pretending to be a victim of an event
  • Otherwise paid or employed to mislead people about their role in the event.

Why spell all this out here? Perhaps the bad press generated quite recently around the Parkland shooting?

In the child-abuse section, the guidelines note specifically that videos depicting “tossing, rotating, or shaking of an infant (too young to stand) by their wrists/ankles, arms/legs, or neck” will be considered videos of child abuse. Why is the parenthetical “too young to stand” necessary? Wouldn’t doing the same thing to a 2-year-old qualify? The phrasing suggests that there is some specific case where this was relevant, even if it is hard for us to imagine what it might have been. And it implies that the document records some subset of the exceptions and difficult decisions that the company has come to.

The forum for such decisions is known. There is a regular meeting at Facebook that Monika Bickert, the company’s Vice President of Global Policy Management, has described as a “mini legislative session.” In it, different teams across the company come together to agree on what to include in the community guidelines.

If the policy meeting determines legislation, the content moderators then try to apply the law to individual cases. This “legal system”—to keep with the governmental metaphors — metes out decisions, but what kind of institutional memory does Facebook preserve of particularly hard calls or mistakes made? What does the escalation process look like if an individual content moderator cannot make a judgment?

Over the last decade, Facebook users have become accustomed to the existence of these documents, but guidelines of this nature are unprecedented. The closest analogue I have thought of are the covenants, conditions, and restrictions that are sometimes used in real-estate developments. Except that this planned community does not serve a few hundred people, but billions.

These guidelines are needed because the social platforms have created new conditions for humans, and there is no plausible mechanism for people to work things out in the ways that they have in the past. The platforms turned relationships into entities with infinite memory, searchability, concreteness. To build the social graph, to create models of the human social world, what we say to each other in the normal course of human life had to become fixed in text, photos, and videos.

While boundaries of acceptable discourse have always existed, they could remain fuzzy and vague, human-scale. In a real-world community, no “Community Guidelines” beyond actual laws exist because they are policed by the people themselves, not a quasi-governmental entity in the form of a corporation’s content moderators. The guidelines are for the third party (i.e. Facebook) that has inserted itself between and inside and around human communication.  

It is true that a document like this is necessary for Facebook to function. It is true that Facebook must hire even more than the 7,500 content moderators they now have. It is true that this is a nearly impossible job that will leave many people unsatisfied with the decisions that Facebook makes.

But all these dilemmas only exist because Facebook has centralized so much power within its network. It’s important not to normalize this power, even as Facebook becomes more transparent about how the company wields it.



from Technology | The Atlantic https://ift.tt/2Jqusi1

Related Posts