Connect with us

ICT

Facebook’s handling of Alex Jones is a microcosm of its content policy problem

Published

on

Spread The News
Twitter CEO Jack Dorsey And Facebook COO Sheryl Sandberg Testify To Senate Committee On Foreign Influence Operations
A revealing cluster of emails reviewed by Business Insider and Channel 4 News offers a glimpse at the fairly chaotic process of how Facebook decides what content crosses the line. In this instance, a group of executives at Facebook went hands-on in determining if an Instagram post by the conspiracy theorist Alex Jones violated the platform’s community standards.
To make that determination, 20 Facebook and Instagram  executives hashed it out over the Jones post, which depicted a mural known as “False Profits” by the artist Mear One. Facebook began debating the post after it was flagged by Business Insider for kicking up anti semitic comments on Wednesday.
The company removed 23 of 500 comments on the post that it interpreted to be in clear violation of Facebook policy. Later in the conversation, some of the UK-based Instagram and Facebook executives on the email provided more context for their US-based peers.
Last year, a controversy over the same painting erupted when British politician Jeremy Corbyn argued in support of the mural’s creator after the art was removed from a wall in East London due what many believed to be antisemitic overtones. Because of that, the image and its context are likely better known in the UK, a fact that came up in Facebook’s discussion over how to handle the Jones post.
“This image is widely acknowledged to be anti-Semitic and is a famous image in the UK due to public controversy around it,” one executive said. “If we go back and say it does not violate we will be in for a lot criticism.”
Ultimately, after some back and forth, the post was removed.
According to the emails, Alex Jones’  Instagram account “does not currently violate [the rules]” as “an IG account has to have at least 30% of content violating at any given time as per our regular guidelines.” That fact might prove puzzling once you know that Alex Jones got his main account booted off Facebook itself in 2018 — and the company did another sweep for Jones-linked pages last month.
Whether you agree with Facebook’s content moderation decisions or not, it’s impossible to argue that they are consistently enforced. In the latest example, the company argued over a single depiction of a controversial image even as the same image is literally for sale by the artist elsewhere on both on Instagram and Facebook. (As any Facebook reporter can attest, these inconsistencies will probably be resolved shortly after this story goes live.)
The artist himself sells its likeness on a t-shirt on both Instagram and Facebook and numerous depictions of the same image appear on various hashtags. And even after the post was taken down, Jones displayed it prominently in his Instagram story, declaring that the image “is just about monopoly men and the class struggle” and decrying Facebook’s “crazy-level censorship.”
It’s clear that even as Facebook attempts to make strides, its approach to content moderation remains reactive, haphazard and probably too deeply preoccupied with public perception. Some cases of controversial content are escalated all the way to the top while others languish, undetected. Where the line is drawn isn’t particularly clear. And even when high profile violations are determined, it’s not apparent that those case studies meaningfully trickle down clarify smaller, everyday decisions by content moderators on Facebook’s lower rungs.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published.

Trending