When it comes to moderating content, Mark Zuckerberg probably gets the sense that Facebook is damned if it does and damned if it doesn’t.
The social network has been ridiculed in the past for not doing enough to police its platform — which, despite a scandal-filled 2018, still closed the year with 2.3 billion monthly users.
Now, Facebook is once again being skewered by its critics — despite hiring thousands of content moderators in recent years — after The Verge published a harrowing look at what it’s like to spend all day scanning the worst Facebook has to offer: racism; violence; beastiality; self-harm; murder.
For the 1,000 moderators at Cognizant, a Phoenix-based vendor used by Facebook, it’s all part of the job. And unsurprisingly, it inflicts a significant mental toll on its workers. Some current and former employees told The Verge they drink on the job or smoke weed to “numb their emotions.” Others said they’d developed “PTSD-like symptoms.” One former moderator, scared that angry ex-colleagues may return to the office, said he brought a gun to work each day.
Also Read: Facebook's Top-Secret Content Moderators Smoke Marijuana to 'Numb Their Emotions,' Report Says
While this was met with understandable horror by many tech reporters, former Facebook director and current Wired writer Antonio Garcia Martinez argued this is the painful but obvious byproduct of Facebook reluctantly stepping up its moderation.
“The same [Facebook] critics who call on the company to take on responsibility for moderating content (an operational job they don’t want, and had to be pressed to perform), will of course be shocked, shocked at the human cost in reviewing billions of pieces of random content,” Martinez tweeted.
The same FB critics who call on the company to take on responsibility for moderating content (an operational job they don't want, and had to be pressed to perform), will of course be shocked, shocked at the human cost in reviewing billions of pieces of random content. https://t.co/GSzzA6k2Nt
— Antonio García Martínez (@antoniogm) February 25, 2019
Martinez’s reaction raised a valid point: What do we even expect Facebook to do?
“Social media companies like Facebook are between a rock and a hard place,” tech ethicist David Ryan Polgar told TheWrap. “If they’re more proactive and remove content, the other side says, ‘How dare you? You’re not the government.’ But then if they’re more libertarian and say it’s a marketplace of ideas, then they’re not handling their social responsibility.”
Facebook has beefed up its moderation efforts since it was roundly blamed for letting Russian trolls run bogus political ads during the 2016 U.S. presidential election. The company also added several thousand moderators in 2017 in reaction to several videos depicting both murder and teens taking their own lives. Other social platforms like Twitter have followed suit in the last two years. Currently, Facebook employs about 15,000 reviewers around the world.
Also Read: Facebook Shuts Down Data-Collection App Onavo
In a wide-ranging discussion with Harvard professor Jonathan Zittrain earlier this month, Facebook chief Mark Zuckerberg touched on the difficulty of moderating billions of posts each day. Zuckerberg said the company wants to make sure “borderline content,” or content that approaches violating its rules, doesn’t become the most shared content on Facebook. At the same time, he acknowledged that the company doesn’t want to be in the business of determining what is and isn’t true.
“I believe very strongly that people do not want Facebook and that we should not be the arbiters of truth in deciding what is correct for everyone in the society,” Zuckerberg told Zittrain. “I think people already generally think that we have too much power in deciding what content is good.”
The challenge Facebook is “grappling with,” Zuckerberg said, is striking a balance between “free expression” on the one hand, and “safety” on the other, where users “rightfully have an expectation of us that we’re going to do everything we can to stop terrorists from recruiting people or people from exploiting children.”
Like Martinez, Polgar said this was a challenge Zuckerberg only took on to silence the company’s critics.
“Zuckerberg and [Twitter chief] Jack Dorsey, if they could shift away this power, they absolutely would,” Polgar said. “Think about how much time and PR nightmares they’re stuck in because they’re trying to clean up behavior online.”
It’s a responsibility Facebook is likely stuck with, though, now that it’s begun moderating in the first place. And if that’s the case, there are a few options that can be explored to make this undesirable job a bit more palatable.
The first fix, Polgar said, would be to pay its moderators better. As The Verge pointed out, Cognizant reviewers earned less than $29,000 per year — or about eight times less than the average Facebook employee makes. If they’re going to sift through disgusting and depressing content all day, he thinks they should at least be paid decently.
Also Read: Facebook Acted Like 'Digital Gangsters,' British Parliament Says
Another option would be to provide more mental health resources for its moderators. A Facebook rep told TheWrap the company was looking to “include even more regular and comprehensive focus groups” with its vendor employees than it currently does, but didn’t elaborate on specific programs it’ll implement.
Still, these changes wouldn’t act as a surefire panacea. Humans would still be taking the mental brunt of it. Artificial intelligence could potentially ease this burden, somewhat.
A person familiar with Facebook’s moderation efforts said the company already depends on AI to target much of the content it’s trying to eradicate, including posts related to terrorism and pornography. AI also helps to prioritize the most important posts for moderators to review, like content related to self-harm. But a future where human moderators are replaced entirely by AI is unlikely.
“There will most likely always be a need for humans as part of this process,” the Facebook employee said, because of the “nuance” needed to review posts.
Also Read: Will Journalists Be Replaced by Artificial Intelligence?
This week has shined a light on the bind Facebook finds itself in. To moderate billions of posts each day is a near-impossible task that comes with a substantial human cost. To improve this effort by paying better or hiring more moderators would be pricey. On the other hand, to abandon it would be a PR catastrophe and allow the ugliest content to float around Facebook. Strictly from a business standpoint, this isn’t feasible, as some users and advertisers would run away.
Zuckerberg made his choice to fight the worst content on Facebook, but the halfhearted attempt has once again provided ammunition for the company’s critics — the same critics Facebook was looking to satiate by focusing on content moderation in the first place.
Will Smith Checks Skydiving Off His Bucket List in New Facebook Watch Series
Zac Efron, Anna Kendrick to Voice Facebook Watch Animated Series From 'BoJack Horseman' Producer
Facebook's Top-Secret Content Moderators Smoke Marijuana to 'Numb Their Emotions,' Report Says