Life as a Facebook moderator: ‘People are awful. This is what my job has taught me’

Chris Gray, one of former employees suing tech giant, gives talk about gruelling work

Chris Gray, a former Facebook content moderator, outside the company’s office at Grand Canal Dock. Photograph: Alan Betson
Chris Gray, a former Facebook content moderator, outside the company’s office at Grand Canal Dock. Photograph: Alan Betson

Chris Gray was a community operations analyst for Facebook. His job was content moderation – reviewing user-reported content that ranged from puppies (animal rights activists frequently reported "puppies for sale" posts) to pornography and torture, rape, murder and child abuse. Gray is one of a number of former content moderators taking legal action against Facebook for psychological trauma experienced as a result of this work.

Last week, facilitated by technology ethics collective Tech Won’t Build It and TU Dublin, Gray decided to take his story to the people: he gave a public talk about his experiences while working for Facebook and live-streamed the event for any and all to watch.

Gray said that the flagged content he dealt with was a mixed bag: tone-deaf jokes, a photograph of a young child dressed as Adolf Hitler that may constitute hate speech, topless images, two users reporting each other in the midst of a petty argument, and then the graphic, upsetting and inhumane would inevitably surface.

“You’ll be scrolling through stuff like this and making decisions and then you get a truckload of very scared people being unloaded by men with machine guns somewhere in the Middle East. They’re lining them up and a trench has been dug in the ground. And you know what’s going to happen but you have to keep watching until the shooting starts and even after to ensure that you make the right decision,” he explains.

READ MORE

The “right decision” is a minefield, according to Gray. He spent most of his time at Facebook on what is known as the high priority queue: bullying, hate speech, and other content that has to be dealt with within 24 hours of the report being filed.

“This stuff comes in and, bang, it pops up on my queue as quickly as that. I’m making decisions and it’s just, bang, and the next one, bang, and the next one.”

Gray and other content moderators work, according to Facebook’s implementation standards, a list of rules they must be familiar with and follow. At the time of his working there – just over two years ago – this list was more than 10,000 words long with “very detailed, densely-written text” but since then moderators have spoken about what Gray refers to as “mission creep” – the rule book gets bigger as more rules are added over time.

“And then you see something and you have a question because [after consulting the implementation standards] you’re not sure. So you look to the Known Questions document and that’s another 50,000 words of clarification. And there’s another document called Operational Guidelines, which is 5,000 to 6,000 words telling you how to do your job.

“Then there’s all the training material: 18 PowerPoint presentations with a wall of words on every slide. So can you imagine the cognitive load, the amount of stuff that has to be going on in your head to process this content?” he explains.

Guidelines

I asked Gray if, during his time at Facebook, he was privy to information on who developed and amended these guidelines and how they were arrived at. He said they were just given the guidelines and told to follow them and never remove them from the premises.

Facebook's official word on this doesn't offer much insight: "Our content policy team writes these rules, drawing on their expertise in everything from counterterrorism to child sexual exploitation – and in close consultation with experts around the world," said Ellen Silver, vice-president of operations, back in 2018.

A document leaked by a Facebook employee to the New York Times in 2018 showed how complex judgments on problematic content appeared to be reduced to binary decisions while not always appearing to be internally consistent. And there is a hierarchy to the decision-making process, meaning it is not good enough to take down illegal, offensive or upsetting content, you must take it down for the right reason.

“The decision making is super granular,” says Gray. “I’ve stated to my lawyers that there were about a hundred possible decisions to make on any given piece of content while I was there. I’ve just seen a news report saying that it’s now 250.

A typical sample that is audited would have been 200-250 tickets a month, so on average you might be flagged for one mistake a week

“If you make the right action for the wrong reason, eg you deleted it because there was a naked man in the image but that naked man was doing something which is also illegal. One of those actions is more important than the other and you have to choose the right one. Otherwise you get it wrong and then you’re in an argument with your auditor.”

The auditors or superiors monitor a sample of reports from each moderator. It is in the moderator’s interest to try to get the point back by defending their decision because they have to meet a target of 98 per cent accuracy, Gray says. And it is in the auditor’s interest not to row back on their decision because they also have auditors reviewing their decisions.

“A typical sample that is audited would have been 200-250 tickets a month, so on average you might be flagged for one mistake a week. You spend the week trying to get those back. You’re not focused on your work, you’re focused on justifying your decisions.”

Again, Silver offered Facebook’s official line on this back in 2018: “A common misconception about content reviewers is that they’re driven by quotas and pressured to make hasty decisions. Let me be clear: content reviewers aren’t required to evaluate any set number of posts – after all nudity is typically very easy to establish and can be reviewed within seconds, whereas something like impersonation could take much longer to confirm.”

“We provide general guidelines for how long we think it might take to review different types of content to make sure that we have the staffing we need, but we encourage reviewers to take the time they need.”

Gray’s voice breaks as he tells the audience about the moment he felt there was something deeply wrong with this system. He was debating with his auditor the reason for taking down a particular image. (Warning: the next paragraph contains upsetting imagery).

“There’s a baby lying on the ground and somebody has their foot on the baby’s chest. The baby’s eyes are closed and its arms are out flat. For me that baby has just suffered a violent death. My auditor wasn’t sure that the baby was dead and so we get into an argument about how you can tell the baby is dead.

“And we know each other. This is not an acrimonious exchange. We’re just having a chat about who’s right and who is wrong and the fact that we’re discussing the dead baby doesn’t matter and you’re losing your humanity.”

Gray talks about his time as a Facebook moderator as being on front line. At first glance this might seem hyperbolic but when you consider the consequences of taking down certain material, it becomes apparent that these workers are asking themselves questions often asked by disaster survivors. Is there anything I could have done differently? Could I have saved that person’s life?

In May 2019, a 16-year-old girl in Malaysia posted a poll on Instagram asking her "friends" if she should live or die. The majority of followers – 69 per cent – voted for death and she took her own life.

“Looking at that as a content moderator, my first thought is why was that not taken down? It wasn’t taken down because it wasn’t reported. Not only did the majority of this girl’s friends vote that she should die, nobody actually took any action to protect their friend, to show any concern for welfare.

“And the takeaway I want you to remember from this kind of crap is people are awful. This is what my job has taught me. People are largely awful and I’m there behind my desk doing my best to save the world. But if no one tells the moderator what’s going on, they don’t know.”

"Mark Zuckerberg, my boss [at the time], he has an opinion on what I think about this, but then he goes on to say that there's a whole lot of us and we might have different experiences. Which is just a polite way of saying when you've got a lot of people, somebody's going to be affected by this stuff."

Fallout

It may be the case that Facebook and the companies they use to hire content moderators through (CPL, Accenture and others) are beginning to see the fallout from exposing employees to this kind of content in a high-pressure work environment with "wellness" offerings that are not licensed to treat PTSD or other psychological trauma.

Within a current job description for “community operations analyst” on the CPL website it states: “Candidates in this position must be able to deal with extreme, graphic and sensitive content.” Meanwhile, moderators employed by Accenture were asked to sign a document acknowledging that PTSD could be part and parcel of the job.

The document says: “I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (PTSD).” Furthermore, the document clarified that the wellness coach offered to employees “is not a medical doctor and cannot diagnose or treat mental disorders”.

Gray himself worked for Facebook through a CPL contract and later found out that his counsellor wasn’t employed by CPL but by a company that outsources corporate wellness solutions to tech companies.

Staff emails would arrive with offers of wellness coaching and yoga but the thinking, he said, was: “I haven’t got time for this. I’ve got targets to meet, I’ve got work to do. My boss is breathing down my neck. There’s 15,000 tickets in the high priority queue.”

Reading this alongside Silver’s official Facebook post on content moderation is conflicting: “We care deeply about the people who do this work. They are the unrecognised heroes who keep Facebook safe for all the rest of us. We owe it to our reviewers to keep them safe too,” she adds.

I think most people would agree with this. Facebook owes it to content moderators to keep them safe. But Gray’s account and those of other moderators around the world seem to suggest this is not the case. And Gray says by talking about this publicly he is doing his best to raise awareness that could lead to a positive change in how this important work is carried out.