17 votes

The trauma floor - The secret lives of Facebook moderators in America

2 comments

  1. [2]
    alyaza
    Link
    pretty much everything about this is terrible! just some snapshots:

    pretty much everything about this is terrible! just some snapshots:

    For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it’s her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world’s largest social network. None of the trainees have seen it before, Chloe included. She presses play.
    The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking.
    Returning to her seat, Chloe feels an overpowering urge to sob. Another trainee has gone up to review the next post, but Chloe cannot concentrate. She leaves the room, and begins to cry so hard that she has trouble breathing.
    No one tries to comfort her. This is the job she was hired to do. And for the 1,000 people like Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is just another day at the office.

    The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”

    Moderators in Phoenix will make just $28,800 per year — while the average Facebook employee has a total compensation of $240,000.

    When Miguel has a question, he raises his hand, and a “subject matter expert” (SME) — a contractor expected to have more comprehensive knowledge of Facebook’s policies, who makes $1 more per hour than Miguel does — will walk over and assist him. This will cost Miguel time, though, and while he does not have a quota of posts to review, managers monitor his productivity, and ask him to explain himself when the number slips into the 200s.
    From Miguel’s 1,500 or so weekly decisions, Facebook will randomly select 50 or 60 to audit. These posts will be reviewed by a second Cognizant employee — a quality assurance worker, known internally as a QA, who also makes $1 per hour more than Miguel. Full-time Facebook employees then audit a subset of QA decisions, and from these collective deliberations, an accuracy score is generated.
    Miguel takes a dim view of the accuracy figure.
    “Accuracy is only judged by agreement. If me and the auditor both allow the obvious sale of heroin, Cognizant was ‘correct,’ because we both agreed,” he says. “This number is fake.”

    13 votes
    1. nacho
      Link Parent
      The thing I struggle with the most is this number. Do the 15,000 human moderators at Facebook really only manually moderate 22.5 million pieces of content? That number is unfathomably low. This...

      From Miguel’s 1,500 or so weekly decisions

      The thing I struggle with the most is this number.

      Do the 15,000 human moderators at Facebook really only manually moderate 22.5 million pieces of content? That number is unfathomably low.

      This basically means Facebook is mostly an entirely unmoderated platform. Or, even worse, is left entirely to algorithms that get it oh-so-wrong with regularity.


      As a volunteer moderator on reddit, I've regularly moderated many times more content than 1,500 items in a week. And I have a life and a different full-time job I don't reddit at.

      But at the price point the article describes, I'd see why things take way more time.

      I wonder what moderation is like for the employees at snapchat, tumblr, instagram, youtube, reddit etc. I'm left with way more questions from reading this piece than I thought I'd have.

      4 votes