28 votes

Bodies in seats: At Facebook’s worst-performing content moderation site in North America, one contractor has died, and others say they fear for their lives

9 comments

  1. [8]
    rkcr
    Link
    I used to think that using algorithms to moderate was dumb because algorithms can't possibly detect the subtleties of human interaction. But I feel like I'm starting to swing around to "...but...

    I used to think that using algorithms to moderate was dumb because algorithms can't possibly detect the subtleties of human interaction. But I feel like I'm starting to swing around to "...but maybe we should do it anyways" to save moderators from having a horrible life.

    8 votes
    1. [6]
      unknown user
      Link Parent
      I'd try and tackle it from another angle: we need to seriously beef up how we respond to people who post nasty/hateful/violent content on the internet. If that involves police visits, fining...

      I'd try and tackle it from another angle: we need to seriously beef up how we respond to people who post nasty/hateful/violent content on the internet. If that involves police visits, fining people, and jail time in serious cases, so be it. People need to be responsible for their actions, and if they won't be, society needs to hold them responsible.

      9 votes
      1. [5]
        JakeTheDog
        Link Parent
        China has a pretty effective system - social credit (with real consequences) and exclusively government-sanctioned social media platforms.

        China has a pretty effective system - social credit (with real consequences) and exclusively government-sanctioned social media platforms.

        2 votes
        1. [4]
          CALICO
          Link Parent
          China's system is also an absolute nightmare, considering it's used as a means of suppressing dissent.

          China's system is also an absolute nightmare, considering it's used as a means of suppressing dissent.

          4 votes
          1. [3]
            JakeTheDog
            Link Parent
            Which is my point. Who is going to ensure that it won't end up like that?

            Which is my point. Who is going to ensure that it won't end up like that?

            3 votes
            1. [2]
              unknown user
              Link Parent
              Slippery slope fallacy. It's entirely possible for reasonable countries to implement reasonable laws and keep them stable over time. New Zealand's laws around free speech spring to mind. I think...

              Slippery slope fallacy. It's entirely possible for reasonable countries to implement reasonable laws and keep them stable over time. New Zealand's laws around free speech spring to mind. I think taking the argument to an illogical extreme—China—is not interpreting my argument in good faith at all.

              1 vote
              1. JakeTheDog
                Link Parent
                I'm aware of the fallacy. That's why I'm explicitly not saying it's inevitable, I'm asking for how would we ensure it wouldn't emulate China. Clearly it's not an illogical extreme (technically,...

                I'm aware of the fallacy. That's why I'm explicitly not saying it's inevitable, I'm asking for how would we ensure it wouldn't emulate China. Clearly it's not an illogical extreme (technically, it's logical) since there are plenty of countries with their own ambitions to control discourse (e.g. Iran and Russia's independent internet). And with the Overton window it can become quickly normalized.
                There's plenty of opinion to go around in these discussions but not enough concrete policy or science. I'm looking for the latter.

    2. FZeroRacer
      Link Parent
      I actually think it's more important than ever to have proper human moderators. The problem with Facebook's approach is two fold: They don't properly compensate their moderators. Compensation...

      I actually think it's more important than ever to have proper human moderators. The problem with Facebook's approach is two fold:

      1. They don't properly compensate their moderators. Compensation meaning hazard pay, health benefits and a non-toxic work environment.
      2. They allow for far too much unrestrained growth. The fact that they have so few barriers to making posts and sharing vile shit means the amount of bile you see grows exponentially. This combined with bad actors and a lassiez-faire outlook from upper management means moderators are exposed to far more terrible shit than they should be due to the policies of Facebook.

      An algorithmic approach only obfuscates the actual problems.and provides a convenient smokescreen for companies to hide behind when their algorithm is conveniently wrong.

      8 votes
  2. Bullmaestro
    (edited )
    Link
    Filthy offices? Cutthroat mismanagement? Minimal job security? Targets not being met by a big margin? Workers struggling to cope with the high workload and heavy stresses of their job? Cognizant...

    Filthy offices? Cutthroat mismanagement? Minimal job security? Targets not being met by a big margin? Workers struggling to cope with the high workload and heavy stresses of their job?

    Cognizant definitely sound like a typical outsourced call centre to me.

    3 votes
  3. Comment removed by site admin
    Link