14 votes

The impossible job: Inside Facebook’s struggle to moderate two billion people

6 comments

  1. [3]
    nacho
    Link
    Yet another longread on the moderation on large sites that has missed the main point and gotten bogged down in the conversations the platforms themselves want to have. They don't want to...

    Yet another longread on the moderation on large sites that has missed the main point and gotten bogged down in the conversations the platforms themselves want to have.

    They don't want to acknowledge that they're publishers of the largest amount of content in the world, and therefore that they reasonably should have the largest sets of moderators (editors) in the world to cope with the scale of content they choose to publish.


    But the need for content moderation is better looked at as a systemic issue in Facebook’s business model.

    Bang on the money.

    “The fundamental reason for content moderation—its root reason for existing—goes quite simply to the issue of brand protection and liability mitigation for the platform,” says Sarah T. Roberts.

    Also bang on the money.

    But then we get to the part where the media just seems to swallow whole the problems Facebook, reddit and the like have with their business models:

    Facebook has a “policy team” made up of lawyers, public relations professionals, ex-public policy wonks, and crisis management experts that makes the rules. They are enforced by roughly 7,500 human moderators, according to the company.

    And at the end of October, Facebook said that by the end of 2018, it would hire 10,000 more people, including contractors, to double its staff working on improving its community safety. (It’s unclear if the company is lumping the 4,000 additional hires to remove ads and do community moderation are part of its push to bring on 10,000 people to tend to Facebook community safety.) Facebook currently has more than 23,000 employees, so these numbers are no drop in the bucket.

    says this article

    Wait. Hang on a second. Facebook has 23,000 employees and the company is worth around 500 billion dollars?

    With 2 billion users, the company is worth $250 per user, or here's where things get crazy, 21,7 million dollars per employee.


    Why is it that newspapers have huge editorial teams and staff to moderate all the content on their platforms?

    For comparison, the New York times had 3700ish employees at the start of 2018, and was valued at a handful of billion. On the order of 1,3 million dollars per employee.

    Basically, the whole of Facebooks model is broken. Everyone in media knows this. They need to hire a truckload more people if they are to moderate their content.

    In 2009, Facebook had just 12 people moderating more than 120 million users worldwide.

    The media seems to consistently buy the social media platforms' framings of moderation: it's about having the right politics and rules the right engineered scripts and automated systems. No. You need to start by having enough people to go through all the manually flagged content with the appropriate level of thoroughness for each situation.

    Then you need to have the capacity to have automated systems flag a bunch of potentially problematic content and to sift through all of that too.

    This is a manpower issue. Not an automation issue. Not an issue of having the wrong rules. You need to start by getting your site under control, then you can look at automating more and more of your site that's under control to rationalize and save costs.

    10 votes
    1. [2]
      Comment deleted by author
      Link Parent
      1. zendainc
        Link Parent
        Oh man, I listened to this episode just yesterday. It really gave me much more respect for the sheer size of the moderation job Facebook has. I think that more credit needs to be given to them for...

        Oh man, I listened to this episode just yesterday.

        It really gave me much more respect for the sheer size of the moderation job Facebook has.

        I think that more credit needs to be given to them for their efforts to ensure they live by guidelines which encompass the whole world, rather than just western/American ethics and ideals.

        1 vote
    2. kashprime
      Link Parent
      It's another article that points to the need for the same kind of user moderation and metamoderation that Tildes hopes to implement. The only way I think they can do this is to empower millions of...

      It's another article that points to the need for the same kind of user moderation and metamoderation that Tildes hopes to implement.

      The only way I think they can do this is to empower millions of it's users to regulate the site. Recent reports of an internal 'trustworthiness' score shows they are likely already working on it.

      2 votes
  2. demifiend
    Link
    All of this is true, but I think the problem with Facebook runs deeper. I think this is also an antitrust issue. Facebook should be forced by law to use open protocols and publish their APIs and...

    This is a manpower issue. Not an automation issue. Not an issue of having the wrong rules. You need to start by getting your site under control, then you can look at automating more and more of your site that's under control to rationalize and save costs.

    All of this is true, but I think the problem with Facebook runs deeper. I think this is also an antitrust issue. Facebook should be forced by law to use open protocols and publish their APIs and data schemas so that other sites can interoperate with and compete with Facebook. Likewise with Google, Twitter, Amazon, Apple, and Microsoft.

    Furthermore, any code that isn't strictly for personal use on one's own hardware should be open source. Enough of this bullshit.

    4 votes
  3. [2]
    Tenar
    (edited )
    Link
    I'm gonna read the article in a minute, will edit & update this comment afterwards, but first thought after just reading the headline: this can be easily fixed with a federated model like...

    I'm gonna read the article in a minute, will edit & update this comment afterwards, but first thought after just reading the headline: this can be easily fixed with a federated model like mastodon, diaspora*, etc use.

    Wow, turns out I was kind of right, kind of wrong. It isn't just that they don't use a better system (federation would allow a much better mod/user ratio), it's that they don't really use their own system right! I mean, facebook has a revenue in the 10's of billions of dollars per year. What would hiring 100,000 mods cost. Like, a billion max? Of course this isn't doable for them, because they don't care to do it. The truth is that they prefer outrageous amounts of profits over throwing a whole bunch of manpower at it. Basically what @nacho said earlier.

    The second issue is intertwined with this: the sheer arrogance they have. If only we had better flowcharts, if only our training was better so that mods wouldn't burn out, if only we had better live-video tools. ad naseum. Maybe decide that live-streaming video isn't worth the hassle (or the few bucks extra it gets you). The article mentions this tidbit (quote from zuck):

    “Facebook stands for bringing us closer together and building a global community. When we began, this idea was not controversial,” he wrote. “My hope is that more of us will commit our energy to building the long term social infrastructure to bring humanity together.”

    Wowiiieeee. It's easy to say Stallman was right, but a quick search (ot: search engines suck at old content): 2011 vid in spanish specifically about facebook; a 2012 discussion on him being right; his own list of anti-facebook stuff that's been kept up to date for a while now; an old news link that mentions facebook (and google+ lol).

    I'm not trying to fanboy here, there's lots that rms can be critiqued for. I'm just saying he's a well-known & public critic of facebook's bs, and has been for a looooong time. "this idea was not controversial"? bullshit. Facebook & Zuckerberg haven't listened to critique. They still don't really. They don't tend to respond until the media picks up a story because up until that point it's not hurting their revenue. The model, the capitalist underpinning of maximizing profits over fucking genocide, is what's wrong here.

    4 votes
    1. toobsock
      Link Parent
      The salaries for 100k employees making minimum wage would be 1.5 billion a year. That's not including health insurance or the office space for this army of workers. Not saying you need 100k...

      The salaries for 100k employees making minimum wage would be 1.5 billion a year. That's not including health insurance or the office space for this army of workers. Not saying you need 100k workers to do this job effectively, but it doesn't seem as easy as you make it out to be.

      2 votes