14
votes
The impossible job: Inside Facebook’s struggle to moderate two billion people
Link information
This data is scraped automatically and may be incorrect.
- Title
- Here's How Facebook Is Trying to Moderate Its Two Billion Users
- Published
- Aug 23 2018
- Word count
- 9455 words
Yet another longread on the moderation on large sites that has missed the main point and gotten bogged down in the conversations the platforms themselves want to have.
They don't want to acknowledge that they're publishers of the largest amount of content in the world, and therefore that they reasonably should have the largest sets of moderators (editors) in the world to cope with the scale of content they choose to publish.
Bang on the money.
Also bang on the money.
But then we get to the part where the media just seems to swallow whole the problems Facebook, reddit and the like have with their business models:
says this article
Wait. Hang on a second. Facebook has 23,000 employees and the company is worth around 500 billion dollars?
With 2 billion users, the company is worth $250 per user, or here's where things get crazy, 21,7 million dollars per employee.
Why is it that newspapers have huge editorial teams and staff to moderate all the content on their platforms?
For comparison, the New York times had 3700ish employees at the start of 2018, and was valued at a handful of billion. On the order of 1,3 million dollars per employee.
Basically, the whole of Facebooks model is broken. Everyone in media knows this. They need to hire a truckload more people if they are to moderate their content.
The media seems to consistently buy the social media platforms' framings of moderation: it's about having the right politics and rules the right engineered scripts and automated systems. No. You need to start by having enough people to go through all the manually flagged content with the appropriate level of thoroughness for each situation.
Then you need to have the capacity to have automated systems flag a bunch of potentially problematic content and to sift through all of that too.
This is a manpower issue. Not an automation issue. Not an issue of having the wrong rules. You need to start by getting your site under control, then you can look at automating more and more of your site that's under control to rationalize and save costs.
Oh man, I listened to this episode just yesterday.
It really gave me much more respect for the sheer size of the moderation job Facebook has.
I think that more credit needs to be given to them for their efforts to ensure they live by guidelines which encompass the whole world, rather than just western/American ethics and ideals.
It's another article that points to the need for the same kind of user moderation and metamoderation that Tildes hopes to implement.
The only way I think they can do this is to empower millions of it's users to regulate the site. Recent reports of an internal 'trustworthiness' score shows they are likely already working on it.
All of this is true, but I think the problem with Facebook runs deeper. I think this is also an antitrust issue. Facebook should be forced by law to use open protocols and publish their APIs and data schemas so that other sites can interoperate with and compete with Facebook. Likewise with Google, Twitter, Amazon, Apple, and Microsoft.
Furthermore, any code that isn't strictly for personal use on one's own hardware should be open source. Enough of this bullshit.
I'm gonna read the article in a minute, will edit & update this comment afterwards, but first thought after just reading the headline: this can be easily fixed with a federated model like mastodon, diaspora*, etc use.
Wow, turns out I was kind of right, kind of wrong. It isn't just that they don't use a better system (federation would allow a much better mod/user ratio), it's that they don't really use their own system right! I mean, facebook has a revenue in the 10's of billions of dollars per year. What would hiring 100,000 mods cost. Like, a billion max? Of course this isn't doable for them, because they don't care to do it. The truth is that they prefer outrageous amounts of profits over throwing a whole bunch of manpower at it. Basically what @nacho said earlier.
The second issue is intertwined with this: the sheer arrogance they have. If only we had better flowcharts, if only our training was better so that mods wouldn't burn out, if only we had better live-video tools. ad naseum. Maybe decide that live-streaming video isn't worth the hassle (or the few bucks extra it gets you). The article mentions this tidbit (quote from zuck):
Wowiiieeee. It's easy to say Stallman was right, but a quick search (ot: search engines suck at old content): 2011 vid in spanish specifically about facebook; a 2012 discussion on him being right; his own list of anti-facebook stuff that's been kept up to date for a while now; an old news link that mentions facebook (and google+ lol).
I'm not trying to fanboy here, there's lots that rms can be critiqued for. I'm just saying he's a well-known & public critic of facebook's bs, and has been for a looooong time. "this idea was not controversial"? bullshit. Facebook & Zuckerberg haven't listened to critique. They still don't really. They don't tend to respond until the media picks up a story because up until that point it's not hurting their revenue. The model, the capitalist underpinning of maximizing profits over fucking genocide, is what's wrong here.
The salaries for 100k employees making minimum wage would be 1.5 billion a year. That's not including health insurance or the office space for this army of workers. Not saying you need 100k workers to do this job effectively, but it doesn't seem as easy as you make it out to be.