14 votes

Inside Facebook’s War on Hate Speech: An exclusive embed with Facebook’s shadow government

8 comments

  1. [4]
    nacho Link
    What a fascinating read. But also, how depressing it is that someone with this level of insight falls into my biggest pet-peeve of a faux concern regarding hate speech: Every developed country in...

    What a fascinating read.

    But also, how depressing it is that someone with this level of insight falls into my biggest pet-peeve of a faux concern regarding hate speech:

    Of all the prohibited content on the platform, hate speech is by far the hardest to adjudicate. First, there isn’t an obvious way to define it. In the U.S., there’s no legal category for hate speech; in Europe, it’s delineated pretty specifically.

    Every developed country in the world except the US has some form of law banning hat speech on the books.

    There's legal jurisprudence for all those issues with extreme troves of case law outlining in minute detail how each definition works, can be practiced, and what limits they have and issues they get "wrong"

    You just have to look anywhere outside the US.


    This article gets a lot of things about Facebook very, very right. The campus feeling being a "privatization of a public space" and the irony of Facebook's metaphor for how they deal with privacy as a "three sided coin" being literally money.

    However, the author gets the major point, the whole conversation regarding if "men are trash" is something you want on your site or not is somehow a large conundrum.

    The use of 'nigga' and 'nigger' varying by continent isn't a huge unsolvable issue.

    Different EU nations having different hate speech laws isn't a large issue of what to ban where or not.


    What all legal systems aim to accomplish is a match between the rules of society and the feeling of "getting things right."

    All of them resort to huge and complicated sets of laws because common sense isn't codified in a 40 page document.

    If Facebook wants to get common sense right, they need to accept that it requires qualified arbiters (read: not cheap) who get substantial training so they get things right.

    The site will also need tonnes of in depth rules if they're gonna govern all types of speech for billions of people.


    On top of that, if Facebook really wants to go for being somewhere everyone want to participate or at least feel safe participating, then Bickert can't have it both ways.

    “We really do want to give people room to share their political views, even when they are distasteful.”

    Is directly related to the “Social climate,” changing where “Immigration into Europe. Ethnic tension. Sharply contested elections. The rise of hot speech.” is different from what things were like five years ago.

    That's because of allowing that hate in normal discourse means.


    I really struggle with why so many people choose allowing hate speech as their free speech hill to die on.

    We already have a ton of regulations against different types of speech. Libel, slander, national secrets, fighting words, incitement to violence, intimidation/threats, lying/false advertising, copyright/trademarked speech, child porn, involuntary pornography, ,the list goes on and on.

    Free speech isn't all or nothing. Everyone sane agrees we need speech regulations on a sliding scale. Why is it so important to allow people to hate on others?

    Why does Facebook as a private company feel it's so super important not to err on the side of removing more hate than less hate? Would their community become too constructive and inclusive?

    Why do American journalists never seem to ask Facebook why it's so important for them to include hate in situations why it's not newsworthy?

    Why is it so important to let an African call someone "nigger" or an American to call someone "nigga"? Why in the world are those the hills these companies are choosing to die on?


    The haters must be a really, really profitable group in some way. Or be thought to be that profitable by the shareholders.

    Otherwise it'd never be Facebook's or Reddit's or any of the other tech-bro companies' hills to die on.

    10 votes
    1. [2]
      Deimos Link Parent
      It was in a completely different context (related to Elon Musk continuing to tweet things about Tesla after repeatedly getting in trouble with the SEC over it), but I thought this excerpt in...

      It was in a completely different context (related to Elon Musk continuing to tweet things about Tesla after repeatedly getting in trouble with the SEC over it), but I thought this excerpt in particular was a pretty good summary of how tech-industry people often try to approach things and the mismatch between that and reality (MTS stands for "Mystery Twitter Sitter"):

      Musk: Show me what in these three tweets is illegal or misleading.

      MTS: Look, as your lawyer, I am telling you that this seems like a bad idea, and you should at least wait until after your contempt-of-court hearing to do anything that might look like a violation of your settlement with the Securities and Exchange commission.

      Musk: Nope! I am a smart guy and I like to get into the details of every aspect of my business. I second-guess expert engineers all the time, and often it works out for me; I’m not going to do whatever you tell me just because you are a lawyer. I think these tweets are fine. If you don’t, you have to explain to me, specifically, how they violate the settlement.

      MTS: The world is not as black and white as many tech founders wish it was, and the legal system is not just a list of unambiguous written rules applied in a mechanical fashion. Whether you like it or not, regulators and courts operate in large areas of discretion; they have lots of ways to make life more difficult for you and for the company that you manage as a fiduciary for others, and they are used to being treated with a certain amount of deference by the people they regulate. Here they have you dead to rights on a technicality—you didn’t get your “500,000 cars” tweet pre-approved, as you promised you would, and it had to be corrected—and how they respond to that will depend on your overall attitude and behavior. My job as a lawyer is not just to look up rules and show them to you; it is to make predictions, grounded in research but also in experience and a certain professional connoisseurship, about how officials will react to particular fact patterns, and to advise you on the wisest course of action in shaping their reaction. (This is sometimes called “legal realism.”) My expert advice to you is that the benefit to you, and to your company and its workers and shareholders, of sending out these inscrutable late-night tweets is very low, while the risk of further antagonizing the SEC and the courts seems pretty high. I am not giving you a formal legal opinion that it is illegal for you to tweet “California.” I am just telling you that it’s dumb.

      A lot of tech people think there always needs to be some unambiguous process involved in making decisions (an algorithm), and don't want to accept that it's not always possible, and sometimes you just need to apply subjective judgment to get a good result. This conflict comes up a ton related to these types of moderation-like decisions.

      8 votes
      1. [2]
        Comment deleted by author
        Link Parent
        1. [2]
          Comment deleted by author
          Link Parent
          1. [2]
            Comment deleted by author
            Link Parent
            1. [2]
              Comment deleted by author
              Link Parent
              1. Heichou Link Parent
                I think I'm too uninformed to be commenting on this matter. My apologies

                I think I'm too uninformed to be commenting on this matter. My apologies

                2 votes
    2. TheInvaderZim Link Parent
      I disagree with your idea that the hater-hill is profitable. It's the tolerist dogma that's the problem. Tech companies are made up of almost entirely of corrupted 'change the world' ideals, from...

      I disagree with your idea that the hater-hill is profitable. It's the tolerist dogma that's the problem. Tech companies are made up of almost entirely of corrupted 'change the world' ideals, from the top down. Somehow, being tolerant of all walks of life equates to tolerating intolerance, which is why Trump's election somehow avoided riots from the tens of millions of people he outright attacked on the campaign.

      Then take that and combine it with the insane amount of entitlement our society grants people just for them being able to count to ten and scream loud enough, and you get the clusterfuck we've found ourselves in. Turns out when the only qualification for policymaking is being alive, you get a lot of morons attacking the system from both sides. Who'd have thought?

  2. [3]
    Deimos Link
    I'm not sure why there's such a flood of articles this week about Facebook's moderation/policy teams (including the excellent article from The Verge), but this one is a good read too.

    I'm not sure why there's such a flood of articles this week about Facebook's moderation/policy teams (including the excellent article from The Verge), but this one is a good read too.

    5 votes
    1. dubteedub Link Parent
      One thing that is brought up a lot in these articles is Facebook's insane policy document on how it moderates content. The way it is structured, you are not allowed to say something like “we...

      One thing that is brought up a lot in these articles is Facebook's insane policy document on how it moderates content. The way it is structured, you are not allowed to say something like “we should kill white men,” as that would be protected by the company’s hate-speech regulation, however, saying “we should kill black children” is allowed and not covered by their rules.

      NPR's Radiolab did a really great piece on that whole issue about a year ago that I found incredibly frustrating from a moderation standpoint.

      Back in 2008 Facebook began writing a document. It was a constitution of sorts, laying out what could and what couldn’t be posted on the site. Back then, the rules were simple, outlawing nudity and gore. Today, they’re anything but.

      How do you define hate speech? Where’s the line between a joke and an attack? How much butt is too much butt? Facebook has answered these questions. And from these answers they’ve written a rulebook that all 2.2 billion of us are expected to follow. Today, we explore that rulebook. We dive into its details and untangle its logic. All the while wondering what does this mean for the future of free speech?

      6 votes
    2. stephen Link Parent
      Maybe its a bunch of journalists who started research after the Zuck/Sandberg hearings and took a while to complete their work?

      Maybe its a bunch of journalists who started research after the Zuck/Sandberg hearings and took a while to complete their work?

      2 votes
  3. quinns Link
    I’ve ran into this same kind of extremist hate speech en masse, there are bubbles of it on fb that you can find if you know where to look. When I simply questioned someone making the assertation...

    I’ve ran into this same kind of extremist hate speech en masse, there are bubbles of it on fb that you can find if you know where to look. When I simply questioned someone making the assertation that all but a tiny percent of men are rapists or abusers I saw them come out in droves with comments like “men are trash,” that I was a “rape apologist,” etc. I tried to point out how this language would just create a divide and that some of them knew of me in real life even and that I was obviously not the enemy, they only got quiet, but I think the scariest part is how viciously some people believe in things like this. It seemed like it was almost entertainment to them, tearing me apart, until I managed to prove to them I was a human. Usually hate exists to serve itself in little echo chambers and what we see is just the ice above the water, a scary thought.

    2 votes