23 votes

YouTube said it was getting serious about hate speech. Over six weeks later, why is it still full of extremists?

21 comments

  1. [9]
    Sand
    Link
    Because it is impossible to moderate a site with millions of videos being uploaded every day. The article even says this. And then people get mad when non-hateful videos are automatically demonetized.

    Because it is impossible to moderate a site with millions of videos being uploaded every day. The article even says this. And then people get mad when non-hateful videos are automatically demonetized.

    10 votes
    1. [4]
      Deimos
      Link Parent
      It's impossible to do perfectly, but they can certainly do a lot better. Google isn't some poor downtrodden startup without resources, they just announced $10 billion in profits this quarter. We...

      It's impossible to do perfectly, but they can certainly do a lot better. Google isn't some poor downtrodden startup without resources, they just announced $10 billion in profits this quarter.

      We need to stop acting like "we built a platform too large for us to effectively manage" is an excuse, or even worse, treating it like a laudable accomplishment.

      27 votes
      1. [2]
        Sand
        Link Parent
        I suppose so, they can certainly do better, but I really don't think it's a big surprise that they aren't doing a very good job at it. There are lots of things they are doing poorly.

        I suppose so, they can certainly do better, but I really don't think it's a big surprise that they aren't doing a very good job at it. There are lots of things they are doing poorly.

        4 votes
        1. [2]
          Comment deleted by author
          Link Parent
          1. moriarty
            Link Parent
            I think the reason they're doing a bad job is because there's no monetary figure attached to it. In the case of copyright violations, when they faced fines, they quickly straightened up and built...

            I think the reason they're doing a bad job is because there's no monetary figure attached to it. In the case of copyright violations, when they faced fines, they quickly straightened up and built the mechanisms to moderate it. Should do the same thing here.

            3 votes
      2. Amarok
        Link Parent
        Simple steps to get started: Publish clear and concise guidelines for what is and isn't acceptable. I see nothing but total confusion out there on my channels about what is and is not allowed. It...

        Simple steps to get started:

        1. Publish clear and concise guidelines for what is and isn't acceptable. I see nothing but total confusion out there on my channels about what is and is not allowed. It seems even youtube staff is confused on this stuff and can't give straight answers. They do not seem to be outright asshats about it, though - just overworked and delayed as fuck trying to keep up with it all.

        2. Instead of videos, focus on accounts directly. Moderate the channels rather than each individual piece of content. Users who break the rules go away. Users who stick to the rules get the benefit of the doubt.

        This game of algorithmic whack-a-mole will go nowhere. To even begin you'd need an algorithm that has a near-perfect understanding of every language including some not-trivial level of nuance. That's not happening any time soon, sorry.

        But they won't. They aren't interested. Dealing with moderation is a shit task that they want nothing whatsoever to do with and have no idea how to do well. It's all busywork, some of it remarkably nasty, and it's well beyond the scope of human labor.

        Heck, even if they did somehow magic a moderation machine into existence, how well do you think it'll catch on with the users? It's no small ask to change your content guidelines mid-stride without pissing off half your users.

        I see a lot of 'youtube is kill' out there right now. Everyone's ready to move their content elsewhere, as soon as there's a viable alternative or two. Mostly this is rage over youtube prioritizing media outlet content such as NBC/FOX over independent content. Youtube isn't trying to send views to their users any longer - they are trying to capture them and make them watch whoever pays youtube the most money for the privilege. That's a calcified, dead website walking.

        People are being randomly unsubscribed from channels, not notified of videos from channels they subscribe to, and having junk content they aren't interested in shoved in their faces instead of what they want. The stupid is strong with this company lately.

        I look forward to whatever the next iteration brings. I think we're in for an era of major turnover on the internet over the next ten years. This generation of sites is starting to crack, and when the ad bubble goes, so will they.

        3 votes
    2. alyaza
      Link Parent
      right, but gizmodo also literally gave them a list of extremist channels compiled by anti-hate groups and channels who were endorsed by stormfront, and youtube didn't do much either, as the lede...

      Because it is impossible to moderate a site with millions of videos being uploaded every day. The article even says this.

      right, but gizmodo also literally gave them a list of extremist channels compiled by anti-hate groups and channels who were endorsed by stormfront, and youtube didn't do much either, as the lede describes:

      With that in mind, we used lists of organizations promoting hate from the Southern Poverty Law Center, Hope Not Hate, the Canadian Anti-Hate Network, and the Counter Extremism Project, in addition to channels recommended on the white supremacist forum Stormfront, to create a compendium of 226 extremist YouTube channels earlier this year.

      While less than scientific (and suffering from a definite selection bias), this list of channels provided a hazy window to watch what YouTube’s promises to counteract hate looked like in practice. And since June 5th, just 31 channels from our list of more than 200 have been terminated for hate speech. (Eight others were either banned before this date or went offline for unspecified reasons.)

      Before publishing this story, we shared our list with Google, which told us almost 60 percent of the channels on it have had at least one video removed, with more than 3,000 individual videos removed from them in total. The company also emphasized it was still ramping up enforcement. These numbers, however, suggest YouTube is aware of many of the hate speech issues concerning the remaining 187 channels—and has allowed them to stay active.

      and later on the article describes significant inconsistencies in who got nixed and who didn't among those who they actually took action against, so it doesn't seem to be just an issue of scale--it seems to also be them dragging their feet on things, inconsistently applying their supposed rules, and overall just not seemingly wanting to actually get rid of many of the genuinely hateful groups that make use of their platform openly and obviously.

      13 votes
    3. [3]
      moriarty
      Link Parent
      And yet they still do it with copyright infringement violations.

      And yet they still do it with copyright infringement violations.

      4 votes
      1. [2]
        Sand
        Link Parent
        Copyright infringement is usually removed by a bot. Or by whoever holds the copyright.

        Copyright infringement is usually removed by a bot. Or by whoever holds the copyright.

        1 vote
        1. moriarty
          Link Parent
          Yes, whoever holds the copyright reports it and a bot takes it down. The same can be done here. There's a massive torrent of reports for hate speech and violence, they just need to be willing to...

          Yes, whoever holds the copyright reports it and a bot takes it down. The same can be done here. There's a massive torrent of reports for hate speech and violence, they just need to be willing to do something about it.

          2 votes
  2. [9]
    nothis
    Link
    Heh, I don't envy Google right now. There's increasing pressure in the US and European countries are starting to solve this with laws, which is certainly not their desired outcome. Problem is:...

    Heh, I don't envy Google right now. There's increasing pressure in the US and European countries are starting to solve this with laws, which is certainly not their desired outcome. Problem is: They'd have to take a stand. They have to commit to a political stance (yes, being a racist conspiracy theorist has a political affiliation). If they would follow through, they'd lose a ton of traffic and and the alt-right has a tight grip on internet opinion (even the non-radical right-leaning crowd), they could brand this as "censorship gone mad" and "#leaveYoutube" with ease.

    4 votes
    1. [8]
      The_Fad
      Link Parent
      As far as I'm aware youtube as a platform is driven primarily by Let's Plays, How-to Videos, and personalities creating content for children/teenagers. Would I be incorrect in assuming they'd be...

      As far as I'm aware youtube as a platform is driven primarily by Let's Plays, How-to Videos, and personalities creating content for children/teenagers. Would I be incorrect in assuming they'd be fine even if they did lose all their alt-right views?

      5 votes
      1. [3]
        Comment deleted by author
        Link Parent
        1. alyaza
          Link Parent
          keeping hateful content on youtube is a very good way to ensure that those views spread without end, as they have done for literally the past four years more or less unabated.

          I'd prefer that hateful content to stay on YouTube rather than have it flee to somewhere where there will be absolutely no stopping its spread.

          keeping hateful content on youtube is a very good way to ensure that those views spread without end, as they have done for literally the past four years more or less unabated.

          10 votes
        2. The_Fad
          Link Parent
          As someone with a daughter who uses Youtube literally every day, I am going to have to respectfully disagree. I can see your point, though.

          As someone with a daughter who uses Youtube literally every day, I am going to have to respectfully disagree. I can see your point, though.

          5 votes
      2. [4]
        moocow1452
        (edited )
        Link Parent
        Yes and no, as Google are priming audiences and selling ad sales based on engagement. Obviously, if they kick a bunch of their alt-right talent off, the audience goes with them, the "principled...

        Yes and no, as Google are priming audiences and selling ad sales based on engagement. Obviously, if they kick a bunch of their alt-right talent off, the audience goes with them, the "principled advocates" tox and dox the comments of everybody for it, and they would probably have to kick off or at least answer for PewDiePie, which would exponentially aggravate the first two.

        Edit: YouTube wants mindless views at all hours. Alt-right wants that too. For YouTube to harm that would harm itself, and YouTube plays shy with it's numbers, so it's safe to assume that if they are making a profit, it's only just, but it has an unchallengable market share in "place to put short and long videos." Kick off your undesirables, and that advantage might just go with it.

        1 vote
        1. [3]
          The_Fad
          Link Parent
          Wait is PewDiePie an altright talking head now?

          Wait is PewDiePie an altright talking head now?

          1. Crespyl
            Link Parent
            No, he's mostly just a bit of an ass who makes content that is probably not always suitable for children (which I gather is a large portion of his audience). There are likely real issues with his...

            No, he's mostly just a bit of an ass who makes content that is probably not always suitable for children (which I gather is a large portion of his audience).

            There are likely real issues with his content, but lumping him in with genuine alt-right pundits only serves to give more ammunition to the "trigger-happy censors" argument.

            6 votes
          2. moocow1452
            (edited )
            Link Parent
            He's... questionable. As in "made a video about Fiverr where he paid guys to say, 'Death to all Jews,' then doubled down when the WaPo said it was a bad idea, conflated losing major advertisers...

            He's... questionable. As in "made a video about Fiverr where he paid guys to say, 'Death to all Jews,' then doubled down when the WaPo said it was a bad idea, conflated losing major advertisers with being censored, then made the rounds on alt-right adjacent free thinker talk shows, and has major audience crossover and pipeline to harder stuff" questionable.

            Plus he has the sort of Alex Jones fanbase where he encourages crazier and crazier stunts, but writes off troublesome events as bad actors.

            6 votes
      3. nothis
        Link Parent
        Do we really still have to play coy about the grip the alt-right has on the internet? I mean, I certainly agree that they're no majority on youtube. But maybe 10%? 5%? A loud 5% with some overlaps...

        Do we really still have to play coy about the grip the alt-right has on the internet? I mean, I certainly agree that they're no majority on youtube. But maybe 10%? 5%? A loud 5% with some overlaps to other groups? There'd be some damage that block could do (harassing youtube users, anti-Google-campaigns, "now more than ever!" type of spam,...) and probably a full-number-percentage of viewership drops. Not something youtube wants.

        Basically, I don't believe Google truly doesn't have the means to stop this (they're frickin' Google, FFS). So they have to have another reason, and my money is on one of the above. One more, slightly meta reason I could imagine is that any hard measures coming this late would inevitably lead to false-positive removals and those are always a PR nightmare.

  3. [4]
    Comment deleted by author
    Link
    1. [3]
      Deimos
      Link Parent
      No comments in here are labeled Noise. Do you have "Collapse old comments when I return to a topic" enabled on this settings page? https://tildes.net/settings/comment_visits

      No comments in here are labeled Noise. Do you have "Collapse old comments when I return to a topic" enabled on this settings page? https://tildes.net/settings/comment_visits

      3 votes
      1. [3]
        Comment deleted by author
        Link Parent
        1. [2]
          Deimos
          Link Parent
          Yeah, that means it's a new comment that wasn't there the last time you visited the topic. If you have that setting on, all comments you've seen before (except ones that are the direct parents of...

          Yeah, that means it's a new comment that wasn't there the last time you visited the topic.

          If you have that setting on, all comments you've seen before (except ones that are the direct parents of new comments) will be collapsed when you come back to the thread.

          5 votes
          1. [2]
            Comment deleted by author
            Link Parent
            1. Deimos
              Link Parent
              Somewhat—comment visits aren't tracked by default, because that impacts people's privacy since it means the site specifically stores the last time they visited each topic's comments. So you have...

              Somewhat—comment visits aren't tracked by default, because that impacts people's privacy since it means the site specifically stores the last time they visited each topic's comments. So you have to opt-in to the overall feature, but the "collapse old comments" behavior is enabled by default when you do that.

              3 votes