11 votes

In a new report, Penn State political scientists suggest that radicalization on YouTube is driven by communities that form around right-wing content more than the recommendation engine

3 comments

  1. moocow1452
    Link
    I think this article is saying that "YouTube didn't create the altright from whole cloth so much as allowed them to network and reach an audience" which passes a sniff test, but then implies "the...

    I think this article is saying that "YouTube didn't create the altright from whole cloth so much as allowed them to network and reach an audience" which passes a sniff test, but then implies "the algorithm had nothing to do with them getting their numbers." As in, the conventional understanding of the radicalization process of YouTube favoring pointing users to more engaging videos is incorrect, or the alt-right infrastructure exists in a vacuum to the rest of YouTube and any recommendation is totes for reals human agents only, no runaway algorithms here?

    3 votes
  2. [2]
    szferi
    Link
    I just watched Tom Scott's presentation about related topics at the Royal Society: https://www.youtube.com/watch?v=leX541Dr2rU It summarizes well, among other related things, what could be the...

    I just watched Tom Scott's presentation about related topics at the Royal Society: https://www.youtube.com/watch?v=leX541Dr2rU It summarizes well, among other related things, what could be the driving forces behind radicalization and the current state of affairs in online communications, especially on YouTube.

    2 votes
    1. moocow1452
      Link Parent
      This should probably be it's own post, interesting material.

      This should probably be it's own post, interesting material.