Maybe I'm missing something here. This is framed as algorithmic political bias, but isn't it possible that there was simply more right-wing content than left-wing content or other related content?...
Maybe I'm missing something here. This is framed as algorithmic political bias, but isn't it possible that there was simply more right-wing content than left-wing content or other related content? Or couldn't it have been relatively more algorithm-friendly on purpose? Or more simply, users were watching more political content because of the upcoming election? FUD is often inherently more emotionally potent and attention-grabbing than alternative political messages.
I mean, for all we know YouTube specifically has explicit right-wing content boosters built in. The fact that we don't know is sort of the root issue, not this specific outcome which has tons of moving variables. Even if YouTube wanted to do a deep-dive into how this occurred, I'm not sure they'd be able to come up with that many specifics due to its black box nature.
Perhaps I'm misreading, but doesn't the following excerpt directly address this with metrics showing that right-wing content is extremely disproportionately recommended compared to its apparent...
isn't it possible that there was simply more right-wing content
Perhaps I'm misreading, but doesn't the following excerpt directly address this with metrics showing that right-wing content is extremely disproportionately recommended compared to its apparent share of videos?
Only 6.4% of videos from search results mention politicians from the right-wing party, whereas 19.3% of the first “Up Next” recommendations relate to it’s politicians.
YouTube's search results have been going downhill for years, so I don't exactly trust those numbers. Even if accurate, I would expect a spike in interest for political content to create a spike in...
YouTube's search results have been going downhill for years, so I don't exactly trust those numbers. Even if accurate, I would expect a spike in interest for political content to create a spike in recommendations for that content.
Maybe I'm missing something here. This is framed as algorithmic political bias, but isn't it possible that there was simply more right-wing content than left-wing content or other related content? Or couldn't it have been relatively more algorithm-friendly on purpose? Or more simply, users were watching more political content because of the upcoming election? FUD is often inherently more emotionally potent and attention-grabbing than alternative political messages.
I mean, for all we know YouTube specifically has explicit right-wing content boosters built in. The fact that we don't know is sort of the root issue, not this specific outcome which has tons of moving variables. Even if YouTube wanted to do a deep-dive into how this occurred, I'm not sure they'd be able to come up with that many specifics due to its black box nature.
Perhaps I'm misreading, but doesn't the following excerpt directly address this with metrics showing that right-wing content is extremely disproportionately recommended compared to its apparent share of videos?
YouTube's search results have been going downhill for years, so I don't exactly trust those numbers. Even if accurate, I would expect a spike in interest for political content to create a spike in recommendations for that content.