8 votes

A study on the online "filter bubble" found that liberals and conservatives were actually recommended similar stories on Google News, representing a fairly homogeneous set of mainstream news sources

3 comments

  1. shiruken
    (edited )
    Link
    This recently published study in Computers in Human Behavior sampled N=168 people to examine what content the Google News algorithm presented to liberal and conservative users. Nechushtai, E. &...

    This recently published study in Computers in Human Behavior sampled N=168 people to examine what content the Google News algorithm presented to liberal and conservative users.

    Nechushtai, E. & Lewis, S. C. What kind of news gatekeepers do we want machines to be? Filter bubbles, fragmentation, and the normative dimensions of algorithmic recommendations. Computers in Human Behavior (2018).

    Abstract: Machines are increasingly aiding or replacing humans in journalistic work, primarily in news distribution. We examined whether news recommendation engines contribute to filter bubbles and fragmented news audiences by asking a diverse set of real-world participants (N = 168), using their personal Google accounts, to search Google News for news about Hillary Clinton and Donald Trump during the 2016 U.S. presidential campaign and report the first five stories they were recommended on each candidate. Users with different political leanings from different states were recommended very similar news, challenging the assumption that algorithms necessarily encourage echo chambers. Yet we also found a very high degree of homogeneity and concentration in the news recommendations. On average, the most recommended five news organizations comprised 69% of all recommendations. Five news organizations alone accounted for 49% of the total number of recommendations collected. Out of 14 organizations that dominated recommendations across the different searches, only three were born-digital, indicating that the news agenda constructed on Google News replicates traditional industry structures more than disrupts them. We use these findings to explore the challenges of studying machine behavior in news from a normative perspective, given the lack of agreed-upon normative standards for humans as news gatekeepers. This article suggests that, because there is no one agreed-upon standard for humans as news gatekeepers, assessing the performance of machines in that role is doubly complicated.

    2 votes
  2. [2]
    annadane
    Link
    The problem is that websites shouldn't be doing this to begin with, or at the very least it should be opt-in. When you go on Facebook you don't want to see more of friends posts and less of other...

    The problem is that websites shouldn't be doing this to begin with, or at the very least it should be opt-in. When you go on Facebook you don't want to see more of friends posts and less of other friends posts just because you liked more of their posts. When you go on Google you expect it to be neutral and give you the most relevant results. When you go on Youtube you don't want the same 5 videos recommended to you constantly.

    1 vote
    1. shiruken
      Link Parent
      That is exactly what this study indicates Google News does. Users are presented a homogeneous set of news sources, regardless of their political inclinations. The algorithm isn't creating a...

      When you go on Google you expect it to be neutral and give you the most relevant results.

      That is exactly what this study indicates Google News does. Users are presented a homogeneous set of news sources, regardless of their political inclinations. The algorithm isn't creating a per-user filter bubble of sources. It presents the same large media sources to everyone.

      Now if the problem is that websites like Breitbart or InfoWars aren't showing up in the results, then maybe that's an indication of how far a user has deviated from the majority of news consumers.

      4 votes