15 votes

Facebook showed this ad to 95% women. Is that a problem?

4 comments

  1. [2]
    deciduous
    Link
    An important point brought up in this video is that algorithmic bias in ad targeting platforms isn't some result of bad design or programming. The problem is fundamental to how data driven...

    An important point brought up in this video is that algorithmic bias in ad targeting platforms isn't some result of bad design or programming. The problem is fundamental to how data driven algorithms work. There really isn't a good way to build a targeted algorithm that doesn't reflect the biases (racial, gender, or otherwise) of society.

    As long as the data you use is biased, which real world data will be, the system will be as well.

    10 votes
    1. arp242
      Link Parent
      This is essentially the "Filter Bubble" we were already talking about 10 years ago, right? The context at the time was mostly about politics and such, but it's basically the same thing. You see...

      This is essentially the "Filter Bubble" we were already talking about 10 years ago, right? The context at the time was mostly about politics and such, but it's basically the same thing.

      You see this on YouTube really clearly; watch one or two videos about topic X and it'll appear in your recommendations forever. I miss the old YouTube that was much more like reddit in the sense of "browse around, see what you find".

      2 votes
  2. trunicated
    Link
    Facebook is in the game of getting results for their advertisers. If advertisers are paying money and not getting results, then they're going to stop paying. Thus, there's an incentive for...

    Facebook is in the game of getting results for their advertisers. If advertisers are paying money and not getting results, then they're going to stop paying. Thus, there's an incentive for Facebook to make sure that advertisements are delivered to users who are more likely to click on them, and part of that is by analyzing the content of the advertisements and providing them to users they believe will be more likely to click on them. This is beneficial not only for Facebook, but for the advertisers themselves (as they will see more people interacting with the advertisements they've paid for).

    One of the examples they showed was an advertisement for makeup. It went to 98% women and 2% men. Is it unreasonable to believe, that in today's society, that an advertisement for makeup would have an overwhelming better chance of working on someone who identifies themselves as a women versus someone who identifies themselves as a man? That seems extremely straightforward to me as the distribution looks to reflect the US societal norms.

    Another example they use relates to political advertising. Democrats strongly tended to see a Sanders advertisement while Republicans strongly tended to see a Trump advertisement. Is this an issue? From the perspective of the customer (the advertiser), absolutely not. Paying to have my "Feel The Bern" coaster shown to Republicans is money wasted. Democrats are much more likely to purchase my items.

    From the perspective of Facebook? Absolutely not. If they want to be able to charge as much money for their advertising space, they need to be able to guarantee a high level of interaction with advertisements. If half the folks that see the add will never click it, then CPM is going to be abysmal and unable to sustain the business.

    From the perspective of the user seeing the advertisement? Absolutely not. I believe it's already established that seeing untargeted, random advertisements does not work, as the early days of the internet would show us. To take it a step further, polarizing figures like Sanders and Trump displayed to those who vehemently oppose them would, at best, do nothing. At worst, it would cause users to stop using the site, which is certainly bad for the other two groups (and likely bad for everyone that is friends with the Republican that sees many Sanders' advertisements).

    As to the question posed in the title of the video: "Is that a problem?", I would say no. The video even states that the results don't mean that Facebook is directly basing predictions on things like gender or race. Instead, the machine has learned the social biases of the US from the people in the society itself, and is directing content based on those biases. It's holding a mirror up to the society itself, and the authors of the video is asking if what we're seeing is okay.

    From Facebook's perspective, I would say yes. From the advertiser's perspective, I would say yes. From the user's perspective, I would say yes.

    Does this mean it's perfect? Absolutely not. As called out in the video, this sort of system can create a nasty feedback loop, much worse (in my opinion) and more quickly than society can produce. However, this may be why we're seeing that 98%/2% divide in the makeup advertisement. There may be a system in place to provide the advertisement to those that the algorithm may not normally consider in an effort to self-check if the targeting is still appropriate. I could see a system that consistently tests on small groups of people that may not be considered "optimal" groups for an advertisement, and perhaps even pivot if results are positive from those experiments. Unfortunately, that sort of logic is the secret sauce for Facebook, and not something they're likely ever to reveal.

    I think there's much more to explore here in the context of machine learning and how it can't be biased itself, only show the biases in the data it's provided. Like that chatbot Microsoft launched that started spewing vile nonsense. I can only assume it got its training data from Xbox Live chat. But that's not the fault of the bot, or the data (not directly, anyways), it's the fault of the society that created that data. If you want to change the data, you need to change the society that created it.

    I guess my overall point is that no, it's not a problem itself. But it's exposing a potential larger problem. Data itself cannot be responsible for what it says. How it was collected or interpreted? Sure. But we don't get mad at math when 2 + 2 = 4, even if that means that 2 people crashed in to 2 other people at highway speeds leading to 4 deaths.

    5 votes
  3. triple8
    Link
    Great video. Personally I have no problem with targeted ads. I'd rather see an add for a cool new product I'm likely to buy, rather than some dumb ad for "hot singles in my area". The era of...

    Great video. Personally I have no problem with targeted ads. I'd rather see an add for a cool new product I'm likely to buy, rather than some dumb ad for "hot singles in my area". The era of online ads before algorithms and personal data being available to companies was irritating, and I think targeted ads are a step in the right direction.

    1 vote