41 votes

Most of my Instagram ads are for drugs, stolen credit cards, hacked accounts, counterfeit money, and weapons

11 comments

  1. [4]
    Amun
    Link
    This is worrisome. I don't use Meta (FB or IG or even WA) so I can't vouch for these findings personally. But there are so many users of these media who may have. Maybe even some of you have...

    For the last few months Instagram has served me a constant stream of ads for hard drugs, stolen credit cards, hacked accounts, guns, counterfeit cash, wholesale quantities of weed, and Cash App scams, as well as a Russian-language job posting seeking paid-in-cash massage therapists. Nearly all of these advertisements link directly to Telegram accounts where the drugs or illegal services can be directly purchased.

    Like many of Meta’s algorithmic rabbit holes, my journey into this world started with a single, curious click. After years of being served primarily ads for surf brands, clothes, and productivity apps, I got an Instagram ad with a hooded man standing in front of a Chase Bank ATM holding a giant stack of cash: “I got half a million worth of sauce in my iCloud,” the ad’s caption said.

    While some of the ads are subtle, many of them are not. Lots of the ads use clear language and imagery about what they’re selling on Instagram itself.

    I asked Laura Edelson, a researcher at New York University who specializes in social media ad spending, if there was a way of estimating just how prevalent these sorts of ads are on Instagram and Facebook, and how much money Meta might be making from them. She said that “Meta makes ads shown on Facebook/Instagram transparent through its Ad Library while they are active only.” Edelson suggested that the large number of ads for illegal content that can be trivially found “certainly isn’t promising,”

    Regardless of whether the accounts are scams or actually selling drugs, guns, and credit cards, the ads shouldn’t be on Instagram under its own terms of use and are a content moderation failure.

    The Meta spokesperson said that “the prevalence of content that violates our Restricted Goods and Services Policy is about 0.05 percent of content viewed on Facebook and Instagram. In other words, out of every 10,000 views of content on Facebook and Instagram, we estimate no more than 5 of those views contained content that violated this policy,” and added that “Views of content violating these policies are very infrequent, and we remove much of this content before people see it.”

    While this overall percentage may sound low, Facebook is one of the largest ad companies in the world and has billions of users viewing billions of pieces of content.

    Previous studies have found that Facebook’s ad marketplace has vulnerabilities. Last year, for example, a joint study between Global Witness and NYU’s Cybersecurity for Democracy found that “Facebook either failed to detect, or just ignored, death threats against election workers contained in a series of ads submitted to the company.” That study also found that YouTube and TikTok more consistently rejected or quickly removed the researchers’ test ads.

    Karan Lala, a founding fellow at the Integrity Institute, which was created by former members of Facebook’s Integrity team, told me. Lala has previously studied the prevalence of spam accounts that advertise on Meta’s platforms. “Just because it’s off-platform doesn’t mean it should be an excuse. There’s things like weed photos—that’s something that should just be getting caught. If I were an engineer on the integrity team, I would want to know why our systems aren’t catching them."

    This is worrisome. I don't use Meta (FB or IG or even WA) so I can't vouch for these findings personally. But there are so many users of these media who may have. Maybe even some of you have experienced something like this.

    This article is followed by an update news piece on how "Instagram Throttles 404 Media Investigation Into Drug Ads on Instagram, Continues to Let People Advertise Drugs"

    26 votes
    1. chiliedogg
      Link Parent
      All online ads should require human review. I review every physical sign and billboard ad placed in my city, and my reviews go way beyond what an ad review would require, since I'm reviewing...

      All online ads should require human review.

      I review every physical sign and billboard ad placed in my city, and my reviews go way beyond what an ad review would require, since I'm reviewing lighting, structure, design, zoning requirements, etc in addition to content, and my review costs the customer about 250-300 bucks including multiple reviews and physical inspections.

      Human review of ads could probably be done for less than 10 dollars.

      16 votes
    2. updawg
      Link Parent
      It's kind of worrisome but they admit that they're only seeing those ads because they try to get those ads. I'm not saying it's great to have companies selling ads to criminals, but it sounds like...

      It's kind of worrisome but they admit that they're only seeing those ads because they try to get those ads. I'm not saying it's great to have companies selling ads to criminals, but it sounds like it's not exactly necessary to protect our society.

      8 votes
    3. MaoZedongers
      Link Parent
      Wow, Facebook's vetting system being worse than Youtube's is really saying something. Most of those probably are scams and the people using them will get nothing but humiliation though probably.

      Wow, Facebook's vetting system being worse than Youtube's is really saying something.

      Most of those probably are scams and the people using them will get nothing but humiliation though probably.

      1 vote
  2. [3]
    Wolf_359
    Link
    My guess would be scams and/or honeypots for law enforcement and counter-terrorism. I can't speak to this issue in particular but I can comment on something semi-related. I know it's been said...

    My guess would be scams and/or honeypots for law enforcement and counter-terrorism.

    I can't speak to this issue in particular but I can comment on something semi-related. I know it's been said before but the alt-right pipeline on YouTube and other social media is way too easy to fall into. I'll be watching stand-up comedy and then Joe Rogan shows up in my feed. From there, it's all over. Next thing I know I can't escape Jordan Peterson, Richard Spencer, etc. Or I'll be watching a news clip about the war in Ukraine and they'll interview AOC. From there, it'll be nonstop videos of "liberal AOC getting HUMILIATED AND OWNED BY BIG BRAIN BEN SHAPIRO."

    Whether it's ads or video recommendations, I don't know about these algorithms man...

    23 votes
    1. [2]
      balooga
      Link Parent
      Seems likely to me. Actual black markets are operating online and they’re easy enough for motivated customers to find, but they not parading around in Meta’s panopticon, that’s a certainty. This...

      Seems likely to me. Actual black markets are operating online and they’re easy enough for motivated customers to find, but they not parading around in Meta’s panopticon, that’s a certainty. This looks like LEO bait for the clueless.

      7 votes
      1. R3qn65
        Link Parent
        Perfect use of the word panopticon.

        Perfect use of the word panopticon.

        1 vote
  3. DeaconBlue
    Link
    I think that this line of the article is worth a bit of a side tangent, because it applies not only to Meta/Alphabet/Whoever, but to every private entity's space. Content moderation is not...

    I think that this line of the article is worth a bit of a side tangent, because it applies not only to Meta/Alphabet/Whoever, but to every private entity's space.

    Regardless of whether the accounts are scams or actually selling drugs, guns, and credit cards, the ads shouldn’t be on Instagram under its own terms of use and are a content moderation failure.

    Content moderation is not (generally) a law. It is a set of guidelines on how users are expected to behave. Advertisers are not users. The owners of the space are well within their rights to arbitrarily enforce or not enforce almost any rules they want. Even within the overall set of "users" we know that content moderation is still a sliding scale, as more prominent users are allowed on most platforms to break more rules than a fresh account.

    In this particular case there might actually be applicable laws against advertising illegal substances, but the general case of "advertiser breaking user Terms of Service" is a very silly thing to complain about.

    9 votes
  4. [3]
    arrza
    Link
    I don't see what the big deal is. This is the free market at work. Society clearly thinks this is ok, or at least tolerable if it means they get to see Janeys makeup tips. Plus, we're only talking...

    I don't see what the big deal is. This is the free market at work. Society clearly thinks this is ok, or at least tolerable if it means they get to see Janeys makeup tips.

    Plus, we're only talking about drugs and guns, hardly the most pernicious ills of society.

    If it were so bad, we'd get the government to intervene in some way. The moral panic types are either unaware, or they approve.

    2 votes
    1. [2]
      NoblePath
      Link Parent
      Oh my! [Edit: I just realized that perhaps your comment was satire, often difficult to distinguish in internet comments. My reply is directed at anyone, yourself included, who may have read it...

      Oh my! [Edit: I just realized that perhaps your comment was satire, often difficult to distinguish in internet comments. My reply is directed at anyone, yourself included, who may have read it literally]

      I agree this is the free market at work. Pretty hard disagree on everything else.

      Just because something is visible, or even prevalent, does not mean "society clearly thinks this is OK." It was, and in some places still is, very easy for a 12 year old to buy crack in the projects. Only the most debased dealer thinks this is OK.

      Guns in America are certainly among the most pernicious ills of society. There's lots of discussion on this.

      I'm not even sure where to begin with your last comment. Government can, and occasionally does intervene successfully to curtail social ills. But governments failure to act at all, let alone successfully, is highly variable across time and issue. Government certainly should regulate public spaces for the public good, but the list of absent and ineffective regulation is very, very long. East Palestine train derailment anyone? Cigarettes? Lead in gasoline? Coal Ash? Forever chemicals? Asbestos? High Fructose Corn Syrup? Iraq War?

      3 votes
      1. arrza
        Link Parent
        My comment wasn't satire, it was a mixture of cynicism, apathy, pragmatism, and helplessness in the face of what lies before us. Large scale social media is a failed experiment. It has nothing...

        My comment wasn't satire, it was a mixture of cynicism, apathy, pragmatism, and helplessness in the face of what lies before us.

        Large scale social media is a failed experiment. It has nothing good to offer society. Its continued existence is a blight on humanity.

        Nothing will be done by the American government to rein it in because of several systemic issues. The current state of our federal government is that it can barely manage to pass a budget to fund its own existence. Furthermore, our legislature is fully captured by these corporations which only makes the task harder, if not impossible.

        If I were king, I'd smite these companies and their websites with impunity, and then salt the earth they stood on so they could never come back. Thats never going to happen, so I just don't participate on them, which is the best I can do.

        3 votes