15 votes

On popular online platforms, predatory groups coerce children into self-harm

3 comments

  1. skybrian
    Link
    From the article (which has a trigger warning, but I won’t quote the more disturbing parts): … … … … … … … …

    From the article (which has a trigger warning, but I won’t quote the more disturbing parts):

    Unlike many “sextortion” schemes that seek money or increasingly graphic images, these perpetrators are chasing notoriety in a community that glorifies cruelty, victims and law enforcement officials say. The FBI issued a public warning in September identifying eight such groups that target minors between the ages of 8 and 17, seeking to harm them for the members’ “own entertainment or their own sense of fame.”

    The group that targeted the Oklahoma girl and others interviewed for this report is called “764,” named after the partial Zip code of the teenager who created it in 2021. Its activities fit the definition of domestic terrorism, the FBI recently argued in court.

    While lawmakers, regulators and social media critics have long scrutinized how Facebook and Instagram can harm children, this new network thrives on Discord and the messaging app Telegram — platforms that the group 764 has used as “vessels to desensitize vulnerable populations” so they might be manipulated, a federal prosecutor said in court recently.
    Discord, a hub for gamers, is one of the most popular social media platforms among teens and is growing fast. The platform allows anonymous users to control and moderate large swaths of its private meeting rooms with little oversight.

    Telegram — an app that includes group chats and has more than 800 million monthly users — allows for fully encrypted communication, a feature that protects privacy but makes moderation more challenging. Telegram delegates most moderation to leaders of groups on the platform, intervening in some instances when posts violate its policies.

    The platforms say deterring these groups is an urgent priority. But after creating the spaces that predators from around the globe use to connect with one another and find vulnerable children, even removing thousands of accounts each month has proved insufficient. The targeted users start new accounts and swiftly reconvene, according to interviews with victims.

    After reporters sought comment, Telegram shut down dozens of groups the consortium identified as communication hubs for the network.

    Discord has filed “many hundreds” of reports about 764 with law enforcement authorities, according to a company spokeswoman, speaking on the condition of anonymity for fear of retaliation from 764-affiliated groups. The company removed 34,000 user accounts associated with the group last year, many of them assumed to be repeat offenders, she said.

    [Discord] uses artificial intelligence to detect predatory behavior and scans for abusive text and known sexually explicit images of children in the platform’s public spaces, the spokeswoman said. It shuts down problem accounts and meeting spaces and sometimes bans users with a particular IP address, email or phone number, though the spokeswoman acknowledged that sophisticated users can sometimes evade these measures.

    Court and police records show that Discord struggled to keep Cadenhead off its platform.

    Starting in November 2020, the company spokeswoman said, Discord noticed that child sexual abuse material was being uploaded from IP addresses — a set of numbers that identify a device used to connect to the internet — that investigators later traced back to Cadenhead. The company sent authorities reports about illegal images on 58 different accounts operated by Cadenhead, well into 2021, the spokeswoman said.

    The Discord spokeswoman said that each time one of Cadenhead’s accounts was flagged, it was shut down and banned. She acknowledged that the company banned only some of the IP addresses used by Cadenhead, saying that it used such bans only when they were deemed tactically appropriate. She said sophisticated predators often have 50 to 100 accounts, some stolen or purchased, to evade enforcement actions.

    The reports from Discord prompted the investigation that led to his arrest on child pornography charges in July 2021. Speaking later to a juvenile probation officer, Cadenhead said that his server attracted as many as 400 members who routinely posted shocking images, including videos of torture and child pornography. It was also “quite common” for members to groom victims and extort them by threatening to distribute compromising images, Cadenhead told the officer. Sometimes their motivation was money, and other times they did it “just for power,” the officer wrote in a report to the court after Cadenhead pleaded guilty.

    Cadenhead, now 18 and serving an 80-year prison sentence for possession with intent to promote child pornography, did not respond to a letter requesting an interview. His parents did not return messages. Chris Perri, a lawyer for Cadenhead, said he may challenge the sentence based on “potential mental health issues.”

    A nonprofit that directs reports of abuse against children from social media companies to law enforcement said it saw a sharp increase in this type of exploitation last year. Fallon McNulty, director of the CyberTipline at the National Center for Missing and Exploited Children, said the center received hundreds of reports of minors extorted into hurting themselves last year and continues to receive dozens each month.

    A Roblox spokesperson said the platform is aware of the groups’ activities.

    7 votes
  2. AlexeyKaramazov
    Link
    Where are the free speech, anti regulation, open internet, free market ideologues on this one?

    Where are the free speech, anti regulation, open internet, free market ideologues on this one?

    3 votes