30 votes

PSA: Disinformation and the over-representation of false flag events on social media.

I've noticed lately that on certain social media websites, particularly Reddit and Facebook, there has been an uptick in articles about fake hate crimes and false rape reports. The comments on these articles especially fan the flames on the subjects of homophobia, racism, and sexism. While the articles themselves are still noteworthy and deserving of attention, the amount of attention that they've been receiving has been disproportionately high (especially when considering how fairly unknown the individuals involved are) and the discourse on those articles particularly divisive.

On top of that, there are clear disinformation campaigns going on to attack current Democratic presidential candidates in the U.S. It seems pretty clear that we're having a repeat of the last presidential election, with outside parties stoking the flames of discrimination and disinformation on social media in order to further ideological divisions, and the consumers of that media readily falling for it.

I would caution readers to be mindful of the shifting representation of historically controversial or contentious topics moving forward. Even if the articles themselves are solidly factual, take note of how frequently you're seeing these articles, whether or not they're known to be contentious topics, and how they're affecting online discourse.

In short: make sure that you can still smell bullshit even when it's dressed up in pretty little facts.

6 comments

  1. [3]
    hereticalgorithm
    (edited )
    Link
    Yeah, I've noticed the upswing in articles about fake rape & hate crime reports too. While at first I attributed it to circlejerking, that sort of hive-mind witch-hunt mentality can be easily...

    Yeah, I've noticed the upswing in articles about fake rape & hate crime reports too. While at first I attributed it to circlejerking, that sort of hive-mind witch-hunt mentality can be easily induced by an actor fairly familiar with social media.

    For instance, /r/cringepics deliberate incited outrage against themselves as an April fool's joke, and it ended up being much bigger than expected. A seriously motivated and/or paid actor could do a lot more damage. Just keep posting articles about fake hate crimes & rape reports at the right time, and even if someone catches on, way more people are gonna be convinced than those who realize something's wrong (going off the 10% read the comments rule of thumb).

    There was post on /r/trashy of a screenshot about a book written by a pseudonymous author claiming to be a feminist about using false rape accusations to get power. All of the other suggested books were MGTOW stuff, and given the ease of self-publication, it was almost certainly written by one of theirs (either to profit off outrage buys, or to spread propaganda).

    Some people in the comments caught on that it was BS, but anyone seeing it on the front page who didn't think about it too hard would have been left with an impression that this was more of a serious thing than it actually was. The moderator response was to remove, lock and tag the post - after the damage was already done.

    Personally, I'd like to see a "retraction" feature on a social media sites that pushes an alert to people who saw a false/misleading post (or even content based off of it), instead of just quietly deleting it... Assuming the platform allows us to trace that graph, there's then the question of who can send those retractions (moderators are the obvious candidates on Reddit and Tildes, but what about Twitter or Facebook?), and how users recieve them. Honestly, it might even just generally be a nice feature for catching updates on posts.

    13 votes
    1. [2]
      Emerald_Knight
      Link Parent
      You know, I have mixed feelings about the idea of a "retraction" feature. Because of the potential immediacy of update notifications, I could see it being abused quite easily by users who disagree...

      You know, I have mixed feelings about the idea of a "retraction" feature. Because of the potential immediacy of update notifications, I could see it being abused quite easily by users who disagree with the submission and have the requisite moderation privileges. It could very easily devolve into ideological arguments resulting in many notifications being sent to users unnecessarily, effectively becoming a thread of "super comments". Any attempts at resolving the issue would then likely result in a complaint over "taking sides" and there would likely be a fair bit of drama involved. The ability to send out those notifications would be fantastic, but setting in place a model of moderation that would be immune or even just resistant to this sort of abuse and the resulting fallout would be unfeasible at best.

      I feel like the only way to somewhat reasonably do this would be to integrate with some reputable, neutral, third-party fact-checking system and periodically audit link submissions via an automated process. I'm not sure, though. Perhaps someone else could come up with a way to keep the abuse of such a system at bay without keeping them out of the hands of human users, but I can't come up with anything.

      Granted, if we were considering this for Tildes, then the Tildes philosophy is to trust by default and punish abuse, but I feel like there's a certain degree of potential abuse that can't be ignored so readily. There's definitely a pretty big difference between allowing the silent editing and deleting of submissions (which other users could review and revert) and being given a public broadcasting feature for users to have particularly "loud" arguments over petty disagreements. It just wouldn't be worth the hassle.

      2 votes
      1. hereticalgorithm
        (edited )
        Link Parent
        Yeah, that's the hard part, but it's fundamentally the same problem with trusting moderators in general (pinned posts for instance are super posts), just with stronger tools. I don't think...

        Yeah, that's the hard part, but it's fundamentally the same problem with trusting moderators in general (pinned posts for instance are super posts), just with stronger tools. I don't think integrating w/ a third party fact-checking system is a solution, because that only offloads the problem of trust somewhere else.

        I'm thinking that a checks & balances system (as this is fundamentally a political problem, like it or not) could do the trick for a tool requiring as much caution as retractions.

        First, a clear policy, on what exactly makes a post retraction worthy (perhaps with different tiers of retraction). Ultimately this should be figured out in public, open discussion (time to post this on ~tildes?) to ensure that it represents the userbase. As a starting point, I think the line for retractions isn't just posts that are incorrect (because people can and will disagree with those), but posts that are crafted by actors (antagonistic to the interests of the userbase) in such a way that it distorts the very context/frame of the post and our ability to contest it.

        For instance, a post from an anti-vaxer about how vaccines cause autism, while factually incorrect, still honestly frames it as their own viewpoint. Even if they titled it "WHAT DOCTORS DON'T WANT YOU TO KNOW", which misrepresents the medical viewpoint, that is still a statement being made in context of "an anti-vaxer is posting this". Statements framed like these can be opposed directly.

        Meanwhile, that book example not only presents a blatantly misogynistic claim (women regularly abuse sexual assault claims to gain power), but the post frames the context as "a feminist in the metoo movement is presenting this". Thus disagreement with the surface-level content causes people, without their knowledge, to actually agree with the ideology ultimately being pushed here. Astroturfing would also likely be grounds for retraction.

        Second, the fact that moderators (perhaps requiring a vote amongst themselves) have to be the ones to issue a retraction is already a check. Limiting retraction power prevents it from being used for spam, and builds this system on top of the existing technical & social tools used for moderation.

        Third, feedback from users on if a retraction was necessary. This could be a vote attached to the retraction, visible in both messages to users and attached to the post being retracted (honestly this sounds a lot like a pinned comment, but going into people's inboxes). This also renders a degree of user conensus in the comments section an informal requirement before moderators even consider retracting.

        For the reasons similar to the second check, I don't see a separate technical measure to boot off moderators who issue unpopular retractions as necessary. The fact that vote totals on a retraction would be visible plays into already existing checks, in both directions - popular retractions affirm consnesus, unpopular ones signal dissent (suggesting that moderators need to change things up, users should leave, or that the policy is bad).

        2 votes
  2. [2]
    nacho
    Link
    Essentially, are publishers interested enough in newsworthiness and their role as setting the agenda? I think the answer to that is too often one of pageviews and attention rather than importance....

    Essentially, are publishers interested enough in newsworthiness and their role as setting the agenda?

    I think the answer to that is too often one of pageviews and attention rather than importance.

    Publications often have editorial priorities and agendas.


    Outrage culture, looking for strong negative emotional reactions to content is also a sure way to get eyeballs and attention.

    We superficially consume or scroll past a lot of content and headlines without taking the time to read the full story.

    Agenda-pushing is a big deal. It's important. As a lot of people are no longer interested in civics and just get the headlines or soundbites, how well is the Fourth estate doing its job?

    What goes viral? How does it happen?

    In an age where anti-elitism is strong in certain groups, it can often be a qualification not to be knowledgeable or point out that some views are less important or right than others. Truth isn't always a matter of opinion.


    Assuming the role of a victim gives substantial advantage in many situation. It also creates tribalism.

    I think a lot of Democratic presidential candidates are getting deserved and important scrutiny. However, some things are blown out of proportion: they get more attention than their importance merits.

    Many of them have serious flaws that will be exposed during a general election campaign anyway. Of course there are vested interests against specific policies or people, but it's also very tempting to suggest external influence and tomfoolery rather than attributing a lot of criticism stems from the US being a highly polarized country where people just have different views on what's right for the country.

    3 votes
    1. Emerald_Knight
      Link Parent
      Just some brief feedback: I've noticed that your rhetoric is frequently very "choppy". It feels like there are a lot of short, unanswered questions and soundbites that aren't being elaborated on...

      Just some brief feedback: I've noticed that your rhetoric is frequently very "choppy". It feels like there are a lot of short, unanswered questions and soundbites that aren't being elaborated on very much. Also, there are sentences that are connected and that are intended to be continuations of the same thought but aren't being grouped together within the same paragraph. It makes it pretty difficult to figure out how your thoughts are organized, so it's often difficult to tell what message you're intending to convey.

      I don't mean to criticize, I just want you to be aware that the structure of your rhetoric could be making it difficult for people to understand you and could be undermining the effectiveness of your arguments.


      With that out of the way, I would like to address this point in particular:

      . . . it's also very tempting to suggest external influence and tomfoolery rather than attributing a lot of criticism stems from the US being a highly polarized country where people just have different views on what's right for the country.

      These two aren't mutually exclusive. In fact, the entire reason external actors are engaging in this sort of divisive campaign in the first place is precisely because of the long-standing political and ideological polarization in this country. They're taking advantage of our weaknesses and attacking them relentlessly, causing the polarization to become worse. We already have evidence that they were doing this previously for that very purpose.

      This is the entire reason that I felt the need to post this topic in the first place--their tactics are insidious, giving an extra push in visibility for subjects that are legitimate but politically polarizing. It's important to be able to recognize when it's just the natural polarization that his historically been present in American political discourse, and when external actors are influencing that polarization to drive a wedge. It's pretty clear that this sudden uptick deviates heavily from the norm, which highly suggests that the external actors are at play once again.

      6 votes
  3. gyrozeppeli
    Link
    Outrage sells. Why do you think PETA is the way that it is? Obviously they planned on criticizing a well known and liked community figure (Steve Irwin) who died suddenly, since it would push...

    Outrage sells. Why do you think PETA is the way that it is? Obviously they planned on criticizing a well known and liked community figure (Steve Irwin) who died suddenly, since it would push peoples' buttons and get them talking about PETA.

    This tradition is as old as time itself, I believe. If you've ever interacted with a small community, gossip happens quite often because people eat it up.

    1 vote