15 votes

Additional steps Twitter is taking ahead of the 2020 US Election

19 comments

  1. [6]
    nacho
    Link
    This is nice and all, but it's at least a couple years too late to make much of a difference in this election. The lies and misinformation have been stewing and whipping up crowds in such an...

    This is nice and all, but it's at least a couple years too late to make much of a difference in this election. The lies and misinformation have been stewing and whipping up crowds in such an insular manner that those people will not have their minds changed now.

    If I were conspiratorial, I'd assume this is just to cover their backs when social media gets investigated after the election, possibly in combination with fears regarding much stricter regulation regarding what gets posted to a platform you run.

    15 votes
    1. [5]
      mrbig
      Link Parent
      Wouldn’t you say that, while clearly imperfect, late, and insufficient, this effort tends to the positive? Wouldn’t it be in our best interest to praise such effort, with the goal of encouraging...

      Wouldn’t you say that, while clearly imperfect, late, and insufficient, this effort tends to the positive? Wouldn’t it be in our best interest to praise such effort, with the goal of encouraging similar (and possibly superior) policies?

      6 votes
      1. [4]
        nacho
        Link Parent
        If a restaurant takes some puny measures to get a little less abysmal health rating from inspectors, praise is not in order if the restaurant is still not safe to eat in. This is nice and all....

        If a restaurant takes some puny measures to get a little less abysmal health rating from inspectors, praise is not in order if the restaurant is still not safe to eat in.

        This is nice and all. It's better than nothing. That is true.

        The company has had terrible anti-cheating and anti-fraud policies for years. That context needs to be present so people don't walk away with the take-home message that Twitter is fighting this in a reasonable way, because they're not.


        Doing anything else leads to acceptance and normalization of how terribly Facebook, Twitter, Youtube and the rest are doing at actually protecting democracy against interference and manipulation. It goes for all other conversation too.

        All these companies have invested huge amounts of effort and money on normalizing content on platforms being unmoderated. That is not normal. It shouldn't be normal, and politicians should regulate them properly.

        4 votes
        1. [3]
          mrbig
          Link Parent
          You can of course provide a nuanced assessment of the situation that accomplishes the purpose of encouraging improvement without excusing abhorrent behavior.

          You can of course provide a nuanced assessment of the situation that accomplishes the purpose of encouraging improvement without excusing abhorrent behavior.

          3 votes
          1. [2]
            nacho
            Link Parent
            I think the result of such a nuanced assessment is to conclude that this is a tiny drop of cleaning agent in a vast, self-made swamp of misleading nonsense. If a nuclear powerplant were to take...
            • Exemplary

            I think the result of such a nuanced assessment is to conclude that this is a tiny drop of cleaning agent in a vast, self-made swamp of misleading nonsense.

            If a nuclear powerplant were to take measures so that the new nuclear waste they release into their surroundings is slightly less radioactive than what came before, great! How should we react to such a change?

            The way to encourage further improvement both for the nuclear plant and Twitter isn't to be content with such a modicum of effort, but to demand not only that they stop all meaningful new radioactive release into the environment, but also clean up their past mess.


            That's the position Twitter is in. Not acknowledging that present state would in my mind be a mistake.

            Twitter, Facebook, Youtube, Instagram, Snapchat, Reddit and all the rest of these huge platforms only seem to respond to large media pressure when they enact policy change favoring actually curating some of their content. Therefore the only reasonable response is not to back down and not to acquiesce when they present to-little-too-late half-measures as meaningful change and recognition of their culpability.

            Using the very words of this press release from Twitter itself, what is the correct nuanced assessment of a platform that deems itself playing

            a critical role around the globe by empowering democratic conversation, driving civic participation, facilitating meaningful political debate, and enabling people to hold those in power accountable. But we know that this cannot be achieved unless the integrity of this critical dialogue on Twitter is protected from attempts — both foreign and domestic — to undermine it.

            Why is this change coming now, cleverly for US elections and maximum exposure there rather than before a host of other important national elections all through 2020, 2019, 2018, 2017, 2016 and 2015?

            Even in that context we give Twitter way more benefit than they deserve in understanding their own platform and dealing with all the huge previous misinformation campaigns they've harbored while doing literally nothing.

            The only reasonable way of encouraging Twitter to do more is to continue being clear, unambiguous and vocal in demanding they have a responsibility for doing much more, and the fact that they aren't should be top news and high on national political agenda for new legislation all the time. That is what they respond to so we must speak in their language.

            12 votes
            1. mrbig
              (edited )
              Link Parent
              I believe that’s an extremely pessimistic viewpoint that is not in consonance with reality. It is also incredibly persuasive. The whole argument is incredibly persuasive and contain unjustified...

              If a nuclear powerplant were to take measures so that the new nuclear waste they release into their surroundings is slightly less radioactive than what came before, great! How should we react to such a change?

              I believe that’s an extremely pessimistic viewpoint that is not in consonance with reality. It is also incredibly persuasive. The whole argument is incredibly persuasive and contain unjustified categorical conclusions. These mistakes are at the foundation of your reasoning and risk taking it down with them. I literally cannot counter everything you said from my tiny smartphone. But that’s my impression.

              1 vote
  2. [13]
    JXM
    Link
    So they're still making it incredibly easy for people to spread false information and they're patting themselves on the back because they made users click an extra button before they can retweet...

    So they're still making it incredibly easy for people to spread false information and they're patting themselves on the back because they made users click an extra button before they can retweet something?

    4 votes
    1. LukeZaz
      Link Parent
      Hey, every additional step is a step that many might not take. You can't stop people from being assholes, but you can make it harder, and sometimes it takes surprisingly little to let people's...

      Hey, every additional step is a step that many might not take. You can't stop people from being assholes, but you can make it harder, and sometimes it takes surprisingly little to let people's laziness convince them not to be a dick.

      8 votes
    2. [11]
      skybrian
      Link Parent
      I’m having trouble imagining a social network of Twitter’s scale where it’s difficult to spread false information, through cut-and-paste if not resharing. How do you imagine it would work?

      I’m having trouble imagining a social network of Twitter’s scale where it’s difficult to spread false information, through cut-and-paste if not resharing. How do you imagine it would work?

      2 votes
      1. [8]
        JXM
        Link Parent
        You know, delete tweets with a known URL that contains false information and kick people who repeatedly share false info off the platform... There are tons of problems with this approach, but I’m...

        You know, delete tweets with a known URL that contains false information and kick people who repeatedly share false info off the platform...

        There are tons of problems with this approach, but I’m not paid millions of dollars per year to figure out this stuff.

        5 votes
        1. [7]
          skybrian
          Link Parent
          If you know this is a hard problem, maybe consider that contempt might not be a helpful take on the situation?

          If you know this is a hard problem, maybe consider that contempt might not be a helpful take on the situation?

          7 votes
          1. [5]
            kfwyre
            Link Parent
            Does Twitter deserve contempt for letting the issue get so bad in the first place though? Part of what has made this into such a hard problem has been longstanding inaction on their part. They...

            Does Twitter deserve contempt for letting the issue get so bad in the first place though? Part of what has made this into such a hard problem has been longstanding inaction on their part. They have helped to create the current conditions that they are now forced to respond to.

            7 votes
            1. [4]
              skybrian
              Link Parent
              There was enormous growth at the beginning, when Twitter couldn’t even prevent the site from going down. (Remember the fail whale? It was named in 2008 and discontinued in 2013.) In retrospect,...

              There was enormous growth at the beginning, when Twitter couldn’t even prevent the site from going down. (Remember the fail whale? It was named in 2008 and discontinued in 2013.)

              In retrospect, they should have been wary of growth, but social networks weren’t known to be so toxic, and growth is usually considered a sign of success, so a Cassandra saying that they should drastically limit growth wouldn’t have been listened to.

              So, they’ve had a tiger by the tail for a long time, and the nature of the tiger only gradually became apparent. This doesn’t excuse their drift and inaction since then, but unless the idea is to kick 99% of their users off and essentially start over, the problem has always been moderation at massive scale.

              3 votes
              1. [3]
                kfwyre
                (edited )
                Link Parent
                I'm going to push back on this, not because it isn't true (I think it is), but because I think its truth belies a bigger issue. Ultimately, I think this idea is the key for understanding why some...

                but social networks weren’t known to be so toxic

                I'm going to push back on this, not because it isn't true (I think it is), but because I think its truth belies a bigger issue. Ultimately, I think this idea is the key for understanding why some people look at networks like Twitter and give them the benefit of the doubt, while others, like me, don't. From my perspective, social media platforms have always had pretty significant toxicity, but I think many people have been insulated from it, which has created the true belief that it doesn't exist or that it's a recent development, which is a false reality.

                Speaking from my experiences, with the exception of Tildes (thanks @Deimos and all those who make this such a great place to be -- yourself included, skybrian!), social media has never not been toxic. Unfortunately, my attempts to raise this issue, along with the efforts of many like me with similar experiences, simply haven't been heard, and it's not for lack of trying. There was a long period of time, probably over a decade ago now, where even mentioning on reddit that the word "fag" could be hurtful, for example, was an invitation to widespread downvotes and an inbox full of hatred. My initial time on the site was punctuated with seeing the word plastered fondly in threads all over the site, along with plenty of other overtly homophobic stuff, much of it far more damaging than just a single slur. It's also worth noting that, at the time, reddit was considered one of the more progressive places on the internet. You can imagine what things were like elsewhere.

                Furthermore reddit was quite unlike forums of the past, eschewing old conventions such as flat conversations, small scopes and numbers of users, and no mechanisms for group feedback or decontextualizing posts. reddit's new mechanisms not only allowed homophobic posts to exist, but they then demonstrated that these posts were in fact endorsed by the broader community. They were socially valuable. Furthermore, these mechanisms allowed dogpiling for unpopular opinions, leaving minority opinions vulnerable to social DDOS attacks, all whilte the reward structures and cross-pollination of the site's structure allowed posts of that type to percolate throughout the culture of the site, well beyond the scope of the original post. It allowed homophobia not just an uncomfortable place to exist but a valuable and embedded spot in the site's zeitgeist itself. The same mechanisms that benefitted discriminatory posts also allowed those who attempted to check or oppose them to be silenced, ignored, or run off.

                Plenty of users would have felt that there wasn't a toxicity problem then, even with all of this going on out in the open. It didn't impact them, and users like me were both shut down on the platform itself and outright discouraged from using it in the first place. If anything were to be seen as toxic by the reddit of the time, it likely would have been me, trying to come in and silence free speech and be a wet blanket on everyone else's good time, when all I wanted was to not be everyone else's punching bag.

                It's not that the toxicity wasn't there; it's that its effects were disproportionately felt. I've spoken as a gay guy, and honestly, though my experience online has been bad, it's not even close to the worst of what's out there. I'm still, after all, a cis white guy. Women, people of color, and trans people have been the targets of hatred on social media the likes of which I cannot fathom. Some of the stuff they've been and continue to be regularly subjected to makes my experiences look mild in comparison.

                I think the change we've seen over the years with modern social media is that toxicity is now more widespread and people who were previously insulated from its effects are now having to experience it and confront it in ways they didn't used to. This is why it seems "new", even though it really isn't -- it's just further evolution of a pre-existing malignancy. I also think that this has resulted because platforms have failed to take the toxicity that they did have seriously, and doing so wasn't out of ignorance but out of continued and deliberate oversight. They've always pointed to people as the problem, pushing the issues down to the individual user level, absolving themselves of responsibility. They also used the escape clause that society gives them: these issues exist in real life too, after all, so the platforms pretend they are simply a mirror -- reflective of worldly issues -- rather than complicit in perpetuating them, or even outright creating new issues themselves.

                There are many people who have been sounding the alarm on this for a long time now, but the platforms seem more interested in turning up the radio to drown out the noise of the ringing rather than dealing with the cause of the bell going off in the first place. And, honestly, we're kind of tired of saying the same things over and over and seeing nothing come of it. Toxicity isn't new to us, and we know what happens when it's left unchecked because we've already been dealing with it all our lives.

                It is immensely hard to not give a cold shoulder and a terse "I told you so" to Twitter in moments like this, when we've literally been telling them for well over a decade now, and have gone almost entirely unheard. I'll always celebrate progress, and I certainly appreciate small steps in the right direction, but to me, that's all this is, and it's very hard for me to ignore the larger context in which it sits: an out of control digital fire left to rage unabated, and all because the fire department wouldn't even take our 911 calls back when it was a series of much smaller, more isolated blazes. Rather than putting out the tiny fires we were calling in, they simply looked at all the stuff that wasn't burning and assessed that there wasn't an issue, never for once considering the reality that all of that unburnt space, too, might catch one day.

                It didn't have to be this way, and the people who have been getting burned have been letting platforms know that every step of the way. I think the platforms bear significant responsibility for being unwilling to listen. This is also what makes Tildes so significant, and why I'm here in the first place. It's a rare instance of a platform designed by someone who did.

                9 votes
                1. [2]
                  skybrian
                  Link Parent
                  I'm not in touch with how the online gay community evolved and I'm also assuming everything you say is true, but I am wondering if many people might have been paying more attention to some of the...

                  I'm not in touch with how the online gay community evolved and I'm also assuming everything you say is true, but I am wondering if many people might have been paying more attention to some of the more positive aspects of the Internet: the rise of specialist online communities, where people can get in touch with others like themselves and know that they are not alone. It seems like this was important? Also, maybe it seemed more likely that censorship would be used against LGBT people?

                  I do remember that around the early Obama administration, there was quite a lot of optimism about the positive aspects of the Internet. There were a lot of congratulatory articles about how social networks help people organize, with Arab Spring as an example. (For some reason, it wasn't considered so important that people can organize for evil ends, too.)

                  3 votes
                  1. kfwyre
                    (edited )
                    Link Parent
                    Those are good points, and my earlier comment is actually a bit unjust because I didn't speak to the positive aspects of the internet, particularly in regards to LGBT people. I portray it as...

                    Those are good points, and my earlier comment is actually a bit unjust because I didn't speak to the positive aspects of the internet, particularly in regards to LGBT people. I portray it as something of a wasteland, but the reality was far from that -- r/lgbt was one of the first places I ever felt "at home" online! Prior to that my hangouts consisted mostly of disparate blogs I followed -- good in terms of receiving input on LGBT topics, but not great for output or self-expression. It was all very one-way, and reddit changed that for me. I met my husband on reddit, so clearly the site does have some value!

                    The problem is that my positive experience there was degraded, from day one, by all the other elements of reddit that I identified. Much of online community organizing for marginalized minorities has been about finding our own spaces, and reddit's subreddit system very much enabled that in ways that hadn't been possible prior, but it also created lots of other issues which I talked about earlier. Then, as time carried on and we raised those issues, they went mostly unaddressed. The place where I met my husband was awash in homophobic slurs at the time when we started talking. As heartwarming as it is that we met online, the darker contexts dampen that shine a bit: our beautiful queer love started on a platform where people felt justified in calling others "fags". That's how widespread it was at the time, and while others can look and say "it was a different time back then!", it was never a different time for us. It was our time, and we chose to spend it there, at a place where our dignity was regularly maligned, because even back then it was still better than most of our other options.

                    There definitely was a period of time where the internet and its platforms had this ultra-positive, change-the-world forward motion to them, but I also think that existed mostly in techy spaces by people who were exploring, contributing to, and even charting the course for its direction. I don't want to take that experience away from anyone, as I have no doubt that it's the lived truth for many people here (I assume you as well, given that you used to work for Google!), but I also think it's a limited perspective. Not untrue, but not a full picture, either.

                    Alongside all of positivity were people like me who were included in that thrust and genuinely did benefit from it, but who also saw the bumps we hit along the way and the concerns that many were turning a blind eye to.

                    5 votes
          2. JXM
            Link Parent
            They have had a over a decade to figure it out and have only taken the barest minimum of steps to stop it. They’ve also repeatedly let their platform be used to harass and send death threats to...

            They have had a over a decade to figure it out and have only taken the barest minimum of steps to stop it. They’ve also repeatedly let their platform be used to harass and send death threats to people.

            I’d say that’s worthy of a little contempt.

            5 votes
      2. [2]
        nacho
        Link Parent
        With an appropriate amount of staff policing the platform. For some reason facebook, twitter, youtube et al. have manage to seed the thought that platforms should largely be checked by algorithm...

        With an appropriate amount of staff policing the platform.

        For some reason facebook, twitter, youtube et al. have manage to seed the thought that platforms should largely be checked by algorithm rather than employee.

        This is where regulation comes in.

        2 votes
        1. skybrian
          Link Parent
          I was unable to find any numbers of Twitter, but it looks Facebook had about 15,000 content moderators in the US as of March. (source) I also saw an article about a report recommending 30,000 but...

          I was unable to find any numbers of Twitter, but it looks Facebook had about 15,000 content moderators in the US as of March. (source)

          I also saw an article about a report recommending 30,000 but I still think that assumes mostly algorithmic moderation. Taking a wold guess, they'd probably need 10x more to do a good job by hand?

          For comparison, Amazon has about 120,000 warehouse workers and Walmart has over a million employees in the US, and you couldn't really say that when you go to Walmart, there are a lot of staff.

          Those are places where customers actually pay for things. There is an assumption that social media can be done for free and I think Facebook and Twitter would have to start charging a subscription fee.

          That might not be a bad thing, but I think there would be a lot of complaints.

          I also wonder about a world where we've got so many people working as content moderators for big, impersonal firms. It seems like moderation is best done by people who are actually familiar with the community. For something like Reddit, the most natural way to do it would be paid hosts, but Twitter and Facebook have a different structure.

          4 votes