31 votes

How the biggest decentralized social network is dealing with its Nazi problem

42 comments

  1. [29]
    tesseractcat
    Link
    Personally, I'm totally fine with instances blocking instances they disagree with, however apps blocking instances makes me uncomfortable. Much like I would feel uncomfortable if a browser...

    Personally, I'm totally fine with instances blocking instances they disagree with, however apps blocking instances makes me uncomfortable. Much like I would feel uncomfortable if a browser developer blocked websites it disagreed with. If anyone has reasoning for or against the in-app blocking of instances, I would be interested in hearing it.

    23 votes
    1. [6]
      cfabbro
      (edited )
      Link Parent
      https://en.wikipedia.org/wiki/Paradox_of_tolerance And given the fact that Gab was being used to spread holocaust denial, anti-immigrant and anti-Semitic rhetoric, calls for violence against said...
      • Exemplary

      https://en.wikipedia.org/wiki/Paradox_of_tolerance

      Less well known is the paradox of tolerance: Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them. — In this formulation, I do not imply, for instance, that we should always suppress the utterance of intolerant philosophies; as long as we can counter them by rational argument and keep them in check by public opinion, suppression would certainly be unwise. But we should claim the right to suppress them if necessary even by force; for it may easily turn out that they are not prepared to meet us on the level of rational argument, but begin by denouncing all argument; they may forbid their followers to listen to rational argument, because it is deceptive, and teach them to answer arguments by the use of their fists or pistols. We should therefore claim, in the name of tolerance, the right not to tolerate the intolerant.

      And given the fact that Gab was being used to spread holocaust denial, anti-immigrant and anti-Semitic rhetoric, calls for violence against said groups, and was being actively used as a recruitment tool for neo-Nazi and other right-wing domestic terrorism groups... and also the fact it already fostered one mass shooter, which was an act largely celebrated by the Gab community... I would say it clearly falls into the latter type of intolerance that we can't just fight with rational arguments, but instead need to suppress at all costs lest our ability to be tolerant of anyone be undermined and eventually destroyed.

      See: https://en.wikipedia.org/wiki/Gab_(social_network)

      31 votes
      1. [5]
        tesseractcat
        Link Parent
        So do you believe that Gabs website should be blocked within open source projects such as Chromium or Firefox? If you do, do you believe that other sites that contain similar behavior should be...

        So do you believe that Gabs website should be blocked within open source projects such as Chromium or Firefox? If you do, do you believe that other sites that contain similar behavior should be blocked as well? If so, how do you believe the process for determining which sites should be enacted?

        11 votes
        1. [4]
          cfabbro
          (edited )
          Link Parent
          To answer your first two questions, yes, if it comes to that. Although thankfully most hate speech platforms have trouble finding registrars and hosts anyways, because they usually inevitably...

          To answer your first two questions, yes, if it comes to that. Although thankfully most hate speech platforms have trouble finding registrars and hosts anyways, because they usually inevitably violate the Terms of Services. E.g. https://en.wikipedia.org/wiki/The_Daily_Stormer#Site_hosting_issues_after_the_2017_Unite_the_Right_rally

          And to answer your third question, hate speech laws already exist in pretty much every Western country outside the US, and publications and websites have already been ordered by various courts to be barred from distribution and shut down for violating said laws. For a recent and still ongoing example from my neck of the woods, see: https://www.cbc.ca/news/canada/toronto/your-ward-promoting-hate-1.4990806

          So there are already plenty of examples available for how to determine what constitutes hate speech, hate propaganda, promotion of terrorism, etc. And applying those standards to another platform, app, opensource project, etc. isn't unreasonable or unrealistic, IMO.

          p.s. Tildes itself has already done so too, in case you didn't know:
          https://docs.tildes.net/code-of-conduct

          Do not incite or encourage harm against people, including by posting hate speech or threats.

          20 votes
          1. Bullmaestro
            Link Parent
            I used to be for protecting free speech at all costs but a few things changed my mind recently; the biggest of which was a friend being doxxed by 8chan and harassed on an almost daily basis for an...

            I used to be for protecting free speech at all costs but a few things changed my mind recently; the biggest of which was a friend being doxxed by 8chan and harassed on an almost daily basis for an entire year.

            Her Snapchat, Discord, Facebook and Instagram have been bombarded with predatory messages, rape threats and death threats. She's had the SWAT team called out on her at least once and she's had dozens of unwanted pizzas delivered to her address. On top of this, she's had several stalkers, many of which originated from 8chan.

            As far as I'm concerned, countries like the US really need to rethink their free speech and free expression legislation because these laws alone are defending extremism. Such laws are essentially the reason why sites like 8chan, Voat and Gab can exist.

            24 votes
          2. [2]
            Adys
            Link Parent
            I think you missed the point @tesseractcat was trying to make. We do have recourses against sites like these. Legal ones, usually, but moreso like you said the services they use will bail on them...

            I think you missed the point @tesseractcat was trying to make.
            We do have recourses against sites like these. Legal ones, usually, but moreso like you said the services they use will bail on them as they end up threatening business, breaking TOS, or simply being dicks to the service.
            But changing the tools themselves (the browsers) to block such sites is a huuuuge leap. We do already have the unsafe site lists, and those are restricted to phishing and malware distribution. Beyond that, blocking legal speech is a massively bad idea. Let it be illegal but don't touch the tools, you know? Especially since the people interested will simply use different tools.

            Remember the EU upload filter legislation and how bad an idea that is.

            10 votes
            1. cfabbro
              (edited )
              Link Parent
              And I think you missed the point I was trying to make. I never said legal speech should be blocked, but hate speech is not legal speech in the vast majority of the Western world. The US is...

              And I think you missed the point I was trying to make.

              blocking legal speech is a massively bad idea

              I never said legal speech should be blocked, but hate speech is not legal speech in the vast majority of the Western world. The US is virtually alone in allowing that garbage to remain legally protected, and propagate as a result.

              Especially since the people interested will simply use different tools.

              Let them do that then. And as with most things that you add an additional barrier to, the number of people that actually go out of their way to use the new tool will fall... which is precisely the point and to our collective benefit in the case of hate speech.

              Remember the EU upload filter legislation and how bad an idea that is.

              Non-sequitur. That was poorly thought out and many aspects of it were technically infeasible. However hate speech legislation is already in place in most countries, and has decades of legal precedence behind it. Further requiring browsers to block sites found guilty of violating said laws is not a massive leap, especially since DNS providers, registrars, hosting providers and ISPs in said countries are already bound by the requirement to comply with court issued takedown orders. And with the Christchurch Call getting the support that it is, many social media platforms will also likely soon be similarly legally required to remove hate speech in most countries. E.g. via Canada's planned Digital Charter. So why shouldn't browsers also be included to help those countries deal with hate speech hosted outside their jurisdiction?

              14 votes
    2. [18]
      alyaza
      Link Parent
      there is really zero good reason to allow people to access a place that is literally just far-righters talking about how much they want to kill all the niggers and kikes and establish a white...

      however apps blocking instances makes me uncomfortable

      there is really zero good reason to allow people to access a place that is literally just far-righters talking about how much they want to kill all the niggers and kikes and establish a white ethnostate free of the degenerates and whatever else unless you're somehow compelled to do so by law, which nobody is. maybe if gab doesn't want to be deplatformed, it should do something about all the people who would gladly string people they deem inferior up and throw them in gas chambers.

      9 votes
      1. [17]
        unknown user
        Link Parent
        Gab might be an obvious shithole, but who gets to decide on more particular cases? How do we react if a major browser decides to block LGBTQ+ websites? What if all major text editors refused to...

        Gab might be an obvious shithole, but who gets to decide on more particular cases? How do we react if a major browser decides to block LGBTQ+ websites? What if all major text editors refused to allow editing files with racist words in them? Should we all wear glasses that hide far right content? Should areas with mostly far righters be deprived of roads and public transport?

        Deplatforming is good. What instances do is good. But tools themselves should not have ideas. If they do, it is a very dystopian future. Browsers or Mastodon clients aren't a good platform to implement censorship on. All it takes is to changes one line in a single text file, and you worked around the cebsorship.

        6 votes
        1. [4]
          mat
          Link Parent
          I suspect that would cause a significant loss of users for that browser. But if the browser makers wants to implement that kind of filter, why shouldn't they? If you're anti-censorship you should...

          How do we react if a major browser decides to block LGBTQ+ websites?

          I suspect that would cause a significant loss of users for that browser. But if the browser makers wants to implement that kind of filter, why shouldn't they? If you're anti-censorship you should be in favour of letting people - including people who make browsers - express themselves however they choose.

          Your other examples are obviously preposterous although if I could get some of those anti-fascist glasses that would be good.

          Deplatforming is good.

          We agree on that, certainly.

          But tools themselves should not have ideas. If they do, it is a very dystopian future.

          You say that as if it's entirely self-evident but I can't see any reason why tools shouldn't express ideas. People have ideas. People make tools. Tools are just expressions of human will, literally the entire purpose of a tool is the turning of an idea into an action. I don't think it follows at all that if someone chooses to limit fascists from using their tool that we're heading to a dystopia. It might even mean the opposite.

          Browsers or Mastodon clients aren't a good platform to implement censorship on. All it takes is to changes one line in a single text file, and you worked around the cebsorship.

          Eh, it might do. I assume you're talking about forking the code and releasing a new version. Sure, the hardcore will do that. But it's not the hardcore who deplatforming works on, it's the casuals, who might not even know there's censorship to work around, or might not be bothered to install a new browser or whatever. Reducing casual exposure to harmful memes (and I mean that word in it's original sense, not silly pictures) seems to have some evidence to support being an effective way to reduce the spread of fascism. I might even argue that a browser is the perfect place to do that.

          Also, can we stop using the negatively loaded word 'censorship' to describe the silencing of fascists, as if that's a bad thing to do? "Filtering" is a little more neutral.

          9 votes
          1. [3]
            unknown user
            Link Parent
            Censorship is censorship. It is important to be aware that it is a band-aid solution, and other actions are necessary, like education, evangelising, general social progress. This sort of...

            Censorship is censorship. It is important to be aware that it is a band-aid solution, and other actions are necessary, like education, evangelising, general social progress. This sort of euphemisms would only confuse us.

            Apart from that, I'm not anti-censorship. I just think that censorship should not happen at the level of clients. They should remain neutral. Just like a wrench does not discriminate between nuts to tighten, a browser should not discriminate between websites to visit. Because then the stronger, not the ethically better, will determine the rules. That is really dangerous in this world we live in where far right is gaining wind in its sails again.

            "Filtering" is something the user decides on. I have uBlock Origin with very strict filters on. I filter out politics on Tildes via not subscribing to ~news, and do not subscribe to ~lgbt in order to avoid long discussions (dealing with procrastination issues). I don't subscribe to /r/Turkey b/c it's a racist nationalist sub. Etc etc. This is filtering. When someone else does it for me, it is censorship. I'm fine with it when it happens on the hosting, ActivityPub federation and NIC level; I don't want it to happen on the ISP and browser level (and Tusky essentially is a single purpose browser).

            5 votes
            1. [2]
              mat
              Link Parent
              If I sold wrenches and a guy came in wearing swastika and 1488 tattoos, you can be certain he wouldn't be leaving with any of my tools. OK, his Nazi nuts might get tightened somehow but I'm dammed...

              If I sold wrenches and a guy came in wearing swastika and 1488 tattoos, you can be certain he wouldn't be leaving with any of my tools. OK, his Nazi nuts might get tightened somehow but I'm dammed if I'm going to make it easy for him.

              Imagine for a moment you make a client. You find out your software is being used to spread far-right ideas. You are, at that point, in part responsible for what happens as a result of your software existing. We know that Gab et al have been to some extent implicated in creating a culture which has led to events like Charlottesville, the Christchurch shooting and more. You, as a human being, can choose to do nothing, or you can choose to try to counter the problem. Which choice would you make? Because I know what I'd do, and it's not stand by shrugging my shoulders with vague pronouncements about how "tools can't have ideas"

              After all, "it has been said that for evil men to accomplish their purpose it is only necessary that good men should do nothing." (hard to attribute that one, but still)

              "Filtering" is something the user decides on.

              That depends where you live. My internet is filtered at a national level and I'm pretty OK with that (filter is child porn, list provided by the IWF). Do you consider filtering images of child sexual abuse censorship? Or is it, like the fact I can't go down the shops and pick up a trolleyful of explosives and firearms, a perfectly reasonable restriction which helps a society remain safe?

              7 votes
              1. unknown user
                Link Parent
                Censorship is censorship. It is not inherently a bad thing. Filtering CSA is censorship, it is a good and useful one. Censorship is about publications, not physical goods like weapons....

                Censorship is censorship. It is not inherently a bad thing. Filtering CSA is censorship, it is a good and useful one. Censorship is about publications, not physical goods like weapons. Restrictions (IMO total ban) on the sale of weapons is a good thing. Still a restriction, but a good and useful one. The problem with the kind of censorship you link is that it is generally abused for other kinds of censorship. In the UK it is way more than CSA. I live in Turkey where the censorship capabilites are used to silence the opposition and hide the govt's wrongdoings.

                There is a dilemma with censorship: we definitely need some of it, but when we have the mechanisms in place, it almost never stays at the level we need them. Thus deplatforming is more important and effective all the while not requiring compromising citizens' privacy.

                Imagine for a moment you make a client. You find out your software is being used to spread far-right ideas. You are, at that point, in part responsible for what happens as a result of your software existing.

                No, the author of software is not responsible at all. Just like browser makers are not responsible for the production and consumption of CSA imagery, just like the manufacturers of the hardware used aren't, or the constructor of the property in which the abuse happens or the imagery is consumed isn't. That is unless the software was not written expressly for propagating racism, CSA or other sorts of hate & abuse.

                If I sold wrenches and a guy came in wearing swastika and 1488 tattoos, you can be certain he wouldn't be leaving with any of my tools. OK, his Nazi nuts might get tightened somehow but I'm dammed if I'm going to make it easy for him.

                This is actually not relevant, but just for the sake of it, if the guy came with the things covered, or you did not know the meanings of the stuff (I just looked up 1488, did not know the 88 part before and thought it was a date), you would've sold the wrench. Is that participation in hate crime or racism?

                2 votes
        2. [12]
          alyaza
          Link Parent
          why is it worth speculating at all? we're not there, we most likely never will be, and people telling gab to fuck off and die is probably not going to get us to literally any hypothetical you just...

          Gab might be an obvious shithole, but who gets to decide on more particular cases?

          why is it worth speculating at all? we're not there, we most likely never will be, and people telling gab to fuck off and die is probably not going to get us to literally any hypothetical you just named because those hypotheticals are ridiculous (and, honestly? probably get dealt with by the free market in a matter of weeks on the off chance they somehow do happen).

          6 votes
          1. [11]
            unknown user
            Link Parent
            Those hypotheticals are not very realistic (except policies against minorities happen all the time, e.g. in opposition municipalities; tho it's generally the far-righters that oppress in IRL...

            Those hypotheticals are not very realistic (except policies against minorities happen all the time, e.g. in opposition municipalities; tho it's generally the far-righters that oppress in IRL situations), yes, but they are meant to illustrate the point.

            My belief is this whole Tusky thing is one big futile virtue-signalling by the app's devs. The Apple ones might have a point w.r.t. App Store banning them. But apps are not platforms, they are clients. This sort of blocking won't stop those who want to participate, and no-one accidentally becomes a far-righter because they accidentally chose a shitty far right instance and accidentally registered an account there.

            1. [7]
              alyaza
              Link Parent
              this is a pretty silly take considering that plenty of people--quite possibly the majority of people who become far-righters, nowadays, in fact--get sucked into seemingly innocuous rabbit holes...

              This sort of blocking won't stop those who want to participate, and no-one accidentally becomes a far-righter because they accidentally chose a shitty far right instance and accidentally registered an account there.

              this is a pretty silly take considering that plenty of people--quite possibly the majority of people who become far-righters, nowadays, in fact--get sucked into seemingly innocuous rabbit holes and come out of those rabbit holes thinking the deep state pedophile (((jewish))) conspiracy sacrifices children and drinks their blood behind closed doors and uses it to run the world or that the only solution to the Minority Question is a final solution. not everybody knows what gab is; gab itself does tend to try to be a bit mask-on about things even if its userbase does not; and moreover not everybody runs screaming to the exits when they see vehement racism because some people aren't actively far-righters or even right-wing but are sympathetic to racist views and would potentially fit in on gab.

              5 votes
              1. [6]
                unknown user
                Link Parent
                First of all, you can choose better adjectives than "silly" and "ridiculous" when referring to the words of just another well-meaning tildestrian like you, can't you? For the actual topic itself,...

                First of all, you can choose better adjectives than "silly" and "ridiculous" when referring to the words of just another well-meaning tildestrian like you, can't you?

                For the actual topic itself, I wouldn't mind if this was about marking these instances as bad, or just making it harder to get to them. For what you say about people getting sucked into rabbit holes and coming out terrorists just because, I'm willing to change my view if you can provide some evidence that this thing happens.

                3 votes
                1. [2]
                  alyaza
                  Link Parent
                  i mean, it is a silly take, though. we've seen radicalization happen on a massive scale, especially since 2016, through those exact means; it's happened with trump supporters and broader...

                  First of all, you can choose better adjectives than "silly" and "ridiculous" when referring to the words of just another well-meaning tildestrian like you, can't you?

                  i mean, it is a silly take, though. we've seen radicalization happen on a massive scale, especially since 2016, through those exact means; it's happened with trump supporters and broader conservative movements, it's happened with conspiracies like QAnon and Pizzagate; it's happened with the skeptic-to-fascist pipeline exemplified by people like sargon on youtube, and so on, and often blocking the means through which people get siphoned into rabbit holes like that which turn them into crypto-fascists or card-carrying neo-nazis is the best or only realistic solution beyond a certain scale. alternative approaches like deprogramming people are not something most people are able to do, and even people who are skilled in it can only do so much with so many people.

                  For what you say about people getting sucked into rabbit holes and coming out terrorists just because, I'm willing to change my view if you can provide some evidence that this thing happens.

                  see here, or here, or things like this, or any number of the links just provided by @Micycle_the_Bichael. it's neither new nor infrequent.

                  7 votes
                  1. unknown user
                    Link Parent
                    It might be a silly take, but you can be a better interlocutor and call it "mistaken", "misguided", "misinformed" or even "ignorant" or "hyperbolical". "Silly" or "ridiculous", especially when you...

                    It might be a silly take, but you can be a better interlocutor and call it "mistaken", "misguided", "misinformed" or even "ignorant" or "hyperbolical". "Silly" or "ridiculous", especially when you use it towards your interlocutor, is patronising and deriding. They might be fun to use if all you care is winning an argument, but the language you use is of utmost importance if you actually want to change someone's mind. I am an open-minded individual and won't ignore information just because, but the vast majority of people out there will.

                    I'll disengage from this discussion here, and read the links I've been provided later. I don't think it'll change my mind too much, for I'm already pro-censorship&deplatforming in this regards and I do agree that exposure to shitty communities can cause radicalisation even for less likely people; I just think it is superfluous and futile to deal with it on the browser/client itself. I do support deplatforming, but I don't think what Tusky did constitutes as deplatforming: it is more than that and it goes too far IMO.

                    7 votes
                2. [3]
                  Micycle_the_Bichael
                  (edited )
                  Link Parent
                  Here's some articles based specifically on youtube algorithm sending people down the altright rabbit hole from this year:...

                  Here's some articles based specifically on youtube algorithm sending people down the altright rabbit hole from this year:

                  I don't really want to get into a discussion or a debate about what the correct answer to dealing with these problems are, but I did want to add some sources to support the common (in my social circles at least) claim that youtube and internet as a whole can send people down radicalization rabbit holes. I think one could also argue that the 8chan post on tildes a couple days/weeks ago would also be a good supporting source (I can't find it on mobile but maybe @alyaza can have more success)

                  6 votes
            2. [3]
              cfabbro
              (edited )
              Link Parent
              What point, that you can reduce an argument to absurdity, rely on the slippery slope fallacy and manufacture completely unrealistic scenarios? Yes it can, but the point of blocking is not to stop...

              they are meant to illustrate the point.

              What point, that you can reduce an argument to absurdity, rely on the slippery slope fallacy and manufacture completely unrealistic scenarios?

              This sort of blocking won't stop those who want to participate

              Yes it can, but the point of blocking is not to stop the hardcore devotees who will find another way no matter what, it's about stopping the casual users who can't be bothered to find another app that doesn't block their favorite little hate speech community. Which as we have seen with deplatforming is the vast majority of their users. E.g, look at the minuscule number of users in /v/fatpeoplehate compared to /r/fatpeoplehate when it got banned. If you put up even a small roadblock to their access, the majority of those hateful spaces' audience vanish because they're not really that invested in them... yet.

              no-one accidentally becomes a far-righter because they accidentally chose a shitty far right instance and accidentally registered an account there.

              You're right, the way they become indoctrinated is slowly, over time, through exposure to that crap for extended lengths of time. Which blocking those hateful communities would also prevent in the vast majority of cases.

              3 votes
              1. Micycle_the_Bichael
                (edited )
                Link Parent
                Another couple of great examples: Look up how many page hits there are on Alex Jones and Milo Yiannopoulos articles now. Again, when I have more time I'll find my source on this, but Milo himself...

                Which as we have seen with deplatforming is the vast majority of their users.

                Another couple of great examples: Look up how many page hits there are on Alex Jones and Milo Yiannopoulos articles now. Again, when I have more time I'll find my source on this, but Milo himself talked about how much harder it is for him to make money since being deplatformed. We have concrete examples of it driving down interaction and views, which in isolation might not be a solution (there are other alt-right talking heads) but it is an insanely effective strategy.

                Edit: Wish it wasn't Mashable and Vox, but here's two articles on Milo's deplatforming effect.

                Question: Is it Vox or VICE that is super bad at reporting facts? I know both have a left lean, but I know one is a bit biased and one is trash.

                5 votes
              2. unknown user
                Link Parent
                Why the fury? I'm just using some metaphors to illustrate my point. Also, reducing to absurdity is just another rhetorical/philosopical device that can be useful when reasoning. I am not relying...

                What point, that you can reduce an argument to absurdity, rely on the slippery slope fallacy and manufacture completely unrealistic scenarios?

                Why the fury? I'm just using some metaphors to illustrate my point. Also, reducing to absurdity is just another rhetorical/philosopical device that can be useful when reasoning.

                I am not relying on anything and (by implication) not participating in a game using this tactic or that to beat arguments.

                I have also already declared support for deplatforming and certain kinds of censorship.

                The point I am arguing is: the client app, the browser, is not the right place for this sort of censorship. Let's be realistic: X wants to access bad-mastodon.social, and Tusky blocks it. Will they not download Just Another App™, but give up instead? Also, when logging in on Tusky or Fedilab (my client of choice, since way before all this nonsense), you have to literally type the domain name of the instance you want to log in to. Do you really believe someone who does not know what Gab is will type whatever-gabs-domain-be.social in that box and create an account on that instance? And do you really believe it is worth it given it takes nothing but a few seconds to find another client and download it, and that there always will be another client? And finally, do you really really believe that someone geeky enough to ever have heard what Mastodon be and figured out how to get an account on it will be unable to download another app to go to whatever instance they want to, or just to navigate to the instance itself? This is all nonsense. This is just putting a lone door out in the open and locking it and saying it'll deter the thieves and the marketers and the JWs.

                Just download Tusky or Fedilab or try to sign up for a Mastodon account to see it for yourself. The scenario that someone indeliberately signs up for a bad instance is basically impossible. And if you sign up on a good one, stuff like Gab should be blocked by the instance itself already.

                3 votes
    3. Eva
      Link Parent
      I disagree with all of the reasoning of the other people in the replies to this, though I largely agree with the end. It's free and open-source software. The developers have a right to make their...

      I disagree with all of the reasoning of the other people in the replies to this, though I largely agree with the end.

      It's free and open-source software.

      The developers have a right to make their software do whatever they want.

      Gab, also, has the right to either fork or shut up—and they did! Their (paid) fork was incredibly successful.

      The developer gave them that irrevocable right by making it FLOSS software.

      It doesn't matter if it's Mozilla, Google, the Servo guys (also Mozilla with a caveat), the Pale Moon/Basilisk guys, the Edge team, one of the five people still working on KHTML, the people working on Apple's fork of KHTML, whatever; they can make their software do whatever.

      Anyone who cares enough can compile the software on their own, maintain a fork.

      That's the value of FLOSS software.

      6 votes
    4. NaraVara
      Link Parent
      I’m down with apps blocking instances and I don’t see it as being all that different from instances blocking them. Everyone knows the score going in and as long as there is a plurality of apps out...

      I’m down with apps blocking instances and I don’t see it as being all that different from instances blocking them. Everyone knows the score going in and as long as there is a plurality of apps out there I don’t think the “slippery slope” problems of censoring valid ideas will happen.

      The problem though, is just one of scale. I actually think the decentralized approach with active moderation is the right one. It keeps the “community management” muscles trained and strong so they can play that game of whack-a-mole. If it’s centralized and people rely on Gargron to do it all he won’t keep up.

      2 votes
    5. [2]
      mrbig
      Link Parent
      But it’s not an app like Chrome or Safari, that resides on your device. It’s a platform.

      But it’s not an app like Chrome or Safari, that resides on your device. It’s a platform.

      1. tesseractcat
        Link Parent
        I may have been unclear. I'm talking about the blocking of certain instances on an app level. In the article they mention the app Tusky, which is an app that you install on your device to access...

        I may have been unclear. I'm talking about the blocking of certain instances on an app level. In the article they mention the app Tusky, which is an app that you install on your device to access mastodon.

        E: Tusky or other mastadon apps are like web browsers, except instead of interacting with HTML or other web protocols, they interact with mastadon/activitypub protocols.

        6 votes
  2. ubergeek
    Link
    I just blocked the instance from 3 servers I admin.

    I just blocked the instance from 3 servers I admin.

    15 votes
  3. [10]
    Bullmaestro
    Link
    I don't know what Mastodon honestly expected when they decided to release an open source microblogging platform with an emphasis on self-hosted servers where each isolated community determines its...

    Gab: Joins open decentralised social networking platform.

    Mastodon: surprisedpikachu.png

    I don't know what Mastodon honestly expected when they decided to release an open source microblogging platform with an emphasis on self-hosted servers where each isolated community determines its own moderation.

    9 votes
    1. [9]
      Micycle_the_Bichael
      (edited )
      Link Parent
      Literally my first thoughts when I read about mastodon: “the tech behind this is really interesting.... but it’s definitely going to be abused by Nazis and White Nationalists”

      Literally my first thoughts when I read about mastodon: “the tech behind this is really interesting.... but it’s definitely going to be abused by Nazis and White Nationalists”

      9 votes
      1. [8]
        mftrhu
        Link Parent
        Meh. The tech has been around for literally a decade, first with Identi.ca and then with GNU Social, which was what used to form the backbone of the Fediverse before Mastodon got publicized. GNU...

        Meh. The tech has been around for literally a decade, first with Identi.ca and then with GNU Social, which was what used to form the backbone of the Fediverse before Mastodon got publicized. GNU Social is also, imho, much easier to host than Mastodon, and much less resource hungry. Diaspora is about as old, and there are much, much better platforms around for them - I2P, Tor, GNUNet all give you anonymity on top of various microblogging implementations.

        They won't give you visibility, or a soapbox, but Mastodon - the Fediverse - isn't really doing that, either. You can set up your own server, sure, but most of the Fediverse is going to block you if you are a "free speech zone" - if you refuse to moderate your instance - let alone if you explicitly encourage hate speech, harassment, or host Nazis. They just end up shouting about "censorship" at each other, and that's not different from their other echo chambers.

        Reddit is a much better platform for them. Their content will bubble up to the front page, there is no way for subreddit mods to block users coming in from other subreddits, and even if there was creating a new one is trivial - less costly than buying or getting a domain and setting up a Mastodon server.

        12 votes
        1. [7]
          Douglas
          Link Parent
          Some subreddits have filters that auto-ban users who comment in other specific subreddits so as to prevent them from coming in. For example I believe if you so much as make one comment in...

          there is no way for subreddit mods to block users coming in from other subreddits

          Some subreddits have filters that auto-ban users who comment in other specific subreddits so as to prevent them from coming in. For example I believe if you so much as make one comment in /r/libertarian, conservative, or something similar to that, you're auto-banned from /r/fuckthealtright.

          This was a jarring experience for myself as I just meant to debate someone I'd seen talking about something, but was banned from some subs I enjoyed because of it. Fortunately getting off the blacklist was easy.

          and even if there was creating a new one is trivial

          You're right, as the Masstagger browser extension could barely keep up with it all. It may work if the list for Masstagger auto-fed into the same filter that banned users from participating in hateful subs.

          4 votes
          1. [6]
            Bullmaestro
            Link Parent
            A few other places like r/OffMyChest and r/2XChromosones run bots that automatically trawl a user's account history and auto ban them if they dare posted to any controversial subs like...

            A few other places like r/OffMyChest and r/2XChromosones run bots that automatically trawl a user's account history and auto ban them if they dare posted to any controversial subs like r/TheRedPill, r/KotakuInAction, r/TumblrInAction, r/The_Donald, r/ImGoingToHellForThis, etc. What surprises me more is that 2XC was a Reddit default sub in the past and they too are doing thhis.

            Funny thing is... running auto-banning bots is against the Reddit rules (Sections 4, 8 & 11 of Reddit's moderator guidelines) but the admins really couldn't give a shit about enforcing their own rules equally.

            That being said, I hope that Reddit ends up facing a decline soon as I really don't like how Spez and Kn0thing are running the site. Even right now I feel like they're one major scandal or bad site revamp away from losing users in a revolt.

            2 votes
            1. [5]
              alyaza
              Link Parent
              they've been 'one away' for half a decade or more, at this point, and they've come out fine. arguably, some of their scandals have had them come out better than they were previously by excising...

              That being said, I hope that Reddit ends up facing a decline soon as I really don't like how Spez and Kn0thing are running the site. Even right now I feel like they're one major scandal or bad site revamp away from losing users in a revolt.

              they've been 'one away' for half a decade or more, at this point, and they've come out fine. arguably, some of their scandals have had them come out better than they were previously by excising the dipshits nobody liked and exposing some of the systematic failures of the reddit system which they've subsequently patched over; people also need a place to go if reddit is going to die, and basically every reddit alternative is a crack den for the exact sorts of people who make reddit as a community garbage to begin with. it's bound to happen eventually as all websites are finite, but reddit to me is more likely to slowly bleed out than to violently and catastrophically extinguish itself. it's been through the ringer too many times for something like that to happen, especially with almost no viable alternatives or replacements to speak of.

              8 votes
              1. Micycle_the_Bichael
                Link Parent
                Yeah IMO if the reddit redesign that most mods and users hate and broke a ton of stuff didn't sink the site, realistically there won't be a "death blow" and instead will be, like you said, a slow...

                Yeah IMO if the reddit redesign that most mods and users hate and broke a ton of stuff didn't sink the site, realistically there won't be a "death blow" and instead will be, like you said, a slow bleeding out.

                1 vote
              2. [3]
                DevNull
                Link Parent
                "and basically every reddit alternative is a crack den for the exact sorts of people who make reddit as a community garbage to begin with" Really?

                "and basically every reddit alternative is a crack den for the exact sorts of people who make reddit as a community garbage to begin with"

                Really?

                1 vote
                1. alyaza
                  Link Parent
                  absolutely, yes. raddle is full of crazy people who were too far-left and off the rails for even the far-left subreddits (some of which unironically support places like the DPRK). voat is...

                  absolutely, yes. raddle is full of crazy people who were too far-left and off the rails for even the far-left subreddits (some of which unironically support places like the DPRK). voat is self-explanatorily bad because it's a haven for neo-nazis and pedophiles. saidit is an ideological shitshow which is so easy to manipulate it's not even funny, and its basis is absolutist libertarianism which has really never gone well for any social media site in human history. hackernews is maybe the one exception, but it's increasingly indistinguishable from the reddit community at large and it suffers from the same problems a lot of reddit does both ideologically and demographically on top of that. most of the others are so irrelevant as to not even matter, but inevitably suffer from the same issues as reddit because they're often literal clones of reddit or they get hijacked and turned into a pile of radioactive waste by whoever adopts them.

                  4 votes
                2. mftrhu
                  Link Parent
                  HackerNews has a much narrower focus than Reddit, and nonsense pops up pretty quickly when threads move to things like politics or medicine. Voat is the cesspit that we all know and love. Imzy...

                  HackerNews has a much narrower focus than Reddit, and nonsense pops up pretty quickly when threads move to things like politics or medicine.

                  Voat is the cesspit that we all know and love.

                  Imzy died after a year or two.

                  Raddle is very leftist - which I don't think is a bad thing - but it's also tiny, smaller than Tildes, which doesn't help with the drama. There has been a lot of it, to the point that even my mostly inactive ass noticed it.

                  Lobsters is pretty good - possibly because it's invite-only - but with a focus similar to HackerNews and with a much smaller community.

                  Notabug is an attempt at creating a decentralized, anonymous reddit alternative. Their approach is kind of interesting, they are using proof of work to avoid the usual issues with anonymous votes, but that also means that the website flat-out won't work without JS. Also, the chat for t/whatever just posted - CW for edgy homophobia/antisemitism - "Sounds gay and like it needs to be shot up with AIDS" "Saudis and Jews are evil" "jews did 911" "jews are gay"

                  Going through the /r/RedditAlternatives list, Snapzu doesn't seem to be bad, but there's also basically no activity on there.

                  As of right now, Saidit features

                  Brainwashing for All: Holocaust Education to be Mandatory in American Schools
                  submitted 22 hours ago by Venom from dailystormer.name

                  on its front page, sitting at +8, while the admin said, a month or two ago

                  We don't want this place to be as far right as voat, we kind of hope to hit more of a centrist middle-ground between reddit and voat. That's the group that doesn't really have a place to speak their mind, so that's why we created saidit.

                  At least three of the sites on that list are built around cryptocurrency of some kind (Sapien, Yours, SteemIt). The rest market themselves as free speech zones, are dead, and/or were just thrown up in an afternoon by some random, so there isn't a lot of choice here.

                  2 votes
  4. [2]
    DevNull
    Link
    I am disturbed by this thread and by how often I read people equating "far right winger" with "someone who promotes hate speech and violence" (or to be blunt, equating "right wingers" with racists...

    I am disturbed by this thread and by how often I read people equating "far right winger" with "someone who promotes hate speech and violence" (or to be blunt, equating "right wingers" with racists bigots and hateful people), and many just outright use "conservative" as the synonym.
    I do not promote or even tolerate hateful ANYTHING and I abhor violence in all it's forms.
    I also abhor the current trend towards this thing called political correctness and I despise social justice warriors who wish to dictate that i must agree with and support every small special interest group that gets it's feelings hurt or feels that a joke that offends them should be banned.
    I see the movie Idiocracy becoming reality in the US. And that's the one thing I do hate.

    1 vote
    1. alyaza
      Link Parent
      i honestly cannot name a single far-righter or far-right movement which didn't do one or both of "promote hate speech" or "promote violence", so i think it's pretty reasonable to equivocate those,...

      I am disturbed by this thread and by how often I read people equating "far right winger" with "someone who promotes hate speech and violence" (or to be blunt, equating "right wingers" with racists bigots and hateful people), and many just outright use "conservative" as the synonym.

      i honestly cannot name a single far-righter or far-right movement which didn't do one or both of "promote hate speech" or "promote violence", so i think it's pretty reasonable to equivocate those, not least because the primordial far right movement which spawned most modern far-right movements is nazism (which is predicated on genocide) and being far-right pretty much comes with both of those things on account of what the parlance "far-right" means in political science thanks to nazism. you might have a case with conservative, but increasingly conservative parties fold like a house of cards to far-right movements or ideas, either because far-right movements cannibalize their voters and they move to the right to try and win them back (as is happening in much of europe) or because far-right ideologues push the margins within the party itself and reconstitute the party's positions entirely (like the american republicans), so even that's not especially unreasonable of a comparison anymore for some conservative movements, and it doesn't show signs of getting better.

      5 votes