I want to make a defense of substack. I've read this article, Popehat's piece and the open letter from Substack writers and I am not convinced. Substack is not a "Nazi Bar". Voat, Parler, Truth...
Exemplary
I want to make a defense of substack. I've read this article, Popehat's piece and the open letter from Substack writers and I am not convinced.
Substack is not a "Nazi Bar". Voat, Parler, Truth Social and so many other sites which claim "free speech" are Nazi bars. But Substack genuinely has a wide range of interesting non-Nazi voices to read, and generally (in my experience) provided an outlet for quality discourse.
Substack doesn't provide a platform that isn't already available to Nazis. Unless you believe that core internet infrastructure providers should censor certain views--and I do not--Nazi websites will always exist. Unlike Musk's Twitter Substack doesn't force, or even suggest, any Nazi content onto users. Nazis have no more presence or reach then if Substack were to ban them.
Nazis will always be on Substack. You can't ban Nazis, it isn't possible. You can ban Nazi content but short of mind reading you cannot ban an anonymous user for views they hold. If Nazi content is banned, Nazis will dance just within the lines of what is allowed, suggesting ideas but never saying them. This is perhaps even more dangerous than explicit Nazism as it can serve to radicalise.
Effective moderation at scale will always cause collateral. If you want to root out Nazism, legitimate ideas will end up being censored. When Facebook banned Covid misinformation, allegations of a lab leak were censored alongside 5g conspiracies. Now the lab leak allegations are taken more plausibly, the question "if they censored this what else are they hiding" becomes a gateway into deep conspiracies. Today pro-Palestine content being caught up with attempts to filter antisemitism.
The metaphor of the Nazi bar is explicit that everyone in the bar doesn't have to be a Nazi, the bar just doesn't kick the Nazis out. Given that, it's definitely a Nazi bar. Why is it not an issue...
Exemplary
The metaphor of the Nazi bar is explicit that everyone in the bar doesn't have to be a Nazi, the bar just doesn't kick the Nazis out. Given that, it's definitely a Nazi bar.
Why is it not an issue that sexually explicit/sex work content isn't allowed? Why is it ok with to accept credit cards for Nazi propaganda but not erotica?
I'd personally feel a bit different about their "principled stance" if they were transparent about how much money Nazis made them, and demonetized them or at a minimum, donated their own proceeds. But they had a separate incident involving a racist guy on their podcast and though I'd have to look it back up to find the specifics, it was a similarly bad vibe. Deplatforming is effective. They'd rather make money off Nazis than avoid having "Nazi" and "Substack" in the same sentence. And that's their call.
Payment processors get really squeamish when NSFW content is involved because many countries have created regulations that require them to be as cautious as possible. If payments used for sex...
Payment processors get really squeamish when NSFW content is involved because many countries have created regulations that require them to be as cautious as possible. If payments used for sex trafficking or illegal content go through them, it starts creating serious liability issues. Moderating that kind of content is extremely tricky; slipping up even once can cost a company millions of dollars (it's not just a slap on the wrist).
I'm aware, it's still a point of hypocrisy. Beyond that I believe they ban erotica entirely, which based on Amazon's Kindle Unlimited selection is perfectly allowable by the payment processors....
I'm aware, it's still a point of hypocrisy. Beyond that I believe they ban erotica entirely, which based on Amazon's Kindle Unlimited selection is perfectly allowable by the payment processors. So, they truly don't have an excuse.
EDIT: In the interest of being correct, if Popehat's analysis is correct they do allow erotica, but also do prohibit a number of things that are legal but "harmful" including doxing and self-harm content.
They have things they don't want associated with their name or making money from it. Nazi content isn't that.
From my history as a merchant account payment processor I really see porn bans as an existential matter for anyone doing payment processing rather than a philosophical issue. It’s not hypocrisy if...
From my history as a merchant account payment processor I really see porn bans as an existential matter for anyone doing payment processing rather than a philosophical issue. It’s not hypocrisy if there is a legitimate and enormous business necessity.
This isn’t the reason payment processors get squeamish about sex. It’s because sex is a high risk, high chargeback sector. So everything is more expensive (eg. insurances don’t cover it, those...
This isn’t the reason payment processors get squeamish about sex. It’s because sex is a high risk, high chargeback sector. So everything is more expensive (eg. insurances don’t cover it, those that do are more expensive).
There are payment processors that are ok with sex. Those payment processors are more expensive.
it's a mix of both. it's only "high risk, high charge-back" because our society shames sexuality and nudity. You don't chargeback a bad DvD nor video game, why some sex video?
it's a mix of both. it's only "high risk, high charge-back" because our society shames sexuality and nudity. You don't chargeback a bad DvD nor video game, why some sex video?
This actually isn’t what’s driving chargebacks of adult content. The main driver is deeply deceptive billing practises on the part of adult website owners. Cross selling and negative option...
This actually isn’t what’s driving chargebacks of adult content. The main driver is deeply deceptive billing practises on the part of adult website owners. Cross selling and negative option billing being the worst offenders. Unfortunately merchant account coding is the blunt instrument used to deal with this, which means you have to do everything under your power to avoid getting reclassified.
if sexual material didn't range from hardcore fetish acts to a female nipple (depending on country and culture), there may be some merit to being that cautious. But given that even the Supreme...
if sexual material didn't range from hardcore fetish acts to a female nipple (depending on country and culture), there may be some merit to being that cautious. But given that even the Supreme court decades ago couldn't properly articulate a proper line of obscenity, it seems more like a blanket ban on anything involving nudity instead of a proper way to protect victims of sex trafficking.
My comment is only regarding why payment processors get uncomfortable around sexual content. I agree the current approach is bad. In my opinion, many laws which have been passed to "protect...
My comment is only regarding why payment processors get uncomfortable around sexual content. I agree the current approach is bad.
In my opinion, many laws which have been passed to "protect children" are doing more harm than good. I hate puritanical reactionaries and their vendetta against pornography.
Eh. I wouldn't disagree entirely...there's a lot of bad law passed in the name of "protect the children." Payment processing laws are problematic way outside the scope of just sex. I think some of...
Eh. I wouldn't disagree entirely...there's a lot of bad law passed in the name of "protect the children." Payment processing laws are problematic way outside the scope of just sex. I think some of the way the Under-13 laws are written are also problematic, though I generally agree with the philosophy there. I think proper mandatory reporting laws WRT user interaction would do much more. I'm starting to come around to the idea of mandatory identification for any site that caters to people under 16.
However, "I can't look at tits on the internet because it's very difficult to moderate user-submitted sexual content properly" is hardly doing more harm than good IMO.
Maybe if there wasn't so much sexualization online, we'd have a healthier stance of it in the real world. I'm not saying there isn't a place for it online, but I do kind of agree that it should be more-seperated than it often is. It's been much harder to find games on Steam since they opened the floodgates. I had to filter out dozens of tags to hide most of the thinly-veiled porn, and many legit games like Witcher 3 get caught up. There's a huge difference in how these games are marketted...even some of the most violent games are still fairly kid-friendly stills that I can browse past on the TV.
I'm not opposed to people playing thinly-veiled porn games. But I do want that content properly isolated from non-sexual content, for the same reason I don't shit in my shower. Appropriate spaces for appropriate activities.
Less porn online, more porno orgy BDSM theaters where they can check ID at the door and bounce the rapists.
I don't think the bulk of Western history bears this hypothesis out. We certainly did not have healthier stances on sex and sexuality prior to the internet.
Maybe if there wasn't so much sexualization online, we'd have a healthier stance of it in the real world.
I don't think the bulk of Western history bears this hypothesis out. We certainly did not have healthier stances on sex and sexuality prior to the internet.
I think in this particular respect porn has both positive and negative aspects. I think it balances out to "neutral but complicated". I'm from a very religiously conservative background, though,...
I think in this particular respect porn has both positive and negative aspects. I think it balances out to "neutral but complicated".
I'm from a very religiously conservative background, though, so that definitely influences my perspective on it. I'm the only one of my siblings who's cohabitated with someone prior to marriage, and my younger sisters both got married before they could legally drink (whereas I married at the ripe old age of 25 for the more traditional reason of "wow would it be easier to sort out that EU visa if we got married instead of just living together").
The aspects that I think generally tip it towards the 'more bad than good' boil down to: Unrealistic expectations about what sex is like. Fetishizing the young (see how prominant and popular...
The aspects that I think generally tip it towards the 'more bad than good' boil down to:
Unrealistic expectations about what sex is like.
Fetishizing the young (see how prominant and popular "barely legal" through 21 is). I think this stokes the fuels of the fetishization of adolecents.
Fostering sexuality via anonymity. This one is a complicated thought, and I'm probably not explaining myself well. It's considered OK to jerk off to a video of sex on the internet (reasonable). It's creepy AF to download non-sexual pics/vids from friend's social feeds for jerkoff material. But I've encountered a few too many people who think it's OK to download non-sexual pics/vids from people they don't know...because they don't know that person. And this is the kind of otherizing behavior that makes catcalling an acceptable behavior to assholes.
And this one is also a bit old-person...but I feel given my last point... we should encourage people to explore their sexuality via personal connections more than (if not already, soon) an algorithm trying to monotize you.
I'm certainly not calling for a ban. But I'd like to see it relegated strictly behind age-validated logins, preferably with an equitable pay model.
It would be, I think, a net positive to cut off teenagers from porn. But I think it would be helpful to have proper sex education videos (that the conservatives would definitely class as porn) that is accessible to teens over 13 though. Showing what healthy sex and sexuality looks like.
Good luck selling the "We're cutting off teenagers from all but good porn" campaign though.
Overall I think most of the issues you describe predate the widespread proliferation of porn via the internet and that the internet just makes these human behaviors more visible than they were...
Overall I think most of the issues you describe predate the widespread proliferation of porn via the internet and that the internet just makes these human behaviors more visible than they were before. Other than algorithms potentially monetizing you (something I honestly think is a much bigger problem OUTSIDE of the sphere of sexual content), I can think of examples of these things back to Classical times. I think it's more sensible to focus on making sure that everything is well-labeled, safe from people who don't want to see it/aren't looking for it, and consensual for all parties involved, rather than trying to prevent teens who want to view porn from viewing porn. The latter is a futile exercise and has always been, and preventing access to porn would only increase creepy things like masturbating to non-sexual content. I think the fundamental issues re: sex and sexuality are cultural ones that long predate and are largely independent from the existence of the internet as a place to very easily find porn.
I think I like Popehat’s take on the Nazi bar analogy with Substack. I don’t know where I stand on the issue. I think there’s a difference between banning a user because they’re a Nazi, and...
With respect to my friend Mike Masnick, I think that makes it [Substack] a bit less like his Nazi bar analogy and more like a Nazi-tolerant banquet hall. You can have your niece’s quinceañera or your parents’ 50th anniversary there and probably won’t feel much of an impact from the fact that they’re always booked solid on April 20 unless you think about it.
I don’t know where I stand on the issue. I think there’s a difference between banning a user because they’re a Nazi, and banning a user because they post Nazi rhetoric, and I acknowledge that it’s probably a fine line that’s hard to define - especially with the constant and ever-changing dogwhistles.
Banning a user from the platform for promoting Nazi ideology on the platform I can understand as banning someone for hate speech. I personally would be okay with banning someone from the platform for posting Nazi rhetoric on a different platform, but I can understand not doing so.
Well, no, in the metaphor, a bar eventually becomes a nazi bar due to a series of events that starts with not kicking out nazis. It's not like if you have a nazi or two in your bar, you're...
The metaphor of the Nazi bar is explicit that everyone in the bar doesn't have to be a Nazi, the bar just doesn't kick the Nazis out. Given that, it's definitely a Nazi bar.
Well, no, in the metaphor, a bar eventually becomes a nazi bar due to a series of events that starts with not kicking out nazis. It's not like if you have a nazi or two in your bar, you're instantly a nazi bar.
Currently, substack is not a nazi bar. Most people there are not nazis. If the metaphor's theory holds true, eventually, more nazis will come there, which will edge out normal people, and eventually no normal person would ever tolerate being there because of how many nazis there are, and the only people left will be nazis; hence, nazi bar.
In the original story (on Reddit transcribed from the tweets the bartender points out that you cannot have even a single one hanging around, no matter how polite an individual and immediately...
In the original story (on Reddit transcribed from the tweets the bartender points out that you cannot have even a single one hanging around, no matter how polite an individual and immediately kicks the guy out because the next step is they bring their friends. If you don't kick out Nazis 1 and 2 you end up with Nazis 12 and 25.
you have to nip it in the bud immediately. These guys come in and it's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.
Substack is just at the point where everyone is noticing the Nazis and complaining to the bartender. Who has decided that it's better not to make decisions about whether Nazis are allowed or not and he's perfectly willing to accept the Nazis money.
If you want to say that it's not technically a Nazi bar yet, ok sure, whatever, this is all a metaphor anyway. A bar that tolerates and lets Nazis hang out pay to reserve the back room for weekly Nazi night is not, in my mind, significantly better or more principled. Because I don't want to be in a bar with Nazis, like as a general rule. And if the bartender is letting them stay, then this isn't a bar I want to be in.
The metaphor isn't about principles, it's about practicality. That's the whole point. Whether you agree with letting people say whatever they want to say regardless of how distasteful you find it...
The metaphor isn't about principles, it's about practicality. That's the whole point.
Whether you agree with letting people say whatever they want to say regardless of how distasteful you find it is an entirely separate issue.
The point of the Nazi bar metaphor is that if you don't nip it in the bud, eventually, your space will become a haven for distasteful people, and those will be the only people who come there. Substack hasn't gotten there yet, so thus it's not a nazi bar. Whether you're okay hanging out in a place where a few nazis hang out is an entirely different conversation.
So they're not nipping it in the bud, meaning it's gonna become... The practicality is that you toss them out and don't engage in arguments about fairness upfront. Or you become a .... Like I said...
So they're not nipping it in the bud, meaning it's gonna become...
The practicality is that you toss them out and don't engage in arguments about fairness upfront. Or you become a ....
Like I said if you want to argue that it's not quite there yet, ok, sure whatever. It's a Nazi Bar to Be. It's a bar really dedicated to making sure everyone, even Nazis, are welcome. Whatever.
That is still bad. The practicality is that you should remove them immediately. So is the principle.
It's a Nazi bar because they aren't banning Nazis. Substack is not a neutral pipe. They're already moderating various content, and intentionally not banning Nazis. You can't ban what's in...
It's a Nazi bar because they aren't banning Nazis.
Substack is not a neutral pipe. They're already moderating various content, and intentionally not banning Nazis.
You can't ban what's in someone's heart, but you can ban their content. Having them have to dance around what they say is so, so much better than allowing them to speak it openly. The two aren't equivalent situations.
Yes, but I think I'll be ok with some plausibly-but-not-actually Nazi content being banned along with the actual Nazis. In fact, that solves a lot of point 3 as well.
Substack’s hate speech policy says “Substack cannot be used to publish content or fund initiatives that incite violence based on protected classes. Offending behavior includes credible threats of physical harm to people based on their race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability or medical condition.” That’s ambiguously broader than the First Amendment standard, under which unprotected incitement is only speech intended and likely to cause imminent lawless action. It’s also deliberately open-ended, since it says “offending behavior includes.” It means whatever Substack chooses it to mean.
Substack prohibits what’s commonly called “doxxing.” “You may not publish or post other people's private information (such as a personal phone number or home address) without their express authorization and permission. We also prohibit threatening to expose private information or incentivizing others to do so.” That’s fine. That’s Substack’s choice. But the First Amendment generally protects that conduct. Is it bad to find a terrible person and post their phone number so people can call and denounce them? Maybe. That’s a value judgment.
Substack prohibits “harmful” activities: “We don’t allow content that promotes harmful or illegal activities, including material that advocates, threatens, or shows you causing harm to yourself, other people, or animals.” That’s vastly broader than the First Amendment exception for incitement. I think Substack is aiming here at stuff like crush videos and people who promote eating disorders, but those are protected by the First Amendment.
Finally, the big one: “We don’t allow porn or sexually exploitative content on Substack, including any visual depictions of sexual acts for the sole purpose of sexual gratification. We do allow depictions of nudity for artistic, journalistic, or related purposes, as well as erotic literature, however, we have a strict no nudity policy for profile images.” This is perfectly fine. It’s within Substack’s First Amendment rights to choose this policy. It’s a choice about branding, about ease of moderation, and about vibe.
It is illegal to allow child porn on your site. Full stop. If you are not validating the adult content and sex work on your site, you can be charged for and child porn or sex trafficking that...
It is illegal to allow child porn on your site. Full stop.
If you are not validating the adult content and sex work on your site, you can be charged for and child porn or sex trafficking that occurs on it.
Basically every other rule falls under a similar subset of “you can and will be charged for this”
Hate speech is not in any jurisdiction they are in. This is why they have the incitement to violence clause/protected classes stuff. Hate speech inciting violence is illegal and they can be found...
Hate speech is not in any jurisdiction they are in. This is why they have the incitement to violence clause/protected classes stuff. Hate speech inciting violence is illegal and they can be found liable for not policing it. Hate speech such as “x are terrible and the reason everything is wrong with the world “ is not illegal
Substack is available in Germany, which absolutely has legal requirements for websites to remove certain Nazi content from their site (as viewed in Germany) and to have easy ways of reporting it....
Substack is available in Germany, which absolutely has legal requirements for websites to remove certain Nazi content from their site (as viewed in Germany) and to have easy ways of reporting it. People who know Substack's legal requirements in the US far better than you (e.g., the American lawyer who goes by Popehat and who posted the quoted portion of the comment you replied to) have pointed out that their moderation far exceeds their legal requirements under US law, but it also is just patently untrue that hate speech isn't regulated in "any jurisdiction they are in" when it absolutely is.
They are a US based company was more my point, but yes Germany is a thing, and I'm curious to see how they handle this. To my knowledge, they've done nothing so far.
They are a US based company was more my point, but yes Germany is a thing, and I'm curious to see how they handle this. To my knowledge, they've done nothing so far.
okay? And not all adult content is CSAM. Consumption of legal adult content do not lead to a desire to seek nor consume CSAM. Meanwhile, statements like "X is horrible and the reason why the world...
okay? And not all adult content is CSAM. Consumption of legal adult content do not lead to a desire to seek nor consume CSAM.
Meanwhile, statements like "X is horrible and the reason why the world sucks" has historically lead to incitements of violence and then actual violence. And that is why many platforms don't let it creep up. It's the same line of reasoning except history points to one topic being much more a of an issue.
The thing is: platforms can't fuck up with CSAM even once. If sexual material of a 17 year old was sent to an email list through Substack, they've opened themselves up to huge liability issues....
The thing is: platforms can't fuck up with CSAM even once. If sexual material of a 17 year old was sent to an email list through Substack, they've opened themselves up to huge liability issues.
Whether it makes sense or not, laws around sexual content are extremely strict and punishing in many countries. Racism isn't regulated the same way as sexual abuse.
sounds like a "simple" filtering issue on all media, then. Which they already have. If we're being realistic: that content is probably already captured and pre-screened far before it ever goes...
If sexual material of a 17 year old was sent to an email list through Substack, they've opened themselves up to huge liability issues.
sounds like a "simple" filtering issue on all media, then. Which they already have.
If we're being realistic: that content is probably already captured and pre-screened far before it ever goes into production. You can fast track some trusted, long good standing accounts, but any uploads from anyone else needs to be checked anyway. Be it illegal or "legal but against the rules" content.
Having a blanket ban is one of convenience, not necessity. One that discourages, but does not eliminate the need for tools already being utilized. I normally wouldn't call out such decisions as lazy (Tildes does the exact same thing, after all), but I will be much more scrutinous towards a company that claims to be "supporting individual rights and civil liberties" while allowing for ideas that have historically quashed said rights and liberties (and lives, many lives).
I knew this would be the first thing the moment I said it and the short response to that is allowing adult content in a hand off moderation strategy is legally not possible. If you allow ANY adult...
I knew this would be the first thing the moment I said it and the short response to that is allowing adult content in a hand off moderation strategy is legally not possible.
If you allow ANY adult content you have to be ready to jump through about a million hoops or risk being arrested for distributing child porn.
They have to do that anyway? If someone was going to post something that nasty, they’re already willing to break rules far more severe than anything Substack enforces, so I don’t think a “no porn”...
They have to do that anyway? If someone was going to post something that nasty, they’re already willing to break rules far more severe than anything Substack enforces, so I don’t think a “no porn” rule will stop them. Substack still has to watch out just as much.
Surely you also have to remove CSA content even if you ban adult erotica (and it would not be trivial to moderate regardless of whether adult erotic content is allowed).
Surely you also have to remove CSA content even if you ban adult erotica (and it would not be trivial to moderate regardless of whether adult erotic content is allowed).
You do but it's much much harder than people think if you allow any sort of adult content. There's a whole lot of "i have no personal experience with this, but I think it should be easy" in this...
You do but it's much much harder than people think if you allow any sort of adult content. There's a whole lot of "i have no personal experience with this, but I think it should be easy" in this thread and it just isn't.
If you allow real human adult content, ESPECIALLY in the just entered age of AI image nonsense, you will have an entire department dedicated to running it, because to do anything less it risk actual jail time, not just the usual slap on the wrist fines.
The difference between "we ban anything that even looks like adult content" and "we will allow legal adult content" is massive. The first one is much easier to accomplish because there's no wiggle room. You ban anything that looks like it could cause issues, done. The second will require verifying literally every account and image, and NOT doing that is why pornhub had to nuke millions of videos.
It's funny that you say there's a lot of "have no personal experience with this, but I think it should be easy" here, because I work as a data scientist training classifiers to detect...
It's funny that you say there's a lot of "have no personal experience with this, but I think it should be easy" here, because I work as a data scientist training classifiers to detect inappropriate content. Granted, we're in a different domain, so "inappropriate" includes decidedly more boring things that are non-compliant with financial regulations in addition to obvious stuff like sexual content. I personally have scraped comments from 4chan and labelled them to use as training data for our sexual harassment classifier (was not a fun task btw). So while I don't run a company like Substack, I've definitely got some experience with the technical side of detecting content like this.
Detecting sexual content or even just nudity in photos is way harder than you think it is. iirc shortly before I started at my workplace our team attempted and gave up on implementing a nudity detector for videocalls because it wasn't reliable enough and had too many embarassing false positives. It's a MUCH MUCH harder task when you include non-photographic images like drawn art. Not to mention text, though Substack's rules don't seem super clear about how heavily they moderate textual erotica. A company like Substack would not be able to effectively moderate adult content on their platform solely through automated means, even if they took the most hardline stance against anything even adjacent to nudity possible (and they're not doing that).
The long and short of it is that Substack is going to need similar levels of vigilance and non-automated review of content to prevent CSAM from being hosted on their platform regardless of how much adult content they actually allow. The government certainly isn't going to care if legal adult content is allowed on the site or not if they find CSAM. Not allowing legal adult content might make the job slightly easier for their human moderators by making it less of a judgment call (though their leeway on artistic nudity undoes that imo), but that doesn't really tranelate to much difference in terms of their legal risk or even the cost of moderation.
I think you're misunderstanding my position. It is relatively easier, not actually easy, and again, it's the massive legal portion of this not so much the tech portion. To host adult content they...
Detecting sexual content or even just nudity in photos is way harder than you think it is
I think you're misunderstanding my position. It is relatively easier, not actually easy, and again, it's the massive legal portion of this not so much the tech portion.
To host adult content they can only automate verification so much or wind up potentially liable for any CSAM/Trafficking/etc. If the stance is ALL adult content is a violation of the sites policies they instantly save themselves millions in dealing with payment processors and law enforcement entities. They will still have to police these things and pass on information as required, but the burden, expectation, and audit is much much lower.
At the bare minimum you don't have the huge problem of people seeing such content and not reporting it, because they think it's allowed, and the issue of attempting to verify every single user who's putting up adult content.
If i say adult content is allowed, let someone post photos, and DON'T verify who those photos are of and if they're of legal age and verify the user who posted them, i'm in a world of trouble. So now I must verify every single user to a much higher standard and police all content they post
If I say adult content is not allowed, and someone posts photos, and I never verified them, that's just one more way they're seen as breaking the rule. The obligation of verification was never put on me in the first place because i never said i'd be hosting that kind of content anyways. I still need a reporting system and some check of images hosted, but as adult content isn't expected to be hosted i'm not forced to verify every single account, and the standard to which I must be negligent to be liable is much much higher.
Finally, not a lot of people trying to hide this kind of content in places that don't allow adult content, especially since there's SO many that do allow adult content in legal grey areas. If you just straight up disallow the content it's going to make your burdeon much easier because no one is accidentally going to share that kind of content either (which happens waaaaaay more than people think).
There's so much more to this than just the programmatic classification
Some amount of non-Nazi content is already being banned. So the choice is ban Nazi content as well as the other stuff, or don't ban Nazi content while banning some other things. As such, the...
Some amount of non-Nazi content is already being banned. So the choice is ban Nazi content as well as the other stuff, or don't ban Nazi content while banning some other things. As such, the question is just: Does Substack want Nazi content on their site, and the answer they're giving is "Yes."
I think you missed the whole point of the Nazi Bar thing. At first there's only one or two well-behaved Nazis. But over time, the other people leave, because they don't want to hang out around...
Substack is not a "Nazi Bar". Voat, Parler, Truth Social and so many other sites which claim "free speech" are Nazi bars. But Substack genuinely has a wide range of interesting non-Nazi voices to read, and generally (in my experience) provided an outlet for quality discourse.
I think you missed the whole point of the Nazi Bar thing. At first there's only one or two well-behaved Nazis. But over time, the other people leave, because they don't want to hang out around Nazis, and more Nazis come in, and once there's enough of them they get a whole lot less well-behaved.
Right now, Substack is a place that also has Nazis, but with this move they'll inevitably become Substack, the Nazi place. The majority of all those other voices you find interesting will leave over time, either of their own volition, because people really hate hanging out around fucking Nazis, or because the Nazis will actively push every other view point out as soon as they feel like they can.
I can't speak to the other sites, but I was among the people that abandoned Reddit 6 or 8 outrages ago, and joined (among other places) Voat. When it was brand new, it was nice. It was created and...
I can't speak to the other sites, but I was among the people that abandoned Reddit 6 or 8 outrages ago, and joined (among other places) Voat. When it was brand new, it was nice. It was created and run--apparently, at least--by a reasonable, idealistic person who simply believed in unfettered freedom of speech.
And, yeah, exactly this happened to it. Nazis, trolls, right-wing conspiracy nuts crept in, everyone else left.
Always like pulling out this passage to describe the issue. I had similar views on a more lazzeis faire moderation before I read through that and realized that such an idealism, especially for a...
Always like pulling out this passage to describe the issue.
"The moral of the story is: if you’re against witch-hunts, and you promise to found your own little utopian community where witch-hunts will never happen, your new society will end up consisting of approximately three principled civil libertarians and seven zillion witches. It will be a terrible place to live even if witch-hunts are genuinely wrong." - Scott Alexander
I had similar views on a more lazzeis faire moderation before I read through that and realized that such an idealism, especially for a new growing site, was simply that: an ideal. The content, normal citizens won't move off their reddits or twitter, so the entire exercise is one of response bias; people frustrated enough to want to move (who will disproportionately be witches, if you ban witch hunts). You can only really oppose that with strict moderation, and that is exactly why Tildes didn't become another Voat..
I would modify your statement to say that a lassiez faire approached moderation does work in the rare circumstances that either the entire market is growing, or the established competitor has a...
I would modify your statement to say that a lassiez faire approached moderation does work in the rare circumstances that either the entire market is growing, or the established competitor has a sudden and complete collapse. Both circumstances lead to a situation where the influx of new users dilutes the Nazi presence to a degree that they end up sequestered to their own hell holes.
If the collapse or growth stalls too soon, you end up with the evaporative cooling of Voat. Trying to pull them out after they’ve taken root leads one down a road of Reddit with inane rules meant to chase dogwhistles as well as opens the door to a petulant insistence on the equal application of rules that kills off advertising-unfriendly fun zones—but that doesn’t root out the problem. At least from my understanding, racist rhetoric was way worse 10 years ago there, but it didn’t inspire real-world violence until after it got put into dogwhistles. One of the parent comments has a point that not only is it easier to radicalize the normies through dogwhistles. Personally, I additionally feel like there is a scale and organization to modern online racism that wasn’t present in 2012. When it’s the same four users spamming slurs to each other, it’s not worth the hassle to investigate anything. Once numbers grow to the millions, the threshold for rhetoric that could inspire stochastic violence, intentionally flamed or otherwise, drops massively.
I’ve seen this phenomenon play out on the internet several times over the years. It doesn’t even take extreme stuff like white supremacy — if nastiness is allowed in a space, that space will...
I’ve seen this phenomenon play out on the internet several times over the years. It doesn’t even take extreme stuff like white supremacy — if nastiness is allowed in a space, that space will inevitably devolve into a cesspit as the nasty (of whatever flavor) attract their own kind and push everybody else out. Moderation is an absolute must because without it, communities will fester and decay. Turns out that most reasonably well adjusted people prefer to spend their time around other reasonably well adjusted people.
The is an imagined scenario of what might happen, but it doesn't seem to have happened yet for Substack, for me anyway. I've seen no sign of anything bad on Substack (on the blogs I read and...
The is an imagined scenario of what might happen, but it doesn't seem to have happened yet for Substack, for me anyway. I've seen no sign of anything bad on Substack (on the blogs I read and occasional glances at Notes) other than these articles calling it out.
So it's more like knowing that there are Nazi websites on the Internet, somewhere, than having them in the room. Okay but don't link to them?
Substack is getting a reputation problem, but the direct cause (for now) is people calling this stuff out. That's a different problem from the Nazis themselves causing a reputation problem from their own interactions with other people.
So for me it's still "wait and see." Maybe the owners should be more preemptive about something that might happen to their website, but as a reader I can deal with it later. I still have favorite blogs hosted on Substack and I will keep reading and linking them, because there's no sign of trouble on those blogs and I would miss reading them if I stopped.
(I even read one blog, Marginal Revolution, that's not on Substack but has a terrible comment section. I don't know that it's actual Nazis, but I avoid reading the comments and I avoid linking directly to it on Tildes. It's still a good source of links to other websites, which sometimes I share here.)
I also have an inactive blog on Substack. It gets zero comments so it's very easy to moderate. Moving it would be more trouble than it's worth.
More generally, I think people are overly concerned with imagining how things might get worse in situations when there's little reason to plan ahead. For any website, there's some chance that it will go downhill and end badly. Leaving early on rumors when you're having a good time seems like a way of missing out?
If people finding out what Substack is doing causes a reputation issue, the problem isn't with the whistleblowers. Platforming Nazis is a bad thing in and of itself, even if the Nazis are...
Substack is getting a reputation problem, but the direct cause (for now) is people calling this stuff out. That's a different problem from the Nazis themselves causing a reputation problem from their own interactions with other people.
If people finding out what Substack is doing causes a reputation issue, the problem isn't with the whistleblowers.
Platforming Nazis is a bad thing in and of itself, even if the Nazis are well-behaved (for now), because doing so leads to more Nazis. Increasing their reach increases the number of people vulnerable to that ideology who will see them. The farther they are from the fringes of the internet, the more legitimate their position appears, and the easier it gets for random people to stumble across them. One of the most effective measures for dealing with an infection is quarantine.
And I invite you to consider how it feels to be a member of one of the many groups that Nazis hate. Over there, in the same room but sitting quietly in a little square of tape is someone who wants to murder you. They will do anything they can to gain power until they can murder you, and everyone like you. Do you see how that might make members of that group feel unwelcome or uncomfortable on the platform, even if the section of the room over here is fine? Quite a lot of marginalized people are driven away when you allow the voices of the people who want to kill them equal weight as their own, to the detriment of people and platform both.
For many people, the best time to get off a sinking ship is now, even if they could potentially swing by the buffet real quick before leaving. If you want to stick around to get as much as you can out of the experience, that's entirely within your rights. But many people will want to exit before any unpleasantness, and it'll be much more difficult to bring in anyone new.
Maybe you're right that people are panicking early, and Substack will turn things around somehow. But I've never even heard of a place getting better after adding Nazis into the mix, so personally I'm going to give it a wide berth.
Nobody needs a reason to stop using a website. It could be just on vibes. I’m explaining why I don’t find other people’s arguments persuasive after thinking about them a bit. In particular, I’m...
Nobody needs a reason to stop using a website. It could be just on vibes. I’m explaining why I don’t find other people’s arguments persuasive after thinking about them a bit. In particular, I’m uncomfortable with argument by analogy when the analogy is vivid, but doesn’t work very well. The “Nazi bar” thing is a meme that’s going around, and I don’t think it really gets at what’s going on, at least for Substack.
I think you’re leaning too much on scary metaphor. There are no scary possible murderers in the same room as you on the Internet. There’s no physical proximity. Using the Internet isn’t much like riding a subway or a bus, or sitting in a bar.
There are real problems with harassment on the Internet and sometimes people organize harrassment or worse in the real world, but this doesn’t seem to be what the conversation around Substack’s terms of service is about?
It seems very apt to me, without resorting to arguing over the semantics of the statement. What do you think is "really going on"? As far as I see it the issue people have with the terms of...
The “Nazi bar” thing is a bad meme that’s going around that doesn’t really get at what’s going on.
It seems very apt to me, without resorting to arguing over the semantics of the statement. What do you think is "really going on"?
but this doesn’t seem to be what the conversation around Substack’s terms of service is about?
As far as I see it the issue people have with the terms of service is that, in as many words, they don't debar Nazis from using Substack. I don't think there's any need to go into why, given the examples in the Atlantic article of "Nazis" on Substack and the very real, very recent effects of dehumanising rhetoric online, that is a bad thing. The problems being spoken about arise very directly from Substack's actions (via the Nazis they continue to platform). So the conversation seems clear to me - or am I misunderstanding?
Okay, here's what I think is going on, avoiding the use of metaphors. According to the Atlantic article, Nazis are using substacks to organize and fundraise. That's definitely very bad, and I...
Okay, here's what I think is going on, avoiding the use of metaphors.
According to the Atlantic article, Nazis are using substacks to organize and fundraise. That's definitely very bad, and I don't know why they don't do more about it.
Some people think of this as reason enough not to use Substack. Fair enough, I consider this similar to not wanting to go to Chick-fil-A because they support conservative causes.
I think the "Nazi bar" metaphor started out being about how moderators should kick out some participants to avoid making things worse for everyone participating and getting a bad reputation. But bloggers who use Substack seem to have the tools they need to moderate comments to their blogs? Perhaps substacks are getting confused with subreddits?
In some sense the confusion comes from Substack trying to have it both ways - they mostly provide infrastructure to writers and stress their writers' independence as separate businesses, but they also heavily promote their own brand and try to cross-promote writers with things like Notes. When their brand gets tarnished then it can affect bloggers who use their infrastructure. For a website using some lower-level service provider like AWS or Google Cloud, the name of the service provider doesn't appear anywhere on the website and saying "that's a different website; nothing to do with us" works a lot better.
I think the bar analogy is useful and mostly correct when dealing with very social sites like Reddit. When the bad users sort of leak into the other communities. I remember the Star Trek subreddit...
I think the bar analogy is useful and mostly correct when dealing with very social sites like Reddit. When the bad users sort of leak into the other communities. I remember the Star Trek subreddit got pretty racist at one point when Reddit was very loose with the moderation.
I am not sure the analogy quite fits Substack. At least not to the same degree. Individual substack users are not really part of the same global community. I get a single newsletter in my email inbox and I see nothing else. Interact with no one and don't see any recommendations to other substacks to follow. At least that is how I use it. Sort of like I could subscribe to a WordPress hosted blog and while some nazi blogs might also be running WordPress, things are still fairly seperated.
To stay in the bar analogy, this issue is more akin to my bar serving the same brand of beer as the nazi bar down the street.
I do still think Substack is doing something highly criticizable here by knowingly profitering and hosting racist content. It is not unreasonable to leave the platform because of this. I just think it is somehow different than other social websites being ruined by the nazi bar effect and it is a bit of stretch to predict the same with Substack. The effect is negative but in a different way.
I really don’t think they did. The Nazi bar argument is lazy and doesn’t track with reality. There are areas like 4chan which host a wiiiiiiiide variety of content and calling anything trying to...
I really don’t think they did.
The Nazi bar argument is lazy and doesn’t track with reality. There are areas like 4chan which host a wiiiiiiiide variety of content and calling anything trying to solve the moderation problem in a hands off way “a Nazi bar” just lumps so much good with the bad in a reductive way
Yes many people who don’t use things have opinions on them but it’s often not accurate in my experience. I also don’t use 4chan but I’m not sure I’ve heard more hate speech or gay furry porn from...
Yes many people who don’t use things have opinions on them but it’s often not accurate in my experience.
I also don’t use 4chan but I’m not sure I’ve heard more hate speech or gay furry porn from the users I’ve known who do
The issue with 4chan isn't that "all 4chan users are bigots", but rather "bigotry is omnipresent". I tried /tg/ a few years ago because I liked TTRPGs, and basically every time anything to do with...
The issue with 4chan isn't that "all 4chan users are bigots", but rather "bigotry is omnipresent".
I tried /tg/ a few years ago because I liked TTRPGs, and basically every time anything to do with women, LGBT people, or non-white people came up there were plenty of people dropping slurs or making disparaging comments. It was bad for my mental health so I left.
It's a hostile environment, because that's the community it has fostered. People there might usually not be bigots, but they also largely don't really care about the kinds of bigotry there.
It's not about all the individual users. It never is. Not with Reddit, not with 4chan. Hell, not with Tildes. It's the environment and what is tolerated. You don't avoid the metaphorical Nazi bar...
It's not about all the individual users. It never is. Not with Reddit, not with 4chan. Hell, not with Tildes. It's the environment and what is tolerated.
You don't avoid the metaphorical Nazi bar because every person there is a Nazi, you avoid it because the Nazis are allowed to hang out there. @jess said it as well, the bigotry becomes omnipresent. And much in the same way that middle school kids pick up "gay" as a n equivalent of "stupid" from each other, it's part of why casual homophobia and racism and the like are so prevalent online. I'm sure 4chan has its good spaces but I never felt safe there. See also AOL chatrooms. And other wretched hives of scum and villainy.
in reality a couple of bad apples ruin the barrel. in reality, the vast majority of people do not commit violent crimes, yet every 1st world country has a prison system for the minority of bad...
The Nazi bar argument is lazy and doesn’t track with reality.
in reality a couple of bad apples ruin the barrel. in reality, the vast majority of people do not commit violent crimes, yet every 1st world country has a prison system for the minority of bad apples (with various degrees of effectiveness).
There may be some minimal positive benefit to hanging around a nazi bar, but life is short and why take a risk? At best you are annoyed and at worst you are killed. Not worth the attempt. if you can identify and avoid it.
calling anything trying to solve the moderation problem in a hands off way “a Nazi bar” just lumps so much good with the bad in a reductive way
Likewise, I don't think "solving the moderation problem" is a good rationale to allow hate content on a site. Nor to skirt the line. I guess giving up is indeed a solution, but it's not like they are even doing that:
Our content guidelines do have narrowly defined proscriptions, including a clause that prohibits incitements to violence. We will continue to actively enforce those rules while offering tools that let readers curate their own experiences and opt in to their preferred communities. Beyond that, we will stick to our decentralized approach to content moderation, which gives power to readers and writers.
It's less removing all lines and more drawing the most fragile of lines in the sand and believing that they will step in before things go too far. Their official statements don't give me much faith that they will indeed step in.
So, to be clear about this, it's kinda not. Or at least, that's not what prophet is asking for. Prophet is asking for them to ban what he deems to be Nazi content, which to quote, is I don't know...
So, to be clear about this, it's kinda not.
Or at least, that's not what prophet is asking for. Prophet is asking for them to ban what he deems to be Nazi content, which to quote, is
I use “Nazis” throughout in this post as shorthand to refer to an array of right-wing bigots and assholes with the secure knowledge that doing so will offend and annoy the people I intend to offend and annoy. Merry Christmas!
I don't know if that's your definition, and of course just about anyone will say "of course you should ban all the bigots" and then we get into the fun classification game where it turns out people don't always agree on what that is. Do I think Shapiro is an asshole? Absolutely. Does everyone? No. Is what he says something that should be banned? Depends, and I suspect someone like Prophet would disagree and say "always".
This exact vague name calling categorization is part of why these discussions suck, because the literal source of this topic IS NOT talking about Nazi's, just everyone they deem too far right to be acceptable up to and including Nazi's. Like any other concern/moral panic argument there's a lot of nastiness in the details, and the inherent dishonesty of "lol i'll annoy people I don't like and just group them all as nazi's" is already a point against them in my eyes.
Are people against abortion "nazi's"? What about people with pro law enforcement views? How about voted for Trump but wouldn't again? What if they listen to Joe Rogan? I don't know, but I sure as hell have met people who would gladly group all those under "nazi" and I think that's shortsighted, stupid, irresponsible, and actively dangerous.
The lab leak theory has never actually been a viable, or widely accepted, theory about COVID. There have been "experts" who have said that's what happened, and it has been explored thoroughly,...
The lab leak theory has never actually been a viable, or widely accepted, theory about COVID. There have been "experts" who have said that's what happened, and it has been explored thoroughly, but, because it's pretty much impossible to prove the exact origins of COVID, it's impossible to disprove.
Allowing conspiracy theories to prosper and flourish without push back has caused irreparable harm to this country over the last few years, hell, some people have been pushing a civil war because of an easily disproven conspiracy theory.
I am of the of the mindset that giving medical advice without proper credentials should not be subject to the first amendment. If you tell people, for example, to take ivermectin instead of a vaccine or approved medications, you should be liable for what happens to them.
"Lab leak" has been used to mean everything from "engineered as a bioweapon" to "sloppy containment of natural virus." It's a meaningless term at this point. Speculation, even misinformed...
"Lab leak" has been used to mean everything from "engineered as a bioweapon" to "sloppy containment of natural virus." It's a meaningless term at this point.
Speculation, even misinformed speculation, about the origins of a virus is not the same as medical advice on how to treat the virus. There are overlaps in the people who believe both types of misinformation, but they're not inherently linked.
Exactly the problem. It can mean anything from "China was fucking around with things it shouldn't have been and now the world is finding out" to "Fauci paid Xi to create a killer virus to...
Exactly the problem. It can mean anything from "China was fucking around with things it shouldn't have been and now the world is finding out" to "Fauci paid Xi to create a killer virus to assassinate trump." But, there's very little evidence that the virus can from anything other than normal animal to human transmission. Yet people, (including the president of the united states) pushed it as absolute truth and used it to drum up racism and anti-science propaganda.
Speculation about the origins of the virus is not medical information, but conspiracies about how to treat it most certainly are. While not the same thing, they're usually pushed by the same people in the same conversations. It's like how not everyone who believes we didn't land on the moon is a flat earther, but there's usually some crossover, and they use "the firmament" as a reason why we can't reach the moon.
It continues to amaze me just how many Nazis remain in this world. I don’t mean bigoted neofascist Trump-worshippers — the past 7 years have revealed the extent of that — but actual...
It continues to amaze me just how many Nazis remain in this world. I don’t mean bigoted neofascist Trump-worshippers — the past 7 years have revealed the extent of that — but actual swastika-waving literal unmasked white supremacist capital-N Nazis. I never encounter these people in everyday life (as far as I know, anyway) but every now and then they’ll crawl out of the woodwork into some unsuspecting corner of the internet. And there are a ton of them, and they just love to parrot reprehensible things at each other endlessly.
It just blows my mind, as a person with a moral compass, that these people even exist and in such numbers. That they weathered out the Nuremberg trials and Elie Wiesel and Anne Frank and Indiana Jones and Schindler’s List and that Are We The Baddies? sketch, and still they’re doubling down and chaining themselves to this indefensible worldview. What must be going on in the brain of someone like that?
More to the point, does this mean it will never go away? If those things failed to completely eradicate it, is eradication even possible?
My theory is that they hold it because it's a reprehensible world view. Some are like teenagers in that they want to shock people. Others feel lonely and want to strike back at society however...
My theory is that they hold it because it's a reprehensible world view.
Some are like teenagers in that they want to shock people. Others feel lonely and want to strike back at society however they can. Many find themselves unattractive, uninteresting, and unintelligent. Thus, they feel the need to build their self-worth, and knowing "the real super-secret truth" is the only way they know how to make themselves feel superior.
Probably a million other reasons someone would get into this ideology in today's world. But I imagine most end up there for reasons directly or indirectly related to how abhorrent and unpopular these views are.
One of the trademarks of fascism is the aestheticization of politics. People spout these horrible ideas not because they genuinely believe in them, but because of the social power it allows them...
One of the trademarks of fascism is the aestheticization of politics. People spout these horrible ideas not because they genuinely believe in them, but because of the social power it allows them to wield. You have these magic words that have the power to make someone wildly upset.
yeah, I've seen this with flat earthers as well. quite a few don't really seem to believe the BS, but know aligning like this gives them a community and belonging. Or spite from people they never...
yeah, I've seen this with flat earthers as well. quite a few don't really seem to believe the BS, but know aligning like this gives them a community and belonging. Or spite from people they never liked anyway.
I guess even traditional cults will have people who are just there for the food and company and pretend to parrot whatever the leader is selling
It's not that so many "remain" in this world. New ones are being created daily, probably by the thousands, thanks in part (quite literally) to decisions like this one by Substack. 40-50 years ago,...
It's not that so many "remain" in this world. New ones are being created daily, probably by the thousands, thanks in part (quite literally) to decisions like this one by Substack.
40-50 years ago, there were many, many fewer Nazis (both in the open and in hiding -- I can't prove it, but I'll stand by that assertion), than there are today ... and that's because the vast majority of adults walking around had lived through Hitler and WWII. One way or another, people remembered it, personally.
Now, it's history. It's getting "fuzzy". People can find all kinds of fringe theories and evidence suggesting it was exaggerated, misrepresented, or flat-out made up. Even without the outrage-inducing algorithms of the FB-circle, people were going to forget what really happened, and start to believe what they want to believe, instead.
There's not that many. You just see them on the Internet because this is the only place they can proliferate. They're not hanging out at bars. They're not building community in your malls. They're...
There's not that many. You just see them on the Internet because this is the only place they can proliferate. They're not hanging out at bars. They're not building community in your malls. They're barely here, and you only pay attention because you're on the Internet too.
And they’re signal boosted because “nazi” gets more clicks than “really shitty people who actually make up a larger subsection” And because nazi has become more and more the de facto term for...
And they’re signal boosted because “nazi” gets more clicks than “really shitty people who actually make up a larger subsection”
And because nazi has become more and more the de facto term for racist/alt righter/facist when there are oh so many awful varieties of all those things and many of them aren’t Nazi’s and some of them even hate nazi’s
To quote Popehat: Generally speaking people who insist on whining when you don't observe the right taxonomical distinctions between right wing extremists are themselves not worth engaging with.
To quote Popehat:
I use “Nazis” throughout in this post as shorthand to refer to an array of right-wing bigots and assholes with the secure knowledge that doing so will offend and annoy the people I intend to offend and annoy. Merry Christmas!
Generally speaking people who insist on whining when you don't observe the right taxonomical distinctions between right wing extremists are themselves not worth engaging with.
Which is exactly what the comment was talking about. And bluntly people like prophet only make the issue worse. I’ve known people who lived through WWII (although children at the time) who lost...
Which is exactly what the comment was talking about.
And bluntly people like prophet only make the issue worse. I’ve known people who lived through WWII (although children at the time) who lost family to the Nazi’s. And I’ve watched idiots on the internet call them Nazi’s because they dared to be right of them in the issue.
If your stance is “well I’ll devolve into name calling to piss people off, cause confusion, AND in the process embolden a dangerous minority by inflating their numbers” then you’re not helping
Some people are just that evil. Deciding to embrace bigotry because it appeals to them. Some people, however, are just beaten down. Without options, without great choices, without opportunities to...
Some people are just that evil. Deciding to embrace bigotry because it appeals to them.
Some people, however, are just beaten down. Without options, without great choices, without opportunities to advance or better their situations in life. These people want Something Better. If nothing else, if more money or a better life isn't going to happen for them, they at least want to feel better. Things might be shit, but at least they might have a few friends.
Some of these beaten down people get picked up by the extremes. Because those extremists will listen. Will commiserate. Sympathize. Validate the feelings of the person who's down. What often happens from "the normal sectors" of society is everyone else has their own problems, or isn't being paid to care (about your problems), or these normal sectors are under the control of groups who just don't give a fuck about "little people."
Extremism historically rises when people are low on options. Especially options for advancing their lots in life. Most people really hate feeling like they're trapped without life options, which today basically means trapped without money since everything costs money.
That's one of the big ways extremists thrive. Again, some people are evil pieces of shit who arbitrarily hate others for no valid or justifiable reason. Most of these evil pieces of shit will always be evil pieces of shit.
But some of the people being picked up under the extremist banners are just being swept in by a facade of care and concern. "Sure buddy, things are tough, I feel you, it's horrible how you work and work and can't get anywhere. You know what I think? It's their fault. They're who's to blame for how shit your life is."
That's going to get listens from some people. From people who, if they weren't backed into the bottom of the barrel with no hope, would keep walking because they've got options. But when they don't, some of them listen to shit like that, and it sounds pretty good. Especially when they're welcomed in by the speakers. They like feeling like someone's listening to them, someone's feeling their pain, rather than just shrugging and telling them "fuck off everyone's got problems."
The way to reduce extremism is to reduce inequity in society. But that's a hard solution, and leaders definitely don't want to hear about it. Easier to just let the little people fight amongst themselves.
It's worth noting that this is an ideology that took root and then dominated a country of ~100 million people, which we're not even a century removed from. The fact that it still compels a sizable...
It's worth noting that this is an ideology that took root and then dominated a country of ~100 million people, which we're not even a century removed from. The fact that it still compels a sizable minority is a depressing but perhaps not outrageous conclusion.
I cannot stand calls for mandatory stand-taking. It's exhausting and unproductive. "Why won't you take a stand on this very important issue which your voice will not meaningfully affect?" It has...
I cannot stand calls for mandatory stand-taking. It's exhausting and unproductive. "Why won't you take a stand on this very important issue which your voice will not meaningfully affect?" It has major "so when did you stop beating your wife?" energy when used by a real journalist.
Asking people to actually act on the values they go out of their way to claim to have, and then thinking differently of them when they do not. How dreadful!
Asking people to actually act on the values they go out of their way to claim to have, and then thinking differently of them when they do not. How dreadful!
Substack fills a niche and competes with old-school roll-your-own platforms like combining Wordpress and maybe Patreon. I'm not sure how easily many authors with sizeable Substack audiences can...
Substack fills a niche and competes with old-school roll-your-own platforms like combining Wordpress and maybe Patreon. I'm not sure how easily many authors with sizeable Substack audiences can migrate to a different platform. Perhaps I'm ignorant and there are other options I haven't heard of.
Buttondown.email is the best other option I've seen - a few tech newsletters I read use it, and they've been actively pushing a) their "migrate from Substack" workflow, and b) their "no nazis" stance
Buttondown.email is the best other option I've seen - a few tech newsletters I read use it, and they've been actively pushing a) their "migrate from Substack" workflow, and b) their "no nazis" stance
Based on Buttowndown's own comparison, it seems the one big feature it's (intentionally) missing is a comments section. Unfortunately, the comments section is an essential feature for many...
Based on Buttowndown's own comparison, it seems the one big feature it's (intentionally) missing is a comments section. Unfortunately, the comments section is an essential feature for many Substack writers. Some writers have intentionally curated a Q&A culture with their paid subscribers.
I clicked the pieces link to Hamish McKenzie’s apologia for Substack’s approach and boy oh boy, worst take evah! Pophat already adressed some of this, but here's some more: The idea of explicitly...
Our content guidelines do have narrowly defined proscriptions, including a clause that prohibits incitements to violence.
The idea of explicitly mentioning the prohibition of violence (as opposite to mentioning, say, the prohibition of unsolicited marketing) is to suggest that Nazi views are okay, as they are just views, not calls for violence. Which is beyond stupid. The Nazis are literally Nazis. The Nazis did the Holocaust. Really blows my mind that I have to state this fact.
But there is also this:
There also remains a criticism that Substack is promoting these fringe voices. This criticism appears to stem from my decision to host Richard Hanania, who was later outed as having once published extreme and racist views, on my podcast, The Active Voice. I didn’t know of those past writings at the time, and Hanania went on to disavow those views. While it has been uncomfortable and I probably would have done things differently with all the information in front of me, I ultimately don’t regret having him on the podcast. I think it’s important to engage with and understand a range of views even if—especially if—you disagree with them. Hanania is an influential voice for some in U.S. politics—his recent book, for instance, was published by HarperCollins—and there is value in knowing his arguments. The same applies to all other guests I have hosted on The Active Voice, including Hanania’s political opposites.
Pophat had a good take here:
I dislike McKenzie’s apologia for Substack’s policy and for Richard Hanania because it has a sort of detached, sociopathic philosophy popular with techbrahs that all differences of opinion are equal — that a dispute over whether black people are human is like a dispute over the best programming language or whether Rocky Road is the best ice cream. This, too, is a value judgment. It’s not one I share.
As said, I like this take. And yet, I sort of feel that it is slightly naive, assuming McKenzie to have the best of intentions. There's just so many alarm bells going on in my head when I read McKenzie piece. To me, its sorta suggest that the decition to host Nazi stuff is ... sort of intentional?
It's true that "Hanania went on to disavow those views", but what else would you expect when an influential conservative has his white supremacy background exposed? Really curious that McKenzie just takes his word for it.
"who was later outed as" ... who in their right mind would unironically use the word "outed" in the context of a conservative having his white supremacy backround exposed?
Yeah, I appreciated his take. It's a shame because I literally had just considered getting more into substack for my reading/news/etc but yeah I don't want to hang out at the Nazi bar
Yeah, I appreciated his take. It's a shame because I literally had just considered getting more into substack for my reading/news/etc but yeah I don't want to hang out at the Nazi bar
Substack needs a competitor that cares about these issues. I have found creative, interesting, unknown voices at substack. I don't want to lose them, and I don't want to support Nazis
Substack needs a competitor that cares about these issues. I have found creative, interesting, unknown voices at substack. I don't want to lose them, and I don't want to support Nazis
Ugh - so I did a little search on Ghost, the open-source newsletter platform best poised to compete with Substack, which hosts 404 Media, The Browser, Quillette, and other significant names. The...
Ugh - so I did a little search on Ghost, the open-source newsletter platform best poised to compete with Substack, which hosts 404 Media, The Browser, Quillette, and other significant names. The Ghost Foundation's Terms of Service prohibit content that
promotes discrimination, bigotry, racism, hatred, harassment, abuse or harm against any individual or group;
Nonetheless, it took no time at all to find newsletter content on Ghost that blames Jews for the Ukraine/Russia conflict in starkly racist terms. Link, follow at your own peril. In addition to the violation of Ghost's Terms of Service, I'm pretty sure this would be banned content under German law, as would be the newsletters Katz found on Substack.
I don't have the rhetorical gifts necessary to analyze and prescribe solutions for the wholesale dereliction of responsibility by current platforms. We know the genocide of Rohingya people in Myanmar was directly promoted by Meta. We know that X is amplifying Nazis.
But the newsletter platforms are trying to pretend they're the equivalent of a magazine section at a bookshop - the Nazis are present, but only if you actively look for them. With the exception of the Substack podcast promotion, there's no algorithm or other activity pushing Nazi content in your face. ***Upon information from /u/DefinitelyNotAFae, it seems Substack does use a recommendation algorithm which has resulted in users being solicited for unsought misinformation newsletters.
You can go on Amazon or Project Gutenberg and merrily find Mein Kampf, the writings of Osama Bin Laden, or any number of other odious materials for passive consumption. But Substack, Ghost, et al. aren't just newsletter publishers. They provide means for authors and publishers to interact with their audiences in real time, to collect revenue, to direct readers to other forums for organizing. It's more like a bazaar where most booths sell flowers and groceries, but there's a tiny shady section where you can pay to learn how to shoot people and practice your gun skills on live human targets, including the other market visitors if you're discreet about it and don't use your real name. Personally, I wouldn't sell there, or patronize the place.
Edit: Curiously, Kagi and Google searches don't immediately deliver hate/obvious misinformation content on Medium.com. Is anyone aware of whether they're actively moderating?
Yet another really effective site has been holed below the water line by its own owners, who hold the most common and popular shitty notions of modern tech-lords and their shitty blinkered venture capital bros. I’m skeptical of any other viable site’s ability to not already be similarly compromised, or to eventually become so. And, as previously alluded, this whole country is getting pretty damn Nazi-normalizing these days, as our Hitler-quoting former president and almost-certain Republican nominee demonstrates. Democratic leader Chuck Schumer’s response to the Hitler quotations was basically “yeah but hey the guy has a point about the border,” so holy shit, if that’s the best the opposition party has got, it seems like Nazi adjacency just might not be optional anymore no matter where you are.
And that’s what Nazis want: to be everywhere, so that you can’t go anywhere.
This most of all: making everybody else leave whatever spaces they enter is exactly what Nazis demand and expect, and they have a clear pattern of going to any platform where people of good intent congregate, in order to chase them out, to splinter communities, to position good intent as fringe and themselves as mainstream—which allows them to move out the margins of permission on the violence they intend.
I actually don't disagree with this comparison, because I think most independent bookshops would shy away from selling Nazi magazines once it was pointed out to them. Customers would certainly...
But the newsletter platforms are trying to pretend they're the equivalent of a magazine section at a bookshop - the Nazis are present, but only if you actively look for them.
I actually don't disagree with this comparison, because I think most independent bookshops would shy away from selling Nazi magazines once it was pointed out to them. Customers would certainly complain in much the same way they do about Substack -- by doing this, you're directly profiting off the Nazi rhetoric and amplifying their ability to reach their audience. Even before you consider the effects this has on your clientele (I wouldn't patronize a bookshop that sold Nazi magazines), imo you are directly implicated in it as soon as you begin profiting off it without even attempting to filter it.
I'm actually curious how Substack handles the legal requirements of a country like Germany, where tolerance for Nazi content is legally pretty damn low and where there are requirements for users to be able to report violations of NetzG, which requires you very promptly remove open Nazi shit for at least German users (this was very effective on Twitter before Musk, for instance, and iirc Musk's Twitter has been subject to fines/legal action in Germany due to their hesitance to enforce this).
Is that accurate that there's no promotion? When I'd poked at substack recently it did suggest me other newsletters, and one person in the comments of Popehat's post claimed to have multiple...
Is that accurate that there's no promotion? When I'd poked at substack recently it did suggest me other newsletters, and one person in the comments of Popehat's post claimed to have multiple "gender critical" newsletters pushed at them. (Which as they also pointed out is usually from people quite ideologically aligned with the Nazis).
I'll correct what I wrote based on your information. I don't use the Substack app or browser page; I subscribe to exactly one paid Substack e-mail newsletter from Adam Tooze, and will be writing...
I'll correct what I wrote based on your information. I don't use the Substack app or browser page; I subscribe to exactly one paid Substack e-mail newsletter from Adam Tooze, and will be writing him to ask if he can relocate or otherwise protest.
No worries I am not super familiar with the service. I literally just started subscribing to free ones earlier this year. And I'm not totally sure how it all works.
No worries I am not super familiar with the service. I literally just started subscribing to free ones earlier this year. And I'm not totally sure how it all works.
Best bet is self-hosting Ghost if you plan to monetize your writing to the general public. If you're seeking a blogging platform, I quite like WriteFreely in the Fediverse, but building reach is a...
Best bet is self-hosting Ghost if you plan to monetize your writing to the general public. If you're seeking a blogging platform, I quite like WriteFreely in the Fediverse, but building reach is a very different process there.
I will say that if you plan on running a newsletter, it's probably worth paying for an established site so that they can deal with e-mail whitelisting management.
There are paid versions of both Ghost (see Nazi adjacency problem mentioned above) and WriteFreely as well, but you asked for "free"... there's not much out there that I know of which offers...
There are paid versions of both Ghost (see Nazi adjacency problem mentioned above) and WriteFreely as well, but you asked for "free"... there's not much out there that I know of which offers adequate features (like newsletter mailing if desired) without charges.
Buttondown looks great (privacy first, no monetization of users), but again, the free tier is highly feature-limited.
I want to make a defense of substack. I've read this article, Popehat's piece and the open letter from Substack writers and I am not convinced.
The metaphor of the Nazi bar is explicit that everyone in the bar doesn't have to be a Nazi, the bar just doesn't kick the Nazis out. Given that, it's definitely a Nazi bar.
Why is it not an issue that sexually explicit/sex work content isn't allowed? Why is it ok with to accept credit cards for Nazi propaganda but not erotica?
I'd personally feel a bit different about their "principled stance" if they were transparent about how much money Nazis made them, and demonetized them or at a minimum, donated their own proceeds. But they had a separate incident involving a racist guy on their podcast and though I'd have to look it back up to find the specifics, it was a similarly bad vibe. Deplatforming is effective. They'd rather make money off Nazis than avoid having "Nazi" and "Substack" in the same sentence. And that's their call.
Payment processors get really squeamish when NSFW content is involved because many countries have created regulations that require them to be as cautious as possible. If payments used for sex trafficking or illegal content go through them, it starts creating serious liability issues. Moderating that kind of content is extremely tricky; slipping up even once can cost a company millions of dollars (it's not just a slap on the wrist).
I'm aware, it's still a point of hypocrisy. Beyond that I believe they ban erotica entirely, which based on Amazon's Kindle Unlimited selection is perfectly allowable by the payment processors. So, they truly don't have an excuse.
EDIT: In the interest of being correct, if Popehat's analysis is correct they do allow erotica, but also do prohibit a number of things that are legal but "harmful" including doxing and self-harm content.
They have things they don't want associated with their name or making money from it. Nazi content isn't that.
From my history as a merchant account payment processor I really see porn bans as an existential matter for anyone doing payment processing rather than a philosophical issue. It’s not hypocrisy if there is a legitimate and enormous business necessity.
I noted that other non-porn, non-illegal things are banned. That's hypocritical. I was incorrect about erotica which would have also been hypocrisy.
This isn’t the reason payment processors get squeamish about sex. It’s because sex is a high risk, high chargeback sector. So everything is more expensive (eg. insurances don’t cover it, those that do are more expensive).
There are payment processors that are ok with sex. Those payment processors are more expensive.
it's a mix of both. it's only "high risk, high charge-back" because our society shames sexuality and nudity. You don't chargeback a bad DvD nor video game, why some sex video?
This actually isn’t what’s driving chargebacks of adult content. The main driver is deeply deceptive billing practises on the part of adult website owners. Cross selling and negative option billing being the worst offenders. Unfortunately merchant account coding is the blunt instrument used to deal with this, which means you have to do everything under your power to avoid getting reclassified.
if sexual material didn't range from hardcore fetish acts to a female nipple (depending on country and culture), there may be some merit to being that cautious. But given that even the Supreme court decades ago couldn't properly articulate a proper line of obscenity, it seems more like a blanket ban on anything involving nudity instead of a proper way to protect victims of sex trafficking.
My comment is only regarding why payment processors get uncomfortable around sexual content. I agree the current approach is bad.
In my opinion, many laws which have been passed to "protect children" are doing more harm than good. I hate puritanical reactionaries and their vendetta against pornography.
Eh. I wouldn't disagree entirely...there's a lot of bad law passed in the name of "protect the children." Payment processing laws are problematic way outside the scope of just sex. I think some of the way the Under-13 laws are written are also problematic, though I generally agree with the philosophy there. I think proper mandatory reporting laws WRT user interaction would do much more. I'm starting to come around to the idea of mandatory identification for any site that caters to people under 16.
However, "I can't look at tits on the internet because it's very difficult to moderate user-submitted sexual content properly" is hardly doing more harm than good IMO.
Maybe if there wasn't so much sexualization online, we'd have a healthier stance of it in the real world. I'm not saying there isn't a place for it online, but I do kind of agree that it should be more-seperated than it often is. It's been much harder to find games on Steam since they opened the floodgates. I had to filter out dozens of tags to hide most of the thinly-veiled porn, and many legit games like Witcher 3 get caught up. There's a huge difference in how these games are marketted...even some of the most violent games are still fairly kid-friendly stills that I can browse past on the TV.
I'm not opposed to people playing thinly-veiled porn games. But I do want that content properly isolated from non-sexual content, for the same reason I don't shit in my shower. Appropriate spaces for appropriate activities.
Less porn online, more porno orgy BDSM theaters where they can check ID at the door and bounce the rapists.
I don't think the bulk of Western history bears this hypothesis out. We certainly did not have healthier stances on sex and sexuality prior to the internet.
I suppose, but I doubt the porn is helping.
Religion is dying the slow death, that should help.
I think in this particular respect porn has both positive and negative aspects. I think it balances out to "neutral but complicated".
I'm from a very religiously conservative background, though, so that definitely influences my perspective on it. I'm the only one of my siblings who's cohabitated with someone prior to marriage, and my younger sisters both got married before they could legally drink (whereas I married at the ripe old age of 25 for the more traditional reason of "wow would it be easier to sort out that EU visa if we got married instead of just living together").
The aspects that I think generally tip it towards the 'more bad than good' boil down to:
I'm certainly not calling for a ban. But I'd like to see it relegated strictly behind age-validated logins, preferably with an equitable pay model.
It would be, I think, a net positive to cut off teenagers from porn. But I think it would be helpful to have proper sex education videos (that the conservatives would definitely class as porn) that is accessible to teens over 13 though. Showing what healthy sex and sexuality looks like.
Good luck selling the "We're cutting off teenagers from all but good porn" campaign though.
Overall I think most of the issues you describe predate the widespread proliferation of porn via the internet and that the internet just makes these human behaviors more visible than they were before. Other than algorithms potentially monetizing you (something I honestly think is a much bigger problem OUTSIDE of the sphere of sexual content), I can think of examples of these things back to Classical times. I think it's more sensible to focus on making sure that everything is well-labeled, safe from people who don't want to see it/aren't looking for it, and consensual for all parties involved, rather than trying to prevent teens who want to view porn from viewing porn. The latter is a futile exercise and has always been, and preventing access to porn would only increase creepy things like masturbating to non-sexual content. I think the fundamental issues re: sex and sexuality are cultural ones that long predate and are largely independent from the existence of the internet as a place to very easily find porn.
Good luck.
Whatever scheme you come up with here will have collateral damage. And the collateral damage is the only thing it achieves.
I think I like Popehat’s take on the Nazi bar analogy with Substack.
I don’t know where I stand on the issue. I think there’s a difference between banning a user because they’re a Nazi, and banning a user because they post Nazi rhetoric, and I acknowledge that it’s probably a fine line that’s hard to define - especially with the constant and ever-changing dogwhistles.
Banning a user from the platform for promoting Nazi ideology on the platform I can understand as banning someone for hate speech. I personally would be okay with banning someone from the platform for posting Nazi rhetoric on a different platform, but I can understand not doing so.
Nazi guests from other events don't generally have free reign to barge into my niece's quinceañera and make comments.
I think the concern is the promotion of Nazi rhetoric on the platform specifically though.
And they're not barring those people.
Well, no, in the metaphor, a bar eventually becomes a nazi bar due to a series of events that starts with not kicking out nazis. It's not like if you have a nazi or two in your bar, you're instantly a nazi bar.
Currently, substack is not a nazi bar. Most people there are not nazis. If the metaphor's theory holds true, eventually, more nazis will come there, which will edge out normal people, and eventually no normal person would ever tolerate being there because of how many nazis there are, and the only people left will be nazis; hence, nazi bar.
In the original story (on Reddit transcribed from the tweets the bartender points out that you cannot have even a single one hanging around, no matter how polite an individual and immediately kicks the guy out because the next step is they bring their friends. If you don't kick out Nazis 1 and 2 you end up with Nazis 12 and 25.
Substack is just at the point where everyone is noticing the Nazis and complaining to the bartender. Who has decided that it's better not to make decisions about whether Nazis are allowed or not and he's perfectly willing to accept the Nazis money.
If you want to say that it's not technically a Nazi bar yet, ok sure, whatever, this is all a metaphor anyway. A bar that tolerates and lets Nazis hang out pay to reserve the back room for weekly Nazi night is not, in my mind, significantly better or more principled. Because I don't want to be in a bar with Nazis, like as a general rule. And if the bartender is letting them stay, then this isn't a bar I want to be in.
The metaphor isn't about principles, it's about practicality. That's the whole point.
Whether you agree with letting people say whatever they want to say regardless of how distasteful you find it is an entirely separate issue.
The point of the Nazi bar metaphor is that if you don't nip it in the bud, eventually, your space will become a haven for distasteful people, and those will be the only people who come there. Substack hasn't gotten there yet, so thus it's not a nazi bar. Whether you're okay hanging out in a place where a few nazis hang out is an entirely different conversation.
So they're not nipping it in the bud, meaning it's gonna become...
The practicality is that you toss them out and don't engage in arguments about fairness upfront. Or you become a ....
Like I said if you want to argue that it's not quite there yet, ok, sure whatever. It's a Nazi Bar to Be. It's a bar really dedicated to making sure everyone, even Nazis, are welcome. Whatever.
That is still bad. The practicality is that you should remove them immediately. So is the principle.
What content is sub stack moderating that isn’t required by law?
Popehat lays it out (on Substack)
It is illegal to allow child porn on your site. Full stop.
If you are not validating the adult content and sex work on your site, you can be charged for and child porn or sex trafficking that occurs on it.
Basically every other rule falls under a similar subset of “you can and will be charged for this”
Hate speech is also illegal in various areas, but they don't seem to be making a stand on that front. Why not?
Hate speech is not in any jurisdiction they are in. This is why they have the incitement to violence clause/protected classes stuff. Hate speech inciting violence is illegal and they can be found liable for not policing it. Hate speech such as “x are terrible and the reason everything is wrong with the world “ is not illegal
Substack is available in Germany, which absolutely has legal requirements for websites to remove certain Nazi content from their site (as viewed in Germany) and to have easy ways of reporting it. People who know Substack's legal requirements in the US far better than you (e.g., the American lawyer who goes by Popehat and who posted the quoted portion of the comment you replied to) have pointed out that their moderation far exceeds their legal requirements under US law, but it also is just patently untrue that hate speech isn't regulated in "any jurisdiction they are in" when it absolutely is.
They are a US based company was more my point, but yes Germany is a thing, and I'm curious to see how they handle this. To my knowledge, they've done nothing so far.
okay? And not all adult content is CSAM. Consumption of legal adult content do not lead to a desire to seek nor consume CSAM.
Meanwhile, statements like "X is horrible and the reason why the world sucks" has historically lead to incitements of violence and then actual violence. And that is why many platforms don't let it creep up. It's the same line of reasoning except history points to one topic being much more a of an issue.
The thing is: platforms can't fuck up with CSAM even once. If sexual material of a 17 year old was sent to an email list through Substack, they've opened themselves up to huge liability issues.
Whether it makes sense or not, laws around sexual content are extremely strict and punishing in many countries. Racism isn't regulated the same way as sexual abuse.
sounds like a "simple" filtering issue on all media, then. Which they already have.
If we're being realistic: that content is probably already captured and pre-screened far before it ever goes into production. You can fast track some trusted, long good standing accounts, but any uploads from anyone else needs to be checked anyway. Be it illegal or "legal but against the rules" content.
Having a blanket ban is one of convenience, not necessity. One that discourages, but does not eliminate the need for tools already being utilized. I normally wouldn't call out such decisions as lazy (Tildes does the exact same thing, after all), but I will be much more scrutinous towards a company that claims to be "supporting individual rights and civil liberties" while allowing for ideas that have historically quashed said rights and liberties (and lives, many lives).
Sex workers and other adult content.
I knew this would be the first thing the moment I said it and the short response to that is allowing adult content in a hand off moderation strategy is legally not possible.
If you allow ANY adult content you have to be ready to jump through about a million hoops or risk being arrested for distributing child porn.
It is a massive difficulty
They have to do that anyway? If someone was going to post something that nasty, they’re already willing to break rules far more severe than anything Substack enforces, so I don’t think a “no porn” rule will stop them. Substack still has to watch out just as much.
Surely you also have to remove CSA content even if you ban adult erotica (and it would not be trivial to moderate regardless of whether adult erotic content is allowed).
You do but it's much much harder than people think if you allow any sort of adult content. There's a whole lot of "i have no personal experience with this, but I think it should be easy" in this thread and it just isn't.
If you allow real human adult content, ESPECIALLY in the just entered age of AI image nonsense, you will have an entire department dedicated to running it, because to do anything less it risk actual jail time, not just the usual slap on the wrist fines.
The difference between "we ban anything that even looks like adult content" and "we will allow legal adult content" is massive. The first one is much easier to accomplish because there's no wiggle room. You ban anything that looks like it could cause issues, done. The second will require verifying literally every account and image, and NOT doing that is why pornhub had to nuke millions of videos.
It's funny that you say there's a lot of "have no personal experience with this, but I think it should be easy" here, because I work as a data scientist training classifiers to detect inappropriate content. Granted, we're in a different domain, so "inappropriate" includes decidedly more boring things that are non-compliant with financial regulations in addition to obvious stuff like sexual content. I personally have scraped comments from 4chan and labelled them to use as training data for our sexual harassment classifier (was not a fun task btw). So while I don't run a company like Substack, I've definitely got some experience with the technical side of detecting content like this.
Detecting sexual content or even just nudity in photos is way harder than you think it is. iirc shortly before I started at my workplace our team attempted and gave up on implementing a nudity detector for videocalls because it wasn't reliable enough and had too many embarassing false positives. It's a MUCH MUCH harder task when you include non-photographic images like drawn art. Not to mention text, though Substack's rules don't seem super clear about how heavily they moderate textual erotica. A company like Substack would not be able to effectively moderate adult content on their platform solely through automated means, even if they took the most hardline stance against anything even adjacent to nudity possible (and they're not doing that).
The long and short of it is that Substack is going to need similar levels of vigilance and non-automated review of content to prevent CSAM from being hosted on their platform regardless of how much adult content they actually allow. The government certainly isn't going to care if legal adult content is allowed on the site or not if they find CSAM. Not allowing legal adult content might make the job slightly easier for their human moderators by making it less of a judgment call (though their leeway on artistic nudity undoes that imo), but that doesn't really tranelate to much difference in terms of their legal risk or even the cost of moderation.
I think you're misunderstanding my position. It is relatively easier, not actually easy, and again, it's the massive legal portion of this not so much the tech portion.
To host adult content they can only automate verification so much or wind up potentially liable for any CSAM/Trafficking/etc. If the stance is ALL adult content is a violation of the sites policies they instantly save themselves millions in dealing with payment processors and law enforcement entities. They will still have to police these things and pass on information as required, but the burden, expectation, and audit is much much lower.
At the bare minimum you don't have the huge problem of people seeing such content and not reporting it, because they think it's allowed, and the issue of attempting to verify every single user who's putting up adult content.
If i say adult content is allowed, let someone post photos, and DON'T verify who those photos are of and if they're of legal age and verify the user who posted them, i'm in a world of trouble. So now I must verify every single user to a much higher standard and police all content they post
If I say adult content is not allowed, and someone posts photos, and I never verified them, that's just one more way they're seen as breaking the rule. The obligation of verification was never put on me in the first place because i never said i'd be hosting that kind of content anyways. I still need a reporting system and some check of images hosted, but as adult content isn't expected to be hosted i'm not forced to verify every single account, and the standard to which I must be negligent to be liable is much much higher.
Finally, not a lot of people trying to hide this kind of content in places that don't allow adult content, especially since there's SO many that do allow adult content in legal grey areas. If you just straight up disallow the content it's going to make your burdeon much easier because no one is accidentally going to share that kind of content either (which happens waaaaaay more than people think).
There's so much more to this than just the programmatic classification
For fuck's sake stop saying "child porn". These are images of child sexual abuse.
I quite strongly disagree that shifting Nazis to another website is the right decision given some amount of not Nazi content being banned as well.
Some amount of non-Nazi content is already being banned. So the choice is ban Nazi content as well as the other stuff, or don't ban Nazi content while banning some other things. As such, the question is just: Does Substack want Nazi content on their site, and the answer they're giving is "Yes."
I think you missed the whole point of the Nazi Bar thing. At first there's only one or two well-behaved Nazis. But over time, the other people leave, because they don't want to hang out around Nazis, and more Nazis come in, and once there's enough of them they get a whole lot less well-behaved.
Right now, Substack is a place that also has Nazis, but with this move they'll inevitably become Substack, the Nazi place. The majority of all those other voices you find interesting will leave over time, either of their own volition, because people really hate hanging out around fucking Nazis, or because the Nazis will actively push every other view point out as soon as they feel like they can.
I can't speak to the other sites, but I was among the people that abandoned Reddit 6 or 8 outrages ago, and joined (among other places) Voat. When it was brand new, it was nice. It was created and run--apparently, at least--by a reasonable, idealistic person who simply believed in unfettered freedom of speech.
And, yeah, exactly this happened to it. Nazis, trolls, right-wing conspiracy nuts crept in, everyone else left.
Always like pulling out this passage to describe the issue.
I had similar views on a more lazzeis faire moderation before I read through that and realized that such an idealism, especially for a new growing site, was simply that: an ideal. The content, normal citizens won't move off their reddits or twitter, so the entire exercise is one of response bias; people frustrated enough to want to move (who will disproportionately be witches, if you ban witch hunts). You can only really oppose that with strict moderation, and that is exactly why Tildes didn't become another Voat..
I would modify your statement to say that a lassiez faire approached moderation does work in the rare circumstances that either the entire market is growing, or the established competitor has a sudden and complete collapse. Both circumstances lead to a situation where the influx of new users dilutes the Nazi presence to a degree that they end up sequestered to their own hell holes.
If the collapse or growth stalls too soon, you end up with the evaporative cooling of Voat. Trying to pull them out after they’ve taken root leads one down a road of Reddit with inane rules meant to chase dogwhistles as well as opens the door to a petulant insistence on the equal application of rules that kills off advertising-unfriendly fun zones—but that doesn’t root out the problem. At least from my understanding, racist rhetoric was way worse 10 years ago there, but it didn’t inspire real-world violence until after it got put into dogwhistles. One of the parent comments has a point that not only is it easier to radicalize the normies through dogwhistles. Personally, I additionally feel like there is a scale and organization to modern online racism that wasn’t present in 2012. When it’s the same four users spamming slurs to each other, it’s not worth the hassle to investigate anything. Once numbers grow to the millions, the threshold for rhetoric that could inspire stochastic violence, intentionally flamed or otherwise, drops massively.
I’ve seen this phenomenon play out on the internet several times over the years. It doesn’t even take extreme stuff like white supremacy — if nastiness is allowed in a space, that space will inevitably devolve into a cesspit as the nasty (of whatever flavor) attract their own kind and push everybody else out. Moderation is an absolute must because without it, communities will fester and decay. Turns out that most reasonably well adjusted people prefer to spend their time around other reasonably well adjusted people.
The is an imagined scenario of what might happen, but it doesn't seem to have happened yet for Substack, for me anyway. I've seen no sign of anything bad on Substack (on the blogs I read and occasional glances at Notes) other than these articles calling it out.
So it's more like knowing that there are Nazi websites on the Internet, somewhere, than having them in the room. Okay but don't link to them?
Substack is getting a reputation problem, but the direct cause (for now) is people calling this stuff out. That's a different problem from the Nazis themselves causing a reputation problem from their own interactions with other people.
So for me it's still "wait and see." Maybe the owners should be more preemptive about something that might happen to their website, but as a reader I can deal with it later. I still have favorite blogs hosted on Substack and I will keep reading and linking them, because there's no sign of trouble on those blogs and I would miss reading them if I stopped.
(I even read one blog, Marginal Revolution, that's not on Substack but has a terrible comment section. I don't know that it's actual Nazis, but I avoid reading the comments and I avoid linking directly to it on Tildes. It's still a good source of links to other websites, which sometimes I share here.)
I also have an inactive blog on Substack. It gets zero comments so it's very easy to moderate. Moving it would be more trouble than it's worth.
More generally, I think people are overly concerned with imagining how things might get worse in situations when there's little reason to plan ahead. For any website, there's some chance that it will go downhill and end badly. Leaving early on rumors when you're having a good time seems like a way of missing out?
If people finding out what Substack is doing causes a reputation issue, the problem isn't with the whistleblowers.
Platforming Nazis is a bad thing in and of itself, even if the Nazis are well-behaved (for now), because doing so leads to more Nazis. Increasing their reach increases the number of people vulnerable to that ideology who will see them. The farther they are from the fringes of the internet, the more legitimate their position appears, and the easier it gets for random people to stumble across them. One of the most effective measures for dealing with an infection is quarantine.
And I invite you to consider how it feels to be a member of one of the many groups that Nazis hate. Over there, in the same room but sitting quietly in a little square of tape is someone who wants to murder you. They will do anything they can to gain power until they can murder you, and everyone like you. Do you see how that might make members of that group feel unwelcome or uncomfortable on the platform, even if the section of the room over here is fine? Quite a lot of marginalized people are driven away when you allow the voices of the people who want to kill them equal weight as their own, to the detriment of people and platform both.
For many people, the best time to get off a sinking ship is now, even if they could potentially swing by the buffet real quick before leaving. If you want to stick around to get as much as you can out of the experience, that's entirely within your rights. But many people will want to exit before any unpleasantness, and it'll be much more difficult to bring in anyone new.
Maybe you're right that people are panicking early, and Substack will turn things around somehow. But I've never even heard of a place getting better after adding Nazis into the mix, so personally I'm going to give it a wide berth.
Nobody needs a reason to stop using a website. It could be just on vibes. I’m explaining why I don’t find other people’s arguments persuasive after thinking about them a bit. In particular, I’m uncomfortable with argument by analogy when the analogy is vivid, but doesn’t work very well. The “Nazi bar” thing is a meme that’s going around, and I don’t think it really gets at what’s going on, at least for Substack.
I think you’re leaning too much on scary metaphor. There are no scary possible murderers in the same room as you on the Internet. There’s no physical proximity. Using the Internet isn’t much like riding a subway or a bus, or sitting in a bar.
There are real problems with harassment on the Internet and sometimes people organize harrassment or worse in the real world, but this doesn’t seem to be what the conversation around Substack’s terms of service is about?
It seems very apt to me, without resorting to arguing over the semantics of the statement. What do you think is "really going on"?
As far as I see it the issue people have with the terms of service is that, in as many words, they don't debar Nazis from using Substack. I don't think there's any need to go into why, given the examples in the Atlantic article of "Nazis" on Substack and the very real, very recent effects of dehumanising rhetoric online, that is a bad thing. The problems being spoken about arise very directly from Substack's actions (via the Nazis they continue to platform). So the conversation seems clear to me - or am I misunderstanding?
Okay, here's what I think is going on, avoiding the use of metaphors.
According to the Atlantic article, Nazis are using substacks to organize and fundraise. That's definitely very bad, and I don't know why they don't do more about it.
Some people think of this as reason enough not to use Substack. Fair enough, I consider this similar to not wanting to go to Chick-fil-A because they support conservative causes.
I think the "Nazi bar" metaphor started out being about how moderators should kick out some participants to avoid making things worse for everyone participating and getting a bad reputation. But bloggers who use Substack seem to have the tools they need to moderate comments to their blogs? Perhaps substacks are getting confused with subreddits?
In some sense the confusion comes from Substack trying to have it both ways - they mostly provide infrastructure to writers and stress their writers' independence as separate businesses, but they also heavily promote their own brand and try to cross-promote writers with things like Notes. When their brand gets tarnished then it can affect bloggers who use their infrastructure. For a website using some lower-level service provider like AWS or Google Cloud, the name of the service provider doesn't appear anywhere on the website and saying "that's a different website; nothing to do with us" works a lot better.
I think the bar analogy is useful and mostly correct when dealing with very social sites like Reddit. When the bad users sort of leak into the other communities. I remember the Star Trek subreddit got pretty racist at one point when Reddit was very loose with the moderation.
I am not sure the analogy quite fits Substack. At least not to the same degree. Individual substack users are not really part of the same global community. I get a single newsletter in my email inbox and I see nothing else. Interact with no one and don't see any recommendations to other substacks to follow. At least that is how I use it. Sort of like I could subscribe to a WordPress hosted blog and while some nazi blogs might also be running WordPress, things are still fairly seperated.
To stay in the bar analogy, this issue is more akin to my bar serving the same brand of beer as the nazi bar down the street.
I do still think Substack is doing something highly criticizable here by knowingly profitering and hosting racist content. It is not unreasonable to leave the platform because of this. I just think it is somehow different than other social websites being ruined by the nazi bar effect and it is a bit of stretch to predict the same with Substack. The effect is negative but in a different way.
I really don’t think they did.
The Nazi bar argument is lazy and doesn’t track with reality. There are areas like 4chan which host a wiiiiiiiide variety of content and calling anything trying to solve the moderation problem in a hands off way “a Nazi bar” just lumps so much good with the bad in a reductive way
As a non-4chan user, many of us feel the same way about 4chan and the like and wouldn't touch it with a ten foot pole for a variety of reasons.
Yes many people who don’t use things have opinions on them but it’s often not accurate in my experience.
I also don’t use 4chan but I’m not sure I’ve heard more hate speech or gay furry porn from the users I’ve known who do
The issue with 4chan isn't that "all 4chan users are bigots", but rather "bigotry is omnipresent".
I tried /tg/ a few years ago because I liked TTRPGs, and basically every time anything to do with women, LGBT people, or non-white people came up there were plenty of people dropping slurs or making disparaging comments. It was bad for my mental health so I left.
It's a hostile environment, because that's the community it has fostered. People there might usually not be bigots, but they also largely don't really care about the kinds of bigotry there.
It's not about all the individual users. It never is. Not with Reddit, not with 4chan. Hell, not with Tildes. It's the environment and what is tolerated.
You don't avoid the metaphorical Nazi bar because every person there is a Nazi, you avoid it because the Nazis are allowed to hang out there. @jess said it as well, the bigotry becomes omnipresent. And much in the same way that middle school kids pick up "gay" as a n equivalent of "stupid" from each other, it's part of why casual homophobia and racism and the like are so prevalent online. I'm sure 4chan has its good spaces but I never felt safe there. See also AOL chatrooms. And other wretched hives of scum and villainy.
in reality a couple of bad apples ruin the barrel. in reality, the vast majority of people do not commit violent crimes, yet every 1st world country has a prison system for the minority of bad apples (with various degrees of effectiveness).
There may be some minimal positive benefit to hanging around a nazi bar, but life is short and why take a risk? At best you are annoyed and at worst you are killed. Not worth the attempt. if you can identify and avoid it.
Likewise, I don't think "solving the moderation problem" is a good rationale to allow hate content on a site. Nor to skirt the line. I guess giving up is indeed a solution, but it's not like they are even doing that:
It's less removing all lines and more drawing the most fragile of lines in the sand and believing that they will step in before things go too far. Their official statements don't give me much faith that they will indeed step in.
This is all we are asking for and frankly it doesn't seem like a big ask. Of course you can't ban Nazis, but that was never the point.
So, to be clear about this, it's kinda not.
Or at least, that's not what prophet is asking for. Prophet is asking for them to ban what he deems to be Nazi content, which to quote, is
I don't know if that's your definition, and of course just about anyone will say "of course you should ban all the bigots" and then we get into the fun classification game where it turns out people don't always agree on what that is. Do I think Shapiro is an asshole? Absolutely. Does everyone? No. Is what he says something that should be banned? Depends, and I suspect someone like Prophet would disagree and say "always".
This exact vague name calling categorization is part of why these discussions suck, because the literal source of this topic IS NOT talking about Nazi's, just everyone they deem too far right to be acceptable up to and including Nazi's. Like any other concern/moral panic argument there's a lot of nastiness in the details, and the inherent dishonesty of "lol i'll annoy people I don't like and just group them all as nazi's" is already a point against them in my eyes.
Are people against abortion "nazi's"? What about people with pro law enforcement views? How about voted for Trump but wouldn't again? What if they listen to Joe Rogan? I don't know, but I sure as hell have met people who would gladly group all those under "nazi" and I think that's shortsighted, stupid, irresponsible, and actively dangerous.
The lab leak theory has never actually been a viable, or widely accepted, theory about COVID. There have been "experts" who have said that's what happened, and it has been explored thoroughly, but, because it's pretty much impossible to prove the exact origins of COVID, it's impossible to disprove.
Allowing conspiracy theories to prosper and flourish without push back has caused irreparable harm to this country over the last few years, hell, some people have been pushing a civil war because of an easily disproven conspiracy theory.
I am of the of the mindset that giving medical advice without proper credentials should not be subject to the first amendment. If you tell people, for example, to take ivermectin instead of a vaccine or approved medications, you should be liable for what happens to them.
Exactly the problem. It can mean anything from "China was fucking around with things it shouldn't have been and now the world is finding out" to "Fauci paid Xi to create a killer virus to assassinate trump." But, there's very little evidence that the virus can from anything other than normal animal to human transmission. Yet people, (including the president of the united states) pushed it as absolute truth and used it to drum up racism and anti-science propaganda.
Speculation about the origins of the virus is not medical information, but conspiracies about how to treat it most certainly are. While not the same thing, they're usually pushed by the same people in the same conversations. It's like how not everyone who believes we didn't land on the moon is a flat earther, but there's usually some crossover, and they use "the firmament" as a reason why we can't reach the moon.
I'm not so sure about "irreparable harm" if we're talking about the country and not things that happened to individuals. It seems too soon to say.
Proliferation of anti-vax and distrust of science propaganda is absolutely doing harm to the country.
Yes, remove the word "irreparable" and I agree - it seems true and uncontroversial.
That's fair, I may a bit more pessimistic than you are.
It continues to amaze me just how many Nazis remain in this world. I don’t mean bigoted neofascist Trump-worshippers — the past 7 years have revealed the extent of that — but actual swastika-waving literal unmasked white supremacist capital-N Nazis. I never encounter these people in everyday life (as far as I know, anyway) but every now and then they’ll crawl out of the woodwork into some unsuspecting corner of the internet. And there are a ton of them, and they just love to parrot reprehensible things at each other endlessly.
It just blows my mind, as a person with a moral compass, that these people even exist and in such numbers. That they weathered out the Nuremberg trials and Elie Wiesel and Anne Frank and Indiana Jones and Schindler’s List and that Are We The Baddies? sketch, and still they’re doubling down and chaining themselves to this indefensible worldview. What must be going on in the brain of someone like that?
More to the point, does this mean it will never go away? If those things failed to completely eradicate it, is eradication even possible?
My theory is that they hold it because it's a reprehensible world view.
Some are like teenagers in that they want to shock people. Others feel lonely and want to strike back at society however they can. Many find themselves unattractive, uninteresting, and unintelligent. Thus, they feel the need to build their self-worth, and knowing "the real super-secret truth" is the only way they know how to make themselves feel superior.
Probably a million other reasons someone would get into this ideology in today's world. But I imagine most end up there for reasons directly or indirectly related to how abhorrent and unpopular these views are.
One of the trademarks of fascism is the aestheticization of politics. People spout these horrible ideas not because they genuinely believe in them, but because of the social power it allows them to wield. You have these magic words that have the power to make someone wildly upset.
yeah, I've seen this with flat earthers as well. quite a few don't really seem to believe the BS, but know aligning like this gives them a community and belonging. Or spite from people they never liked anyway.
I guess even traditional cults will have people who are just there for the food and company and pretend to parrot whatever the leader is selling
I doubt it will ever go away, but that doesn't mean you stop stomping it out wherever it shows up.
It's not that so many "remain" in this world. New ones are being created daily, probably by the thousands, thanks in part (quite literally) to decisions like this one by Substack.
40-50 years ago, there were many, many fewer Nazis (both in the open and in hiding -- I can't prove it, but I'll stand by that assertion), than there are today ... and that's because the vast majority of adults walking around had lived through Hitler and WWII. One way or another, people remembered it, personally.
Now, it's history. It's getting "fuzzy". People can find all kinds of fringe theories and evidence suggesting it was exaggerated, misrepresented, or flat-out made up. Even without the outrage-inducing algorithms of the FB-circle, people were going to forget what really happened, and start to believe what they want to believe, instead.
There's not that many. You just see them on the Internet because this is the only place they can proliferate. They're not hanging out at bars. They're not building community in your malls. They're barely here, and you only pay attention because you're on the Internet too.
And they’re signal boosted because “nazi” gets more clicks than “really shitty people who actually make up a larger subsection”
And because nazi has become more and more the de facto term for racist/alt righter/facist when there are oh so many awful varieties of all those things and many of them aren’t Nazi’s and some of them even hate nazi’s
To quote Popehat:
Generally speaking people who insist on whining when you don't observe the right taxonomical distinctions between right wing extremists are themselves not worth engaging with.
Which is exactly what the comment was talking about.
And bluntly people like prophet only make the issue worse. I’ve known people who lived through WWII (although children at the time) who lost family to the Nazi’s. And I’ve watched idiots on the internet call them Nazi’s because they dared to be right of them in the issue.
If your stance is “well I’ll devolve into name calling to piss people off, cause confusion, AND in the process embolden a dangerous minority by inflating their numbers” then you’re not helping
Some people are just that evil. Deciding to embrace bigotry because it appeals to them.
Some people, however, are just beaten down. Without options, without great choices, without opportunities to advance or better their situations in life. These people want Something Better. If nothing else, if more money or a better life isn't going to happen for them, they at least want to feel better. Things might be shit, but at least they might have a few friends.
Some of these beaten down people get picked up by the extremes. Because those extremists will listen. Will commiserate. Sympathize. Validate the feelings of the person who's down. What often happens from "the normal sectors" of society is everyone else has their own problems, or isn't being paid to care (about your problems), or these normal sectors are under the control of groups who just don't give a fuck about "little people."
Extremism historically rises when people are low on options. Especially options for advancing their lots in life. Most people really hate feeling like they're trapped without life options, which today basically means trapped without money since everything costs money.
That's one of the big ways extremists thrive. Again, some people are evil pieces of shit who arbitrarily hate others for no valid or justifiable reason. Most of these evil pieces of shit will always be evil pieces of shit.
But some of the people being picked up under the extremist banners are just being swept in by a facade of care and concern. "Sure buddy, things are tough, I feel you, it's horrible how you work and work and can't get anywhere. You know what I think? It's their fault. They're who's to blame for how shit your life is."
That's going to get listens from some people. From people who, if they weren't backed into the bottom of the barrel with no hope, would keep walking because they've got options. But when they don't, some of them listen to shit like that, and it sounds pretty good. Especially when they're welcomed in by the speakers. They like feeling like someone's listening to them, someone's feeling their pain, rather than just shrugging and telling them "fuck off everyone's got problems."
The way to reduce extremism is to reduce inequity in society. But that's a hard solution, and leaders definitely don't want to hear about it. Easier to just let the little people fight amongst themselves.
It's worth noting that this is an ideology that took root and then dominated a country of ~100 million people, which we're not even a century removed from. The fact that it still compels a sizable minority is a depressing but perhaps not outrageous conclusion.
It is impressive how many progressive authors post on substack and are are ignoring this issue.
Upton Sinclair — 'It is difficult to get a man to understand something, when his salary depends on his not understanding it.'
Same shit with xitter.
I cannot stand calls for mandatory stand-taking. It's exhausting and unproductive. "Why won't you take a stand on this very important issue which your voice will not meaningfully affect?" It has major "so when did you stop beating your wife?" energy when used by a real journalist.
Asking people to actually act on the values they go out of their way to claim to have, and then thinking differently of them when they do not. How dreadful!
Substack fills a niche and competes with old-school roll-your-own platforms like combining Wordpress and maybe Patreon. I'm not sure how easily many authors with sizeable Substack audiences can migrate to a different platform. Perhaps I'm ignorant and there are other options I haven't heard of.
edit: spelling
Buttondown.email is the best other option I've seen - a few tech newsletters I read use it, and they've been actively pushing a) their "migrate from Substack" workflow, and b) their "no nazis" stance
Based on Buttowndown's own comparison, it seems the one big feature it's (intentionally) missing is a comments section. Unfortunately, the comments section is an essential feature for many Substack writers. Some writers have intentionally curated a Q&A culture with their paid subscribers.
It’s mentioned in the article, but Popehat’s full piece is a good read. Worth noting it is hosted on Substack, though.
I clicked the pieces link to Hamish McKenzie’s apologia for Substack’s approach and boy oh boy, worst take evah! Pophat already adressed some of this, but here's some more:
The idea of explicitly mentioning the prohibition of violence (as opposite to mentioning, say, the prohibition of unsolicited marketing) is to suggest that Nazi views are okay, as they are just views, not calls for violence. Which is beyond stupid. The Nazis are literally Nazis. The Nazis did the Holocaust. Really blows my mind that I have to state this fact.
But there is also this:
Pophat had a good take here:
As said, I like this take. And yet, I sort of feel that it is slightly naive, assuming McKenzie to have the best of intentions. There's just so many alarm bells going on in my head when I read McKenzie piece. To me, its sorta suggest that the decition to host Nazi stuff is ... sort of intentional?
Here's Richard Hananias wiki page. Juicy stuff, give it a read.
It's true that "Hanania went on to disavow those views", but what else would you expect when an influential conservative has his white supremacy background exposed? Really curious that McKenzie just takes his word for it.
"who was later outed as" ... who in their right mind would unironically use the word "outed" in the context of a conservative having his white supremacy backround exposed?
Yeah, I appreciated his take. It's a shame because I literally had just considered getting more into substack for my reading/news/etc but yeah I don't want to hang out at the Nazi bar
Substack needs a competitor that cares about these issues. I have found creative, interesting, unknown voices at substack. I don't want to lose them, and I don't want to support Nazis
All Substack links posted to Tildes are tagged
substack
, allowing users to filter them out of their feeds if they so wish.Ugh - so I did a little search on Ghost, the open-source newsletter platform best poised to compete with Substack, which hosts 404 Media, The Browser, Quillette, and other significant names. The Ghost Foundation's Terms of Service prohibit content that
Nonetheless, it took no time at all to find newsletter content on Ghost that blames Jews for the Ukraine/Russia conflict in starkly racist terms. Link, follow at your own peril. In addition to the violation of Ghost's Terms of Service, I'm pretty sure this would be banned content under German law, as would be the newsletters Katz found on Substack.
I don't have the rhetorical gifts necessary to analyze and prescribe solutions for the wholesale dereliction of responsibility by current platforms. We know the genocide of Rohingya people in Myanmar was directly promoted by Meta. We know that X is amplifying Nazis.
But the newsletter platforms are trying to pretend they're the equivalent of a magazine section at a bookshop - the Nazis are present, but only if you actively look for them. With the exception of the Substack podcast promotion,
there's no algorithm or other activity pushing Nazi content in your face.***Upon information from /u/DefinitelyNotAFae, it seems Substack does use a recommendation algorithm which has resulted in users being solicited for unsought misinformation newsletters.You can go on Amazon or Project Gutenberg and merrily find Mein Kampf, the writings of Osama Bin Laden, or any number of other odious materials for passive consumption. But Substack, Ghost, et al. aren't just newsletter publishers. They provide means for authors and publishers to interact with their audiences in real time, to collect revenue, to direct readers to other forums for organizing. It's more like a bazaar where most booths sell flowers and groceries, but there's a tiny shady section where you can pay to learn how to shoot people and practice your gun skills on live human targets, including the other market visitors if you're discreet about it and don't use your real name. Personally, I wouldn't sell there, or patronize the place.
Edit: Curiously, Kagi and Google searches don't immediately deliver hate/obvious misinformation content on Medium.com. Is anyone aware of whether they're actively moderating?
Edit 2: A.R. Moxon said it best in The Reframe:
I actually don't disagree with this comparison, because I think most independent bookshops would shy away from selling Nazi magazines once it was pointed out to them. Customers would certainly complain in much the same way they do about Substack -- by doing this, you're directly profiting off the Nazi rhetoric and amplifying their ability to reach their audience. Even before you consider the effects this has on your clientele (I wouldn't patronize a bookshop that sold Nazi magazines), imo you are directly implicated in it as soon as you begin profiting off it without even attempting to filter it.
I'm actually curious how Substack handles the legal requirements of a country like Germany, where tolerance for Nazi content is legally pretty damn low and where there are requirements for users to be able to report violations of NetzG, which requires you very promptly remove open Nazi shit for at least German users (this was very effective on Twitter before Musk, for instance, and iirc Musk's Twitter has been subject to fines/legal action in Germany due to their hesitance to enforce this).
Is that accurate that there's no promotion? When I'd poked at substack recently it did suggest me other newsletters, and one person in the comments of Popehat's post claimed to have multiple "gender critical" newsletters pushed at them. (Which as they also pointed out is usually from people quite ideologically aligned with the Nazis).
I'll correct what I wrote based on your information. I don't use the Substack app or browser page; I subscribe to exactly one paid Substack e-mail newsletter from Adam Tooze, and will be writing him to ask if he can relocate or otherwise protest.
No worries I am not super familiar with the service. I literally just started subscribing to free ones earlier this year. And I'm not totally sure how it all works.
That's a shame. I was thinking of creating a blog there. What is the best free alternative?
Best bet is self-hosting Ghost if you plan to monetize your writing to the general public. If you're seeking a blogging platform, I quite like WriteFreely in the Fediverse, but building reach is a very different process there.
I will say that if you plan on running a newsletter, it's probably worth paying for an established site so that they can deal with e-mail whitelisting management.
I would definitely not self-host anything, not a tech person. I guess there's always wordpress.com, blogger, etc.
There are paid versions of both Ghost (see Nazi adjacency problem mentioned above) and WriteFreely as well, but you asked for "free"... there's not much out there that I know of which offers adequate features (like newsletter mailing if desired) without charges.
Buttondown looks great (privacy first, no monetization of users), but again, the free tier is highly feature-limited.
Paragraph.xyz is the “web3 substack”
I would avoid it solely on the basis of not enabling any more of that crypto garbage, which is what web3 is.
WordPress is surprisingly ok
Unfortunate typo in the submission headline. If only they were "turning in" their Nazi welcome sign.
Whoops. Fixed.