46
votes
Substack is removing some publications that express support for Nazis, the company said today
Link information
This data is scraped automatically and may be incorrect.
- Title
- Substack says it will remove Nazi publications from the platform
- Authors
- Casey Newton
- Word count
- 1212 words
Good on Platformer to push the issue and the response by Substack is at least favourable... but it feels like Substack is doing this because of the bad publicity rather than any sort of altruism. When this all blows over in, oh I don't know, a week from now, you'll see that Substack goes back to their lazy approach.
it's a good sign that peer pressure works, but yeah, I wouldn't let up on Substack over such a puny response.
From what I've seen about this this morning, looks like they're kicking out 5 with less than 100 followers, while the literal dozens that remain and actually make them money with the bulk of followers will happily remain. This is nothing more than empty posturing, and is in fact a declaration, "We WANT to continue being the Nazi bar, they're some of our best customers!"
Not wanting to defend Substack, but maybe the bigger ones are smart enough to not openly call for violence?
Promotion of white supremacy, promotion of racism, is inherently violent even if the people doing it don't say "Kill these people", and so it should always be removed even if it's dressed up in scientific racist language.
Substack are not doing this, and it's bad that they're not doing it.
While I can agree with this in theory I think OP's point was more towards the idea of using clearer legal standards. You can, legally, stand on a public street and talk about how you think the Nazi's were right so long as you don't call for violence. OP seems to be suggesting that substack is applying the same standard.
I have no idea if this is the case, but seems worth at least identifying.
...in the US. But in the US it's not even strictly illegal to call for violence as long as it isn't an imminent threat (and imminent does a lot of work there, I doubt even the vilest shit on substack qualifies). This has never been about any concrete legal standard imo
I agree with everything you're saying. Just pointing out it might be the standard they're using to curate content.
Yeah, agree, them covering their bile with a civil facade makes little difference. Sorry I was unclear.
I deleted previous comment, because it looks like attack, but I still have the feeling that this things is quite subjective.
Unless you kick out all the Nazis, you are still a Nazi bar.
Nazis are evil people who advocated for genocide of the Jewish people and other groups they viewed unfavorably. There is no subjectiveness to this; no ifs or buts. They are a cancer to society and deserve no room to platform their horrible beliefs. If you identify as a member of a group that supports such things, you are a horrible person who deserves to be silenced.
Unfortunately this issue isn't helped by people using the word Nazi to mean, "people I don't agree with".
Quite literally from the last time this was discussed:
https://popehat.substack.com/p/substack-has-a-nazi-opportunity
So...now Nazi is "array of right-wing bigots and assholes". I've been called both and I'm absolutely sure I'm not a Nazi. I'm also absolutely sure i'm not right wing or a bigot (i am an asshole but I do try not to be).
It makes discussing this topic very difficult because there's a large difference between "anyone not ethnically pure should be enslaved and killed as they are less than human" and "I don't think people should have abortions", but there are absolutely people conflating the two.
So in reality there is a conversation about "should substack allow Nazi's" but people like the linked article are trying to piggyback off that into "and all these other people I don't like".
The second part is extremely concerning because this kind of wide reaching conglomeration of beliefs is, frankly, dumb and dangerous from a bunch of different directions.
I think that's a reductionist way to look at things. The thing you quoted is absolutely not "people I don't agree with". People who don't think universal healthcare or UBI are good ideas are one thing, but people who are looking to take away rights from women, queer people, and other minorities are direct threats to the way we live our life. We've seen countless times that arguing over terminology is a losing proposition. Listen to the intention of what people are saying; the words don't matter.
I generally agree with this sentiment, yet it's vague brush also reinforces the prior mentioned "people I don't agree with" definition. People who are anti-abortion can absolutely be described as "a direct threat to the way we live our life" from a pretty common perspective, yet that covers (at least) literally hundreds of millions of people.
There has to be a way to accommodate speaking with that large portion of society without reducing every platform not instantly banning them to a "nazi bar". You have to treat the actual behavior beyond that, otherwise it's a massive assumption of bad faith towards people who could barely have a formed a opinion on the topic. Teenagers or people just starting down the pipeline likely shouldn't be immediately labelled and isolated from the only people who could help pull them back.
I think it's far too easy to assume the wrong intention behind someone's words. I've had it done to me on this very site, and it's something growing more pervasive online that's making more and more topics "very difficult" to discuss.
I understand and sympathize with what you're saying, but I also think that the distinction is effectively meaningless. I think that a lot of the people in the margins you describe are effectively already "poisoned" with the rhetoric of the extremists we are addressing, and will eventually succumb to it regardless of weather or not we point out that they have been poisoned.
In any case, these words aren't meant for an audience of Nazis. If you see this kind of rhetoric and assume that they are talking to you, you either actually are one of the extreme people they are talking about or you have self-selected yourself to become one.
But more than anything, arguing semantics just weakens the arguments. It's a recipe for losing, and losing these things has measurable negative effects.
I’ve been called a Nazi because I advocate for trans people competing in an open division instead against cisgender women. Even while I support every other conceivable trans right — bathroom policies, gender affirming care for children, low barriers to obtaining hormone therapy and surgery for adults, strong privacy rights for children with regards to parental notification, and so on.
I'm sorry that happened to you, but that's a very different context, so I'm not sure it's relavent to what we're talking about.
My point is, you can’t please everyone. One person’s hate speech can be another person’s reasonable position. Maybe some day in the future I will come to see my position as violating the rights of a vulnerable minority but right now I think I’m clearly and obviously in the right, while there are others who see what I say as intolerable hate speech.
And this is why I'm glad the first amendment protects the speech of these people in the US, as horrible as they are.
Because it demonstrates that it will also protect me.
That is not universally true, and should not be regarded as such.
It's true often enough for me that
is too subjective a threshold for me to be comfortable using to censor opinions.
No one is censoring opinions here. None of the quotes you've referred to have been deleted or removed.
Discussion and disagreement is not censorship. I whole-heartedly disagree with your stances here, but my refusal to agree with you is not the same as censorship.
But we digress. "One person's hate speech can be another person's reasonable position" is not universally true; even you agreed with that.
The entire point of this discussion is what threshold substack should be using to censor opinions.
Okay.
Are you willing to go to bat that Nazis can be considered as having a reasonable position?
I'm not here to defend Nazis. However, the comment I initially replied to was talking about
and that's not Nazis, and "taking away rights" is a pretty broad concept. There are legitimate policy disputes that absolutely fall under that description depending on how one characterizes rights. For instance, abortion is certainly in there, depending on how one characterizes the life and rights of a fetus, versus the life and rights of a living woman.
I think a reasonable line for platform level censorship should stay away from ephemeral concepts like rights and focus on concrete things like advocating genocide, or the targeted killing of specific individuals.
Right.
Like banning Nazis.
Which they aren't doing.
Like I said, I’m not here to defend Nazis. However, this discussion is veering into banning groups that I wouldn’t personally characterize as Nazis.
There’s a reason I personally find “right-wing authoritarian” to be a useful term. It doesn’t have the same zing as fascist, but it cuts through the semantics a lot better.
How so?
Define Nazis. Is it supporter of the far-right totalitarian socio-political ideology and practices associated with Adolf Hitler and the Nazi Party in Germany? Or is it any racist? Or any critic of LGBT+?
In the context of this discussion, it is actual Nazis.
There are actual Nazis on substack.
https://www.theatlantic.com/ideas/archive/2023/11/substack-extremism-nazi-white-supremacy-newsletters/676156/
Previous comment indirectly defines Nazis as "Promotion of white supremacy, promotion of racism". I'm not sure that we should call any racist, any white supremacy advocat Nazi.
Okay. But that's not what I'm pointing at.
Substack has actual Nazis on the platform, utilizing it for fundraising and promoting their message.
And I agree. I disagree with previous comment to which I replied.
Emphasis mine. I think it's worth noting that Substack is just reinterpreting its existing policies a bit more broadly to remove calls for violence. Regardless of where you fall on this free speech debate, removing content calling for violence is a legal requirement in many jurisdictions. My primary concern is still whether the Notes feature will end up promoting dangerous extremist content.
Yup, my read on this is that this move is to weather the current bad publicity storm and hope most people move on. They aren’t changing their guidelines, and “re-interpretation” at best renders the guideline meaningless. Substack is the latest Nazi bar.
As he often does, Freddie deBoer wrote a rant about writers I don't know and don't care about doing bad things. It's mostly pretty terrible so I won't share it top-level, but it does seem to have some interesting (though not very current) examples:
These Rules About Platforming Nazis Sure Seem Arbitrary and Incoherent! (Freddie deBoer)
...
...
As someone who's writing some open source software that anyone at all might use to publish whatever terrible things they like (or it wouldn't be open source), I'm glad people mostly don't blame the software developers, but I fear it won't last.
I think Substack crossed the line once they evolved into having active recommendations and pushing specific content at people. And of course directly allowing them to monetize. I am not sure how much Wordpress.com does that, but I think there is a substantial difference in merely hosting - and then outright helping with profiteering and promotion. Casey Newton also wrote a bit about that few days ago here
Were they caught actively recommending Nazi blogs or handling payment for them? In the article, Newton seems to be arguing that it could happen:
Platformer is leaving Substack, in a blow to the company. Looks like their wet fart of a response was seen for what it was.
A reminder that Substack articles posted to Tildes can be filtered. All posts are tagged
substack
.Really needs a more accurate title. Substack is removing five Nazi blogs, not all of them.
I changed the topic title to be the article lede, which is more accurate.