Looks like the controversy we previously discussed made the New Yorker. The author (Gideon Lewis-Kraus) attempts to explain what Slate Star Codex is all about and the history of the rationalist...
Looks like the controversy we previously discussed made the New Yorker. The author (Gideon Lewis-Kraus) attempts to explain what Slate Star Codex is all about and the history of the rationalist movement, which is quite a task.
I think he does a reasonable job and is trying to be fair. I have some criticisms, spots where an argument is summarized in a way that's a bit clumsy or unsympathetic, but for someone not previously familiar with Scott Alexander's work, I'd rather they started with this article than with RationalWiki.
A quibble with the headline: I suspect that "Silicon Valley" might be one of those terms that's too vague to be all that useful for analysis, and that "war" is the wrong metaphor, as it assumes there are easily distinguished sides.
This has been also covered in Reddit's /r/slatestarcodex https://old.reddit.com/r/slatestarcodex/comments/ho6g2b/slate_star_codex_and_silicon_valleys_war_against/ and the fellow splinter community...
It seems there are a lot of people in the slatestarcodex topic saying they think the article is pretty fair, while the people in the Motte topic seem more angry about it. (Some of each in both...
It seems there are a lot of people in the slatestarcodex topic saying they think the article is pretty fair, while the people in the Motte topic seem more angry about it. (Some of each in both places, though.)
This article feels like two separate pieces have been smashed together with not much care as to the distribution of the paragraphs. There's the article that serves as a primer of what the...
This article feels like two separate pieces have been smashed together with not much care as to the distribution of the paragraphs. There's the article that serves as a primer of what the rationalist community is, and why people care about Scott Alexander / SSC, and there's the article that aims to critique major themes in Scott's writing, and examine the commentary body of SSC. I really wish the author had just split the two themes out here.
The title is also pretty orthogonal to the article, we lose (and come back to) the thread about silicon valley several times, and we lose the thread about a war against the media almost immediately.
And then the third article, which explores the perceived war going on between Silicon Valley and traditional media. EDIT: I got three votes, but made a change at that point: I don't know if...
there's the article that aims to critique major themes in Scott's writing, and examine the commentary body of SSC.
And then the third article, which explores the perceived war going on between Silicon Valley and traditional media.
EDIT: I got three votes, but made a change at that point: I don't know if there's any intended malice, and don't mean to comment on that, but I felt the article did a good job of pointing out potential motivation between both sides of this specific culture war.
I think a piece of this may have to do with the notion of "going to war with the allies you have." Various flavors of xenophobes and bigots found arguments in the "rationalist universe" that they...
It is just astounding to me that a group that started around atheism, skepticism, and logic is now largely lumped in a blend of evangelical fanatics, white supremacists, misogynists, homopho[b]es, transphobes, islamophobes and more.
I think a piece of this may have to do with the notion of "going to war with the allies you have." Various flavors of xenophobes and bigots found arguments in the "rationalist universe" that they could tailor to their purposes (I'd argue that it's easy to see why). Slowly members of the "xenophobic community/ies" become part of the rationalist one, ideas are exchanged, and perhaps one finds itself defended by the other when an idea is criticized. Over time the lines get blurrier.
A recent article on bellingcat about the "Boogaloo movement" takes pains to discuss that the question of "is this a racist/white-nationalist movement?" remains...unsettled and ambiguous at best:
Reaction to [a series of racist posts] was not universal [...] The point here is not that the Boogaloo movement is wholly or authentically anti-racist, but that there appears to be a very active struggle within some parts of this movement as to whether or not their dreamed-of uprising will be based in bigotry.
But more specifically on the topic of finding support from unlikely allies, I'm consistently reminded of the story of Candace Owens. Today we know her as an associate of Charlie Kirk and one of the most prominent young, black pro-Trump pundits/activists/whatever. But in 2016 she was just a person on the internet who occasionally wrote anti-conservative (and anti-Trump) blog posts. Having suffered from bullying, Owens began kickstarting a website called "Social Autopsy" which would basically let you share examples of cyber-bullying, creating something of a directory of internet bullies.
Backlash to "Social Autopsy" was pretty swift, and people like Zoe Quinn and Randi Lee Harper—critics and targets both of the still-simmering Gamergate movement—were among the more vocal detractors. Quinn had already worked on several anti-bullying/harassment initiatives and her argument that this was a bad idea came from a place of genuine concern, but pretty soon the Gamergater enemies of Quinn et al came to Owens' defense. And so Owens went to war with the allies she had. Soon enough she was one of them, and the Owens we know today is, by her own admission, the product of this experience:
“I became a conservative overnight,” Owens said in 2017 on an online talk show hosted by Libertarian political personality Dave Rubin. “I realized that liberals were actually the racists. Liberals were actually the trolls.” [...]
“Social Autopsy is why I’m conservative,” Owens later explained. [source]
(An exhaustive and exhausting account of the drama was also written by Jesse Singal who engaged in the controversy quite a bit on Twitter at the time. It's not worth the read but I leave it here for reference.)
The author of the New Yorker story does a good job of delineating where the SSC/rationalist drama of the day intersects with broader American cultural tribalism and/or political entrenchment. My point here is that, insofar as the "grey tribe" doesn't fit neatly within the bounds of the "blue" one—even if it's purportedly something of a bastard child—we shouldn't be surprised when the grey takes on more aspects of the red ("the enemy of my enemy...").
There's something of a fallacy around "logic and reason." If you convince yourself that you're a skeptic, and have infallible judgement, you actually become more prone to falling for conspiracies,...
I also think its funny how so many people that claim to follow beliefs of rationality and logic are so quick to jump at baseless and unfounded conspiracy theories like those listed here.
There's something of a fallacy around "logic and reason." If you convince yourself that you're a skeptic, and have infallible judgement, you actually become more prone to falling for conspiracies, cults, that sort of thing. In a way, being self-assured quickly turns to hubris. To be honest, I feel like knowing this concept is a bit of a risk, because it could be used to justify ones own superiority, furthering the trap, which in a way re-exemplifies the problem. As a side note, another axiom I'd take a whack at is: The louder somebody preaches something, the more likely they are to not actually practice it.
I think I've written something about this somewhere on reddit/tildes before but it's worth noting how conspiracy theorists often state their arguments in the language of skepticism, rigor, demands...
I think I've written something about this somewhere on reddit/tildes before but it's worth noting how conspiracy theorists often state their arguments in the language of skepticism, rigor, demands for proof/evidence, and refusal to be easily persuaded—only to abandon all of those principles when it comes to the overarching dogmas which drive their theories. They're indeed extremely skeptical, but not of the holes in their own theories.
As to why proponents of conspiracy theories can't/won't see/acknowledge this hypocrisy/fallacy—I think conspiracism is often about comfort and control.
On comfort: Things that people make conspiracies about are often world-changing events where the accepted explanation is simply too simple to live up to its complex consequences. If a single guy with a gun, working alone, can kill the president of the US and send the world into chaos, this suggests an extreme fragility of the state of the world. Perhaps we'd rather believe that anything with massive consequences must itself have been perpetrated on a massive scale—yes, the world has been changed, but it took a massive undertaking to do it.
On control: To be skeptical is to be mistrustful. Many explanations for massive world events come to us by way of our government (think 9/11 commission, Warren commission, etc.) or major media institutions. If you don't trust those institutions, then believing their explanations for big events is a tough pill to swallow. Believing in conspiracy theories often means crafting or building upon them, connecting them to others, talking to fellow believers about it. This gives the conspiracist a sense of control and participation over uncovering/building this narrative. I think this is what's so attractive about QAnon: that "Q" actively talks to his followers (idk what they're called) and often gives them breadcrumbs and clues, telling them to investigate certain things, rather than relaying explicit statements or predictions.
I feel the same way. It seems like the people who remained there after everyone left went extreme on the intellectual/rational side and I honestly think that turned into a double-sided blade. I...
I feel the same way. It seems like the people who remained there after everyone left went extreme on the intellectual/rational side and I honestly think that turned into a double-sided blade. I always thought that their obsession over "effective altruism" was the perfect example - removing compassion from the equation removes the primary reason why most people donate to begin with. While, sure, Bill Gates fighting malaria has made the world a better place, he could have spent just 20% of his net worth to end homelessness in the United States.
Uh, no, wait. I'm not sure you have a clear idea of what the rationalist community is? It's a bit hard to tell who is in the rationalist community since they aren't checking memberships or...
Yeah, that is about what I expected from the rationalist community.
Uh, no, wait. I'm not sure you have a clear idea of what the rationalist community is? It's a bit hard to tell who is in the rationalist community since they aren't checking memberships or anything, but as far as I can tell it's not these two. Their reactions are their own.
Anyone who's paid attention to Y Combinator and Hacker News knows who Paul Graham is. I've enjoyed reading some of his essays. I haven't heard of any ties between him and Scott Alexander, although they very probably have read each others' work so there is at least influence that way, as fellow writers. I haven't heard anything about Graham being a member of the rationalist community, and it seems more accurate to say that he's a founding member of the Y Combinator community? Or more broadly, Silicon Valley startup community. He has his own themes and catch-phrases that are different from the sort of thing you see on LessWrong.
I hadn't previously heard of Balaji Srinivasan. He's got a blue check so I guess some people know him, but I don't see any obvious ties to Scott Alexander or to the rationalist community. It's unsurprising that a random crypto person might read Slate Star Codex and have strong opinions.
I'm not sure I have all that clear idea of the rationalist community either, but I have read some of the Sequences and I thought Harry Potter and the Methods of Rationality was often a great read (though over-long). I've tried to read LessWrong off and on, but much of the writing there doesn't interest me, so I just visit occasionally. This is one of the reasons I subscribed to Slate Star Codex instead, since he can actually write and I'd rather read the good stuff. I don't think of myself as a member of the rationalist community since I don't think any of the more prominent members even know who I am, and I've never met anyone in person. I'm more of a fan, and I think there has been some influence on my thinking and writing.
So, I think of the actual rationalists as people who have posted on LessWrong sometimes, have read some of the Sequences, and have adopted some of their lingo, and have gone on to get to know each other in real life. There used to be courses sometimes, and there are other events.
There is some overlap between the rationalists and cryptocurrency, in that I've seen a few Less Wrong posters congratulate themselves on noticing Bitcoin early and profiting from this. It's thought that this is the sort of thing that rationalist thinkers should be able to do, to notice unconventional trends early and actually act on that knowledge. But I wouldn't say that crypto is all that important to rationalists in general and I don't get the vibe that they're enthusiasts.
I haven't seen any overlap with gamergate, or much video game discussion at all. There is certainly a theme that "social justice warriors" are often thought to be a menace, but this is more closely related to the idea that culture war discussions are tempting but not very useful discussions, to be avoided. (Also, your attempt to make ties between different people who have little to do with each other would itself be considered harmful to civil dialog and an example of what's wrong with that sort of discussion.)
More generally, the article sometimes has a tone of blaming Scott Alexander for his fans, and I think that's problematic. I think it's better to judge him by what he writes, and by how he uses his writing to try to influence what people do, while understanding that he has no real power over them beyond moderating his own forums. And wanting civility and niceness is not just a pose, it's a pretty core theme.
You're talking about a "flurry of harassment and hate" and I see evidence of people writing angry letters, canceling their New York Times subscriptions, and maybe deciding not to talk to their...
You're talking about a "flurry of harassment and hate" and I see evidence of people writing angry letters, canceling their New York Times subscriptions, and maybe deciding not to talk to their reporters. As protests go, this seems... pretty civilized? Isn't it the same thing that happens whenever The New York Times does something that people don't like? Or is it that doxing Scott Alexander isn't a legitimate thing to protest, in your view?
People aren't exactly taking to the streets. As far as I can tell, not much is happening, but if you have evidence that protest about Slate Star Codex is getting too rough, then sure, it would be worth hearing.
The Anil Dash post doesn't support your argument. That post is about taking responsibility for moderating your own website's comment section, and Scott Alexander does (did) a reasonable job of that for his own website. He is also a moderator of r/slatestarcodex, though there are other moderators as well and I don't know how active he is. He's not a moderator of r/themotte or r/sneerclub and doesn't have any say over what goes on there.
I don't see the paradox of tolerance as applying here either. This is a philosophical argument that unlimited tolerance cannot last because intolerant people will take over. But LessWrong and r/slatestarcodex are moderated forums and discussion is pretty well under control. (It's not true of Twitter, but then Twitter is out of control in all sorts of terrible ways.)
You say that you see normal people fleeing the rationalist / skeptic community. It might explain why we see different things because that is not a single community. I don't think any further discussion of what various communities are like is worthwhile unless you actually say which forums you have in mind; it's just too vague to talk about otherwise.
I was going to take some time to see what's happening on Twitter and get some questions answered. (Like, who are these people?) But it's been a couple days and I can't bring myself to do the...
I was going to take some time to see what's happening on Twitter and get some questions answered. (Like, who are these people?) But it's been a couple days and I can't bring myself to do the research. Suffice it to say that people being vile on Twitter is pretty unsurprising, and you're right that's it's naive not to expect this.
It seems like for any moderated forum, you can consider two different views of it. There is the moderated view, the one that readers want and the moderators aspire to give them, where people are mostly pretty polite, make reasonable arguments, and so on. Then there is the spambox, containing all of all vile crap that got filtered out or deleted. Anil Dash makes an argument for the moderated view, that we should not allow vile hate to fester in a website's comments section and have a responsibility to get rid of it. And I pretty much stick with that, which is why I read moderated websites like Tildes and Hacker News, and my Twitter feed is pretty carefully curated. (Plus I use the realtwitter redirect trick.)
But that doesn't mean the unmoderated view doesn't exist. The spam and hate and unproductive flamewars still happened, even if you can't always see them. Even for Tildes, if the deleted topics and comments were saved on a separate webpage and we were able to judge Tildes by them, it wouldn't be a pretty picture. And of course when there is manual moderation, the moderators see the spambox view, the stuff normal readers don't normally see. You might call this the moderators-eye view. On the big sites with flagged posts there are huge teams of people who only see this view and I understand that it can be traumatizing.
I will still insist that the moderated point of view is valid too, that it's okay to judge a community by its best writing and its best posts and decide that there is a lot of good in it.
The thing is, the vile stuff doesn't go away and it can easily migrate elsewhere. On Twitter, everyone has effectively got their own blog and it's up to them to block anything in their replies that they don't like, and even if you block it, you still see it. Twitter has backup-moderation, but it only goes so far.
I think as I just demonstrated, the people on Scott Alexander's website seem to be pretty mean and far from polite when it suits them.
Apparently some of Scott Alexander's readers are pretty hateful, but they aren't "people on Scott Alexander's website" when they do it on Twitter. I guess you could call them supporters.
Unfortunately, once something becomes a larger controversy it spreads all over the place and limiting discussion to moderated forums doesn't work. We can see this most clearly in politics, where if you judged Bernie Sanders by what his worst supporters do, or Andrew Yang by what his worst supporters do, then you would conclude that they are both terrible people. And I'm saying we shouldn't do that. (I think it would be difficult to judge whose supporters are worse on average since it would require statistics and arbitrary methodological choices.)
Also, to be clear, r/sneerclub is a apparently a group for people who hate rationalists or just want to sneer at things in general, so that's definitely going to be a spambox view of things. You might compare it to the "Shit Hacker News Says" Twitter account. I don't read it and only heard of it recently.
The r/themotte group is kind of different in that the "culture war" thread used to be in r/slatestarcodex but Scott Alexander asked them to go away and cut ties - sort of a friendly divorce, making them the official outcasts, I guess.
I'm not sure what celebrities (or even just celebrities on the Internet) should do about the fact that some of their supporters probably will go into other people's forums and do nasty things? Condemn them harder? I guess we should be thankful that we're not celebrities. It seems like there is a transition, where people you never met become supporters and do nasty things in your name, and I hope I never get there.
I've never been "deep" into those communities but the current situation also saddens me since I believe they're generally promoting good ideals. Their problem comes down to refusing to see that...
I've never been "deep" into those communities but the current situation also saddens me since I believe they're generally promoting good ideals.
Their problem comes down to refusing to see that the real-world result of science and logic based content can be irrational. That seems ironic but it's definitely true. You're creating a culture where everything that can receive the "rational" seal of approval can longer be questioned as counter-arguments would be perceived as "irrational". And that's so easy to abuse.
I see a bit of an analogy with an argument against surveillance. While, in theory, it leads to more security, if someone corrupt gets behind that wall of security that person now has unproportional amounts of power. Large parts of the rational community got taken over by right-wing ideologists, waving some charts or statistics to support their politics, which look like they vaguely fit the requirements but fall apart under scrutiny (or at least have a deeply negative impact on society when bubbling up in the wrong context). Disproving a vaguely logical sounding theory is hard and I doubt a majority of people on those sites are professional scientists and experts in the fields they're commenting on. They open the flood gates to right-wingers (or plain assholes) who just like the sound of "objectively true" and some random people who were just looking for a way to make sense of the world see the trend, can't find an obvious flaw with the arguments and just go along since it fits their ideal of "fact-based opinion".
Looks like the controversy we previously discussed made the New Yorker. The author (Gideon Lewis-Kraus) attempts to explain what Slate Star Codex is all about and the history of the rationalist movement, which is quite a task.
I think he does a reasonable job and is trying to be fair. I have some criticisms, spots where an argument is summarized in a way that's a bit clumsy or unsympathetic, but for someone not previously familiar with Scott Alexander's work, I'd rather they started with this article than with RationalWiki.
A quibble with the headline: I suspect that "Silicon Valley" might be one of those terms that's too vague to be all that useful for analysis, and that "war" is the wrong metaphor, as it assumes there are easily distinguished sides.
This has been also covered in Reddit's /r/slatestarcodex https://old.reddit.com/r/slatestarcodex/comments/ho6g2b/slate_star_codex_and_silicon_valleys_war_against/ and the fellow splinter community /r/TheMotte https://old.reddit.com/r/TheMotte/comments/hoadqs/slate_star_codex_and_silicon_valleys_war_against/
It seems there are a lot of people in the slatestarcodex topic saying they think the article is pretty fair, while the people in the Motte topic seem more angry about it. (Some of each in both places, though.)
This article feels like two separate pieces have been smashed together with not much care as to the distribution of the paragraphs. There's the article that serves as a primer of what the rationalist community is, and why people care about Scott Alexander / SSC, and there's the article that aims to critique major themes in Scott's writing, and examine the commentary body of SSC. I really wish the author had just split the two themes out here.
The title is also pretty orthogonal to the article, we lose (and come back to) the thread about silicon valley several times, and we lose the thread about a war against the media almost immediately.
And then the third article, which explores the perceived war going on between Silicon Valley and traditional media.
EDIT: I got three votes, but made a change at that point: I don't know if there's any intended malice, and don't mean to comment on that, but I felt the article did a good job of pointing out potential motivation between both sides of this specific culture war.
I think a piece of this may have to do with the notion of "going to war with the allies you have." Various flavors of xenophobes and bigots found arguments in the "rationalist universe" that they could tailor to their purposes (I'd argue that it's easy to see why). Slowly members of the "xenophobic community/ies" become part of the rationalist one, ideas are exchanged, and perhaps one finds itself defended by the other when an idea is criticized. Over time the lines get blurrier.
A recent article on bellingcat about the "Boogaloo movement" takes pains to discuss that the question of "is this a racist/white-nationalist movement?" remains...unsettled and ambiguous at best:
But more specifically on the topic of finding support from unlikely allies, I'm consistently reminded of the story of Candace Owens. Today we know her as an associate of Charlie Kirk and one of the most prominent young, black pro-Trump pundits/activists/whatever. But in 2016 she was just a person on the internet who occasionally wrote anti-conservative (and anti-Trump) blog posts. Having suffered from bullying, Owens began kickstarting a website called "Social Autopsy" which would basically let you share examples of cyber-bullying, creating something of a directory of internet bullies.
Backlash to "Social Autopsy" was pretty swift, and people like Zoe Quinn and Randi Lee Harper—critics and targets both of the still-simmering Gamergate movement—were among the more vocal detractors. Quinn had already worked on several anti-bullying/harassment initiatives and her argument that this was a bad idea came from a place of genuine concern, but pretty soon the Gamergater enemies of Quinn et al came to Owens' defense. And so Owens went to war with the allies she had. Soon enough she was one of them, and the Owens we know today is, by her own admission, the product of this experience:
(An exhaustive and exhausting account of the drama was also written by Jesse Singal who engaged in the controversy quite a bit on Twitter at the time. It's not worth the read but I leave it here for reference.)
The author of the New Yorker story does a good job of delineating where the SSC/rationalist drama of the day intersects with broader American cultural tribalism and/or political entrenchment. My point here is that, insofar as the "grey tribe" doesn't fit neatly within the bounds of the "blue" one—even if it's purportedly something of a bastard child—we shouldn't be surprised when the grey takes on more aspects of the red ("the enemy of my enemy...").
I can tolerate anyone except the outgroup. I'd link it, but you know...
There's something of a fallacy around "logic and reason." If you convince yourself that you're a skeptic, and have infallible judgement, you actually become more prone to falling for conspiracies, cults, that sort of thing. In a way, being self-assured quickly turns to hubris. To be honest, I feel like knowing this concept is a bit of a risk, because it could be used to justify ones own superiority, furthering the trap, which in a way re-exemplifies the problem. As a side note, another axiom I'd take a whack at is: The louder somebody preaches something, the more likely they are to not actually practice it.
A lot of people I know, and know of, who buy into conspiracies start with what seems like innocent Socratic questioning, like the rational skeptic types. A simple "what if?" in regards to an observable truth, given the wrong justification, can cause a fair amount of damage, and I have a couple examples of these little grains of truth that led to greater conspiracies: 5g occupying a frequency band use to detect water, potentially damaging weather forecasting, we don't have the original tapes from the Moon landing, researchers did create viruses in labs to study SARS and understand how it could be worse, and had been doing these experiments for a decade.
I think I've written something about this somewhere on reddit/tildes before but it's worth noting how conspiracy theorists often state their arguments in the language of skepticism, rigor, demands for proof/evidence, and refusal to be easily persuaded—only to abandon all of those principles when it comes to the overarching dogmas which drive their theories. They're indeed extremely skeptical, but not of the holes in their own theories.
As to why proponents of conspiracy theories can't/won't see/acknowledge this hypocrisy/fallacy—I think conspiracism is often about comfort and control.
On comfort: Things that people make conspiracies about are often world-changing events where the accepted explanation is simply too simple to live up to its complex consequences. If a single guy with a gun, working alone, can kill the president of the US and send the world into chaos, this suggests an extreme fragility of the state of the world. Perhaps we'd rather believe that anything with massive consequences must itself have been perpetrated on a massive scale—yes, the world has been changed, but it took a massive undertaking to do it.
On control: To be skeptical is to be mistrustful. Many explanations for massive world events come to us by way of our government (think 9/11 commission, Warren commission, etc.) or major media institutions. If you don't trust those institutions, then believing their explanations for big events is a tough pill to swallow. Believing in conspiracy theories often means crafting or building upon them, connecting them to others, talking to fellow believers about it. This gives the conspiracist a sense of control and participation over uncovering/building this narrative. I think this is what's so attractive about QAnon: that "Q" actively talks to his followers (idk what they're called) and often gives them breadcrumbs and clues, telling them to investigate certain things, rather than relaying explicit statements or predictions.
edit: Weird, right after writing this I logged onto Twitter and saw a Tweet (thread) making the exact same argument about the appeal of QAnon.
I feel the same way. It seems like the people who remained there after everyone left went extreme on the intellectual/rational side and I honestly think that turned into a double-sided blade. I always thought that their obsession over "effective altruism" was the perfect example - removing compassion from the equation removes the primary reason why most people donate to begin with. While, sure, Bill Gates fighting malaria has made the world a better place, he could have spent just 20% of his net worth to end homelessness in the United States.
Uh, no, wait. I'm not sure you have a clear idea of what the rationalist community is? It's a bit hard to tell who is in the rationalist community since they aren't checking memberships or anything, but as far as I can tell it's not these two. Their reactions are their own.
Anyone who's paid attention to Y Combinator and Hacker News knows who Paul Graham is. I've enjoyed reading some of his essays. I haven't heard of any ties between him and Scott Alexander, although they very probably have read each others' work so there is at least influence that way, as fellow writers. I haven't heard anything about Graham being a member of the rationalist community, and it seems more accurate to say that he's a founding member of the Y Combinator community? Or more broadly, Silicon Valley startup community. He has his own themes and catch-phrases that are different from the sort of thing you see on LessWrong.
I hadn't previously heard of Balaji Srinivasan. He's got a blue check so I guess some people know him, but I don't see any obvious ties to Scott Alexander or to the rationalist community. It's unsurprising that a random crypto person might read Slate Star Codex and have strong opinions.
I'm not sure I have all that clear idea of the rationalist community either, but I have read some of the Sequences and I thought Harry Potter and the Methods of Rationality was often a great read (though over-long). I've tried to read LessWrong off and on, but much of the writing there doesn't interest me, so I just visit occasionally. This is one of the reasons I subscribed to Slate Star Codex instead, since he can actually write and I'd rather read the good stuff. I don't think of myself as a member of the rationalist community since I don't think any of the more prominent members even know who I am, and I've never met anyone in person. I'm more of a fan, and I think there has been some influence on my thinking and writing.
So, I think of the actual rationalists as people who have posted on LessWrong sometimes, have read some of the Sequences, and have adopted some of their lingo, and have gone on to get to know each other in real life. There used to be courses sometimes, and there are other events.
There is some overlap between the rationalists and cryptocurrency, in that I've seen a few Less Wrong posters congratulate themselves on noticing Bitcoin early and profiting from this. It's thought that this is the sort of thing that rationalist thinkers should be able to do, to notice unconventional trends early and actually act on that knowledge. But I wouldn't say that crypto is all that important to rationalists in general and I don't get the vibe that they're enthusiasts.
I haven't seen any overlap with gamergate, or much video game discussion at all. There is certainly a theme that "social justice warriors" are often thought to be a menace, but this is more closely related to the idea that culture war discussions are tempting but not very useful discussions, to be avoided. (Also, your attempt to make ties between different people who have little to do with each other would itself be considered harmful to civil dialog and an example of what's wrong with that sort of discussion.)
More generally, the article sometimes has a tone of blaming Scott Alexander for his fans, and I think that's problematic. I think it's better to judge him by what he writes, and by how he uses his writing to try to influence what people do, while understanding that he has no real power over them beyond moderating his own forums. And wanting civility and niceness is not just a pose, it's a pretty core theme.
You're talking about a "flurry of harassment and hate" and I see evidence of people writing angry letters, canceling their New York Times subscriptions, and maybe deciding not to talk to their reporters. As protests go, this seems... pretty civilized? Isn't it the same thing that happens whenever The New York Times does something that people don't like? Or is it that doxing Scott Alexander isn't a legitimate thing to protest, in your view?
People aren't exactly taking to the streets. As far as I can tell, not much is happening, but if you have evidence that protest about Slate Star Codex is getting too rough, then sure, it would be worth hearing.
The Anil Dash post doesn't support your argument. That post is about taking responsibility for moderating your own website's comment section, and Scott Alexander does (did) a reasonable job of that for his own website. He is also a moderator of r/slatestarcodex, though there are other moderators as well and I don't know how active he is. He's not a moderator of r/themotte or r/sneerclub and doesn't have any say over what goes on there.
I don't see the paradox of tolerance as applying here either. This is a philosophical argument that unlimited tolerance cannot last because intolerant people will take over. But LessWrong and r/slatestarcodex are moderated forums and discussion is pretty well under control. (It's not true of Twitter, but then Twitter is out of control in all sorts of terrible ways.)
You say that you see normal people fleeing the rationalist / skeptic community. It might explain why we see different things because that is not a single community. I don't think any further discussion of what various communities are like is worthwhile unless you actually say which forums you have in mind; it's just too vague to talk about otherwise.
I was going to take some time to see what's happening on Twitter and get some questions answered. (Like, who are these people?) But it's been a couple days and I can't bring myself to do the research. Suffice it to say that people being vile on Twitter is pretty unsurprising, and you're right that's it's naive not to expect this.
It seems like for any moderated forum, you can consider two different views of it. There is the moderated view, the one that readers want and the moderators aspire to give them, where people are mostly pretty polite, make reasonable arguments, and so on. Then there is the spambox, containing all of all vile crap that got filtered out or deleted. Anil Dash makes an argument for the moderated view, that we should not allow vile hate to fester in a website's comments section and have a responsibility to get rid of it. And I pretty much stick with that, which is why I read moderated websites like Tildes and Hacker News, and my Twitter feed is pretty carefully curated. (Plus I use the realtwitter redirect trick.)
But that doesn't mean the unmoderated view doesn't exist. The spam and hate and unproductive flamewars still happened, even if you can't always see them. Even for Tildes, if the deleted topics and comments were saved on a separate webpage and we were able to judge Tildes by them, it wouldn't be a pretty picture. And of course when there is manual moderation, the moderators see the spambox view, the stuff normal readers don't normally see. You might call this the moderators-eye view. On the big sites with flagged posts there are huge teams of people who only see this view and I understand that it can be traumatizing.
I will still insist that the moderated point of view is valid too, that it's okay to judge a community by its best writing and its best posts and decide that there is a lot of good in it.
The thing is, the vile stuff doesn't go away and it can easily migrate elsewhere. On Twitter, everyone has effectively got their own blog and it's up to them to block anything in their replies that they don't like, and even if you block it, you still see it. Twitter has backup-moderation, but it only goes so far.
Apparently some of Scott Alexander's readers are pretty hateful, but they aren't "people on Scott Alexander's website" when they do it on Twitter. I guess you could call them supporters.
Unfortunately, once something becomes a larger controversy it spreads all over the place and limiting discussion to moderated forums doesn't work. We can see this most clearly in politics, where if you judged Bernie Sanders by what his worst supporters do, or Andrew Yang by what his worst supporters do, then you would conclude that they are both terrible people. And I'm saying we shouldn't do that. (I think it would be difficult to judge whose supporters are worse on average since it would require statistics and arbitrary methodological choices.)
Also, to be clear, r/sneerclub is a apparently a group for people who hate rationalists or just want to sneer at things in general, so that's definitely going to be a spambox view of things. You might compare it to the "Shit Hacker News Says" Twitter account. I don't read it and only heard of it recently.
The r/themotte group is kind of different in that the "culture war" thread used to be in r/slatestarcodex but Scott Alexander asked them to go away and cut ties - sort of a friendly divorce, making them the official outcasts, I guess.
I'm not sure what celebrities (or even just celebrities on the Internet) should do about the fact that some of their supporters probably will go into other people's forums and do nasty things? Condemn them harder? I guess we should be thankful that we're not celebrities. It seems like there is a transition, where people you never met become supporters and do nasty things in your name, and I hope I never get there.
I've never been "deep" into those communities but the current situation also saddens me since I believe they're generally promoting good ideals.
Their problem comes down to refusing to see that the real-world result of science and logic based content can be irrational. That seems ironic but it's definitely true. You're creating a culture where everything that can receive the "rational" seal of approval can longer be questioned as counter-arguments would be perceived as "irrational". And that's so easy to abuse.
I see a bit of an analogy with an argument against surveillance. While, in theory, it leads to more security, if someone corrupt gets behind that wall of security that person now has unproportional amounts of power. Large parts of the rational community got taken over by right-wing ideologists, waving some charts or statistics to support their politics, which look like they vaguely fit the requirements but fall apart under scrutiny (or at least have a deeply negative impact on society when bubbling up in the wrong context). Disproving a vaguely logical sounding theory is hard and I doubt a majority of people on those sites are professional scientists and experts in the fields they're commenting on. They open the flood gates to right-wingers (or plain assholes) who just like the sound of "objectively true" and some random people who were just looking for a way to make sense of the world see the trend, can't find an obvious flaw with the arguments and just go along since it fits their ideal of "fact-based opinion".