Seriously... looking at the author's other articles immediately raises a ton of red flags, and makes me seriously doubt the veracity of this article (which similarly feels like it's trying to...
Exemplary
Seriously... looking at the author's other articles immediately raises a ton of red flags, and makes me seriously doubt the veracity of this article (which similarly feels like it's trying to incite a witch-hunt):
Wikipedia’s “Supreme Court” Enforces Sweeping Ban on Pro-Hamas Edit Gang
WIRED’s UK Gang Rape Disinformation
How Soros-Backed Operatives Took Over Key Roles at Wikipedia
How CNN’s Failure in Syria Was Great for its Genocidal Dictator
How Wikipedia’s Pro-Hamas Editors Hijacked the Israel-Palestine Narrative
I'd doubt the veracity more if they didn't bring receipts that match my shifting moderation experiences on Reddit. I haven't seen many in-depth discussions explaining how grassroots astroturfing...
I'd doubt the veracity more if they didn't bring receipts that match my shifting moderation experiences on Reddit. I haven't seen many in-depth discussions explaining how grassroots astroturfing campaigns are organized and executed.
What receipts, the handful of screenshots from the subreddit discord group, none of which seemed particularly damning, AFAICT? I also don't remember seeing any actual evidence directly linking RNN...
What receipts, the handful of screenshots from the subreddit discord group, none of which seemed particularly damning, AFAICT? I also don't remember seeing any actual evidence directly linking RNN to the named moderators either, but I could have missed that, I suppose. I would go back to reread it all over again but I've reached the site's free view limit so now their paywall prevents me from doing so, and archive.is doesn't work on the site.
I'm mostly referring to the content going from RNN or Samidoun to the subreddits. I don't think being part of a Discord server is itself damning, but it's a little different when said moderators...
I'm mostly referring to the content going from RNN or Samidoun to the subreddits. I don't think being part of a Discord server is itself damning, but it's a little different when said moderators are directing brigading campaigns from the server.
The moderators don't have to post the content to shape the narrative. Though in at least one case, a mod has used content from Samidoun:
In December, after Samidoun was designated a terror entity, an account tied to the network posted an image from a rally organized by Samidoun in Brazil with the title “Free Palestine!” and the caption “Today in São Paulo BR.” A post like this would normally be an unambiguous example of free expression if it weren’t for the fact that Samidoun is a front organization for the PFLP designed to fundraise from a Western audience. In another example, one of the network’s highly influential moderators, u/Sabbah, directly sources Samidoun content about a demonstration it organized in New York City in a r/Palestine post.
This is one of the (many) things that drove me away from reddit. In the olden days, it seemed like deleted posts were rare and included an explanatory comment from the mod. Then they became...
shifting moderation experiences on Reddit.
This is one of the (many) things that drove me away from reddit. In the olden days, it seemed like deleted posts were rare and included an explanatory comment from the mod. Then they became common, then ubiquitous. If I left the front page cached and tried to look at stories hours later, the submissions would be removed by a moderator half the time. Never with an explanation, but I'm guessing it's because they were reported as bots.
In any case, there's no transparency, which makes me feel like their moderation is not in my best interest (or if it is, it's only when their own best interests tend to overlap with mine).
I can't speak for every mod of course, but for me personally, it's just easier that way, to not leave a comment after a removal. Because leaving a comment, which of course notifies the commenter...
I can't speak for every mod of course, but for me personally, it's just easier that way, to not leave a comment after a removal. Because leaving a comment, which of course notifies the commenter like any other comment, tended to invite abuse. When I first started modding, and for a long time after, I would always leave a comment explaining which rule was broken.
But eventually people started arguing back. Even when the removal was 100% a clear rule violation. Sometimes I'd only leave a warning, not even removing the comment. But that would even get people riled up. Sometimes enough to start abusing me and my fellow mods via modmail.
So I started to just remove comments, lock threads, and even ban, without saying anything. I know from a user perspective, that's annoying. I've been on the receiving end a few times on other subs. But I was simply an unpaid volunteer. No one was paying me to deal with the lottery for users' bad behavior (well...full disclosure, I was able to get some reddit shares; still not enough to cover the hours and hours over the years of modding that I did).
I love how this is in equal parts showing how consent can indeed be manufactured, but also malding over the fact that a bunch of terminally online far leftists with entirely too much time are...
I love how this is in equal parts showing how consent can indeed be manufactured, but also malding over the fact that a bunch of terminally online far leftists with entirely too much time are outdoing state-sponsored Hasbara on the propaganda front.
I also love that every time the issue is that they're promoting views that may have been shared by "US-designated terrorist organizations" but refusing to engage with the actual material of the views.
The promotion of content from foreign terror organizations on Reddit, including by top moderators, raises serious legal concerns. U.S. “material support” laws prohibit aiding terror entities, including spreading propaganda
Citation needed. That would be a first amendment disaster and many people would have been in jail by now for expressing support for Hamas on X and similar websites.
all performed in service of radical ideologies that seek nothing more than totalitarian control of how we think and what we think about.
EDIT: If there's one thing you can gain from this article, is that, once again, nobody is immune to propaganda. If the zeitgeist can be controlled, it can certainly influence your own opinions. Keep that in mind and consume things critically.
For me, the ideological bend of the article is secondary to the methods discussed. I agree with you that nobody is immune to propaganda and manufactured consent. In this specific case, it's not...
For me, the ideological bend of the article is secondary to the methods discussed. I agree with you that nobody is immune to propaganda and manufactured consent. In this specific case, it's not just one side of the political spectrum either. Some surprising coalitions can come together to artificially promote content that pushes particular views. Even with a better search engine, it's getting harder to find honest, organic discussions online.
I think I can agree with that but sadly, I think that's just the nature of online spaces that grow big. People will compete for influence on those spaces, whether it's a billionaire buying a...
Yeeeeah, I recall that happening contemporaneously. Even with a coordinated astroturfing campaign, some catch on and others don't. I guess that's how it goes with marketing.
Yeeeeah, I recall that happening contemporaneously. Even with a coordinated astroturfing campaign, some catch on and others don't. I guess that's how it goes with marketing.
Even with the article’s slant, this looks like young Western activists coordinating; mostly because they are primarily communicating via Discord, something a Middle East state-sponsored initiative...
Even with the article’s slant, this looks like young Western activists coordinating; mostly because they are primarily communicating via Discord, something a Middle East state-sponsored initiative would never do.
This article dances around a pretty critical conclusion, though. If this is something online activists can do effectively, I guarantee you there are state-sponsored groups trying to do the same thing. You do not need to censor speech if you can algorithmically silence it, and control over ideas is an authoritarian golden goose.
I stay off of Reddit, Twitter, Facebook, and the like. Articles like this one convince me that the most reasonable future for online discourse will be a combination of well-known, deanonymized public figures with microphones and invite-only, anonymous communities like Tildes.
You cannot let anonymous actors coordinate to algorithmically manipulate information without leaving the free exchange of ideas ripe for manipulation. At least news organizations have known figureheads who risk reputational damage with their words. Random anonymous accounts have nothing to lose.
On Tildes, even though it’s anonymous, the small size means I get a sense of who the regular commenters are and what their perspectives are. People still risk reputational damage, albeit to a lesser extent. There are still avenues for exploitation, but this is why you need a benevolent dictator (and the site’s reputation depends on their behavior).
I’m not exactly sure how to construct a healthy system of discourse without some trust factor.
Pirate Wires is not a source I'd usually cite, but this is some in-depth investigative journalism with receipts for every claim. Regardless of anyone's personal views (seriously, I don't want...
Pirate Wires is not a source I'd usually cite, but this is some in-depth investigative journalism with receipts for every claim. Regardless of anyone's personal views (seriously, I don't want another discussion on I/P), I think this article goes a long ways towards explaining some of the changes in Reddit's hivemind feel over the past few years.
While astroturfing exploded around 2016 with Russia's social media campaigns, this kind of ideological moderation designed to create echo chambers is more recent. Powermods with an ideology are nothing new, but the sheer level of coordination is something different. The internet feels worse with the proliferation of ideological echo chambers artificially promoting talking points. I personally want moderation that's favourable to civil discussions, not heavy-handed ideological purists with an agenda.
The article is total propaganda as as much of that site.
Seriously... looking at the author's other articles immediately raises a ton of red flags, and makes me seriously doubt the veracity of this article (which similarly feels like it's trying to incite a witch-hunt):
I'd doubt the veracity more if they didn't bring receipts that match my shifting moderation experiences on Reddit. I haven't seen many in-depth discussions explaining how grassroots astroturfing campaigns are organized and executed.
What receipts, the handful of screenshots from the subreddit discord group, none of which seemed particularly damning, AFAICT? I also don't remember seeing any actual evidence directly linking RNN to the named moderators either, but I could have missed that, I suppose. I would go back to reread it all over again but I've reached the site's free view limit so now their paywall prevents me from doing so, and archive.is doesn't work on the site.
I'm mostly referring to the content going from RNN or Samidoun to the subreddits. I don't think being part of a Discord server is itself damning, but it's a little different when said moderators are directing brigading campaigns from the server.
The moderators don't have to post the content to shape the narrative. Though in at least one case, a mod has used content from Samidoun:
This is one of the (many) things that drove me away from reddit. In the olden days, it seemed like deleted posts were rare and included an explanatory comment from the mod. Then they became common, then ubiquitous. If I left the front page cached and tried to look at stories hours later, the submissions would be removed by a moderator half the time. Never with an explanation, but I'm guessing it's because they were reported as bots.
In any case, there's no transparency, which makes me feel like their moderation is not in my best interest (or if it is, it's only when their own best interests tend to overlap with mine).
I can't speak for every mod of course, but for me personally, it's just easier that way, to not leave a comment after a removal. Because leaving a comment, which of course notifies the commenter like any other comment, tended to invite abuse. When I first started modding, and for a long time after, I would always leave a comment explaining which rule was broken.
But eventually people started arguing back. Even when the removal was 100% a clear rule violation. Sometimes I'd only leave a warning, not even removing the comment. But that would even get people riled up. Sometimes enough to start abusing me and my fellow mods via modmail.
So I started to just remove comments, lock threads, and even ban, without saying anything. I know from a user perspective, that's annoying. I've been on the receiving end a few times on other subs. But I was simply an unpaid volunteer. No one was paying me to deal with the lottery for users' bad behavior (well...full disclosure, I was able to get some reddit shares; still not enough to cover the hours and hours over the years of modding that I did).
I love how this is in equal parts showing how consent can indeed be manufactured, but also malding over the fact that a bunch of terminally online far leftists with entirely too much time are outdoing state-sponsored Hasbara on the propaganda front.
I also love that every time the issue is that they're promoting views that may have been shared by "US-designated terrorist organizations" but refusing to engage with the actual material of the views.
Citation needed. That would be a first amendment disaster and many people would have been in jail by now for expressing support for Hamas on X and similar websites.
Ah yes, I love me some projection to top it all off. https://www.hrw.org/report/2023/12/21/metas-broken-promises/systemic-censorship-palestine-content-instagram-and
What a load of crock.
EDIT: If there's one thing you can gain from this article, is that, once again, nobody is immune to propaganda. If the zeitgeist can be controlled, it can certainly influence your own opinions. Keep that in mind and consume things critically.
For me, the ideological bend of the article is secondary to the methods discussed. I agree with you that nobody is immune to propaganda and manufactured consent. In this specific case, it's not just one side of the political spectrum either. Some surprising coalitions can come together to artificially promote content that pushes particular views. Even with a better search engine, it's getting harder to find honest, organic discussions online.
I think I can agree with that but sadly, I think that's just the nature of online spaces that grow big. People will compete for influence on those spaces, whether it's a billionaire buying a website he constantly gets roasted on or a bunch or co-ordinated netizens with mod powers on a Discord server. Here's the Harris-Walz campaign doing the same thing, reported on by an equally unsavory source: https://thefederalist.com/2024/10/29/busted-the-inside-story-of-how-the-kamala-harris-campaign-manipulates-reddit-and-breaks-the-rules-to-control-the-platform/
Yeeeeah, I recall that happening contemporaneously. Even with a coordinated astroturfing campaign, some catch on and others don't. I guess that's how it goes with marketing.
Even with the article’s slant, this looks like young Western activists coordinating; mostly because they are primarily communicating via Discord, something a Middle East state-sponsored initiative would never do.
This article dances around a pretty critical conclusion, though. If this is something online activists can do effectively, I guarantee you there are state-sponsored groups trying to do the same thing. You do not need to censor speech if you can algorithmically silence it, and control over ideas is an authoritarian golden goose.
I stay off of Reddit, Twitter, Facebook, and the like. Articles like this one convince me that the most reasonable future for online discourse will be a combination of well-known, deanonymized public figures with microphones and invite-only, anonymous communities like Tildes.
You cannot let anonymous actors coordinate to algorithmically manipulate information without leaving the free exchange of ideas ripe for manipulation. At least news organizations have known figureheads who risk reputational damage with their words. Random anonymous accounts have nothing to lose.
On Tildes, even though it’s anonymous, the small size means I get a sense of who the regular commenters are and what their perspectives are. People still risk reputational damage, albeit to a lesser extent. There are still avenues for exploitation, but this is why you need a benevolent dictator (and the site’s reputation depends on their behavior).
I’m not exactly sure how to construct a healthy system of discourse without some trust factor.
Pirate Wires is not a source I'd usually cite, but this is some in-depth investigative journalism with receipts for every claim. Regardless of anyone's personal views (seriously, I don't want another discussion on I/P), I think this article goes a long ways towards explaining some of the changes in Reddit's hivemind feel over the past few years.
While astroturfing exploded around 2016 with Russia's social media campaigns, this kind of ideological moderation designed to create echo chambers is more recent. Powermods with an ideology are nothing new, but the sheer level of coordination is something different. The internet feels worse with the proliferation of ideological echo chambers artificially promoting talking points. I personally want moderation that's favourable to civil discussions, not heavy-handed ideological purists with an agenda.