39
votes
Reddit is testing warnings when submitting to highly-moderated subreddits and encouraging users to post in other subreddits
Link information
This data is scraped automatically and may be incorrect.
- Title
- Reddit tests showing warnings before posting on highly moderated subreddits
- Authors
- Aamir Siddiqui
- Published
- Aug 27 2019
- Word count
- 578 words
There seems to be a somewhat reasonable idea hiding underneath this—attempting to warn users that they should make sure they're posting in an appropriate subreddit—but it's been implemented in a really bad and tone-deaf way. There are so many better ways that they could have tried to get that kind of information across without basically framing it as "the mods of this subreddit are jerks that will probably remove your post, you should go somewhere else".
There are a lot of additions/tests lately that seem very poorly thought-out and will almost surely end up causing more issues than they solve (see also today: giving mods the ability to send a message to all subscribers that comes through as an admin-labeled message). I think this warning in particular is a good example of a change motivated by a conflict that I expect we're going to see more and more: reddit needs to maximize their growth and engagement metrics, but moderation can hurt those.
Specifically, it's a really discouraging experience for a new user to have their first attempts at posting removed, especially if it happens silently (the default) and the user just thinks their posts were totally ignored. If new users don't keep posting, reddit's growth will die out. I guarantee they're worried about this, and they've already implied they're going to remove mods' abilities to set karma minimums with AutoModerator (that's the VP of Product), because those impact all new users and a lot of subreddits use them as a barrier for trolls/spammers.
The problem is that they're really going about trying to fix this the wrong way. Mods don't remove a huge number of posts and set karma minimums because they want to do those things. They do it because the ability to inform users of the rules before posting and block trolls/throwaways/spammers are almost totally nonexistent, so they need to. reddit should be addressing this by giving mods the tools they need so that the new-user-unfriendly methods aren't necessary any more, not even further hobbling the few tools they do have and putting big scary warnings on their subreddits for moderating "too much".
Have they never checked /about/spam/ on ANY major sub before? Do they not even consider that most mods of the large subs that'll impact the most have at least one programming-savvy person on their teams that can replicate that with our own bots? 'Shortsighted' is putting it mildly.
Screenshot of the warning in action. This is such a bad faith feature. They should inform users that they are submitting to a highly-moderated subreddit and display the submission rules, not encourage the users to go elsewhere.
Not to mention this sort of thing could be weaponized, so to speak. I could see one community brigading another community with inflammatory posts that mods are forced to remove, with the intention of eliciting the notification to post in other subs.
Yeah it's painting the moderation as if it's a bad thing. It's not.
Admin response in the /r/ModSupport thread about it: https://www.reddit.com/r/ModSupport/comments/cwmqnj/this_community_has_a_medium_post_removal_rate/eyddsoo/
But from the warning they give:
I don't see how an automatic "consider posting elsewhere" message isn't consistent with trying to move members and posts into other communities. If this really is about helping users follow subreddit rules better (and reducing the strain on mods) then this seems like a terrible way of going about it. My guess: high engagement is better for Reddit's business goals and lots of low effort comments is an easier way to drive that, and users getting their posts deleted in places like /r/AskHistorians risks driving away those high engagement low effort posters.
I know that admin comment blamed it on bad copy that's being changed but that just seems like a cop-out. I'm not sure how that warning could possibly be interpreted as anything but an effort to undermine moderators of communities with even modest anti-spam and trolling measures in place.
File this under "Reddit doesn't understand Reddit" and add it to the pile. What is it with them and Twitter, being so off-base on what makes their highly-successful platform highly-successful?
I'd imagine that they would like it to be very successful in a very particular and monetizable way, and are willing to overlook problematic elements of the tools and the userbase in order to focus on how it could be made so much better.
It's typically only the customer service team that's trying their darnedest to listen to and communicate customer needs higher up the chain (assuming the team isn't outsourced/isn't working remotely, which typically makes it become just another number to the company), and I don't think either Twitter or Reddit have customer service teams. So you just kind of get this unfiltered spitball of ideas making their way out the door.
People seem to not realise that this is an A/B test. Reddit developers love their A/B tests! They love trying things out in random ways on their
guinea pigsusers. (I hated having to explain this to users who suddenly found a new feature that noone else seemed to know about.)In this case, it's an A/B/C test. Selected users trying to post to high-moderation subreddits will be randomly directed to one of three outcomes:
A = no message
B = message saying "This community has a high post removal rate, please read the community rules"
C = message saying "This community has a high post removal rate, consider selecting a similar community with a lower post removal rate"
(Read the block of code in the middle of the article.)
The admin quoted in the article even says this:
And, they've already conceded they made a mistake:
In other words, Reddit is trying out a couple of different approaches to see what works. They do this all the time. It's just a test. And this particular part of this particular test failed.
CC: @Deimos
I think we all get that it's a test, the issue here is that the "consider these other communities instead" suggestion is so incredibly bad that it's amazing they even bothered to test it. It's yet another little peek into how backwards Reddit's understanding of their own platform actually is.
The people in this thread don't seem to get that. They seem to be reacting as if this is an actual feature which has been fully implemented.
I know that it's a test, but the only reason to A/B test something is because you're considering implementing it. The test results become worthless if they change anything about it significantly after the test, so it needs to be in the form they're considering implementing during the test. They started testing this, which means they believed it was in a state that could be implemented if the test went well.
We know it's a test, but they still chose a completely tone deaf way of implementing it. That's what's being criticized. The idea itself isn't inherently a bad one.
I appear to have been misled by remarks such as "This is such a bad faith feature." and "it's been implemented".
Well the comment describing it as a bad faith feature was from the submitter, and they reported it as a test in their title. And Deimos mentioned it as being implemented but also described it as an "addition/test". So I think it's all well understood.
edit: Corrected myself.
It looks like they've added this warning to the redesign now too, still using the exact same wording that they admitted was bad and promised to change.
On one side, I feel this might be unfair to overworked moderators, and that Reddit is fixing one mistake with another. On the other side, I feel vindicated for trying to post quality and appropriate content in subs with stringent, excessive or vague rules that seem to be applied in a random and subjective manner.
I am curious to know if this as well as anything else Reddit is trying to or will be implementing is another way for them to circumvent mods.
I am thinking about Blackout 2015. I remember multiple subreddits joining together and setting their subreddits to private as a protest against the admins (I am simplifying here). I could be mistaken but I also remember reading elsewhere that many of the mods were warned/threated that should anything like that happen ever again, the admins would forcibly remove all participating mods and replace them with other people. Something like that would obviously also upset a lot of users, but if the admins just quarantine or encourage users to participate in other subreddits, it sort of gets around that.
This is really something I'd need to see evidence of before believing. It doesn't fit the admin's prior M.O. where they let mods run almost entirely unleashed.
Thought my memory was completely making things up at first, but I managed to find where I had heard that. It turns out I was slightly incorrect though.
From here by dubteedub: https://tildes.net/~tech/c3v/r_changemyview_moderators_announce_that_they_are_launching_their_own_site_at_changeaview_com#comment-2zhz
I'm not privy to moderator discussions, so I cannot verify if it is simply conjecture among the mods that participated in the protest or if they actually have some messages implying something like that would happen.
Thanks for sharing your source.
I didn't participate in the protest, though I do follow ModTalk occasionally. However I suspect that most of this discussion would have taken place in the super-secret, default-only discussion chats. (I understand they use Slack today)
I actually like this feature, even though it is probably possible to improve it.
Moderation is a good thing, but removing posts is not an end in itself and moderators need to be held accountable somehow. Telling users how selective the sub they're targeting also has the potential to drastically reduce angry users and misunderstandings.
See also https://old.reddit.com/r/RedditMinusMods
What you see on reddit is almost 80% not what happens organically on reddit.
https://old.reddit.com/r/technology/comments/apu3oz/with_the_recent_chinese_company_tencent_in_the describes pretty well how the arbitrary censorship works on reddit.
I see you just registered - if you're on Tildes looking for a site without that kind of "censorship" (moderation is not censorship), you're probably not going to be happy here. This is not an "anything goes" site by any means.
Sure. I don't think these things are a dichotomy. I think HN is slowly devolving out of long-form intellectual discussions (I don't think they're trying very hard to make it their mission like you seem to be) but out of the rules they do try to enforce (like no ad hominem or calling people shills), I think they do a slightly cleaner job than Reddit, in that they do so publicly and while citing specific rules.
The point I was making was wrt the arbitrary part rather than suggesting anything goes.
What happens organically on Reddit is spam. And more spam. And lots of subreddit-rule-breaking submissions. Subreddits were created to allow for topic-specific communities and moderators were empowered to run them as they see fit (within the sitewide Reddit rules). The concept of "organic Reddit" died the day subreddits were created. If you don't like what you see in your feed, it's very straightforward to subscribe to the subreddits you support.
Anyone that believes that moderation on Reddit changed after receiving the Tencent investment has been reading too much r/conspiracy and r/WatchRedditDie. Full stop.
The OP of that submission is a particularly egregious user who aggressively modmails to demand his comments and submissions be restored based on his personal interpretation of subreddit rules. He frequently tries to claim that his revolutionary experiences with self-performed human fecal transplants are a medical miracle being suppressed by the academic and medical elite of Reddit.
It sounds like you and that poster are making the same points. I'm not sure which points you're arguing against.
Not sure whether his beliefs on fecal transplants validate or invalidate his description of the content control model of subreddits.
Is it really censorship, though? To me, the intent of censorship is a concerted effort by a government to prevent citizens expressing a freedom of speech. To me, moderation is not censorship. It is community curation to prevent content not in keeping with the intents of a community off that community. By that definition, Apple engages in censorship when it prevents poorly built and crafted apps from being on their app store.
At least in my view, the term "censorship" is a loaded, emotive word often used in completely incorrect contexts by those who have a vested interest in bad faith arguments.
What you say is correct in most cases. But lately, I started to think that, even though corporations are definitely distinct from governments, monopolies and oligopolies can sometimes function as de facto censors because they actually possess the power to silence political dissent.
And I believe they do.
But, as you exemplified very well, in most cases that is not what is going on.
And yet you are free to create your own self-hosted website for any opinion you might want to hold, and nobody will stop you from doing so.
It seems to me like you are requiring private organizations to provide you a platform where you can freely spread any idea to a very specific and very vast audience, which is simply unreasonable.
What I'm trying to say is that companies should be limited in their freedom to limit dissent through the creation of monopolies and oligopolies. Private entities can and do have a nefast influence on democracy, and must face sanctions to their power to coerce public opinion. IMHO, ignoring that is either irresponsible or naive.
It's interesting how this thought seems to have only become prevalent with the internet, rather than the near Monopoly of entertainment apparatuses we have in other media.
I'd say that the internet gave birth to a whole other level of
monopolyoligopoly. Never before we had such a small amount of companies directly controlling the media on a global level.And yet conservatives seem to only care about Facebook and Google, who are not directly controlled by any media outlet or moneyed interest but themselves.
They definitely care a lot about Twitter too.
I should have generalized it to "social media". 😜
Well, no other media feels that much like a two-way street. It's unreasonable to demand to be given a platform in film or print as a lone individual, but on the internet (or any social media site therein), a two way road and therefore expression is much more feasible.
From a different angle: The internet replaced a lot of in-person communication, and now that communication is being censored. That doesn't sound as harmless as a newspaper refusing to print your opinion.
But it is. You're always welcome to use Telegram for secure chat or even roll your own website to publish your thoughts.
Sure. Terminologies are tricky. I think reddit is somewhere in between. I don't think spez is sitting behind censors' monitors trying to cherrypick a narrative, nor do I think there are benevolent wise masters moderating us to become better versions of ourselves.
I just think a small group of individuals being bestowed power to control a large group with little mechanism for accountability and transparency doing so it semi-covert fashions is problematic. I don't think it's very dogmatic.