20
votes
Overmoderation?
How does everyone feel about...the other platform and overmoderation? It seems like a lot of content and comments get removed due to borderline rule infractions and/or hair trigger enforcement.
Wondering how you all feel about that and what it means for this platform. Is it an inevitable outcome of a large community, or did the system itself lead to such an outcome?
Or is my perception way off?
Yea, I never heard a serious argument for "over"-moderation on reddit other than, "I posted this random link and mods removed it because it broke some boring rule!". Meanwhile, subreddits turn to absolute shit if mods aren't actively enforcing rules.
I've run into a few subs that were over-moderated by power-tripping mods. /r/MLB had that problem, I believe, which led to the rise in popularity of /r/baseball.
I think it's a pretty common experience to see numerous threads on reddit unnecessarily locked by mods that clearly just wanted to feel powerful. The point of having moderators is to moderate the discussion, not shut it down. The obvious response to seeing some objectionable content in an otherwise productive discussion is to remove the comments and penalize the posters, but it absolutely seems like the default response across most of reddit has become to lock the entire thread because "you guys can't play nice".
It's a sort of overmoderation and undermoderation - an abuse of moderator powers for the purpose of gratifying the moderator rather than actually bettering the community.
Alternatively they've removed about a hundred comments saying that celebrity X who hung themselves 'was an inconsiderate asshole (he left his children behind, that fuck!)' and 'people who commit suicide are vile filth' and just decide that thread is spiraling out of control and isn't worth saving.
It goes both ways :/
I don't think you've ever been to /r/bitcoin then :)
Although arguably it isn't really overmoderation but rather outright censorship.
An important thing to understand about modern toxicity and trolling is how organized it is. Not organized in the sense of coordinated campaigns (though there is that), but organized in the sense that bad behavior can be supported by entire communities and become baked into community norms that wide swaths of people agree on.
I remember a case on reddit where a girl shaved her head to raise money for charity, was subjected to a massive harassment campaign and accusations of fakery. When they learned that it wasn't fake, instead of taking any responsibility for it, the community converged on the lesson that "you shouldn't be trying to raise money or posting personal info." This wasn't a norm at the time on reddit, but it has become one, in part because it served as an escape hatch that let people avoid taking responsibility for harassment and instead accepting it as an inevitable norm.
In terms of moderation, there will be important points in the history of any budding online community where it faces choices about what kind of place it wants to be. Even reasonable efforts at moderation with legitimate purposes will get politicized, escalated to issues of free speech, censorship, and tyranny, and there will be no way to have a reasonable debate over reasonable cases.
I think the fix has to be two fold. One part is just trying by whatever means we have to evolve a healthier community. Another part is to recognize that these kinds of debates will flare up and people, even swaths of self-reinforcing communities, will become enraged even with reasonable moderation. Dealing with toxicity on the modern internet means that sometimes you simply have to face them down and tell them they're wrong, and not let them dictate what the culture's going to be.
I have to agree on this one. I used to be a mod for a niche but fast-growing subreddit on the other site for a while, and big part of the reason why I quit was because we couldn't even stick to our own rules out of a constant fear that we would be perceived as too strict and overbearing. Whenever I called for actually enforcing our rules, other mods would go 'well but...'. Meanwhile we had trolls running around the sub, spreading some weird conspiracy theories about mods, calling people names, and crying about censorship whenever one of their posts got deleted.
I don't know, maybe I'm too old school, but in every other online community I was a part of (forums, MMO guilds, etc.), if a person couldn't respectfully (not unquestioningly - constructive criticism is fine) participate in a community, they were banned and the community was better for it. Now everyone seems to think this is way too harsh.
Has there been any discussion on topics that could devolve into what incels and T_D became?
On the opposite end of the spectrum, I don't want another r/politics. The top rated comments on many of the posts on there are some variation of "Traitor!" and "Fuck That Guy!". Even though it might feel good knowing that everyone else feels like blasting obscenities at the person you don't like, it absolutely adds nothing to the conversation. The reason why I want to get away from Reddit is that when I walk into the real world, those type of thoughts permeate my mind when I'm having conversations with people. I might not say those things out loud, but I feel like people are being unreasonable if they don't have the same attitude. My fault for being easily manipulated by group think I guess, but at least I'm trying to get away from it.
I almost wouldn't mind extremely heavy handed moderation in that regard. One sentence declarations of opinions about people should be wiped out.
I do think many people, including myself, will agree with that :)
Right there with you on that - and it's not just things on the political spectrum. There is something very satisfying about righteous indignation and the internet has gotten incredibly efficient at amplifying minor local news and casting it as a major player in "the culture war" (which, as a case-in-point, is that even a thing?). I've become exhausted with outrage porn. It tends to be based off a limited viewing of the facts and obscures real issues that take more nuance to discuss. It's not healthy and I think almost everyone is receptive to it. Rather than try to figure out if it's OK in moderation, I think it may be best to try to curate the conversation and help all of us keep things in perspective.
On the question of overmoderation, I don't fall into the camp that holds every opinion deserves to be heard or that every pun deserves to be read. Everyone deserves to be treated with the respect they are willing to show, but if a position is not backed up solid reasons or a comment is just part of the same "I did nazi that coming" threads then I think I'm ok without them.
Discussions regarding preventions? I'm not good with censorship, especially politically, but I don't like T_D's level of shit.
Well spoken!
I couldn't agree more. I firmly believe in the right to have opinions, and be free to share those opinions without fear of government penalty... But when your speech is hurting others, we as a society have an obligation to shut that shit down.
I think the problem is that you can just go make a new subreddit. You've got shitty mods who don't do their job and then shitty mods making new subreddits. There are so many subreddits that a lot of mods are shitty. It's a never ending loop of shit
I probably fall into the category of moderators that you think overmoderate, but IMO, we have to overmoderate to make up for the lack of administrative tools/features on reddit especially given the overall size. I hate it too. All of our filters are hacked together to fill the gaps, and it becomes insanely frustrating to users. We actually just relaxed a couple filters in one of my subs (/r/listentothis) for that very reason.
I hope that ~ evolves into a place where a user can confirm compliance with all posting rules before actually submitting. On reddit, it's a shoot first ask questions later model, which is further complicated due to their anti-spam rate limiting. That said, I always try to moderate with common sense and by weighing the benefits of a decision. Say a post with popular breaking news hits the front page but has a small rule infraction - I'd leave it up then comment explaining to users why it wasn't removed. Since the site's algorithm could prevent it from front paging after the user corrects the error and resubmits, it's worth preserving the original. Then my comment in the thread is a great opportunity to educate the userbase and reiterate our rules.
Man, this exactly. Part of the reason "overmoderation" is a problem because Reddit's tools are simultaneously too weak and too blunt to moderate effectively. Lets say that a subreddit wants to enforce quality titles. Titles are important and color the way a post is received and discussed. Whelp, too bad: a moderators only recourse for this is to remove the post and ask OP to resubmit, which is inevitably seen as being heavy-handed.
Reddit's mod tools are built to stop trolls, not educate users or foster discussion. At the same time, they're intentionally weak because moderators have no human oversight. The tools over there don't contain many affordances that assume users are willing to learn and mods are willing to teach, both acting in good faith, and the lack of said affordances is a self-reinforcing cycle.
I think this is a great way of putting it. There will be some complex issues where excess of moderation is tangled together with the legitimate and pressing need to curb toxicity. Spending the effort to develop sophisticated tools equal to the complexity of the problem (which imo nobody on the face of the earth has done yet), or spending the time and energy, to go over all the granular special cases to give them the merit they, in some sense, "deserve" can quickly leave you buried.
I'm no fan of overzealous moderation, but I think one positive change to community norms would be to have increased tolerance of those excesses when they occur in the context of policing toxic behavior. To me that's a special case where our toleration for moderating mistakes needs to be at its highest. My go-to example of this is the context of reddit banning /r/fatpeoplehate, which caused an eruption of people not concerned about toxicity, but extremely concerned about technical consistency of how moderation was enforced. Instead, the response should have been "oh, well we understand there was a serious problem here, and that context makes the mod actions more understandable." Meanwhile, where there's no context of curbing toxicity, (such as overzealous mods at, say, /r/MLB), there doesn't need to be as much latitude or tolerance for the excesses of overzealous mods.
This is all from my experience as a moderator of two former "defaults".
To really understand this, you have to understand the following:
Being a moderator of a default is essentially being an unpaid help-desk employee volunteering for an employer who actively ignores any of the ideas you have to make both your and the user's experience better.... for years at a time... and then, when employer actually decides to make a change it usually actively breaks a lot of things that you, the moderator, counts on to make your miserable existence moderately tolerable.
All the while you have massive RAMPANT amounts of spam and self-promotion to deal with, and no real tools with which to do so, a "work-load" that would require hundreds of people to volunteer their time properly deal with (assuming nobody want's to make an unpaid job out of moderation!), and users that generally think you're a piece of shit just trying to keep them down.
All the while, you really just wanted to talk to some people about music.
Hence, it's a lot easier and less time-consuming to just nuke anything in the mod-queue that looks questionable.
So the "strategies" to fix this are as follows:
Recruit those hundreds of people e.g., /r/science. This can work, but then moderation becomes a whole different task, more like management -- and there are now more missing tools in your life! Again, didn't I just want to listen to music?
Write bots that are artificially limited by the fact that they aren't running "at submission". Go this way, and you end up spending a whole bunch of your time writing bots only to realize that a) it sucks for the user that they have like 2-3 different bots that can remove their submission for different reasons and b) since the bots don't run at submission, there's no way to unify things.
The kicker is that 99% of this could have been avoided if you could have just made a change to the website itself. In my example, the literal scourge of my existence is users sending modmail complaining about automod not recognizing the title format (for /r/listentothis). Why do we have a title format? Well, not only does it cut out a bunch of extraneous info ("My friend's new single, I hope you love it as much as I do reddit!") from the titles, it's essential for our bots to recognize the artist and song submitted. Why do we need this? Well, we have to do lookups on external music databases to make sure that the users aren't spamming their own (sorry users) crappy bands all the time / or reposting Kanye 24-7.
If we were able to make a custom submission form, or even better allow the mods to write a regex that formats that took info from external databases and populated it for you, a good 75-80% of our mod-mail would simply disappear. But what you get on that other website is complete and utter ignoring of your requests.
Hence, I'm not worried about over-moderation on ~s. Simply freeing up huge amounts of volunteered time spent on thankless, repetitive tasks will immensely increase the quality of moderation. Adding a change-log based system for moderator review should clean up most of the rest of the problems.
This is a really insightful look at being a mod and has changed my perspective on the matter, to be honest
A question - if modding is a tedious and thankless job, why do you do it?
The main reason is that I'm good friends with some of the people I mod with (sitting around and digging deep into obscure music tends to do), and up until recently (i.e., ~) that meant hanging them out to dry (who will fix the bots when / if they break?). We've done some things I'm quite proud of, and I wouldn't want to see all of that go up in flames
I'm barely aware that mods exist in most subreddits that aren't called AskHistorians. Reddit to me seems to be a big mess of people shouting worthless half sentences out into the void that no one pays attention to unless you're spouting a meme that makes it to the top.
This isn't to shit on Reddit mods too much, many are doing the best that they can. But as a whole, the Reddit platform has so little moderator presence that it really hurts the site. This site, though not fully to what I would personally prefer, is at least trying to maintain a moderator presence everywhere and I appreciate that.
I certainly get the worthless half sentences into the void part
I don't generally have too hard of a time on Reddit. There are certain communities that tend to be overzealous in their moderation (cough politics cough), but most IME are fine. The main issue I see is inconsistency in applying moderation. The stereotypical example is banning trolls and inflammatory comments, but only enforcing it against dissenting viewpoints (such as conservative views on /r/politics). Make the moderation truly neutral (like /r/NeutralPolitics) and the communities tend to be pretty good.
I think the /r/politics team has it rough. Most moderating teams on reddit have too much content to review personally and end up relying too much on user-reports. That means the /r/politics team out the gate is relying on the largely conservative-hostile community to neutrally and accurately report rule breaking content. Of course that doesn't happen. The people who frequent /r/politics often struggle to accurately read anything that dissents from their personal point of view. Almost anyone who posts a dissenting opinion gets downvoted massively, mods are no exception. Those sorts of posts and comments are also MUCH more likely to be reported than someone expressing a sentiment that agrees with the communtiy consensus but which also breaks the rules. Mods will have to go out and find that sort of content manually, which takes a lot more time and effort.
So how do you fix a community that has learned to be assholes to one another? It's really, really tough to try to make a top-down solution to that sort of problem.
Downvotes are fine to me. Hey, it's the community; whatever. My issue is mods not fairly arbitrating between parties. I got a week-long ban there for responding to someone calling all folks in flyover country bigots with "Fuck off. Sincerely, flyover country." Somehow, my comment was inflammatory and deserved an instant ban with no warning, while the other was totally fine. Evidently, denigrating half of the country (that doesn't agree with your politics) as bigots is fine discourse. Was mine inflammatory? Sure. Would I have happily accepted at least a warning if the other party did too? Sure.
The problem was that I took it to the mod team, and they would have none of it. Asked why it was a ban and not a warning, and they wouldn't give a reason. Asked how the other comment could be seen as not inflammatory, and they didn't care. They absolutely selectively enforced their viewpoint. It isn't just a matter of what gets reported and what doesn't; it's a matter of what the individual mods there think deserves to stay.
Not to turn this into a rant about /r/politics (that week-long ban made me realize there was no reason to stay subbed. I unsubbed and have had a much more pleasant feed since), it's just the "main" place I've had issue. For comparison, I've also run into trouble in /r/PoliticalDiscussion...but the mod team there was willing to engage me in a conversation about the comment thread. I agreed I was a little over the top and promised to not do it again, and they gave a warning to the other guy, too. So there are definitely good mod teams out there.../r/politics just isn't one of them.
Yeah I'm not trying to say that the team is even-handed in all or most things (I happen to think a lot of the way they've structured policy to be incredibly by-the-book lends itself to exactly the context-deaf enforcement you're complaining about). But I do think you end up with mod teams that reflect their communities in most cases (especially because it is the community that drives reports to the mod team), and the /r/politics community is really awful.
It's kind of infuriating when you try to paticipate in a sub and get a post or thread removed for very oddly specific rules. I understand the mods can run their subs as they please but sometimes it just seems much too heavy handed.
Echo chambers are also a large problem, on both sides of any debate. I've seen both ends of it and the sheer inability to even consider opposing arguments logically is abysmal. But that's just unfortunately how people are.
My first posts on Reddit fell into this trap of oddly specific rules, I was a lurker ever since. Being here is like a fresh start and I'm not afraid to comment or what have you, which is refreshing because in subreddits I always feel behind the trend or sort of lost. I don't know what bearing this has on the conversation of moderation, but I thought I'd say it.
I agree with you and it has everything to do with moderation. Subs that have a thousand flairs, title requirements, post and link requirements, allowed/disallowed content (outside of what would make sense to a normal user) all cut down on activity and make participating a nuisance.
Sometimes I posted something I thought was discussion worthy on /r/android (after I had already learned questions should be directed elsewhere) and it was removed for not being "thought provoking" or whatever. So I just quit trying to participate there lol.
I think communities need to really decide how heavy they want to moderate because over or under moderation can really make or break a community.
I'm hoping with tildes and the smaller focused communities that with less moderation this will allow for more discussions and the "bubbling" of stuff to the top to be better (for lurkers especially lol). I just hope every new individual can find themselves a home, I don't like tribalism like on T_D, but there is something nice about having a little corner that you can call your own, and that can influence and be influenced by other little corners with the advertised "bubbling up" of content.
Yeah a well-moderated sub usually has a lot of good discourse, responses below vote treshold, and few removed comments, in my opinion.
Agreed. But it's an art that is hard to perfect tbh. Especially when it's so easy to allow bias to guide your moderating decisions.
You don't know why things were removed, so hard to judge as a user imo
I'd say that's part of the problem. The user should always know why their post was removed.
The user, generally yes, everyone else, not so much.
Reddit has a major issue with overmoderation, in my opinion. When you only have a hammer, every problem begins to look like a nail. /r/subredditcancer and /r/undelete are great repositories of examples of this sort of thing - some false positives of course, but generally speaking it's a widespread thing on any large subreddit. It gets worse when the people form irc channels or discord servers in private in which they badmouth users, and form a group identity that is opposed to the users or criticism of moderation. (I was even a part of that sort of thing when i was a moderator for a minecraft server - there were users with legitimate grievances (some were legitimate toxic trolls, but not all) and I painted them with a broad brush as being problematic. In retrospect i regret that tremendously.)
A great example is the explainlikeImfive subreddit. You are not allowed to post short explanations. A five year old (and most redditors) do not have large spans of attention. Sometimes you don't need a novel to explain something. But if your response is too short, not only will automoderator remove it, but if you attempt to circumvent it by putting in stuff to fluff it out, you will be banned for doing so by a human moderator who could evaluate the original removal and determine that it is fine - but because the policy says no short submissions, the whole thing gets warped into something that is like.... it's like someone with a pipe shoved through their kneecap still managing to walk around and go about their work.
The rule was made for a good reason initially, but the consequences are something completely different. Some moderators will rationalize and say that it's good for xyz reasons still, but i don't believe they are being objective.
Hell, you could look at the incident that prompted me to create /u/publicmodlogs - the /r/technology tesla fiasco. If that isn't an instance of overmoderation, I don't know what is.
The inverse sentiment, i.e., users disparaging mods or not understanding the differences between mods and admins (particularly, getting upset about things mods have zero-control over) is also the bread and butter of reddit.
This is a great example that shows both the disconnect between the user and the moderator and how the tools available to a moderator are woefully inadequate to the task of their moderation.
Let's break it down and see how much you and I agree:
Let's say, for the sake of argument, that ELI5 originally banned short top-level comments because: a) they tend to be off-topic, jokes, or memes, b) they tend to be inadequate answers or c) a combination of both. In my experience, this holds pretty true but your experience may differ.
Clearly, the hammer meets nail approach is to simply whack any comment shorter than X characters. But clearly this has it's own issues, namely not every answer needs five paragraphs of explanation -- sometimes you could even just link to a previous answer and that would be better than what you'd come up with.
Now, someone comes along and recognizes that in this case, this rule is dumb. It just doesn't work for the answer they made:
Some users might message the mods and ask for an exception -- this is a bad system, who knows if a mod is online right now, and by "later" your comment is irrelevant for a high traffic subreddit.
Other users (presumably, you :P) start filling in the required length with Lorem Ipsum or the like -- also a bad solution, given that you've elevated an issue with a dumb rule to flagrantly breaking a dumb rule, you can easily anger a mod, who on reddit, may be the equivalent of god the sub (unless by some miracle they do something awful enough to bring the admins to bear, like... say... not have any sort of moderating activity for years, and then you still have to beg :P). Personally, I wouldn't ban you for that kind of shenanigans, but I would be annoyed and if you kept doing it after I told you to knock it off, well...
Most users though, probably just go somewhere else and don't care that their comment is removed. Why bother throwing your complaint into the void if nothing will come of it (e.g., mailing /r/reddit about issues).
The mods should "just approve" things that break dumb rules for the right reasons. Except now you've created a situation where mods are working on tedious tasks, for no pay or no discernible improvement to user-behavior or moderator experience. On a decent sized subreddit it's not uncommon to have hundreds of things get reported in a single thread (if no-ones's online managing the queue). Who the hell has the time to go and do that? Especially when you're treated (at best) as a glorified, unpaid help-desk employee.
A tiny minority of users might try to start a discussion about the rules, and how to change them to better suit the sub. That's an ok line of thought, but what if most short top-level comments on reddit aren't worth viewing? Should you change the rules such that a bunch more garbage gets through so that the needles in the haystack aren't summarily removed?
In none of these "solutions" have we reached any sort of good outcome for the user OR the moderator.
So, where does the moderator go in their toolbox from here?
Well, on reddit nowhere; there's nothing you can do really, short of adding even more moderators, with no way to coordinate (except for Slack/Discord) and no way to meaningfully change user behaviour.
On ~s, well maybe we could build in some sort of Sentiment Analysis to auto-tag short top-level comments as jokes (if applicable) or determine if they're "serious" or not. Maybe we can incorporate a user's trust rating into a sliding scale on whether a short top-level comment can be removed. There's a lot of things out there one could try... but not on reddit.
Very good description that matches my experiences. The 'mods are Nazis' narrative is to easy and one dimensional, and doesn't help move to a solution.
I think I probably feel the same as you on this topic
As one of the people who regularly performs this over-moderation on the other platform, I'm in favour. :)
Depends on the size, the "other platform" is massive compared to Tildes, so eventually with enough people the average quality of content goes down, average quality of mods, etc etc.
I don't think over moderation is too bad if it's done properly, people kept within rules for the better experience of all, not just shutting down one commenter because he did something incorrect. The biggest issue with sites in which the user base "votes" on comments is you end up shadowing all the smaller comments, either for being too late or being disagreeable.
For instance, AskHistorians is super strict on their moderation. They have a complete absence of shitposting because of it, and enjoy a reputation of having really good answers because of it
Depends on the topic really
I hate it when a subreddit is overmoderated. When you post a comment and it instantly gets removed because xyz or when a thread doesn't meet one of the 7 requirements to stay up, it's just really frustrating. What there needs to be is a middle ground. Basic rules for users to follow but not so many that you have to tiptoe and triple check the dozens of rules to make sure your post stays up. That's the problem I had with some subreddits. Some of them had so many over detailed rules that everything feels to mechanical because nobody wants to get their posts removed.
I have a fairly large subreddit that only has three rules.
There has only been two incidents in the 6 years the subreddit has existed.
I think moderation should be automated. If a comment or post is racist or offensive, users should be able to mark it as offensive and explain why it is offensive. With certain number of such marks, system should notify the OP and then after a threshold, either delete or war other users before opening the post or comment.
User offensive post score should also be tracked as well as offensive reports scores.
I just tried to give an example. If a post does not belong to the group, same method can be applied.
That is true. I am just trying to put an idea.
I'm one of a dozen people moderating a niche sub with some 14,000 people, most of whom expect consistency of quality in the posts.
Posts that outright do not follow with the guidelines are removed, but posts that seem relevant at first glance despite not actually belonging in the sub are kept if they are debated and discussed in the comments. I find it a shame to delete actual interaction or discussion even if the post is irrelevant. Few users have a problem with that, but the problem is not deal-breaking.
I believe that moderation should cater to the expectations of the community while enforcing the rules that make it unique. Mods who consistently take things personally are irresponsible if their sub aspires to offer more than just stroking the moderator's ego.