The future of moderation on Tildes
It seems like a large percentage of us that are also moderators on Reddit-- myself included.
It seems that there's a generally negative attitude toward moderators on Reddit, which I totally get. Moderation on Reddit is flawed. Community members feel a sense of ownership in the community (which they should have), but bad moderators can ruin that. How do you guys think moderation should be handled here?
Here's a link from the docs that describes current plans: https://docs.tildes.net/mechanics-future
It highlights plans for a reputation system, which I think is the right way to go.
I also just realized that the same discussion was posted 18 days ago, but perhaps discussion with some of the newer users is worthwhile nonetheless:
https://tildes.net/~tildes/6e/community_moderators
Activity doesn't necessarily mean the user is the best at being a moderator-type, it feels like a different skill set is involved. Posting clever snarky comments doesn't necessarily make for a good mediator and impartial fair judge. By posting a lot fo content, you can become biased towards your own content.
Activity will be a factor, but not the only one. If the system ends up rewarding people for doing a lot of low-effort posting, that means we've done it wrong and need to change it.
Yeah, this is actually a hard problem to get right from a algorithm standpoint. I add a lot of mods to r/science on reddit, and we then promote good comment mods to "lt. mods" if they so activity, a clear understanding of the rules, and the proper disposition. You have to have all of those, if one is missing, it doesn't work. I'm not sure how you judge 'disposition' without being a human.
In my case, I ended up as a mod of /r/crypto because the previous one decided it took to much time for him, and so he reached out to me. It was partially with the motivation that I was very active and frequently directed off topic posters to the right subreddit (like telling them /r/codes would be a better fit if somebody posted a code challenge). Also because I was interested in the topic and understand the basics of the field (you can't moderate a technical topic you don't understand...). So he asked me if I wanted to be a mod, and I accepted.
It wasn't submissions or in-depth knowledge (compared to some of the professionals that frequent the sub, I'm a complete noob), or high karma, etc, that led to it. Just activity and consistent contribution, making an effort to raise the quality of the sub. Persistence is another important trait.
Hacker News has a similar approach to for example reporting privileges. Those who frequently use it right (report low quality submissions only) gets more trust.
But that only covers a person's capability of being a good mod. Doesn't cover his judgement, which likely takes a human to evaluate. Will they abuse their powers, are they "home blind" or biased?
I wouldn't make somebody mod if they often let their emotions get out of hand, if they shitpost too much, are too critical without being constructive, or encourage low quality submissions, etc. And that's regardless of their other abilities.
They might be an excellent mod on one topic they take more seriously, and could still be a terrible mod on another.
You also need respect for your community, regardless of how well you use the moderation tools. Your judgement of what to allow and not might be perfect, but you need to communicate your decisions to the community in a respectful way too. Or let somebody else do the talking...
This is a good point. Someone that gets a high amount of reputation from posting pun threads shouldn't necessarily get more abilities than someone that posts few, high-quality posts. Do you think comment/post tagging help differentiate this and provide the ability to weigh certain types of content with regards to how they contribute to a user's reputation?
I used to moderate on a few forms back in the old days of PHPBB and SMF and my style was almost always new topics and starting discussions/linking interesting news. I very rarely posted to a thread 'just because' but would generally only come in when I felt a flamewar was immanent.
We always had a few users that would comment in absolutely every thread solely to chase the comment count, and I guess that will become the same on Tildes at some point.
Gauging quality over quantity of contribution is always going to be very difficult to quantify due to its very nature, but maybe a 'majority-rules' type system could work. Occasionally show a comment without their tags, if the user tags the comment in a way that correlates with the actual tags; that user could be considered more 'trusted'.
The idea of not displaying a comment's tags to all users in a consistent manner isn't one that I love, but that could be a decent method of building a trust rating. I agree with @hackitfast that getting users to tag content will be difficult, maybe rewarding users for tagging things with the "exemplary vote" that's been discussed?
Traditional forums and Reddit are very old in the wider context of the internet. I'm optimistic that we've learned enough in the last twenty years and made enough advances in technology to solve some of the inherent issues in the format.
That's exactly what I'm hoping for in Tildes :)
For some reason tagging feels 'bulky' to me, for two reasons. For one I don't think the majority of users want to go out of their way to hit 'Tag' and then click on the appropriate tag, and secondly the colored aesthetic of the tags stick out a lot.
To fix the first issue would require laying out all of the tags near the vote button (which would be 'bulky' but could possibly be fixed with an on.hover), and the second one may be one that is harder to approach.
Some form of user reputation could possibly be gauged by analyzing vote patterns of "regular users" against those who could be considered "irregular users". Regular users might receive X amount of votes per Y weeks after Z amount of posts, while Irregular users might receive W amount of votes per Y weeks after Z amount of posts. Even with a tagging system in place this does also depend on the kind of community that enters tildes and decides to do the tagging / voting, so it would have to be a little more sophisticated than that and rely on a number of variables to make a determination (percentage probabilities, maybe? I'm not quite sure).
IIRC, the docs mention that the default UI is the mobile interface. If that is the case, I think on hover menus should be avoided on desktop in order to maintain consistency across interfaces.
I for one have not yet accessed ~ on desktop, only on mobile, which is the platform I use the most since years now. But my laptop sucks... I need to get a new one soon!
Ah damn you're right, on hover couldn't be used then. Some way to clean up the tagging options would definitely be welcomed though.
There's already a "Joke" comment tag, and I think that comment/post tagging could work as part of a reputation system. Somethng like, a user is consistently upvoted in a community then it's likely that they're posting quality content, but if their comments are often tagged as "Flame" or "Troll" then maybe they probably aren't fit to moderate.
Depending on the community, the Flame or Troll tag could also be abused and may not be an accurate indicator. For example, when I posted on a popular gaming forum that I didn't think that Breath of the Wild was as life changing and genre bending as everyone made it out to be I was eviserated for my opinion and called a troll when I was stating my opinion.
If we grow to the point where ~ has tons of users that are passionate about a subject, someones reputation shouldn't be in the hands of those who have a dissenting opinion. All that serves to accomplish is basically self censorship for fear of possible retributions from others and hivemind.
Exactly. Someone has to be the bad guy sometimes, and take feedback from the users and be a good discussion starter and participant. It's important that people are willing to take feedback from each other and strive toward being better. To build a healthy community, not just give in to everyone in order to be popular and liked.
Case in point: Me. I post here a shit load and I have 0 illusions that I would make a good moderator.
Here are my personal thoughts:
I like the idea of a reputation system. If you're familiar with the stack exchange network that's how content moderation is largely handled there.
While it seems to do a good job, it also seems to put too much power in the hands of the "old guard" and the people that have been around for awhile. What results is a certain level of hostility toward new users. While I don't think that would happen here to such a degree since Stack Exchange is a Q&A network, it's something to consider.
I like the idea of having a hybrid between reddit and Stack Exchange. Trusted users would have certain abilities that new users would have to earn.
Like Stack Exchange it would be tiered. Anyone can comment and post to start becoming part of the community. The ability to upvote other users would come very quickly, i.e. after only having made a couple of quality comments and/or posts.
Comment tagging would follow a little down the line. Tildes doesn't have downvotes (which I like). Still, there might be certain comments that are spam or otherwise rule breaking. Users with higher reputation wouldn't be able to downvote per se, but given enough votes from users with this privilege, the comment could be hidden and reported to full-time moderators (same goes for Spam posts).
Full time moderators are what I'm not sure about. Should it be required that users have a certain amount of reputation in the group to become a moderator? Should there be a built-in activity requirement? That's going to be tough to figure out.
Here's a Douglas Adams quote that basically summarizes my opinion on this:
This is definitely true, which is why I've found that the best moderators are those who are sought out. I moderate r/Android and all of our moderators have come from active community members and related reddits. We've found some great mods this way, and on top of that have an active chatroom where everyone gets along.
Interesting. I'd like to see how well this works.
I think a tiered system makes sense and it’s interesting to think about the highest and lowest tiers necessary. For example, I never even considered hiding voting from people who do not have some minimum score (though it makes sense!). As for the highest tier (people who can flat out remove posts or even edit them), I don’t think reaching that should be automated. Maybe the search for candidates should be automated but not the actual picks.
PS: That being said, I kinda like the idea of implementing voting among mods. Like, any mod-tier user can hide a post from public initially but then a review process takes place where all mods can internally vote to remove or keep.
Systems where some form of consensus is needed to confirm an initial decision if there is a conflict are one of the best ways to handle day-to-day moderation situations in my opinion. That way you can ensure that some form of common agreed upon guideline is applied consistently.
Post was removed and other mods disagree? Have a quick vote.
User was banned, believes it was unjust and challenges it without being a dick? Let a couple of mods have a look at the spot to figure out if the ban should be down-/upgraded or seems appropriate.
There's something else we've been missing in these moderator discussions: an actual democratic process.
It's impossible to find out what the users want with mere normal threads, because so many of them are never going to see it. How exactly are we supposed to moderate fairly if we haven't even got the ability to ask the users for their opinions and preferences on policy? I think we need to invent a voting/polling feature of some kind to solve this problem. In fact I think such a feature is overdue - we never see it in any other kind of forum.
I'd like to have some special class of poll built into the site that can actually reach a quorum of voters. We might use this to resolve disputes in the mod teams as well. I'd go beyond the usual surveys and question-response formats too, and try to implement a proper condorcet method. That's the strongest, fairest known voting mechanism. People get to rank their preferences, and it'll sort out both the most popular and the least popular from the various options, even handle ties properly.
I'll give you an example of just how difficult this is to do at all on reddit.
That, I think, is what moderation should look like.
I'd be on board with trying this out, but I'm hesitant to fully support politicizing the site in such a way. Do we really want users campaigning for moderator positions? Do we trust voters to be educated about all of the candidates? What if it's a small group, where someone with multiple accounts could game the system to gain control of it? What if the day before voting someone posts fake information to discredit someone running in the election? What if a candidate makes promises and doesn't follow up to them? What if a new electee is at odds with the rest of the moderation team? What if a group has custom systems in place (bots, for example) that a new electee doesn't know how to operate? What if the previous mod who was voted out sabotages a custom bot that was in place or doesn't hand it over to the new mod? What if a group with 100k users brigades another group with 500 users to elect their own "guy" and take over the group? What if the current moderators need more help to deal with a sudden influx of subscribers? What if a moderator has something tragic happen irl and needs to take a break from moderation? What if the moderation team is forced to do something by the admins that's unpopular with subscribers?
Maybe there's a way to solve all of this, but the more I think about it, the more I honestly don't think this is the right solution. Sure, there are reddits on reddit that have moderation teams that don't act within the best interest of the subscribers. But sometimes they need to make unpopular decisions that a vocal minority of users might have issue with. Should moderators be forced to only make decisions only if they think that decision are "safe"? Will moderators be motivated to invest significant time into the group if they can be voted out at any time?
I'm not interested in people campaigning for moderation positions, or in a democratic process that the mod team can't decide to override, in the event the subscribers are clearly having an aneurysm - and I've seen that happen many times. If we surrender this to a direct democracy, we're going to make it a bigger mess. Meritocracy must remain the greater part of our democratic systems here.
What I'm interested in with a system like this is both informing the voters and gathering their feedback - both preferences they vote for, and opinions or clarifications they have on why they vote a certain way. Once you have that information, I think the correct consensus course of action becomes quite obvious - even nuanced if you do it well. Read that thread above I linked from listentothis and you can see how we used this process to make decisions. I might have edited that in while you were typing your response, took me a while to dig it out of reddit's ancient history.
Thanks for the clarification, I'm totally on board then after reading these clarifications. I apologize if I misinterpreted your initial reply. I agree that meritocracy is key and populism should be avoided.
Another thing I just thought of is how the initial moderators are chosen for a group. On reddit, whoever thinks of creating a group is automatically in charge. This not only leads to squatters, but also leads to reddits moderated by people that don't have the best interests for the community in mind. Finding some sort of democratic or meritocratic way of choosing the initial moderators for the group would be a huge step in solving the moderation issue on reddit.
I would go a bit further than that even. Both the initial mods and an initial idea of guidelines are good things to pick collectively. What I mean by that is that if for example ~music.underground forms the community can decide that this is a group specifically for underground music and that shall be some kind of "eternal" guideline and that should not be something the mods can change, they're serving the community by enforcing what is on- and offtopic for example.
Enforcing and developing these or further guidelines should be up to the moderation team as a dedicated group, the important part is keeping the spirit of the initial idea alive.
In the event of a conflict between mods and their userbase something like that could potentially give a higher authority (admins) a way of moderating the moderators. If there are clear standards you have something they can be judged against in the end.
If one of the mods in the above example starts working for Sony and promoting their content that would be a clear violation of the initial guideline. On reddit, if that mod is high up on the list, the sub would be done for and users would have to migrate.
I don’t believe tildes should be a “democratic” place. It’s build around some very specific ideals, which should not be up for debate. Maybe mod decisions (or very subgroup-specific things not related to tildes, specifically) would benefit from some voting process, but never general policies. I think it’s dangerous to even suggest to users that they “own” the place.
You mention Condorcet - what type would you use? What about using range voting? From the article:
I prefer ranking to range voting when it comes to picking between multiple options, however we might even have both types. The reason I say that is because range voting would make the typical best-of activity in the music subs into a breeze - /r/metal is famous for their meticulous score-based process, and it works very well.
My approach to making it democratic is to separate the conversation and propagation of messages from the moderation.
https://roamingaroundatrandom.wordpress.com/2014/06/01/a-decentralized-hash-chained-discussion-system/
I think democracy has its flaws.
I've noticed that in a democratic team, it paves the way for a certain contingency to gain popularity and push ideas that aren't good for a community as a whole, or for even the small team. There need to be checks and balances in play.
I think there are a lot of great ideas floating around here. I love the idea of a reputation system leading to giving people more influence instead of allowing certian groups of seasoned mods to become power mods.
I like the voting idea but I also think there should be a way to hold future mods accountable for their actions. If someone is getting a little power happy or showing bias then I think there should be some kind of way for the masses (non mods) to "impeach" these elevated people/mods or whatever their role/title is.
I think it's important to not let the people in power have absolute power. That's how a lot of communities turn to crap in my opinion.
I am also not a fan of "power mods". Maybe we should implement a limit on the number of groups that can be moderated at a time, potentially tied to reputation.
On reputation, specifically: In most systems, there is basically no way to fall back to low reputation once you've reached a certain critical mass. Maybe reputation should not even be a simple accumulative score, but a squished range, so that reaching an ever-higher reputation gets harder and harder, while losing reputation remains relatively easy. This would result in more severe punishment for people abusing high trust, relative to someone in the middle of the range, which feels appropriate.
The "impeachment" process is a very interesting idea. That would be a great way to hold mods accountable and reduce their reputation accordingly, potentially weighted by the number of vote participants.
Edit: Typo.
I abhor power mods on reddit, and part of the reason I was excited to move to tildes was to get away from those who abuse their powers. Auto-banning people that post to subs you don't like, or banning people that have opinions you disagree with, and responding with a mute to anyone who asks why a decision was made - not what I want in a community, not conducive to healthy discussion.
I really like the ideas being thrown around in this thread about a reputation system and impeachment, I'm less into the concept of a democratic process. If the latter were to happen I'm almost certain there would be immediate abuse of the system.
I'd like to keep the democratic aspect of it focused on gathering opinions and measuring the value of ideas, rather than on some form of nebulous electoral system. It's fantastic for information, but not necessarily good for decision-making. Let the trust system handle the 'election' aspects, and the democratic system handle the informational aspects. I think the people who are selected by the trust system can use the democratic system to find consensus and make good decisions. In the (I expect) rare event there's a genuine conflict between the two, we can look at admin intervention or vote-of-no-confidence systems.
If you're voting to 'elect' people, they can lie in public, misrepresent themselves. That's far harder to do with the trust system, since you'd have to invest a lot of time and energy to get there actually working on behalf of the community you intend to sabotage - and once you get to the higher levels, you're sharing the power with a larger group just like you. It's hard to see how bad apples can abuse that system.
Also, if the democratic process is more idea/debate-focused, you have to defend your ideas for people to take them seriously. It's not about people, it's about solutions instead.
I'm very much in favor of the trust system, it sounds much less abuseable than any sort of election or old guard selection. I'm really looking forward to seeing how things develop!
Banning probably needs to be more democratic or group oriented. The details TBD. The problem with this is brigading and mob factions forming. I keep wondering if the entire idea isn't backwards though - central banning seems the problem when instead perhaps it needs to be on the reader's side (banning a person per forum). Then it becomes a question of if a one-dimensional vote/reputation score is too narrow - perhaps reputation needs more axes/degrees of orthogonal scoring. For example: flaws in logic, vs. political orientation vs. relevance/trolling/seriousness. Just a crazy idea.
Perhaps handicapping trolls or rudeness is another way - not a ban but posts start negative in score. Then if legitimately good content or ideas are presented they will eventually get voted up to visibility. Again, perhaps a one-dimensional scoring system is too simple and causing the problem.
I know you mentioned that activity isn't the only thing, but I want to give a case study of myself. I'm a fairly active moderator and I feel that I am professional, kind, and judicious in my decisions. I even help rein in some of the other moderators when they begin to become unprofessional or unkind. I moderate communities that are important to me because they are part of my daily work life. /r/web_design, /r/webdev, etc. However, I don't contribute to these communities via posting threads (text post/links) much for a number of reasons. I am much more likely to comment, if I do contribute.
Between the time it takes to reply through modmail on the communities and review reports and spam, I don't have much time to do anything else. Do you care that you may be missing out on moderators who have personable skills and good judgement, but are a bit more passive when it comes to contributing? If you do care, what are indicators that you can track to include them in a person's reputation?
My thinking is that "behind the scenes" moderation would count equally (and perhaps more) compared to simply commenting. That's to say that a moderator that contributes content to the group but not moderation wouldn't be considered an active moderator.
A lot of people have floated the idea of a democratic process to choosing moderators, but my problem with this is just what you mentioned-- that not all moderators have a visible presence on the group, even if they're busting ass behind the scenes. Users would be much more inclined to elect people they recognize
My other issue with moderation being completely democratic is that visions for community become a popular vote and when there are growth pains, you get ejected because things take time, especially in large communities. It’s a tough problem to solve! I’m also a football official, so I experience being the focus of everyone’s hatred. Even if I’m right on a call, there is a massive amount of disinformation and if my job as an official was based on audience vote there would be no officials. In that field, it’s important to have a review of experts in that field, not a popularity vote. A hierarchy, as mentioned elsewhere, is an interesting concept. Moderators of moderators that move up based on votes of people who have experience in that field, etc. Obviously that route has its own issues. Got to find a happy median.
Edit: mobile typos
I was offered a mod position once and I didn't take it because the old guard was the least thing from trustworthy. They'd obsess over a time before most people had arrived and were pretty malcontent. I could chat with just about anyone so it didn't particularly bother me.
I believe the “old guard” narrative simplifies the issue a bit too much. Yes there (strangely) seem to be mods who manage to find some authority kick in the ridiculous process of banning people for calling each other a poopy face. But there also definitely is a shift in quality that comes with subreddits growing bigger.
It’s not just nostalgia or a sense of “owning the place”. There is definitely a shift in quality as a subreddit reaches a certain size, fewer people read (or care about) content rules and get mad when you remove stuff for breaking them (“But everybody else does it, too! The mods are on a power trip!”).
Tildes strikes me as a place that would enforce certain rules quite strictly. It’s not relevant right now because there are like a hundred people, handpicked for being passionate about the idea. But it will be, if it grows.
I agree with the size thing. Once you pass 300 or so active users who more or less know each other it's difficult to form a consensus without moderation. You can adopt an eat your own attitude but that gets ridiculous with so many weird fights over time.
One thing Reddit did was thrust smaller communities into the spotlight for political or advertising reasons, that made for less quality. Sometimes the older users did not feeling safe posting after that happened.
I think everyone is giving some good ideas on here, and it's definitely a good step in the right direction.
A couple thoughts though:
Democratically 'electing' moderators is tricky because this will give people who have an agenda a way to become in charge of communities that potentially will effect their interests. Once the site opens up, a determined group could probably work to get friendly moderators on their topic given minimal organization.
I do agree that users should have a say in who moderates (at a lower level than the full-times or admins obviously), so some sort of community recommendations stacking upon themselves to give someone low-level moderator abilities would be a cool feature, where anything big they did they'd have to get it approved by the next tier of moderator.
I don't know just my opinion I guess but that'd be a system that I think would work nicely.
I used to mod some pretty large subreddits in my ~8 years being on the site.. SRD, RickAndMorty, GooglePlayDeals, Awwwtf.. and honestly the hostility towards myself and the other mods (plus the amount of spam and off-topic/shitposts), made me hate doing it and I just ended up resigning. Everyone think you're the power tripping bad-guy when all you're trying to do is maintain quality of the content in the subreddit. Now I just mod a teeny tiny subreddit for the last four years and it's mostly been OK.
So I just read some of the top parts of the Community Moderators? thread, and I had an idea I wanted to share. Why not make mods, and even some of the highest voting tiers, elected?
It could work like this:
I think having the vote weight count for this is super important. It means that the older members of a community can defend from an influx of low effort newcomers, but if the culture of a ~ changes, then newcomers can bring new moderators and high power voters in even if an entrenched minority is making a stink.
I'm not sure what a fair way to decide how many votes someone needs should be. Perhaps a certain fraction of the weighted total power of the subscribers, with those not subscribing not being allowed to vote? It would also be good if you counted as "registered" to vote if you are subbed when the requests for mods/power voters are sent out, to prevent brigading of the process entirely.
I do think community members should be involved with moderation in some way, but politicizing moderation seems like such a fine line. All I can picture is the drama it might create.
There is already drama in many other moderation systems, do you think that a voting method would create more drama? I feel like seeing the a new moderator had the right combination of experienced old guard and newcomers behind them, or seeing that an older moderator had stopped contributing, and as such couldn't muster support, would reduce drama somewhat compared to people simply being selected. In my mind, I'm less likely to complain about a transparent and democratic voting process than I am someone being picked from above.
The point by /u/Amarok about using ranked voting to help deal with limited positions is good, though the more informal review process suggested below by /u/UristKerman is good as well.
Overall, this whole thing is exciting because it is a community we can help improve and push towards a place where people can discuss freely. If something we try doesn't work out, we can go back and try something else.
I agree, I'm really excited to be able to have this discussion at all.
All I would say is consider all the reddits you subscribe to on reddit. Are you informed enough about the users on them or even the moderators enough to vote on who should moderate the communities?
What if a group like r/fatpeoplehate brigades a smaller community dedicated to weight loss and takes it over? My gut feeling is that whatever system is put in place could be gamed. How can users "run" for moderatorship if current moderators can stifle any sort of post that resembles campaigning?
There could be a solution to all this, but it won't be easy. Requiring a certain amount of reputation to vote would solve a lot of problems, but there are still so many to consider
The systems we aim to have in place are already going to make brigading into almost a non-issue. In order to have any real vote weight and access to trusted features in any given group, you're going to need to participate within that group for a longer period of time to gain access, and that participation has to be well received by the group or your standing won't improve much.
So if a group here on tildes does decide to brigade another, they'll have low-weight votes that the targeted community can easily out-class with its own. The brigaders will have no access to the reporting and tagging and abuse features within the target group either, but that group can and will use those features to defend themselves against the brigades. The likely outcome of a tussle like this is that the invaders end up heavily 'flame' and 'troll' tagged, embarrassingly unvoted in the comments and submissions, and utterly outclassed in the discussions they attempt to engage in within that targeted group.
Also, that's just the users defending the place - the mods are going to see this brigade happening in real-time and be given the tools to bitch-slap all of the brigaders hard over it, since that's ban-worthy behavior. They'll be able to report that to the admins if they like. If a community continues to make a nuisance out of itself with this bad behavior, that entire community can be banned, along with anyone they've ever invited, if it comes to that.
Thanks for the reply. I agree, the crux of this system and indeed many of the systems on Tildes is going to be the reputation system. I don't envy the persons tasked with developing and implementing it!
Let's say that either the existing group is excessively protective of their status quo, perhaps even maliciously rejecting new members from joining (superiority complex, etc) or alternatively that a slow silent takeover eventually succeed. And so a tilde group for one topic is essentially held hostage by a small group.
Then were would new members go, when they want to talk about that topic?
I'd say similar concerns also apply for all controversial topics, where one subgroup may want to exclude the others.
I don't really see a way to make it "fair" without allowing parallel communities.
I think you raise an interesting point here. Some of the referendum ideas that people have been floating could help solve an overly cloistered community.
As for the slow silent takeover, I think there was an idea that posting in sister ~s would boost your trust (ie NFL general ~ and team specific ~), so a similar thing to lose a bit of trust if you have a lot of activity in another sub that would be antagonistic towards the one you just subscribed to (ie group for flat earthers and group who laughs at flat earthers). That would be a huge discouragement to this kind of attack, as it would really be best to make a new account, and then you would also have to go about making the account seem real by not just posting in one sub to gain trust.
At the end of the day, though, I think that no matter how good of a discouragement you have, it's always possible for a sufficiently organized and well disciplined group to game the system. The goal is just to make it hard enough that those types of groups will arise rarely, if at all, and can be managed by any real humans that are helping to run the site.
I don't think there will ever be a mechanism to reject new users. On reddit they do that with bans - such as banning someone for even posting to a specific group, usually using bots. If we find that happening, the ones doing the banning and their bots are going to end up with the bans, I think. The admins here aren't planning to sit around and let things fester until they become toxic and ridiculous like reddit. Any time the moderation system or a moderation team goes off the rails, admins will set it right again.
That assumes the admins will always understand the problem and make the right call.
It does. Someone has to be in charge, and we've seen what happens when the users run things.
Best way to avoid making mistakes is to have threads about it, when in doubt.
As a sidenote, this hypothetical where people fundamentally disagree with the admin's decision is precisely what spawns sites like Tilde (and voat, but let's ignore that one). I know it's going to be rare (if the admins are competent), but that's what happens when somebody sufficiently motivated disagrees with the final call of the top authority. And it tends to split communities.
It's part of why I want to break out of the dependence on central servers and move to a protocol based scheme. Something IRC-like, in that you don't move even if the servers do. To you, everything is aggregated, you just add another domain when somebody decides to split.
Also, you're voting for the admins in a way - with your dollars/donations. If tildes does something unpopular, people can withhold their donations.
Do we really want “election cycles”? “Vote xXxDickMaster69xXx”-Stickers? Mods elected for how well they can play an audience rather than how good they are at calmly judging posts that break a rule? I don’t think I’d like that.
That is a major issue with an elected system online. I think vote weighting and the trust system could be used to mitigate some of those problems. Those most likely to be good mods are suggested as mods by the trust system. Those best able to judge mods' ability (are active and make good quality submissions) have the most say on putting mods into place as well as removing them if there is a problem.
I'm not so sure about elections themselves, but some kind of periodic review taking into account the trust system as well as people's opinions would be useful, similar to the concept discussed below by /u/UristKerman.
Empower the user to forcibly remove a moderator. Either via some sort of user-triggered referendum or call to action to the admins.
Empower the user to override mod decision. Either via some sort of user-triggered referendum or call to action to the admins. A lot of times, things gets removed by mods that aren't supposed to be removed.
Empower the user to appeal mod bans. Either via some sort of user-triggered referendum or call to action to the admins. Kind of what they did at karma-court, but legit this time.
I like the idea of including decay, weighting the contributions, and semi-automating a regular (i.e. periodic) process.
Maybe something like: every six months, the status of current moderators is reviewed by the community. People can make suggestions, discuss any holes in the moderator coverage, and highlight any mods who have made contributions (or conversely, any mods who have fallen off the face of the earth). They can also discuss making anybody who is not a mod, but is an exemplary contributor, into a mod or other such thing.
You could also include metrics in that discussion (post amount, impact of posts, posts that are tagged, etc), and look at them over time, too. Might be a bit of work to build a bot to drag all that out, but could be helpful in bringing a bit of objectivity into the mod discussion - not as a requirement, but as guidelines.
I think having some sort of periodic 'how are things running' thread for each community is a good idea. We did that throughout all of /r/listentothis' history. People gave good feedback, and the rules we developed from them with the community involvement helped make sure that things worked the way everyone wanted them to work, within reddit's admittedly flawed mechanics. It's also very useful to have this log of conversations as a defense against trolls and subversive types, because as a mod you don't have to say, "Take my word for it" - instead you can link back to those threads and say, "The community has spoken." Then they have to bring up their issues during the next cycle and defend them in front of the entire community.