• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics with the tag "moderation". Back to normal view
    1. Should deleting comments be the standard behaviour, or can we consider a less censored approach by default?

      I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit. I'd like to see a move to locking...

      I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit.

      I'd like to see a move to locking comments rather than deleting them by default. That would mean no further replies to the comment or any other comment in that chain, no one being able to delete or edit their comments, no one being able to add or remove votes to a comment, etc.

      I understand for particularly egregious comments removal is completely necessary (especially when it goes hand-in-hand with banning users), but a lot of times comments are deleted as a means to prevent long argumentative back-and-forth chains that spam and derail topics, as well as antagonize users.

      In a lot of cases I feel like deleting the comment only further serves to hide what is unacceptable behaviour (even if that behaviour should be obvious), rather than setting an example for the userbase.

      30 votes
    2. Feature proposal: Real-time moderation transparency page (vote in comments)

      Proposal: Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions. A...

      Proposal:
      Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions.

      A new page, maybe called tildes.net/moderation, is available to all registered users. I am not sure where the link to should appear on the site, maybe on the user's profile sidebar?

      This page contains a table of all possible moderation actions. The actions may include: deleted topics, deleted comments, tag modification, moved topics, edited topic titles, banned user, locked topics. (this begs the question, what are the possible mod actions, and that they must be codified.)

      Very roughly, the table columns might include: Date, User(being mod'ed), Mod Action(a list of possible mod actions), Mod Action Reason (either a text field, or a list of possible reasons for this action), Link (null if action is a deleted topic.)

      I think that the user who did the moderating should not be publicly listed for now, to avoid drama?


      Some of the related Topics: (please make a top-level comment with any others)

      Could we have a stickied list of all bans with reasons included?

      Daily Tildes discussion - our first ban


      Please vote for the comment which best reflects your position on this proposal.
      As a bonus question, please make a top-level comment if you have general comment about my format of voting on comments. Would you prefer a straw poll on a 3rd party platform? Is there a cleaner way to do this?

      Edit: added "banned user" to actions list, I probably missed others, let me know. Also added the obvious locked topics.

      23 votes
    3. Lets discuss tags, again

      There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags I've noticed a few things about tags and rather than make a topic for each...

      There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags

      I've noticed a few things about tags and rather than make a topic for each one I thought I'd make a few top level comments instead, hopefully with others doing the same for anything tag related they'd like to see discussed.

      20 votes
    4. Tildes code of conduct

      Tildes code of conduct says Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent. Can you change that to just say don't...

      Tildes code of conduct says

      Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent.

      Can you change that to just say don't post personal info? Even if it's not done with malicious intent it should still be removed to protect people's privacy.

      Also while it does say to not post spam on tildes terms of service I think It should say that on the code of conduct.

      Edit: I mean posting personal info without consent and not public information.

      Telling someone how to contact a company would be fine but not posting someone's address.

      12 votes
    5. Could we have a stickied list of all bans with reasons included?

      In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users...

      In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users understand what isn't acceptable in the community and allow for community discussion on what could be considered an unjustified ban or a weird influx of bad behavior. This wouldn't be super viable when the site goes public, but would be a neat implementation in Tildes' alpha state.

      14 votes
    6. Moderator tools: what do you have and what should be the immediate priorities?

      I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long...

      I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long as Deimos is the only one with most of the moderating tools at their disposal, I think it's crucial to make sure it's as painless as possible.

      So far it looks like Deimos has these moderating tools available to him:

      1. User bans
      2. Comment removal
      3. Thread locking/removal
      4. Title/tag editing (and this ability is shared by many of us as well)

      Am I missing anything?

      The three next tools I would hope are coming next are:

      • A reporting mechanism, where users can report comments and threads that they think should be removed.
      • A feedback mechanism for reports, telling users that a report they gave was acted on.
      • A note taking system for the moderator-type person, shareable with all other moderator-type persons at that level, with an expiration date probably around 30 days.

      Now I'll talk about why. First, the reporting mechanism. While it's still possible to keep up with everything that gets posted, I don't necessarily think it's the best use of Deimos' time to read literally everything, especially as the site expands its userbase and presumably activity level and depth. The reporting system at first should probably just be a button, maybe eventually with a pop-up field allowing the user a brief description why their reporting, and a queue that gets populated with comments and threads that get reported.

      Coinciding with a report queue/option should probably be an easy, rudimentary system for providing feedback to those whose reports led to moderating action. At first, an automated message saying something like "thank you for reporting recently. Action has been taken on one of your recent reports" without any relevant links would do fine, and we can leave the particulars of how much detail to add for later discussions.

      The last thing I think should help things considerably in the immediate term is a time-limited user tracking tool for the moderator-type person. As things scale, it isn't always going to be feasible to use mental bandwidth remembering each username and the relevant history associated with their behavior. A good note-taking tool with an auto-timed expiration date on notes would be a good way to address what can easily become a hugely mentally taxing role at almost any scale. This tool should let Deimos take a discrete note for himself (and other moderators at that permission level and higher) connected to a user regarding any questionable threads or comments that were yellow/red flags, or any other moderator action taken against a user within the last X days/months (the particulars don't matter to me as much as that there is an expiration date to these notes). This should let the moderator type person focus on the broader history of the users they're looking at before making a decision, without having to go searching for every relevant comment from the past 30 days. Fewer problematic users at scale should fall through the cracks and more users that might just be having a bad day can be let off with comment removals and/or warnings.

      Are these priorities fair? Are there design elements you would want to see in the immediate term that would help reduce the burden of moderating? Are there problems with these tools I'm suggesting that you would want to see addressed?

      19 votes
    7. Moderators of Reddit, tell us about your experiences in fostering quality discussion and content (or failures to do so)

      Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god...

      Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god forbid, moderate such subs as T_D. Having a range of perspectives is, as always, the most valuable aspect of any discussion.

      Here are some baseline questions to get you started:-

      • Did your subreddit take strict measures to maintain quality ala r/AskHistorians, or was it a karmic free-for-all like r/aww?

      • Do you think the model was an appropriate fit for your sub? Was it successful?

      • What were the challenges faced in trying to maintain a certain quality standard (or not maintaining one at all)?

      • Will any of the lessons learnt on Reddit be applicable here in Tildes?

      29 votes
    8. On Reddit moderation - it's a matter of scale.

      I apologize in advance for what's probably going to be a very rambly post. This has been stewing on my mind for a while now and I just need to get it out. I've been on reddit a long time, 11 years...

      I apologize in advance for what's probably going to be a very rambly post. This has been stewing on my mind for a while now and I just need to get it out.

      I've been on reddit a long time, 11 years as of today in fact. In that time, I've watched the site grow from a small community of mostly tech nerds to one of the biggest sites on the web. I've also moderated many communities, from small niche subs (/r/thecure, /r/makeupaddictioncanada) to some of the biggest subs on the site (/r/worldnews, /r/gaming). I've modded communities that have exploded in popularity, growing from 25k to 100k to 500k and beyond, and seen how those communities change.

      When you're in a subreddit of say, 10k users, there's more community engagement. You know the users, the users know the mods, and you know when people are engaging in good faith. The mods themselves are basically just another user with a bit more control. People coming in just to cause shit are generally downvoted to death and reported quickly, and taken care of - it's a community effort to keep things civil. Modding a community like that is piss easy, you can generally check every thread yourself and see any nastiness easily before it becomes a problem, and the users themselves are more invested in keeping things on topic and friendly. Disagreements are generally resolved amicably, and even when things get heated it's easy enough to bring things back to center.

      Then the community starts to grow, and gather more users. Ok, you adjust, maybe add another mod or two, the users are still engaged and reporting threads regularly. Things stay more or less the same. The growth continues.

      At 50k, 100k, 250k, etc you notice differences in the community. People argue more, and because the usernames they're arguing with aren't known to them, they become more vitriolic. Old regulars begin drifting away as they feel sidelined or just lose interest.

      At 1M a major shift happens and the sub feels more like a free for all than a community. As a mod, you can't interact as much because there's more traffic. You stop being able to engage as much in the threads because you have to always be "on" and are now a representative of the mod team instead of a member of the community. Even if you've been there since day one, you're now a mod, and seen by some as "the enemy". Mods stifle free speech after all, removing posts and comments that don't fit the sub rules, banning users who are abusive or spammers. Those banned users start running to communities like SRC, decrying the abuse/bias/unfair treatment they've gotten at the hands of X sub mod team. Abusive modmails and PMs are fairly regular occurrences, and accusations of bias fly. The feeling of "us vs them" is amplified.

      Once you get above 10M users, all bets are off. Threads hit /r/all regularly and attract participants from all over reddit. These threads can attract thousands of comments, coming at the rate of several hundred every minute. Individual monitoring of threads becomes impossible. Automod can handle some of it, but we all know automod can be slow, goes down sometimes, and can't handle all the nuances of actual conversation. You've outgrown any moderation tools reddit provides, and need to seek outside help. Customized bots become necessary - most large subreddits rely on outside tools like SentinelBot for spam detection, or snoonotes for tracking problem users. Harassment is a real problem - death threats, stalking, and doxxing are legitimate issues and hard to deal with. I won't even touch on the issues like CP, suicidal users, and all the other shit that comes along with modding communities this large.

      I wish I had some solutions, but I really don't know what they are. We all know the tools we have as moderators on reddit are insufficient, but what people often overlook is why - the community is just too large for unpaid volunteers to moderate with the limited tools we have.

      39 votes
    9. How will moderation on existing groups work?

      So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but...

      So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but how will this work once Tildes is fully released to the public? Will people who show interest in a certain community be reached out to and asked?

      13 votes
    10. [Suggestion] Use sortition for moderation ?

      In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process...

      In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process originates from the idea that “power corrupts.” For that reason, when the time came to choose individuals to be assigned to empowering positions, the ancient Athenians resorted to choosing by lot. In ancient Athenian democracy, sortition was therefore the traditional and primary method for appointing political officials, and its use was regarded as a principal characteristic of true democracy.

      Today, sortition is commonly used to select prospective jurors in common law-based legal systems and is sometimes used in forming citizen groups with political advisory power (citizens' juries or citizens' assemblies).

      The mechanics would be something like this: users report a post/comment, when there's enough reports the systems randomly selects 3/5/7/... currently active users and ask them to determine if the reported post contravene to the rules. The decision is then automatically taken with a majority rule.

      Why ?

      1. It's the only system that scales (to my knowledge). More users mean more content to moderate, but since the users are also moderators the system works at any scale. Systems that don't scale lead to all kind of problems when the number of users become large.
      2. It's very robust to manipulation. As moderators are chosen randomly it's very hard to coordinate or try to influence the decisions.
      3. It promotes a participatory attitude and a sense of responsibility in the users. There's no "them against us" (the bad mods against the users).
      21 votes
    11. Mod cultures - What do we want?

      Right now there's a lot of discussion ongoing about community culture, building Tildes' attitudes as a community into something that is solid enough to withstand waves of new users without being...

      Right now there's a lot of discussion ongoing about community culture, building Tildes' attitudes as a community into something that is solid enough to withstand waves of new users without being disrupted too heavily by newcomers that have yet to learn the culture.

      But what of mod culture?

      This topic isn't only for those that have mod experience, there are plenty of users with experience talking to mods that have their own negative stories. Over on reddit the actions of one mod team affect the brand-image of all modteams on the entirety of reddit. One bad action by a mod that occurs in a default subreddit backed up by the other mods in that subreddit becomes (in the eyes of users) the behaviour of all "reddit moderators".

      Often I see mods making things far far worse by being one of the most combative and hostile in-groups on the site. Talking to users in a manner that is best described as the way the worst teacher in school talked to teenagers as if they were 4 year olds, not listening to anything a user is actually saying and dismissing them outright because they're the user and they're the moderator. I understand some of it comes from difficult interactions with genuinely toxic individuals that waste enormous quantities of time better put towards better things. However what I see are moderators approaching every interaction with every user with criticism as if they are almost certainly the same-old toxic user. This is not the case.

      This is exceptionally important here on Tildes because it won't be a mistake to take the actions of one moderator and have it colour your image of other moderators on the site. When the site holds responsibility for moderator actions due to oversight and control then the actions of all moderators are going to be considered the actions of the site and the rest of the mods.

      So, how do we want our mods to talk to users? How do we want them to interact with users? What controls can be put in place to appreciate quality moderation? What can stop quippy mods that shut down valid discussion with 1 line reductive answers? Etc etc.

      What is good moderation and what is a good moderator?

      Personally what I try to apply to my own behaviour is to actually LISTEN to people and act as an equal, or at least present the appearance of listening. The thing that bothers people most feeling like something they care about is dismissed.

      What are the many issues that you've see in moderator behaviour (in front and behind the scenes) and in what ways can Tildes go about things differently to stop them?

      19 votes
    12. What's the plan for deciding moderation policies that go beyond removing trolls?

      So I noticed the entire front page getting clogged with "question" type posts, ranging from "what are your favorite..." to "pls help me choose..." type posts. This might be mainly due to...

      So I noticed the entire front page getting clogged with "question" type posts, ranging from "what are your favorite..." to "pls help me choose..." type posts. This might be mainly due to "activity" sorting (sorting by votes is a little better), but that's still the default and doesn't change the general dominance. I took this screenshot earlier and I did not see a non-question post without scrolling. None of them were from ~talk, either.

      I know people have different views on this, but I remember from my brief time moderating that it's generally a good idea to restrict these types of posts, for the simple reason that people love to dump their "favorite" lists, which makes these types of threads dominate the frontpage, while they tend to produce always the same responses (intuition might suggest they produce great discussion but that's usually not the case). They're best pushed into specific subreddits (subgroups?).

      I think this is a rather small and specific issue, but it might be a taste of future difficulties with voting/moderation. Banning content for being disruptive/abusive is one thing, but the best places I know for discussion also ban via more subtle rule sets. They take measures into account (often at the cost of facing a ton of backlash from users seeing their posts removed for "unfair" reasons) that keep one type of post from taking over the frontpage, potentially drowning out more interesting ones. I'm still trying to picture how this would translate from Reddit's moderation model to Tildes'.

      One way would be to open up a subgroup for any sufficiently large category of posts and give moderation the option to move posts to a subgroup that people can opt-out from. Another is very diligent tagging and filtering. My concern is that neither could produce the complex, fine-grain type of moderation that distinguishes really good subreddits (yea, they exist!) from spammy ones. "Hide all posts tagged 'question'" could hide "what's your favorite...?" type posts but also posts that ask a really deep and interesting question. So would you filter "question && favorite"? That turns filtering into almost a scripting job. It doesn't seem reasonable to expect users to put this much effort into content filtering and it wouldn't help "shape" discussion culture, as the default (no filters?) would keep most users jumping from one "favorite game/band/movie/programming language" post to the next.

      So far, it seems rules are set site-wide based on mostly removing blatantly off-topic, bad faith or trolling content. As the groups grow, however, I believe it's absolutely vital to also allow more subtle policies (think "only original sources for news articles" or "only direct links to movie trailers", etc). As groups branch off into further subgroups, it might suddenly also be reasonable to have very specific rules like "no more individual posts about hype topic X, keep discussion in the hub thread until Friday".

      The only way I can see this work out (and maybe I lack imagination) is via a "meta" section for each group that allows whoever is decided to be part of the moderator group to decide upon and clearly formulate rules specific to it. It could be a wiki-like thing, it could involve voting on changes, maybe automation via "default tag filters", etc. Other users could see the policies mods have decided upon and maybe even "opt out" from moderation actions being considered in filtering, to have no reason to be paranoid about "censored" content.

      Am I too pessimistic about tagging/voting solving this on its own? Am I too stuck on doing it "the reddit way" (albeit with hopefully better tools)? I just really believe it's subtle moderation like this that might make or break Tildes in the long run.

      TL;DR: How would more subtle or group-specific moderation policies be decided? Just tags+votes? Should there be a "meta" sections for each group where mods can agree upon specific rules?

      8 votes
    13. Tumblr unfollowed me from a thousand blogs

      One of my friends said "hey why did you unfollow me" I check my following list (witch is really hidden deep into the gui) and I see I went from following 2k (from when I check a few months back)...

      One of my friends said "hey why did you unfollow me" I check my following list (witch is really hidden deep into the gui) and I see I went from following 2k (from when I check a few months back) to follow 600 people. WHAT HAPPENED, so now I'm freaking out franticly making sure I didn't lose anyone.

      5 votes
    14. How do you think social networks should handle hate speech?

      A bit of context: in July 2017 germany implemented the Netzdurchsetzungsgesetz, a law which allows german authorities to fine Social Media companies with over 2 million users if they persistently...

      A bit of context: in July 2017 germany implemented the Netzdurchsetzungsgesetz, a law which allows german authorities to fine Social Media companies with over 2 million users if they persistently fail to remove obvious hatespeech within 24 hours and all other cases within a week. A write up of the law and background information. Information about the definition of hate sepeech in germany.

      I am interested in your opinion: Is this governmental overreach and infringes on the freedom of speech or is this a long needed step to ensure that people feel safe and current german law is finally being followed?

      16 votes
    15. Standard procedure to deal with someone that seems like a troll

      There is a user I do not wish to mention to prevent a witch Hunt or if I am wrong. In the past two days I have seen them post two topics with fairly contentious topics, but nothing was wrong with...

      There is a user I do not wish to mention to prevent a witch Hunt or if I am wrong. In the past two days I have seen them post two topics with fairly contentious topics, but nothing was wrong with the topic itself. The user however, has a flame bait sentence in each of these posts, ex. "I am against homosexual marriage". He then waits for a few heated responses and then edits out the flame bait sentence.

      This makes it look like an innocuous post is suddently full of hot heads immediately starting fights based off their assumptions and not what the user posted.

      How do we deal with what seems like a troll that operates like this? I won't be posting on his posts anymore as you shouldn't feed the troll, but he definitely got me the first time and it's unreasonable to expect everyone to always be on the lookout for this.

      Edit: to everyone saying I am jumping the gun by accusing him of being a troll. That may very well be, which is why I declined to name the user. Even if it's not intentional, it's causing problems if we want this to be a place for high quality discussion. Messaging @deimos has been suggested as an option and is probably the best choice for now but will not scale. What should be our solution to this issue going forward that scales?

      23 votes
    16. The rise of Reddit's megathreads

      I originally posted this as a comment here but thought it might deserve it's own discussion. I think that the rise of megathreads/ultrathreads/collections of threads on reddit has been a large...

      I originally posted this as a comment here but thought it might deserve it's own discussion.

      I think that the rise of megathreads/ultrathreads/collections of threads on reddit has been a large detriment to the site.

      I'm a mod for a few large subreddits that utilizes them (and I know a good portion of people reading Tildes right now are as well), and as time goes on I've started to dislike them more and more.

      At first they were great - they seemed to silo off all the posts and noise that happened around an event, and made the lives of mods easier. Posts that should've been comments could now be removed, and the user could be pointed towards the megathread. Users could go back to the post and sort by new to see new posts, and know that they'd all have to do with that one topic.

      I believe that this silo actually hurts the community, and especially the discussion around that original megathread, more than it helps. As modteams I think we underestimate the resilience of our communities, and their ability to put up with "noise" around an event.

      The fact that we are in a subreddit dedicated to that cause should be silo enough - each post in that subreddit should be treated as an "atomic" piece of information, with the comments being branches. By relegating all conversation to a megathread we turn top level comments into that atomic piece of information, and subcomments into the branches.

      But that's just a poor implementation of the original! There are some edge cases where this might make sense (take /r/politics, it wouldn't make sense to have 9 of the top 10 posts just be slightly reworded posts on the same issues), but I think this can be remedied by better duplication rules (consider all posts on a certain topic to be a repost, unless the new post has new or different information).

      There is something to be said about the ability to generate a new, blank sheet of conversation with a post, that is not marred with previous information or anecdotes. New comments on a megathread post don't have that luxury, but new posts do.

      Additionally, I feel like the way reddit originally conditioned us to view posts is to view them then not check them again (unless we interacted with someone in it or got a notification). This prevents potentially great (but late) content from gaining visibility, as a non-negligible portion of the population will still be browsing the subreddit, but will never click the post again.

      24 votes
    17. Comment tags: suggestions

      I just showed up yesterday to this great experiment, and find myself with some fresh-minted drama over politics and bans to ingest. While I wouldn't presume to propose a solution to the issues...

      I just showed up yesterday to this great experiment, and find myself with some fresh-minted drama over politics and bans to ingest. While I wouldn't presume to propose a solution to the issues raised in and by those threads, I found myself looking to the comment tagging system and finding some space to improve conversation.

      My intent (as I believe is the intent of this community) is to help foster constructive discussion without outright banning inflammatory topics. I believe that simply ignoring controversial issues because of the problems they raise is at best stifling potentially useful discourse and at worst intellectually dishonest.
      Tags I'd like to see:

      • "Citation Requested" As a tag, it would be a more constructive way of saying "I don't believe you"
      • "Disreputable Source" / "Source Disputed" is a civil way of pointing out issues
      • "Reported" would be a tricky implementation, but useful as a way of flagging comments for removal. Should ideally only be applied to eg. doxxing or incitement

      There should also be a moderation feature for removing tags that are no longer relevant or incorrectly applied. Alternatively, the display of comment tags could be reliant upon a critical mass of "reputation points" which would allow for, say, 100 people with 1 "troll-tagging rep" to get a comment flagged, or 2 people with 50 troll-tagging rep to do so. This of course is dependent upon the reputation system being fleshed out and has the very real danger of creating power users

      EDIT:

      @jgb pointed out that this is a lively discussion see these

      Tags I missed that came up in other discussions:

      • "Insightful" as a positive, almost a super-upvote
      • "Solved" for a comment that resolves an issue

      And, according to @cfabbro, @deimos is working on a public activity audit that can then be built upon to improve moderation

      13 votes
    18. What defines a toxic user

      Posting this here because I'm also wondering about how this will affect moderation policy on Tildes going forward As a former Reddit Moderator this has been something I've pondered for a long...

      Posting this here because I'm also wondering about how this will affect moderation policy on Tildes going forward

      As a former Reddit Moderator this has been something I've pondered for a long time: how does one define what a toxic user is in such a way that it can be easily understood as a community standard? I'll post the definition I defaulted to below. But I'd be most interested in knowing how other people think about this.

      26 votes
    19. Mod tools growing with user 'tools'

      So, new here and looking around but haven't seen this addressed yet (though could be wrong! Happy to be linked if I missed something) One common failure I've seen in online communities of various...

      So, new here and looking around but haven't seen this addressed yet (though could be wrong! Happy to be linked if I missed something)

      One common failure I've seen in online communities of various sorts is that moderation tools don't get grown in parallel with user tools and abilities, rather they lag behind, and are often in the end built by third parties. This is the case with Reddit, but also in a bunch of other areas (e.g. online gaming, admin tools were often built to basically provide functionality that users realised were needed but makers did not).

      I get the impression there are plenty of reddit mods here, so can we discuss what are the key features needed to moderate communities that would be better built in than coming from third party tools (RES, toolbox) . A lot of these aren't needed with 100 users but with a million they become pretty crucial.

      My initial thoughts:

      • Something not dissimilar to the automod
      • Group user tagging (shared tagging visible to all mods, tags can be linked to specific discussions/comments)
      • Ability to reply as a 'tilde' not as an individual
      • Ability to have canned responses/texts (for removals, for replies to user contacts)
      • Some sort of ticket-like system for dealing with user contacts to mods (take inspiration from helpdesk ticket systems)
      • (added) space per tilde for storage (tags, notes, bans, canned text etc) of reasonable size.

      Plenty more to add I am sure but wanted to open the discussion.

      10 votes
    20. The future of moderation on Tildes

      It seems like a large percentage of us that are also moderators on Reddit-- myself included. It seems that there's a generally negative attitude toward moderators on Reddit, which I totally get....

      It seems like a large percentage of us that are also moderators on Reddit-- myself included.

      It seems that there's a generally negative attitude toward moderators on Reddit, which I totally get. Moderation on Reddit is flawed. Community members feel a sense of ownership in the community (which they should have), but bad moderators can ruin that. How do you guys think moderation should be handled here?

      Here's a link from the docs that describes current plans: https://docs.tildes.net/mechanics-future

      It highlights plans for a reputation system, which I think is the right way to go.

      I also just realized that the same discussion was posted 18 days ago, but perhaps discussion with some of the newer users is worthwhile nonetheless:
      https://tildes.net/~tildes/6e/community_moderators

      31 votes