• Most votes
  • Most comments
  • Newest
  • Activity
  • Showing only topics with the tag "moderation". Back to normal view
    1. I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that...

      I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that point was genial and non-controversial, so it seems especially odd that there's just this black hole there. What struck me mostly was how opaque the moderation was. There is no indication of what kind of content was removed, why it was removed, or specifically who did the removal or when it happened.

      Then I scrolled down and at the very bottom I found what I guess is meant to address these concerns, a comment from Deimos:

      Sigh, I saw this thread was active and thought it was going to have an actual on-topic discussion in it. Let's (mostly) start this over.

      It's not always clear online so I want to say that I'm not rage-posting or bellyaching about censorship or any of the usual drama that tends to crop up on sites like Tildes from time to time. I trust Deimos' moderation and give this the benefit of the doubt. What I'm actually doing, I guess, is making a feature request about better annotation for removed comments.

      Would it make sense to show a note (like Deimos' comment) in-thread at the position of the deleted content? Instead of down at the bottom of the page or unattached to anything relevant? In my opinion some kind of "reason" message should always be provided with any moderation activity as a matter of course. Even if it's just boilerplate text chosen from a dropdown menu.

      Also, would a single bulk-annotation for all of the related removals make for better UX than 13 separate ones? I think that would be both easier to read, and easier for Deimos to generate on the backend.

      I feel like we may have had this conversation previously, but I couldn't find it. Apologies if I'm beating a dead horse.

      14 votes
    2. I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit. I'd like to see a move to locking...

      I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit.

      I'd like to see a move to locking comments rather than deleting them by default. That would mean no further replies to the comment or any other comment in that chain, no one being able to delete or edit their comments, no one being able to add or remove votes to a comment, etc.

      I understand for particularly egregious comments removal is completely necessary (especially when it goes hand-in-hand with banning users), but a lot of times comments are deleted as a means to prevent long argumentative back-and-forth chains that spam and derail topics, as well as antagonize users.

      In a lot of cases I feel like deleting the comment only further serves to hide what is unacceptable behaviour (even if that behaviour should be obvious), rather than setting an example for the userbase.

      31 votes
    3. Proposal: Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions. A...

      Proposal:
      Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions.

      A new page, maybe called tildes.net/moderation, is available to all registered users. I am not sure where the link to should appear on the site, maybe on the user's profile sidebar?

      This page contains a table of all possible moderation actions. The actions may include: deleted topics, deleted comments, tag modification, moved topics, edited topic titles, banned user, locked topics. (this begs the question, what are the possible mod actions, and that they must be codified.)

      Very roughly, the table columns might include: Date, User(being mod'ed), Mod Action(a list of possible mod actions), Mod Action Reason (either a text field, or a list of possible reasons for this action), Link (null if action is a deleted topic.)

      I think that the user who did the moderating should not be publicly listed for now, to avoid drama?


      Some of the related Topics: (please make a top-level comment with any others)

      Could we have a stickied list of all bans with reasons included?

      Daily Tildes discussion - our first ban


      Please vote for the comment which best reflects your position on this proposal.
      As a bonus question, please make a top-level comment if you have general comment about my format of voting on comments. Would you prefer a straw poll on a 3rd party platform? Is there a cleaner way to do this?

      Edit: added "banned user" to actions list, I probably missed others, let me know. Also added the obvious locked topics.

      23 votes
    4. There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags I've noticed a few things about tags and rather than make a topic for each...

      There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags

      I've noticed a few things about tags and rather than make a topic for each one I thought I'd make a few top level comments instead, hopefully with others doing the same for anything tag related they'd like to see discussed.

      20 votes
    5. Tildes code of conduct says Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent. Can you change that to just say don't...

      Tildes code of conduct says

      Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent.

      Can you change that to just say don't post personal info? Even if it's not done with malicious intent it should still be removed to protect people's privacy.

      Also while it does say to not post spam on tildes terms of service I think It should say that on the code of conduct.

      Edit: I mean posting personal info without consent and not public information.

      Telling someone how to contact a company would be fine but not posting someone's address.

      12 votes
    6. In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users...

      In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users understand what isn't acceptable in the community and allow for community discussion on what could be considered an unjustified ban or a weird influx of bad behavior. This wouldn't be super viable when the site goes public, but would be a neat implementation in Tildes' alpha state.

      14 votes
    7. I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long...

      I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long as Deimos is the only one with most of the moderating tools at their disposal, I think it's crucial to make sure it's as painless as possible.

      So far it looks like Deimos has these moderating tools available to him:

      1. User bans
      2. Comment removal
      3. Thread locking/removal
      4. Title/tag editing (and this ability is shared by many of us as well)

      Am I missing anything?

      The three next tools I would hope are coming next are:

      • A reporting mechanism, where users can report comments and threads that they think should be removed.
      • A feedback mechanism for reports, telling users that a report they gave was acted on.
      • A note taking system for the moderator-type person, shareable with all other moderator-type persons at that level, with an expiration date probably around 30 days.

      Now I'll talk about why. First, the reporting mechanism. While it's still possible to keep up with everything that gets posted, I don't necessarily think it's the best use of Deimos' time to read literally everything, especially as the site expands its userbase and presumably activity level and depth. The reporting system at first should probably just be a button, maybe eventually with a pop-up field allowing the user a brief description why their reporting, and a queue that gets populated with comments and threads that get reported.

      Coinciding with a report queue/option should probably be an easy, rudimentary system for providing feedback to those whose reports led to moderating action. At first, an automated message saying something like "thank you for reporting recently. Action has been taken on one of your recent reports" without any relevant links would do fine, and we can leave the particulars of how much detail to add for later discussions.

      The last thing I think should help things considerably in the immediate term is a time-limited user tracking tool for the moderator-type person. As things scale, it isn't always going to be feasible to use mental bandwidth remembering each username and the relevant history associated with their behavior. A good note-taking tool with an auto-timed expiration date on notes would be a good way to address what can easily become a hugely mentally taxing role at almost any scale. This tool should let Deimos take a discrete note for himself (and other moderators at that permission level and higher) connected to a user regarding any questionable threads or comments that were yellow/red flags, or any other moderator action taken against a user within the last X days/months (the particulars don't matter to me as much as that there is an expiration date to these notes). This should let the moderator type person focus on the broader history of the users they're looking at before making a decision, without having to go searching for every relevant comment from the past 30 days. Fewer problematic users at scale should fall through the cracks and more users that might just be having a bad day can be let off with comment removals and/or warnings.

      Are these priorities fair? Are there design elements you would want to see in the immediate term that would help reduce the burden of moderating? Are there problems with these tools I'm suggesting that you would want to see addressed?

      19 votes
    8. Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god...

      Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god forbid, moderate such subs as T_D. Having a range of perspectives is, as always, the most valuable aspect of any discussion.

      Here are some baseline questions to get you started:-

      • Did your subreddit take strict measures to maintain quality ala r/AskHistorians, or was it a karmic free-for-all like r/aww?

      • Do you think the model was an appropriate fit for your sub? Was it successful?

      • What were the challenges faced in trying to maintain a certain quality standard (or not maintaining one at all)?

      • Will any of the lessons learnt on Reddit be applicable here in Tildes?

      31 votes
    9. So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but...

      So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but how will this work once Tildes is fully released to the public? Will people who show interest in a certain community be reached out to and asked?

      13 votes
    10. In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process...

      In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process originates from the idea that “power corrupts.” For that reason, when the time came to choose individuals to be assigned to empowering positions, the ancient Athenians resorted to choosing by lot. In ancient Athenian democracy, sortition was therefore the traditional and primary method for appointing political officials, and its use was regarded as a principal characteristic of true democracy.

      Today, sortition is commonly used to select prospective jurors in common law-based legal systems and is sometimes used in forming citizen groups with political advisory power (citizens' juries or citizens' assemblies).

      The mechanics would be something like this: users report a post/comment, when there's enough reports the systems randomly selects 3/5/7/... currently active users and ask them to determine if the reported post contravene to the rules. The decision is then automatically taken with a majority rule.

      Why ?

      1. It's the only system that scales (to my knowledge). More users mean more content to moderate, but since the users are also moderators the system works at any scale. Systems that don't scale lead to all kind of problems when the number of users become large.
      2. It's very robust to manipulation. As moderators are chosen randomly it's very hard to coordinate or try to influence the decisions.
      3. It promotes a participatory attitude and a sense of responsibility in the users. There's no "them against us" (the bad mods against the users).
      18 votes
    11. Right now there's a lot of discussion ongoing about community culture, building Tildes' attitudes as a community into something that is solid enough to withstand waves of new users without being...

      Right now there's a lot of discussion ongoing about community culture, building Tildes' attitudes as a community into something that is solid enough to withstand waves of new users without being disrupted too heavily by newcomers that have yet to learn the culture.

      But what of mod culture?

      This topic isn't only for those that have mod experience, there are plenty of users with experience talking to mods that have their own negative stories. Over on reddit the actions of one mod team affect the brand-image of all modteams on the entirety of reddit. One bad action by a mod that occurs in a default subreddit backed up by the other mods in that subreddit becomes (in the eyes of users) the behaviour of all "reddit moderators".

      Often I see mods making things far far worse by being one of the most combative and hostile in-groups on the site. Talking to users in a manner that is best described as the way the worst teacher in school talked to teenagers as if they were 4 year olds, not listening to anything a user is actually saying and dismissing them outright because they're the user and they're the moderator. I understand some of it comes from difficult interactions with genuinely toxic individuals that waste enormous quantities of time better put towards better things. However what I see are moderators approaching every interaction with every user with criticism as if they are almost certainly the same-old toxic user. This is not the case.

      This is exceptionally important here on Tildes because it won't be a mistake to take the actions of one moderator and have it colour your image of other moderators on the site. When the site holds responsibility for moderator actions due to oversight and control then the actions of all moderators are going to be considered the actions of the site and the rest of the mods.

      So, how do we want our mods to talk to users? How do we want them to interact with users? What controls can be put in place to appreciate quality moderation? What can stop quippy mods that shut down valid discussion with 1 line reductive answers? Etc etc.

      What is good moderation and what is a good moderator?

      Personally what I try to apply to my own behaviour is to actually LISTEN to people and act as an equal, or at least present the appearance of listening. The thing that bothers people most feeling like something they care about is dismissed.

      What are the many issues that you've see in moderator behaviour (in front and behind the scenes) and in what ways can Tildes go about things differently to stop them?

      19 votes
    12. So I noticed the entire front page getting clogged with "question" type posts, ranging from "what are your favorite..." to "pls help me choose..." type posts. This might be mainly due to...

      So I noticed the entire front page getting clogged with "question" type posts, ranging from "what are your favorite..." to "pls help me choose..." type posts. This might be mainly due to "activity" sorting (sorting by votes is a little better), but that's still the default and doesn't change the general dominance. I took this screenshot earlier and I did not see a non-question post without scrolling. None of them were from ~talk, either.

      I know people have different views on this, but I remember from my brief time moderating that it's generally a good idea to restrict these types of posts, for the simple reason that people love to dump their "favorite" lists, which makes these types of threads dominate the frontpage, while they tend to produce always the same responses (intuition might suggest they produce great discussion but that's usually not the case). They're best pushed into specific subreddits (subgroups?).

      I think this is a rather small and specific issue, but it might be a taste of future difficulties with voting/moderation. Banning content for being disruptive/abusive is one thing, but the best places I know for discussion also ban via more subtle rule sets. They take measures into account (often at the cost of facing a ton of backlash from users seeing their posts removed for "unfair" reasons) that keep one type of post from taking over the frontpage, potentially drowning out more interesting ones. I'm still trying to picture how this would translate from Reddit's moderation model to Tildes'.

      One way would be to open up a subgroup for any sufficiently large category of posts and give moderation the option to move posts to a subgroup that people can opt-out from. Another is very diligent tagging and filtering. My concern is that neither could produce the complex, fine-grain type of moderation that distinguishes really good subreddits (yea, they exist!) from spammy ones. "Hide all posts tagged 'question'" could hide "what's your favorite...?" type posts but also posts that ask a really deep and interesting question. So would you filter "question && favorite"? That turns filtering into almost a scripting job. It doesn't seem reasonable to expect users to put this much effort into content filtering and it wouldn't help "shape" discussion culture, as the default (no filters?) would keep most users jumping from one "favorite game/band/movie/programming language" post to the next.

      So far, it seems rules are set site-wide based on mostly removing blatantly off-topic, bad faith or trolling content. As the groups grow, however, I believe it's absolutely vital to also allow more subtle policies (think "only original sources for news articles" or "only direct links to movie trailers", etc). As groups branch off into further subgroups, it might suddenly also be reasonable to have very specific rules like "no more individual posts about hype topic X, keep discussion in the hub thread until Friday".

      The only way I can see this work out (and maybe I lack imagination) is via a "meta" section for each group that allows whoever is decided to be part of the moderator group to decide upon and clearly formulate rules specific to it. It could be a wiki-like thing, it could involve voting on changes, maybe automation via "default tag filters", etc. Other users could see the policies mods have decided upon and maybe even "opt out" from moderation actions being considered in filtering, to have no reason to be paranoid about "censored" content.

      Am I too pessimistic about tagging/voting solving this on its own? Am I too stuck on doing it "the reddit way" (albeit with hopefully better tools)? I just really believe it's subtle moderation like this that might make or break Tildes in the long run.

      TL;DR: How would more subtle or group-specific moderation policies be decided? Just tags+votes? Should there be a "meta" sections for each group where mods can agree upon specific rules?

      8 votes
    13. I just showed up yesterday to this great experiment, and find myself with some fresh-minted drama over politics and bans to ingest. While I wouldn't presume to propose a solution to the issues...

      I just showed up yesterday to this great experiment, and find myself with some fresh-minted drama over politics and bans to ingest. While I wouldn't presume to propose a solution to the issues raised in and by those threads, I found myself looking to the comment tagging system and finding some space to improve conversation.

      My intent (as I believe is the intent of this community) is to help foster constructive discussion without outright banning inflammatory topics. I believe that simply ignoring controversial issues because of the problems they raise is at best stifling potentially useful discourse and at worst intellectually dishonest.
      Tags I'd like to see:

      • "Citation Requested" As a tag, it would be a more constructive way of saying "I don't believe you"
      • "Disreputable Source" / "Source Disputed" is a civil way of pointing out issues
      • "Reported" would be a tricky implementation, but useful as a way of flagging comments for removal. Should ideally only be applied to eg. doxxing or incitement

      There should also be a moderation feature for removing tags that are no longer relevant or incorrectly applied. Alternatively, the display of comment tags could be reliant upon a critical mass of "reputation points" which would allow for, say, 100 people with 1 "troll-tagging rep" to get a comment flagged, or 2 people with 50 troll-tagging rep to do so. This of course is dependent upon the reputation system being fleshed out and has the very real danger of creating power users

      EDIT:

      @jgb pointed out that this is a lively discussion see these

      Tags I missed that came up in other discussions:

      • "Insightful" as a positive, almost a super-upvote
      • "Solved" for a comment that resolves an issue

      And, according to @cfabbro, @deimos is working on a public activity audit that can then be built upon to improve moderation

      13 votes
    14. Posting this here because I'm also wondering about how this will affect moderation policy on Tildes going forward As a former Reddit Moderator this has been something I've pondered for a long...

      Posting this here because I'm also wondering about how this will affect moderation policy on Tildes going forward

      As a former Reddit Moderator this has been something I've pondered for a long time: how does one define what a toxic user is in such a way that it can be easily understood as a community standard? I'll post the definition I defaulted to below. But I'd be most interested in knowing how other people think about this.

      26 votes
    15. So, new here and looking around but haven't seen this addressed yet (though could be wrong! Happy to be linked if I missed something) One common failure I've seen in online communities of various...

      So, new here and looking around but haven't seen this addressed yet (though could be wrong! Happy to be linked if I missed something)

      One common failure I've seen in online communities of various sorts is that moderation tools don't get grown in parallel with user tools and abilities, rather they lag behind, and are often in the end built by third parties. This is the case with Reddit, but also in a bunch of other areas (e.g. online gaming, admin tools were often built to basically provide functionality that users realised were needed but makers did not).

      I get the impression there are plenty of reddit mods here, so can we discuss what are the key features needed to moderate communities that would be better built in than coming from third party tools (RES, toolbox) . A lot of these aren't needed with 100 users but with a million they become pretty crucial.

      My initial thoughts:

      • Something not dissimilar to the automod
      • Group user tagging (shared tagging visible to all mods, tags can be linked to specific discussions/comments)
      • Ability to reply as a 'tilde' not as an individual
      • Ability to have canned responses/texts (for removals, for replies to user contacts)
      • Some sort of ticket-like system for dealing with user contacts to mods (take inspiration from helpdesk ticket systems)
      • (added) space per tilde for storage (tags, notes, bans, canned text etc) of reasonable size.

      Plenty more to add I am sure but wanted to open the discussion.

      10 votes
    16. It seems like a large percentage of us that are also moderators on Reddit-- myself included. It seems that there's a generally negative attitude toward moderators on Reddit, which I totally get....

      It seems like a large percentage of us that are also moderators on Reddit-- myself included.

      It seems that there's a generally negative attitude toward moderators on Reddit, which I totally get. Moderation on Reddit is flawed. Community members feel a sense of ownership in the community (which they should have), but bad moderators can ruin that. How do you guys think moderation should be handled here?

      Here's a link from the docs that describes current plans: https://docs.tildes.net/mechanics-future

      It highlights plans for a reputation system, which I think is the right way to go.

      I also just realized that the same discussion was posted 18 days ago, but perhaps discussion with some of the newer users is worthwhile nonetheless:
      https://tildes.net/~tildes/6e/community_moderators

      32 votes