• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics with the tag "moderation". Back to normal view
    1. Crazy idea to help stop the spreading of untruthful news

      One of the main issues with news on social media is the spread of fake or false news. This happens on every platform that allows sharing news. If Tildes continues to gain popularity, this will...

      One of the main issues with news on social media is the spread of fake or false news. This happens on every platform that allows sharing news. If Tildes continues to gain popularity, this will likely happen on Tildes. I had an Idea: what if tildes had a group of fact checkers that check to see if the news is truthful, and block posts that link to untrustworthy new sites? could be like a 3 strikes thing, where if a new source has 3 articles posted that have misinformation, they would be blocked (the post also removed).

      This is just an idea, feel free to highlight any issues with it.

      10 votes
    2. Looking for opinions on how to moderate a community

      Hello. I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in...

      Hello.
      I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in and it got ugly (4 bans in the first 12 comments), in the other a trans woman took part and got shit for it (also featured a few users banned).

      Now, each of them had a very different approach. The first got defensive and stopped participating, while the second took the time to respond to the stupid but not offensive ones, trying to educate them.

      So even if this is something that bothers me a lot and makes considerably angry, I realised that maybe I should take a more nuanced view on this, and I should actually ask for more opinions on how to handle thiS, instead of simply applying my own standards and maybe making things worse and/or missing a chance to make things better. And since Tildes has always provided me with intelligent, thoughtful and interesting points of view and opinions, I thought this would be the best place for this question.

      And so here I am, asking anyone that would care to give an opinion: what would a good moderator do? How harsh or lenient should we be with ignorant but not offensive comments? Should we get involved at all if the discussion is not offensive? What would make our sub a nicer place to everyone? Any other thoughts?

      Thank you very much to all.

      20 votes
    3. Should deleting comments be the standard behaviour, or can we consider a less censored approach by default?

      I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit. I'd like to see a move to locking...

      I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit.

      I'd like to see a move to locking comments rather than deleting them by default. That would mean no further replies to the comment or any other comment in that chain, no one being able to delete or edit their comments, no one being able to add or remove votes to a comment, etc.

      I understand for particularly egregious comments removal is completely necessary (especially when it goes hand-in-hand with banning users), but a lot of times comments are deleted as a means to prevent long argumentative back-and-forth chains that spam and derail topics, as well as antagonize users.

      In a lot of cases I feel like deleting the comment only further serves to hide what is unacceptable behaviour (even if that behaviour should be obvious), rather than setting an example for the userbase.

      30 votes
    4. Feature proposal: Real-time moderation transparency page (vote in comments)

      Proposal: Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions. A...

      Proposal:
      Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions.

      A new page, maybe called tildes.net/moderation, is available to all registered users. I am not sure where the link to should appear on the site, maybe on the user's profile sidebar?

      This page contains a table of all possible moderation actions. The actions may include: deleted topics, deleted comments, tag modification, moved topics, edited topic titles, banned user, locked topics. (this begs the question, what are the possible mod actions, and that they must be codified.)

      Very roughly, the table columns might include: Date, User(being mod'ed), Mod Action(a list of possible mod actions), Mod Action Reason (either a text field, or a list of possible reasons for this action), Link (null if action is a deleted topic.)

      I think that the user who did the moderating should not be publicly listed for now, to avoid drama?


      Some of the related Topics: (please make a top-level comment with any others)

      Could we have a stickied list of all bans with reasons included?

      Daily Tildes discussion - our first ban


      Please vote for the comment which best reflects your position on this proposal.
      As a bonus question, please make a top-level comment if you have general comment about my format of voting on comments. Would you prefer a straw poll on a 3rd party platform? Is there a cleaner way to do this?

      Edit: added "banned user" to actions list, I probably missed others, let me know. Also added the obvious locked topics.

      23 votes
    5. Lets discuss tags, again

      There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags I've noticed a few things about tags and rather than make a topic for each...

      There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags

      I've noticed a few things about tags and rather than make a topic for each one I thought I'd make a few top level comments instead, hopefully with others doing the same for anything tag related they'd like to see discussed.

      20 votes
    6. Tildes code of conduct

      Tildes code of conduct says Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent. Can you change that to just say don't...

      Tildes code of conduct says

      Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent.

      Can you change that to just say don't post personal info? Even if it's not done with malicious intent it should still be removed to protect people's privacy.

      Also while it does say to not post spam on tildes terms of service I think It should say that on the code of conduct.

      Edit: I mean posting personal info without consent and not public information.

      Telling someone how to contact a company would be fine but not posting someone's address.

      12 votes
    7. Could we have a stickied list of all bans with reasons included?

      In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users...

      In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users understand what isn't acceptable in the community and allow for community discussion on what could be considered an unjustified ban or a weird influx of bad behavior. This wouldn't be super viable when the site goes public, but would be a neat implementation in Tildes' alpha state.

      14 votes
    8. Moderator tools: what do you have and what should be the immediate priorities?

      I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long...

      I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long as Deimos is the only one with most of the moderating tools at their disposal, I think it's crucial to make sure it's as painless as possible.

      So far it looks like Deimos has these moderating tools available to him:

      1. User bans
      2. Comment removal
      3. Thread locking/removal
      4. Title/tag editing (and this ability is shared by many of us as well)

      Am I missing anything?

      The three next tools I would hope are coming next are:

      • A reporting mechanism, where users can report comments and threads that they think should be removed.
      • A feedback mechanism for reports, telling users that a report they gave was acted on.
      • A note taking system for the moderator-type person, shareable with all other moderator-type persons at that level, with an expiration date probably around 30 days.

      Now I'll talk about why. First, the reporting mechanism. While it's still possible to keep up with everything that gets posted, I don't necessarily think it's the best use of Deimos' time to read literally everything, especially as the site expands its userbase and presumably activity level and depth. The reporting system at first should probably just be a button, maybe eventually with a pop-up field allowing the user a brief description why their reporting, and a queue that gets populated with comments and threads that get reported.

      Coinciding with a report queue/option should probably be an easy, rudimentary system for providing feedback to those whose reports led to moderating action. At first, an automated message saying something like "thank you for reporting recently. Action has been taken on one of your recent reports" without any relevant links would do fine, and we can leave the particulars of how much detail to add for later discussions.

      The last thing I think should help things considerably in the immediate term is a time-limited user tracking tool for the moderator-type person. As things scale, it isn't always going to be feasible to use mental bandwidth remembering each username and the relevant history associated with their behavior. A good note-taking tool with an auto-timed expiration date on notes would be a good way to address what can easily become a hugely mentally taxing role at almost any scale. This tool should let Deimos take a discrete note for himself (and other moderators at that permission level and higher) connected to a user regarding any questionable threads or comments that were yellow/red flags, or any other moderator action taken against a user within the last X days/months (the particulars don't matter to me as much as that there is an expiration date to these notes). This should let the moderator type person focus on the broader history of the users they're looking at before making a decision, without having to go searching for every relevant comment from the past 30 days. Fewer problematic users at scale should fall through the cracks and more users that might just be having a bad day can be let off with comment removals and/or warnings.

      Are these priorities fair? Are there design elements you would want to see in the immediate term that would help reduce the burden of moderating? Are there problems with these tools I'm suggesting that you would want to see addressed?

      19 votes
    9. Moderators of Reddit, tell us about your experiences in fostering quality discussion and content (or failures to do so)

      Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god...

      Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god forbid, moderate such subs as T_D. Having a range of perspectives is, as always, the most valuable aspect of any discussion.

      Here are some baseline questions to get you started:-

      • Did your subreddit take strict measures to maintain quality ala r/AskHistorians, or was it a karmic free-for-all like r/aww?

      • Do you think the model was an appropriate fit for your sub? Was it successful?

      • What were the challenges faced in trying to maintain a certain quality standard (or not maintaining one at all)?

      • Will any of the lessons learnt on Reddit be applicable here in Tildes?

      29 votes
    10. On Reddit moderation - it's a matter of scale.

      I apologize in advance for what's probably going to be a very rambly post. This has been stewing on my mind for a while now and I just need to get it out. I've been on reddit a long time, 11 years...

      I apologize in advance for what's probably going to be a very rambly post. This has been stewing on my mind for a while now and I just need to get it out.

      I've been on reddit a long time, 11 years as of today in fact. In that time, I've watched the site grow from a small community of mostly tech nerds to one of the biggest sites on the web. I've also moderated many communities, from small niche subs (/r/thecure, /r/makeupaddictioncanada) to some of the biggest subs on the site (/r/worldnews, /r/gaming). I've modded communities that have exploded in popularity, growing from 25k to 100k to 500k and beyond, and seen how those communities change.

      When you're in a subreddit of say, 10k users, there's more community engagement. You know the users, the users know the mods, and you know when people are engaging in good faith. The mods themselves are basically just another user with a bit more control. People coming in just to cause shit are generally downvoted to death and reported quickly, and taken care of - it's a community effort to keep things civil. Modding a community like that is piss easy, you can generally check every thread yourself and see any nastiness easily before it becomes a problem, and the users themselves are more invested in keeping things on topic and friendly. Disagreements are generally resolved amicably, and even when things get heated it's easy enough to bring things back to center.

      Then the community starts to grow, and gather more users. Ok, you adjust, maybe add another mod or two, the users are still engaged and reporting threads regularly. Things stay more or less the same. The growth continues.

      At 50k, 100k, 250k, etc you notice differences in the community. People argue more, and because the usernames they're arguing with aren't known to them, they become more vitriolic. Old regulars begin drifting away as they feel sidelined or just lose interest.

      At 1M a major shift happens and the sub feels more like a free for all than a community. As a mod, you can't interact as much because there's more traffic. You stop being able to engage as much in the threads because you have to always be "on" and are now a representative of the mod team instead of a member of the community. Even if you've been there since day one, you're now a mod, and seen by some as "the enemy". Mods stifle free speech after all, removing posts and comments that don't fit the sub rules, banning users who are abusive or spammers. Those banned users start running to communities like SRC, decrying the abuse/bias/unfair treatment they've gotten at the hands of X sub mod team. Abusive modmails and PMs are fairly regular occurrences, and accusations of bias fly. The feeling of "us vs them" is amplified.

      Once you get above 10M users, all bets are off. Threads hit /r/all regularly and attract participants from all over reddit. These threads can attract thousands of comments, coming at the rate of several hundred every minute. Individual monitoring of threads becomes impossible. Automod can handle some of it, but we all know automod can be slow, goes down sometimes, and can't handle all the nuances of actual conversation. You've outgrown any moderation tools reddit provides, and need to seek outside help. Customized bots become necessary - most large subreddits rely on outside tools like SentinelBot for spam detection, or snoonotes for tracking problem users. Harassment is a real problem - death threats, stalking, and doxxing are legitimate issues and hard to deal with. I won't even touch on the issues like CP, suicidal users, and all the other shit that comes along with modding communities this large.

      I wish I had some solutions, but I really don't know what they are. We all know the tools we have as moderators on reddit are insufficient, but what people often overlook is why - the community is just too large for unpaid volunteers to moderate with the limited tools we have.

      39 votes
    11. How will moderation on existing groups work?

      So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but...

      So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but how will this work once Tildes is fully released to the public? Will people who show interest in a certain community be reached out to and asked?

      13 votes