• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics with the tag "moderation". Back to normal view
    1. Long story, but I've ended up becoming the admin of a group on Facebook (the previous admin stepped down in a rush, and added me as he left). And the group has an existing group chat associated...

      Long story, but I've ended up becoming the admin of a group on Facebook (the previous admin stepped down in a rush, and added me as he left). And the group has an existing group chat associated with it.

      Is it possible to "moderate" this group chat? Specifically, as an admin of the group, can I remove unsavoury/unwanted messages from the chat associated with the group? It looks like I can't.

      Can even the creator of a group chat do this? If I close the group chat and create a new one, will I (as its creator) be able to remove unsavoury/unwanted messages from that new chat?

      I've done some searching via Google, and I'm not finding anything to indicate that this is possible. If someone posts something unsavoury in a group chat, it looks like the only option is to remove the person from the chat - but the unsavoury messages can't be deleted.

      Please tell me that's wrong!

      6 votes
    2. I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that...

      I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that point was genial and non-controversial, so it seems especially odd that there's just this black hole there. What struck me mostly was how opaque the moderation was. There is no indication of what kind of content was removed, why it was removed, or specifically who did the removal or when it happened.

      Then I scrolled down and at the very bottom I found what I guess is meant to address these concerns, a comment from Deimos:

      Sigh, I saw this thread was active and thought it was going to have an actual on-topic discussion in it. Let's (mostly) start this over.

      It's not always clear online so I want to say that I'm not rage-posting or bellyaching about censorship or any of the usual drama that tends to crop up on sites like Tildes from time to time. I trust Deimos' moderation and give this the benefit of the doubt. What I'm actually doing, I guess, is making a feature request about better annotation for removed comments.

      Would it make sense to show a note (like Deimos' comment) in-thread at the position of the deleted content? Instead of down at the bottom of the page or unattached to anything relevant? In my opinion some kind of "reason" message should always be provided with any moderation activity as a matter of course. Even if it's just boilerplate text chosen from a dropdown menu.

      Also, would a single bulk-annotation for all of the related removals make for better UX than 13 separate ones? I think that would be both easier to read, and easier for Deimos to generate on the backend.

      I feel like we may have had this conversation previously, but I couldn't find it. Apologies if I'm beating a dead horse.

      14 votes
    3. Hello. I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in...

      Hello.
      I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in and it got ugly (4 bans in the first 12 comments), in the other a trans woman took part and got shit for it (also featured a few users banned).

      Now, each of them had a very different approach. The first got defensive and stopped participating, while the second took the time to respond to the stupid but not offensive ones, trying to educate them.

      So even if this is something that bothers me a lot and makes considerably angry, I realised that maybe I should take a more nuanced view on this, and I should actually ask for more opinions on how to handle thiS, instead of simply applying my own standards and maybe making things worse and/or missing a chance to make things better. And since Tildes has always provided me with intelligent, thoughtful and interesting points of view and opinions, I thought this would be the best place for this question.

      And so here I am, asking anyone that would care to give an opinion: what would a good moderator do? How harsh or lenient should we be with ignorant but not offensive comments? Should we get involved at all if the discussion is not offensive? What would make our sub a nicer place to everyone? Any other thoughts?

      Thank you very much to all.

      20 votes
    4. I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit. I'd like to see a move to locking...

      I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit.

      I'd like to see a move to locking comments rather than deleting them by default. That would mean no further replies to the comment or any other comment in that chain, no one being able to delete or edit their comments, no one being able to add or remove votes to a comment, etc.

      I understand for particularly egregious comments removal is completely necessary (especially when it goes hand-in-hand with banning users), but a lot of times comments are deleted as a means to prevent long argumentative back-and-forth chains that spam and derail topics, as well as antagonize users.

      In a lot of cases I feel like deleting the comment only further serves to hide what is unacceptable behaviour (even if that behaviour should be obvious), rather than setting an example for the userbase.

      30 votes
    5. Proposal: Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions. A...

      Proposal:
      Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions.

      A new page, maybe called tildes.net/moderation, is available to all registered users. I am not sure where the link to should appear on the site, maybe on the user's profile sidebar?

      This page contains a table of all possible moderation actions. The actions may include: deleted topics, deleted comments, tag modification, moved topics, edited topic titles, banned user, locked topics. (this begs the question, what are the possible mod actions, and that they must be codified.)

      Very roughly, the table columns might include: Date, User(being mod'ed), Mod Action(a list of possible mod actions), Mod Action Reason (either a text field, or a list of possible reasons for this action), Link (null if action is a deleted topic.)

      I think that the user who did the moderating should not be publicly listed for now, to avoid drama?


      Some of the related Topics: (please make a top-level comment with any others)

      Could we have a stickied list of all bans with reasons included?

      Daily Tildes discussion - our first ban


      Please vote for the comment which best reflects your position on this proposal.
      As a bonus question, please make a top-level comment if you have general comment about my format of voting on comments. Would you prefer a straw poll on a 3rd party platform? Is there a cleaner way to do this?

      Edit: added "banned user" to actions list, I probably missed others, let me know. Also added the obvious locked topics.

      23 votes
    6. There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags I've noticed a few things about tags and rather than make a topic for each...

      There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags

      I've noticed a few things about tags and rather than make a topic for each one I thought I'd make a few top level comments instead, hopefully with others doing the same for anything tag related they'd like to see discussed.

      20 votes
    7. Tildes code of conduct says Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent. Can you change that to just say don't...

      Tildes code of conduct says

      Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent.

      Can you change that to just say don't post personal info? Even if it's not done with malicious intent it should still be removed to protect people's privacy.

      Also while it does say to not post spam on tildes terms of service I think It should say that on the code of conduct.

      Edit: I mean posting personal info without consent and not public information.

      Telling someone how to contact a company would be fine but not posting someone's address.

      12 votes
    8. In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users...

      In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users understand what isn't acceptable in the community and allow for community discussion on what could be considered an unjustified ban or a weird influx of bad behavior. This wouldn't be super viable when the site goes public, but would be a neat implementation in Tildes' alpha state.

      14 votes
    9. I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long...

      I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long as Deimos is the only one with most of the moderating tools at their disposal, I think it's crucial to make sure it's as painless as possible.

      So far it looks like Deimos has these moderating tools available to him:

      1. User bans
      2. Comment removal
      3. Thread locking/removal
      4. Title/tag editing (and this ability is shared by many of us as well)

      Am I missing anything?

      The three next tools I would hope are coming next are:

      • A reporting mechanism, where users can report comments and threads that they think should be removed.
      • A feedback mechanism for reports, telling users that a report they gave was acted on.
      • A note taking system for the moderator-type person, shareable with all other moderator-type persons at that level, with an expiration date probably around 30 days.

      Now I'll talk about why. First, the reporting mechanism. While it's still possible to keep up with everything that gets posted, I don't necessarily think it's the best use of Deimos' time to read literally everything, especially as the site expands its userbase and presumably activity level and depth. The reporting system at first should probably just be a button, maybe eventually with a pop-up field allowing the user a brief description why their reporting, and a queue that gets populated with comments and threads that get reported.

      Coinciding with a report queue/option should probably be an easy, rudimentary system for providing feedback to those whose reports led to moderating action. At first, an automated message saying something like "thank you for reporting recently. Action has been taken on one of your recent reports" without any relevant links would do fine, and we can leave the particulars of how much detail to add for later discussions.

      The last thing I think should help things considerably in the immediate term is a time-limited user tracking tool for the moderator-type person. As things scale, it isn't always going to be feasible to use mental bandwidth remembering each username and the relevant history associated with their behavior. A good note-taking tool with an auto-timed expiration date on notes would be a good way to address what can easily become a hugely mentally taxing role at almost any scale. This tool should let Deimos take a discrete note for himself (and other moderators at that permission level and higher) connected to a user regarding any questionable threads or comments that were yellow/red flags, or any other moderator action taken against a user within the last X days/months (the particulars don't matter to me as much as that there is an expiration date to these notes). This should let the moderator type person focus on the broader history of the users they're looking at before making a decision, without having to go searching for every relevant comment from the past 30 days. Fewer problematic users at scale should fall through the cracks and more users that might just be having a bad day can be let off with comment removals and/or warnings.

      Are these priorities fair? Are there design elements you would want to see in the immediate term that would help reduce the burden of moderating? Are there problems with these tools I'm suggesting that you would want to see addressed?

      19 votes
    10. Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god...

      Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god forbid, moderate such subs as T_D. Having a range of perspectives is, as always, the most valuable aspect of any discussion.

      Here are some baseline questions to get you started:-

      • Did your subreddit take strict measures to maintain quality ala r/AskHistorians, or was it a karmic free-for-all like r/aww?

      • Do you think the model was an appropriate fit for your sub? Was it successful?

      • What were the challenges faced in trying to maintain a certain quality standard (or not maintaining one at all)?

      • Will any of the lessons learnt on Reddit be applicable here in Tildes?

      30 votes
    11. I apologize in advance for what's probably going to be a very rambly post. This has been stewing on my mind for a while now and I just need to get it out. I've been on reddit a long time, 11 years...

      I apologize in advance for what's probably going to be a very rambly post. This has been stewing on my mind for a while now and I just need to get it out.

      I've been on reddit a long time, 11 years as of today in fact. In that time, I've watched the site grow from a small community of mostly tech nerds to one of the biggest sites on the web. I've also moderated many communities, from small niche subs (/r/thecure, /r/makeupaddictioncanada) to some of the biggest subs on the site (/r/worldnews, /r/gaming). I've modded communities that have exploded in popularity, growing from 25k to 100k to 500k and beyond, and seen how those communities change.

      When you're in a subreddit of say, 10k users, there's more community engagement. You know the users, the users know the mods, and you know when people are engaging in good faith. The mods themselves are basically just another user with a bit more control. People coming in just to cause shit are generally downvoted to death and reported quickly, and taken care of - it's a community effort to keep things civil. Modding a community like that is piss easy, you can generally check every thread yourself and see any nastiness easily before it becomes a problem, and the users themselves are more invested in keeping things on topic and friendly. Disagreements are generally resolved amicably, and even when things get heated it's easy enough to bring things back to center.

      Then the community starts to grow, and gather more users. Ok, you adjust, maybe add another mod or two, the users are still engaged and reporting threads regularly. Things stay more or less the same. The growth continues.

      At 50k, 100k, 250k, etc you notice differences in the community. People argue more, and because the usernames they're arguing with aren't known to them, they become more vitriolic. Old regulars begin drifting away as they feel sidelined or just lose interest.

      At 1M a major shift happens and the sub feels more like a free for all than a community. As a mod, you can't interact as much because there's more traffic. You stop being able to engage as much in the threads because you have to always be "on" and are now a representative of the mod team instead of a member of the community. Even if you've been there since day one, you're now a mod, and seen by some as "the enemy". Mods stifle free speech after all, removing posts and comments that don't fit the sub rules, banning users who are abusive or spammers. Those banned users start running to communities like SRC, decrying the abuse/bias/unfair treatment they've gotten at the hands of X sub mod team. Abusive modmails and PMs are fairly regular occurrences, and accusations of bias fly. The feeling of "us vs them" is amplified.

      Once you get above 10M users, all bets are off. Threads hit /r/all regularly and attract participants from all over reddit. These threads can attract thousands of comments, coming at the rate of several hundred every minute. Individual monitoring of threads becomes impossible. Automod can handle some of it, but we all know automod can be slow, goes down sometimes, and can't handle all the nuances of actual conversation. You've outgrown any moderation tools reddit provides, and need to seek outside help. Customized bots become necessary - most large subreddits rely on outside tools like SentinelBot for spam detection, or snoonotes for tracking problem users. Harassment is a real problem - death threats, stalking, and doxxing are legitimate issues and hard to deal with. I won't even touch on the issues like CP, suicidal users, and all the other shit that comes along with modding communities this large.

      I wish I had some solutions, but I really don't know what they are. We all know the tools we have as moderators on reddit are insufficient, but what people often overlook is why - the community is just too large for unpaid volunteers to moderate with the limited tools we have.

      39 votes
    12. So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but...

      So far, I haven’t seen too much moderation aside from bans, etc. dealt out by the admins (unless I’m wrong here and a moderation system is currently in place, please correct me if I’m wrong), but how will this work once Tildes is fully released to the public? Will people who show interest in a certain community be reached out to and asked?

      13 votes
    13. In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process...

      In governance, sortition (also known as allotment or demarchy) is the selection of political officials as a random sample from a larger pool of candidates. The logic behind the sortition process originates from the idea that “power corrupts.” For that reason, when the time came to choose individuals to be assigned to empowering positions, the ancient Athenians resorted to choosing by lot. In ancient Athenian democracy, sortition was therefore the traditional and primary method for appointing political officials, and its use was regarded as a principal characteristic of true democracy.

      Today, sortition is commonly used to select prospective jurors in common law-based legal systems and is sometimes used in forming citizen groups with political advisory power (citizens' juries or citizens' assemblies).

      The mechanics would be something like this: users report a post/comment, when there's enough reports the systems randomly selects 3/5/7/... currently active users and ask them to determine if the reported post contravene to the rules. The decision is then automatically taken with a majority rule.

      Why ?

      1. It's the only system that scales (to my knowledge). More users mean more content to moderate, but since the users are also moderators the system works at any scale. Systems that don't scale lead to all kind of problems when the number of users become large.
      2. It's very robust to manipulation. As moderators are chosen randomly it's very hard to coordinate or try to influence the decisions.
      3. It promotes a participatory attitude and a sense of responsibility in the users. There's no "them against us" (the bad mods against the users).
      18 votes