Daily Tildes discussion - starting some moderation
Alright, this is very late today, but I had some other things to get through first.
If you missed it, I locked this topic earlier today (which involved quickly hacking together a lock method because I didn't have one). There was nothing wrong with the subject itself, and some reasonable discussion did happen in it, but overall it was disappointing to see it start devolving into the same old tired arguments, and it was unlikely to go anywhere productive if it had continued. I don't want to focus on that specific post though, and let's (please) try not to turn this thread entirely into a debate about it.
The thing that I'd rather discuss is that I think this marks the first time I've done any sort of "strong" moderation-like action that wasn't also associated with banning a user (and there have still only been a few of those total). This shouldn't be a shocking or surprising event—introducing some moderation was inevitable if we want to have any hope of maintaining quality, and I'm honestly impressed that we managed to make it a month before it was necessary. From this point, I'm probably going to start doing it a little more (especially as we continue growing), and at least for the near future the actions should mostly be restricted to:
- re-tagging topics (and I'll give other people the ability to do this as well)
- editing topic titles
- moving topics between groups
- (hopefully rarely) locking topics, or removing topics/comments
The main thing I'm working on finishing up now is a sort of "topic log" that will show which actions were taken on a topic, and who took them. So for example, once this is deployed, you'll be able to see things like "Deimos added tags x, y, z" or "Deimos changed title to ...".
So what I'd like to talk about in this thread is just general thoughts on moderation—would you like to see a bit stricter moderation to try to set the bar a bit higher to start? How aggressively should I move topics if I think they don't fit? Do you think we need some sort of global log to list topics that are removed? Opinions on those sorts of questions are welcome, so I can take it all into account as I figure out how I want to approach it.
I'm fine with taking a fairly hard line on stamping out bullshit. I'm also fine with topics being moved to somewhere more suitable when appropriate, that's really not a big deal at all.
I haven't thought much about public access to mod actions. I think it might be nice if it was available to see somewhere for those that are interested in the how's and why's of what things are done.
On that other thread, all the "reply" buttons are gone from the comments.
That's how I noticed it. I wanted to make a comment, but couldn't. Did a quick search for "deimos" and found a comment where he said he was locking it. If there's a clear indicator that "<user> locked this topic <relative timestamp> ago", that would be great. Especially once we get more moderators, and more moderator power, transparency will go a long way.
Also, reply to myself, I think something broke up there. Leaving it for @Deimos to take a look later on.
Yeah, I think the markdown parser or HTML sanitizer I'm using is... picky about closing tags for some reason, so if you use something that looks like an HTML tag and don't close it, it'll do it for you. It's weird, I need to look into it: https://gitlab.com/tildes/tildes/issues/7
That's a great idea, something like an ongoing diff or git log of changes to a page over time would be brilliant. It should include moderation actions and notes, which would mean it would always be transparently clear what has occurred and why specific action was taken.
This has been gone over dozens of times so far. Having a public log of moderation is a bad idea and nobody with moderation experience supports it at all.
Even more importantly - It's not possible to add a git log of changes to a page over time. Users editing their own comments should be a permanent and irreversible change that is not visible to other users.
Things that get removed by moderators should not be visible to users. That goes against the entire point of removing things. If you remove 40 comments of hatespeech for the benefit of people that don't want to see a large amount of it and leave the note "removed for hatespeech" in its place then it arguably has exactly the same effect as if the hatespeech were there anyway, it shows a tonne of hate existing. Making it go away entirely is far far better for the benefit of users.
Contrary to popular belief mod abuse is actually rare, and where it does occur it is usually negligible in effect anyway.
In top-down system where Tildes is taking responsibility for the actions of moderators a moderator will no longer be able to get away with the kinds of actions or behaviours you see from mods on reddit anyway as their actions will be monitored by Tildes itself. I can tell you with absolute certainty that good mods will not be allowing bad mods to get away with their behaviour. If good mods of reddit could report bad moderation to reddit itself and have those mods removed for it they absolutely would.
I mod a couple of medium sized subreddits myself. I am at least one instance of someone with an interest in discussion the idea, in theory.
Why would a diff or change log not be possible for a page?
That's a subjective opinion which may be correct for some people and not others.
I've personally seen many instances of deleted comments that were in no way related to hate speech. Sometimes it's really not obvious why a comment was removed outside of bias position on the part of the removing mod.
So it doesn't exist, but if it does it's rare, and even if it's not that rare it doesn't have a negligible effect, and even if it does have a negligible effect it's not possible on tidles? Pardon me if I have misunderstood but that does seem to be quite circular logic, and it sounds like it's being argued to prove a position, rather than being used to attempt to think and discuss the best steps forward.
Honestly, I was more excited about the technology that could be created to form a living document tracking change over time. I think it's an interesting topic.
Because:
These should all be very obvious to a person with moderator experience. These happen in every subreddit with over 20k subs. Users make mistakes and post things they should not. These aren't always noticed by the users and they need removing for their own protection. These are sometimes noticed by the users and they edit them themselves, etc etc. Either way doesn't matter - both should be permanent changes that aren't visible to others.
The negatives far outweigh the gains.
Why doesn't any of the above prevent Wikipedia from keeping page change history?
I really do appreciate your points. But in the spirit that tildes wants to be something better than reddit, rather than a straight clone, I don't see any harm in discussing ideas that might not be a good fit for reddit specifically
Wikipedia's page change history does not keep moderator actions.
This should be fairly obvious because otherwise you'd have the personal addresses of thousands of people littered throughout the page histories of various pages for notable individuals, celebrities, etc. But it also doesn't keep vandalism actions, hatespeech, and other things.
There's a difference between something being removed (but kept in history) because it fails to meet Wiki's content guidelines on how an encyclopedic article should be written vs something being removed because it breaks Wiki's rules.
But now we've come full circle to the claim that "It's not possible to add a git log of changes to a page over time."
You've stated that "it isn't possible to be" and you have used an argument that "it shouldn't be"
Transparency is a powerful tool. One of the problems we face as a community is deciding what constitutes vandalism, hate speech and unwanted content and "other things". The community should be a part of that decision-making process and not a limited number of overseers.
Making that process invisible causes difficulty and resentment. Any process of moderation should be transparent to the community. While I accept that might be difficult to achieve, it's not the poisonous goal you describe.
Your point about rules vs guidelines is well made. Perhaps a that should be something that should be looked at first. Making a clear distinction between the two? How might that distinction be defined? How could the community have input on the definition?
This isn't Reddit. I would suggest to you not to assume that policies necessary on Reddit will be necessary here. If we are forced to take all the same precautions and make all the same tradeoffs, Tildes will not be a new project--it will be a Reddit clone, with all of the same problems.
We don't have any groups with 20k subs. We won't for a very long while. If these issues start happening and Tildes is unable to find more egalitarian and transparent ways of handling them, THEN we should apply the Reddit strategy. Prove to me that other solutions don't work and then I'll be happy to support yours. Until then, I think it is unwise to pigeonhole a project into some of the same design choices its predecessor was criticized for.
We have groups with 4000 subs. The site is growing and will grow rapidly. We're only a few months away from it and failure to plan ahead for a behaviour we KNOW is going to occur is would be pretty naive.
Learn based on the history of others. Don't repeat mistakes in the naive but hopeful belief that it somehow might not happen again. It will happen again.
Prove that they DO before making errors. It has already been tried all over reddit. It caused more issues than it solved anywhere that it was tried.
What issue is it trying to solve? It doesn't really solve any issue. It doesn't improve much at all yet it brings forth countless problems.
That's the exact same thing I'm saying to you. Reddit is the wrong way. Don't assume we have to do it the Reddit way. I don't WANT to be like Reddit. The whole point is being DIFFERENT than Reddit.
That should honestly depend on the group, I'd say. In reddit terms /r/askhistorians deletes boatloads of comments, and probably very few are for hate speech. This is fine, but to prove that comments aren't deleted out of any sense of bigotry, it might be good to show what the comments were--particularly as a way of cataloging what lay users believed when the post was made.
Then you run into the issue of savvy bigots abusing the system with comments that appear to be reasonable-participation by the casual user who doesn't know any better but are in fact veiled bigotry or bait.
And when they're visible to other users, they then take it and use it as evidence to casual users of the bad modding taking place. It gets used as a recruiting means.
Christ, can you give me ANY evidence of ANY of your claims? You're a zealot, but I can at least respect a zealot who has some studies or primary sources to back them up.
Anyone with active mod experience of the larger subs on reddit will give you the same story since discord became the organisational norm. We've all seen and experienced it first hand.
Just go a brigading discord group and lurk them a while. They talk tactics fairly frequently as they believe they're in a safe space.
You can hand wave it away because you don't like the sound of it. But it is what it is.
As for name calling and personal attacks - I don't think that's very appropriate here and I urge you to reconsider your phrasing in future as behaving and talking to people like that is not good participation.
Zealot is an appropriate word for someone who continues to not provide evidence for their extraordinary claims. It's not an insult--it's my honest appraisal of your behavior and attitude. I also gave you an easy way to earn my respect--I literally outlined it for you. Now I apologize if you took it as an insult, but I myself felt insulted by what seemed like a deliberate strategy of obfuscation and redirection--you never provided anything to show that what you say is true.
is the citational equivalent of "I read it in the paper the other day," or more accurately, "Just go open a textbook." Do you have a link? Or the name of a specific group? Are you afraid they're going to sue you for libel by calling them out in a closed invite-only forum? You can PM me if you'd rather do that. All you need to do to get me to respect your opinion in any fashion is to provide a link or screenshots or something. I don't know what the hell "a brigading group" is or where they advertise themselves. I don't do that kind of thing. That should be obvious, or I wouldn't be interested in tildes.
Am I open to believing you if you provide evidence? Of course. But I'm sorry, if you are making a claim of conspiracy, that takes evidence. And if you expect tildes to be better than Reddit or Voat or 4chan or whatever social media sites frustrated you enough to try an alpha with hardly any content, then I'm asking you to jump on board with me and raise the level of discussion with a little proof behind our claims.
I don't know those people or have any means of contacting them to verify this. Imagine if you said, "Ask any professor of psychology about psychological topic X. They'll tell you Y." Now imagine you said that to someone who is not and does not plan to go to college but is genuinely curious. Is that a proper source?
Here's the real rub--you are not just making a claim in order to change someone's mind and then hope they'll make decisions down the road based a little more on your way of thinking. You are targeting specific policies and design decisions in a young project that many people are invested in (and at least one person is financially invested in), and you are making claims in order to affect those decisions. There is nothing wrong with that--that's part of how these things work. But consider the magnitude of what you're asking. Consider the hard work and effort that Deimos put into this project. Yeah, it'll be open source and someone might fork it if it goes to shit. But will it be the same project? Will it be the project that you want? I would guess that out of all the possibilities, it is unlikely that another project just like this one or just like your dream version will be chosen.
So not only the forum for all the people who have joined, but also all the time and money Deimos has spent, and all of the good community you're seeing here and hoping to nurture, and all of the culture that this project has, could be wasted or negatively affected by a bad decision here.
I'm not saying you're pushing for a bad decision. I'm saying that your claim warrants a few links.
You should stop writing at the point you double down with a second personal attack in future because the other person is going to stop reading at that point.
Please rethink how you're going to participate on Tildes in future. This isn't the participation anyone wants to see here. Be better.
Regardless of this exchange clearly ending on a useless note I sincerely hope you have a nice day.
You as well. I think it would be better if we did not reply to each other in the future, including in other threads. I never say things like that on Reddit, but...this is a very small forum.
Are you afraid that this place is going to turn into Voat or something? I think your fears of bigotry and whatever else are blown significantly out of proportion. The culture of a site determines what is acceptable there--this is a far more liberal site in general than Reddit. I don't foresee alt-righters coming in here and having any success whatsoever.
They're already here. They already use the same tactics here that are in use in communities on reddit that have high-moderation goals and standards of behaviour.
The issue in general is that moderating behaviour is considered "liberal". As is visible in your own comment "far more liberal site in general than Reddit".
A place that people consider "liberal" naturally attracts those that are anti-liberal to come and cause trouble.
What kind of trouble though? Well that depends on what kind of methods can work and what kind of methods can not work. But the main example is demonstrating to us that "liberal" methods (moderating) can't work, by trying to use moderation against the site itself.
They are already here, doing just that.
How?
They join discussions and use savvy, well rehearsed, well understood ways of participating in discussion while appearing like an "innocent" user just asking questions. They deliberately lead conversation into highly-inflammatory areas and bait people that would be easily emotionally triggered by those areas into rising to the user emotionally. They then use this to report said user and have moderators forced into taking action against them.
See an lgbt topic? Ask "innocent" questions about autogynephilia and watch it cause someone that's gay or trans to explode after just a few responses. The baiter succeeds in injecting hateful fringe theories rejected by 100% of the medical community into the discussion and gets a perfectly good person banned due to their emotional reaction to hate.
This has already occurred here and is ongoing. They are savvy. They have already learned these tactics from participating in /r/worldnews. It took a long time for moderators there to become aware of this tactic and how to handle it, the way it is handled is still poor even now.
I won't be naming names or pointing at example threads though as it's not necessary. Deimos has privately been made aware of individuals and has clearly adjusted tactics slightly after realising what is going on.
This will only grow and become harder and harder to handle as the site scales.
Honestly this sounds a bit like conspiracy theories to me. I find it unlikely that there is a centralized, concerted effort to teach alt-righters to ask just the right questions to piss people off. I literally have no idea what "autogynephilia" is, and to be frank if it's to do with sexual orientation or gender, I think there are other moral issues which are a little more important and wide-reaching. Many people, especially young people, don't always understand things that the LGBT community believes or refers to. It's not something they are taught in school. Perhaps hey are asking a question because they really don't know, and when you explain they still disagree with you. I find that drastically more likely than some kind of planned conspiracy.
The very nature of this beast is vague and terrifying. It is the PERFECT object of paranoia, fear, and overreaction. It is the liberal equivalent of immigrants. The threat cannot be seen--it's online, and it's anonymous. It cannot be described--it changes its tactics and adapts. Evidence cannot be provided--that would be risky, and it may cause memetic spread. The threat is clever--all threatening incidents are likely to be part of a grander plan. Do you see? This is EXACTLY the kind of object which fascists obsess over. This is EXACTLY the kind of bogeyman that the Nazis made of the Jews. There is no presentation of evidence, or due process, or anything. It's FUD. You have no evidence for any of your claims.
This is a total lack of personal responsibility. Look. I'm a staunch atheist, and the secular principles that I care about are often at odds with the church. I have sat through many church services in which my beliefs and principles have been mocked and scorned, right in front of me, and with foolish arguments to boot. I've never enjoyed it, but neither have I done anything to piss those people off or start arguments. That's in person, in the heat of the moment. I won't deny that people can say cruel things online, but if you do not choose to walk away, and if you choose to retaliate and escalate, that is your fault, not just theirs.
The only people that will visit it are people who have had moderator actions against them. So the only participants will be endless amounts of people that all have grumpy feelings towards the mods because they've had their things moderated.
This turns such places into a hellhole of people that all honestly rightfully deserved the moderation they received but all back one another up in their indignation at having been moderated.
Which of course, all of these users pejoratively call "censorship" instead of "moderation".
It doesn't really solve an issue that can't be addressed by going to an Admin(here on tildes). But it introduces many more issues.
I'm not a big fan of locking threads, but I definitely understand that it can become necessary if the quality of conversation is declining to a combative level. Would it be possible to introduce a "cool down" moderator action? If it becomes clear that a group of users are getting heated (high frequency commenting, more combative language), a mod could suspend them all from commenting for something like an hour or two. It might be a way to allow others to continue having a good conversation.
What if the cool down period pertained to the thread and not the users? If a heated argument has started, the option of locking the thread for a certain period of time might help. It might also just kill the thread. I'm not really sure how that'd play out. I don't think I participate in groups anywhere else online where this is done.
Permanently locking it kills the thread for sure. I think temporarily locking it is a possibility, though. Maybe the thread could be locked temporarily, but only for certain users? Then you only hit the "offending" users, and only on that particular thread, leaving the rest of the community untouched.
That sounds significantly more complex than temporarily locking the whole thread. Maybe I'm too incremental, but all I want is something that is a little better than Reddit in most areas. I don't want the moon, just something nice, you know?
That's a good idea. I might increase the period to a few hours to give people a bit more time to cool down.
Let's be clear about this, though: this is basically a short-term temporary ban, albeit under a different name.
I wonder if it would be possible to suspend users on a specific thread so they could continue to contribute in other places.
It would certainly be possible to do this technically. Each thread has a unique URL, so there could be two "cool down" options:
Cool down on all Tildes for X hours.
Cool down on current thread for X hours.
I feel like this is an advanced feature, which might be implemented after an initial Tildes-wide "cool down" feature (which would be easier to build).
Do you mean specific users, or every user that participated in the devolving thread?
Probably at the discretion of the mod. A lot of times, these sorts of things are two users going at each other. Snoozing them for a while would likely do the job.
That's an interesting idea! You'd have to hit the sweetspot of them cooling off, though, and not just getting angry at the mod for stepping in.
I was going to suggest this same feature. The ability to ban or temporarily suspend users from specific topics. As you said, most of the time it is 2 or a handful of users getting into it. This could be a good solution.
The ability to lock comment threads would also accomplish this.
at the cost of preventing other people from engaging in discussion if it is never unlocked.
Sure, but they could always reply elsewhere if what they have to say is on a more productive level. The types of threads I would lock as a mod are those with two users going at it. Nothing productive, and a lot of work to stifle.
You'd have to watch out for people continuing the argument in different places.
I absolutely hate locking topics. I understand why it's done but it has the effect of stomping out reasonable discussion along with it, the feeling of being silenced is not a good one. I don't have a better solution.
Here's what I'm envisioning in the case of a discussion page that has flaming or bad behavior, and a human moderator with limited time is in charge of handling it:
If after its unlocked there are individual users breaking the rules, those users get whatever is the next level of punishment for rule violations. If after its unlocked there are enough rule-breaking actions that it takes more than X minutes, temp-lock the thread again. I was tempted to say permanently lock it, but I don't believe that is wise since there may still be users who are participating without breaking the rules, and it is unfair to them to lock it permanently. To do so would grant power to any cabal who wanted to lock things down to do so - especially if low-trust users e.g. sockpuppet accounts could prompt the lock.
Now if there was not a human moderator in charge, if it was purely driven by algorithms and distributed moderation by many many users (thus the time investment for moderation for any given individual is much less), then there should ideally be no full locks for pages - only locks (and warning, elevations of punishments) for specific rule-violating users.
I think half the problem sometimes people arn't specifically breaking the rules but threads get locked because the conversation devolves into things not necessarily warning or ban worthy, but nevertheless not a constructive conversation.
I don't agree on the notion of "being silenced" through thread locks. If the thread was also then removed I would but a thread lock still leaves the conversation which happened open for perusal and doesn't prevent further conversation on the topic from occurring, it just signals that a particular instance of it has gone wrong and approaches may need to be re-evaluated.
You're right that it doesn't stop someone further discussing by opening another thread, but as you said, the tread is still open for browsing but my opinion in reply to the conversation is silenced, aside from PM'ing the user involved I can't engage them in conversation, what has been said is concreted in from the lock, and what I want to reply is silenced.
My thoughts on all of those:
No. I love the idea of leaving it up to the community in general, although I assume most of those mechanics aren't in place yet. But I believe there should be as little direct one-person-makes-the-decision involvement as possible. I know you don't mean Tildes to be a bastion of free speech, but let's not retread the ground of putting too much power into too few hands, a la Reddit.
This I'm rather fine with, considering the topics are just moving, not disappearing. Unless I'm mistaken, this is also one of the things the community itself will be in charge of once we get further along. If anything, I'd just make sure to allow the community to undo any move you make. Not that I think you'll be abusive, of course; I just like the idea of letting users decide collectively.
Absolutely. Transparency is key in avoiding corruption, wherever it may stem from.
I have two issues with this statement:
There are good things about reddit moderation and there are bad. We shouldn't throw out everything because bad things were too prominent to notice good ones. Moderation, being separated from admins was a good things. Mods being essentially god's to their subs was bad. There is no repurcussions to bad modding and there are no rewards for good modding. That is a really hard problem to solve, IMO.
I am in favor of strict moderation. I know this isn’t reddit, but I ended up unsubscribing from all subreddits that weren’t heavily moderated.
I think starting off very strict is the way to go, especially when the site is growing. Sites develop collective characteristics quickly, so going aggressively in the beginning makes since. Once the community settles down you should be able to ease off and the community can better police itself. But right now, it’s a Wild West.
I agree with the notion of starting stricter, in my experience it actually helps a community grow towards a place where it self-polices and mod intervention becomes increasingly rare.
Bullshit.
If someone is being antagonistic or bigoted, and I step in as a moderator to remove their comment attacking someone else, or their racist/homophobic/bigoted rant, they're quite likely to turn that antagonism on me. This has happened to me many times on Reddit. Very few people calmly accept removal of their post or comment as a good thing. Most people dispute it to some degree, and some of these people turn nasty.
Making a moderation log public only increases the number of people who can attack moderators: not only the individual whose post/comment was removed, but all the people who think that nothing should be removed ("Free speech!"), and all the people who agree with the removed post/comment ("But that person was right!").
I don't disagree with your comment, but it would have been better to just leave the "Bullshit." line out. The actual content stays exactly the same, and it feels far less antagonistic. Here's an interesting blog post by the mods of /r/changemyview about exactly this subject: https://changemyview.net/2018/03/28/thats-bullshit-rude-enough-for-removal-a-multi-mod-perspective/
Okay. You set the bar even higher than I do. Good to know.
I'll use something politer next time. Is "Rubbish." acceptable? Or should I open with "I strongly disagree"?
I prefer "Poppycock.", please.
But seriously, I think it just depends what kind of reaction you want. Like that CMV post goes over, starting with "Bullshit." feels pretty hostile and will probably make the person you're replying to immediately "dig in" and feel defensive before they've even gotten to any of the content of the comment. Then when they reply, they'll probably feel compelled to use something at least vehement, and it's pretty easy to end up with it escalating.
I'm not trying to change someone else's view in this case. I'm trying to express my view that I disagree with their view. If I was trying to change their view, I'd take a different approach.
But: point taken. I won't open with "bullshit" or anything similar (even though I like dramatic openings!). From now on, I'll just launch straight into my tirade explaining why I disagree with someone else's view.
I think what's important to remember here is that Algernon is projecting the fact that he's had this discussion a thousand times, not here but on reddit it comes up all the time.
Algernon's a pretty measured person and well spoken most of the time here, but moderators of reddit have a few trigger-topics and have been conditioned by reddit to VERY aggressively defend that position because anything short of macho-assertive-aggression on the topic can fail to garner upvotes in a thread. This can then spiral into a whole section of a community rallying to the side of public logs/messages which is a terrifying and dangerous prospect for reddit mods. In particular because, honestly, 80%+ of reddit's mod teams are really very very rude behind the scenes to users and it would make them look pretty awful. But also because of the very good justifications here that you've seen @deimos agree with.
Not saying that it's right for Algernon to have failed to filter himself properly here. But some context on why the topic provokes such a strong reaction is probably useful here in providing understanding of the human and for the person emotionally speaking behind those words due to experiences and conditioning.
It's not impossible for mods to filter themselves, but it's easy to forget yourself, forget the venue, and given that he has no responsibilities here I can see why he would speak more candidly or be less-filtered. I tend to do a pretty ok job at it, I'm not endorsing it, but understanding it can help a lot in these discussions to avoid some poor language choice causing a spiral of two users going at each other tit for tat in increasingly aggressive ways.
I wasn't angry or trying to be impolite. I was just being emphatic. Saying "bullshit" is not that big a deal here in Australia. <sigh>
Yes, I'm for a harder line of moderation, especially when things devolve into personal attacks.
If we want to create a better version of the online community, it is important that there is a strong system in place to keep things above water level.
But, and this is a big but, the need for transparency goes hand in hand with that strong moderation. If you want to treat your users as grown-ups, with strict implementation of the rules, you also need to treat them as adults when it comes to discussing and giving access to what is being done.
Probably very much so, because there's so much that is still up in the air, as Tildes is so new.
I imagine that, once everyone grows more comfortable with Tildes, it will be less and less necessary.
Yes, that kinda goes hand in hand with the first part of the answer. Transparency will go a long way towards making sure things continue to run smoothly.
We are all in this together, and we all want to help implement your vision of what Tildes can and should be.
A publicly accessible mod log is great. That's something I've run up against many times.
I mentioned this in a reply above, but I think the option to lock individual comment threads rather than an entire topic would be quite useful and a more targeted moderation tool.
I think given your limited time to handle moderation as the only admin/mod on the site, locking the thread was a reasonable thing to do. In an ideal world, I would have preferred the individuals who were perceived to be disruptive or breaking the rules be warned and or themselves temporarily locked from commenting, so that the other users who were not causing problems could continue engaging. I mentioned sort of what I would envision being how things might go in situations like this in the future with more moderators in another comment here: https://tildes.net/~tildes.official/2c4/daily_tildes_discussion_starting_some_moderation#comment-pkx
I have no problems with you moving posts as you see fit at this time. I am subscribed to most groups so it doesn't really affect me that much, and i trust your judgment on where things are best placed.
As far as whether "strict moderation" is appropriate or desired, it depends on what that would involve, and how much time and energy you have for it.
I absolutely think a public log of any official actions you take would be nice to have, and essential as other users are granted power to perform such actions. Even if it is something as simple as a csv or plaintext file. Having a public moderation log lends credibility to the moderating user, as there is a guarantee that the moderator is not abusing their power else they would be caught by users who monitor those logs. It is not a concern in the case where you are the only one performing such actions, because i don't believe you have any reason to conceal your actions, and trust is very important to you - it is only really a concern for other users who have been granted these powers - especially if they've been granted through a system that users may not fully understand or which has the possibility of being gamed.
edit: I'd also say, having very clear policy guidelines for moderation are a good idea too. On the minecraft server I was an administrator for, we have a shared google doc that outlined a whole process for land claim disputes, sort of codified from a series of discussions about what was and was not fair and reasonable. It made moderation of those situations a breeze, whereas they had previously been something I dreaded to do for how complicated they could be. Obviously they would only be guidelines, so flexible, but still useful - especially if the users know what to expect - that helps to reinforce the credibility of the mod via the log.
So about that specific thread and how specific cherry picked details from it invalidate all arguments ever.
No but seriously, locked threads are the absolute height of bullshit and I hate that the concept is a part of the internet whatsoever. If people say bad things, they get punished. If the topic of a post is not good, it gets removed.
There is no acceptable (i.e. not removed) topic which forces people to say bad things. None. So the concept of locked threads is so deeply flawed that the only possible response I have for it is - if you're doing it, you are doing the wrong thing. Always, no possible way around it, you are doing something wrong.
As for the other parts of that list, retagging should be an organic process like voting, and should not be a moderator action for the most part. The only time it should be a moderator action is if a tag is malicious, i.e. obviously deceptive or completely off-base. Editing topic titles would be fine so long as (unlike comment edit history) the history of the topic title is prominently visible.
As for moving topics between groups... for now, I suppose. It would mostly depend on exactly how the relationships between posts, groups and tags end up working. Because there's a whole lot of maybes in there where it does or doesn't make sense for that to be a thing. It gets very complicated very quickly when you think of things like diverse groups where one group has some culture and another groups has some different culture, but a post is relevant to both. Does that end up in both as a single entity, or is it duplicated? Does it have a primary group and the tags put it in secondary groups, do people browse ~ by tags or by groups or by groupsAndTags, etc. Are there 2 sets of comments or just 1? If just 1, are the votes unified, or is there a voting structure in group A and a parallel voting structure in group B? If it's parallel, that's super messy. If its unified, suddenly group cultures will clash - like an automated brigade where Group A can't talk about anything related to a larger Group B and keep it about the topic as it relates to A, because it will get tagged as relevant to B and suddenly all the voting and activity represents the topic as it relates to Group B and A gets shunted off to the side.
It's super complicated and all I am sure of is that anyone saying they have the answer is the only one you can be sure is wrong.
I've locked a thread once. We're two mods in /r/crypto (cryptography), I was the only one online, it was nearing bedtime, and our sub suddenly got hit with a thread with Assange drama, flooded by users from +10x larger subs, with things heating up, and with every question repeated dozens of times by different people, speculation everywhere.
No thank you, I'm not staying up late to handle that. And I'm definitely not letting that go unsupervised. And I had nobody else to call in with such a short notice. So locking it is.
I'd be impressed for you to find an argument for why that was the wrong call.
a) you did something wrong: not having enough mods to ensure coverage
b) what was causing the drama? If it was the thread, don't lock it, delete it.
A) impossible to know in advance. It came out of nowhere. You're effectively saying that literally ALL tiny communities are doing something wrong for being unprepared for a literal 1000x jump in activity. We had never had to have extra moderators for 24/7 coverage before. We had never even been close. The automoderator rules had been sufficient until then.
Keeping the thread open would have required at least 3-4 mods available at any given time, hawking over the moderation & comments logs. Probably a total of like 10 people (taking turns) with knowledge of the sub's topic and culture and with proper moderation skills, plus good communication.
That's not remotely plausible. The only other option is closing the entire subreddit. That thread just couldn't have been left unsupervised.
B) there was valuable and relevant information in there, but people in the thread was starting conflicts, there was harmful speculation, insults all over. It was a mess. Allowing people to still see it but not add anything more was IMHO the best choice. The useful information managed to reach the top, and there was links out to other threads in other subreddits for people to follow if they wanted to. However, all the conflict WOULD have spilled over into the rest of the subreddit, if unchecked.
a) This is why ~ is the better platform design. There won't be a case where that is a problem, because the moderation level of the highest scale of activity (root groups) will always be available to handle rule-breaking/spam moderation in child groups if it becomes an issue. This is what I mean by doing it wrong, and I should have worded that much better - not that you as a moderator were doing the wrong thing, but that the need for locking a thread in general indicates that something, somewhere, is broken.
b) Again, difference between Reddit and ~ being the cause of the problem. Google is a better link aggregator than Reddit or ~ could ever hope to be in a billion years. What it isn't however is a source of discussion on those links. If there is no discussion, there is no need for the link to exist.
A) the main difference is that tilde can't guarantee being able to sustain a certain atmosphere / culture. There's for example no guarantee that /r/crypto wouldn't eventually be co-opted by cryptocurrency fans, away from us cryptography fans, especially due to their larger numbers. This is not necessarily a good thing for all communities.
Moderation from ~comp which would be the likely root wouldn't likely understand what rules a cryptography group needs.
B) I don't see what argument you're making here. Why don't you look at my subreddit for a while to see how it works in practice?
The crypto thing is the mistake of naming something ambiguously. Crypto is both cryptography and cryptocurrency. If you want a cryptography tilde... you'll probably end up with ~comp.cryptography (or ~tech root). Cryptocurrency get to play over in ~comp.cryptocurrency. This is a non-issue.
As for root mods not understanding child cultures, that is what sidebar rules are for. Clear, unambiguous rules. If parent ~ mods can't follow them in a shit-hits-the-fan situation, your rules are wrong.
Wrong. The ambiguity came later, the subreddit was created before Bitcoin even was released. Nobody in computing / tech had used crypto to mean anything other than cryptography until years later when Bitcoin and altcoins started to get popular.
It's not our fault that others later added another meaning.
No.
As an example, an experienced cryptography mod would understand that stream ciphers don't allow key + IV reuse and would then warn somebody giving bad advice, so that people don't follow it erroneously. Same with recommendations to use ECB block mode, MD5 for hashing, etc. The parent group's mods can't be expected to understand such details.
You're making the mistake of assuming everything can be condensed to simple rules. That intricate knowledge of the field isn't always necessary.
To moderate a cryptography sub, you need to be able to tell competent people and safe software from charlatans and snake oil. This can't be left to popular opinion. Algorithms can't trivially evaluate this.
None of those specialist problems are a time sensitive issue, and the inability to rename subs is yet another Reddit issue, which is not relevant here.
How is bad security advice not time sensitive? Especially in a popular thread?
And we shouldn't need to change the name. Only a verification that the submitter actually read the rules.
Because if you're taking security advice from social media you've already lost. Nobody is going to die, the page isn't filling up with spam or personal attacks (which the parent mods would be more than capable of handling) and in however many hours until a native mod returned there's no reason to care about anything more minor or specific than that.
You are carrying a sense of 'ownership' of subs from Reddit that has no place here - leave it there, because it is toxic. If the world changes and there is a massive community around say 'technocracy' which keeps overriding ~tech, the solution isn't to go power tripping mod, it's to rename ~tech to ~technology and have ~technocracy. Getting hung up on 'waaa we were here first' is just classic mods thinking too highly of themselves.
People already take bad advice from the web. That's how MD5 is perpetuated, it's why people use CBC unauthenticated, it's why people use salts and IV:s wrong, etc...
This has ALREADY caused massive troubles. Hacked websites, leaked passwords, stolen money, leaked data.
This is a genuine problem.
Bad advice right when a new critical vulnerability like efail and heartbleed gets published could destroy entire companies.
Meanwhile our sub has actual professional cryptographers from places like Google and more. In our sub people actually have a real chance to get correct advice.
And I intend to make sure that the people with actual competence keep their visibility over bad advice and spam.
It's not even ownership. I don't care who runs it. I just care about their competence. Yes, actually.
If you visited the sub you wouldn't find much at all that qualifies as toxic. And I can completely honestly say I've gotten praise more often than complaints, by a ratio of at least 3x or so, no exaggeration. That's because I'm active and proactive, trying to halt conflicts before they flare up, keeping the quality high (and know the field well enough to do so).
The main problem I see here is that out of the people who know the field as well as me, most of them either lack the particular set of soft skills that makes one a good mod, or they don't want to be one (or lack the time). And the other people with mod skills won't recognize bad advice or snake oil, and can't do it alone.
IMHO the correct solution isn't to allow the place with a long history to be co-opted. At most I'd lock the original and create two new ones, with redirects to both (alternatively rename the original, create a locked page with redirects in its place). This isn't about ownership, it's about disruption. Just because you're louder, you don't automatically get to displace people who already were there. Also, it ruins old bookmarks and confuses people.
This is awesome, thank you so much.
Random scattered thoughts on a global log of removed posts:
In the post you're replying to, Deimos wrote this:
If Deimos and his designated assistants can edit topic titles, there should be no need to remove a topic because of its title.
Part of the problem with not producing a log of removals is that it is invisible to the average user if a removal has happened - so without that log, nobody can know whether a removal was contentious or not unless they themselves were involved. This isn't such a big deal with Deimos, but if this expands to other users who may not be as virtuous, I would think having a public log of those removals would be absolutely necessary for the community to be sure things are functioning smoothly. Trust is not just an algorithm, it's something that exists in hearts and minds - a public log reinforces credibility, which is a really great thing for anyone in a position of responsibility to have.
Logging removals publicly is not really feasible. In the cases of users with spammy or malicious intent (linking to shock sites and such for example) you don't want those links surviving for your good users to possibly fall prey to. There may also be legal issues depending on the content and removal reasons. The latter is why things such as Warrant Canaries exist.
You can log the removal action (and reason) without retaining the content of the removed comment/post.
I think a public record of moderator actions is extremely important for the sake of accountability. It can be as simple as "Mod X removed comment Y for spam" or directly on a removed item "Removed by Mod X for violation of Rule N"; this protects users from malicious intent while reducing the temptation for community mods to abuse their powers.
For many actions and with a more humane humanity, I'd agree. Aside from the possible legal/jurisdictional things, there are a great many users on the Internet that would still see that as censorship and/or silencing them. It's a tough call and I understand why many sites opt to not publish such things, use some form of "deleted" placeholder, or a mix. The resulting griefing of being more forward can be brutal.
[clarified last sentence]
I agree that those people exist, but I think that kind of negative reaction is only amplified by having their words simply be anonymously "disappeared" without comment. Un-recorded deletion/removal by moderators only leads to paranoia and users building their own tools to figure out what exactly was removed and pestering questions from them and passers-by about what happened and why. One example is the heavily moderated /r/science, which I think is valuable in no small part due to its heavy use of moderation, but the first few times I encountered "graveyards" full of nothing but opaque "[removed]" comments were rather upsetting before I took the time to find the moderator comments explaining their approach and got used to that community.
If reddit provided better tools and visibility into moderator actions, for example by collapsing such "nuked" threads by default, and gave more information in the placeholders "[removed by subreddit moderator: offtopic]" the situation would be much clearer and leave less room for paranoia. /r/science is a fairly benign example, but the same principle holds true for more contentious topics, and the more contentious the topic, the more important transparency is.
There's a fair discussion to be had about whether moderator identities should be public ("A group Moderator"/"A site Admin" instead of "Mod X/Admin Y"), but I feel strongly that the actions and reasons should always be public.
In your first paragraph you hit on something I think we both agree on: context is a major factor. I've tried to keep my parts of the discussion general because reddit is not the only example of sites facing such issues by a long shot, but I agree that there have been some issues with its moderation tools.
Those aside, every moderation action also has the possibility of behind the scenes consequences. Should someone in a righteous mood garner multiple actions, there's a good chance they'll message or file a support issue to complain. For every one of those you rebuff or ignore, there's a chance you gain a user with griefing as a hobby. Even large sites with seemingly deep pockets such as Facebook, Twitter, and yes Reddit have trouble staffing support/QA teams with the skill and patience to handle their volume of support contact.
What it comes down to in my opinion is that sites need to weigh such public facing (and possibly disrupting) features as they grow. For the vast majority of sites, it's terribly impractical due to all of these accumulating factors until they get to the kind of scale to support it.
I think so long as there's no personal information or illegal content, it isn't an issue. Users will know from the fact that the content is in the removal log that there's a good chance it is something to exercise caution over - especially if there is a bit of in-line text with an explanation for the mod action. The risk of users clicking a shock site or spam link while digging through a removal log are outweighed by the risk of legitimate content removed by moderator misconduct being obfuscated (making the utility of the log for deterring misconduct regarding removals practically null).
The sort of people who mistrust moderators aren't going to be mollified by seeing a log of what moderators have done: quite the opposite. If someone starts out with the attitude that moderators remove legitimate content, what they see in a mod log will merely confirm that attitude, because the decision to remove something is always going to be subjective. There will always be someone that claims a particular removed post or comment is legitimate content - even if the post is a rabid rant advocating the murder of all X people or if it's child pornography.
The people who believe moderators are removing legitimate content are often the people who believe in extreme free speech, because they think everyone should have the right to say anything anywhere they want without limits. In that worldview, all content is seen as legitimate, and all removals are therefore confirming the idea that mods are censoring people.
Mod logs will confirm to the free-speech crowd that moderators are censoring people, and they'll confirm to the anti-hate-speech crowd that moderators are removing problematic content. Just like in those ink-blot tests, everyone will see what they want to see and noone will agree.
/r/conspiracy, and around 300+ other subreddits have had public moderation logs for a while via /u/publicmodlogs. I haven't kept up with them all closely, but afaik they haven't had the issues you're predicting - or at least not to a debilitating point to where they've popped up on my radar. I think /r/neutralpolitics has a similar setup as well - or at least they did have, and they're pretty strict about moderation generally.
Admittedly, my opinion is coloured by my experiences as a moderator. In many cases, no matter how transparent I was about what I'd removed and why, I still copped attacks. In fact, because I've usually been one of the more transparent moderators in the various subreddits I've moderated - by leaving replies when I remove a comment, explaining why it was removed - I'm also usually one of the most attacked moderators in those subreddits.
For example, I used to moderate a history-related subreddit. Here are some common exchanges from my time there:
"I've removed this comment because Holocaust denial is not permitted here."
"You're censoring the truth! The Jews lie!"
"This answer has been removed because of racism."
"But Aboriginals/Indians/Africans really ARE stupider than white people - look at the historical evidence. They didn't make civilisations!"
Then there was the time I moderated a subreddit for political discussion. When I removed a comment by a right-wing person because they insulted someone, I was accused of being a left-wing sympathiser. When I removed a comment by a left-wing person because they insulted someone, I was accused of being a right-wing sympathiser. My politics seemed to change from hour to hour depending on whose comment I'd just removed.
In that context, I don't see the benefit of public moderation logs. In my experience, it's just a way to give the mod-haters more ammunition.
Well, it's probably not going to be a popular opinion, but we can always ban the people who flip their shit when they get moderated. If they are starting trouble, and then when they get called on it their brains shut down and they go into asshole mode, why do we want them to remain members of the community?
I'm all for warnings/reform chances, but at some point if it's a pattern of behavior that can't be corrected, there are better uses of our time than trying to 'reform' immature people.
You seem to be making two completely different points here. First you claim that people that are suspicious of mods will definitely not be persuaded by a moderation log, but then you claim that even with a moderation log, "someone" will still take issue. These are not the same idea.
Case in point: I am very wary of moderators, especially on Reddit, because I feel that abuse of power and removal of valid content is the norm rather than the exception, and I myself believe that a moderation log would help allay my fears.
It would be tough to balance that without creating a place for drama aggregation or rubbernecking in my opinion. In my couple of decades of experience as a web developer, I'm constantly amazed at what users can find their way to and popularize.
That said, I would like to see a www that could handle such transparency. My point isn't that such a thing is bad or not an ideal to strive toward, but that it's simply not practical.
I assume this appears on each individual post, rather than in some massive central log? (Is "topic" equal to "post"?)
Of course! I'm a very activist moderator on Reddit. I've been involved in building up a couple of subreddits from start, and one of those required some active curation of content in the early days to set in place the culture that we wanted in the long term. I think of it like training a climbing plant. You can put it next to a trellis, but you still have to guide its tendrils in and out of the framework, and even chop off a few miscreant offshoots. Eventually, it'll grow up the trellis, but you have to guide it in the early days.
Fairly aggressively. Show people where things go now before bad habits set in. It's harder to retrain 30,000 tilders doing the wrong thing than to train 3,000 tilders to do the right thing.
Not really. You should simply add a notification on the removed post itself. I think it's courteous to let a poster know you've removed their post, partly so they don't wonder why the activity on their post stopped, and partly so they'll know what to do right next time they post.
As an aside, I really wish moderators understood this. I’ve almost cried tears of frustration trying to explain this concept to other people when offering suggestions on how to deal with the negative effects of not pruning content.
I am a moderator and I do understand this. And I've worked with other moderators who understand this, too.
Hahahahaha sorry. That sentence was supposed to say more moderators.
Yes, in the sidebar of each post (or at least that's where I have it for now, I may move it later if that seems too cramped for the info).
Cool.
I'm very fine with stricter moderation as well as setting the bar a bit high to begin with. Move the topics you know have to be moved, let the rest be. I see no reason for having a list of topics that are closed/removed to have examples of where the line goes. It would also add to transparancy/accountability
Am I the only one here who doesn't get that upset at people saying something obnoxious or stupid online? I hope this site isn't going to turn into a nanny state where no one is allowed to ever say anything naughty. I'll certainly try to follow whatever rules the admin(s) want to have because it's their site, their rules, but my hope is this site won't get to the point where every bit of text that upsets anyone is viewed as "toxic" like it's going to kill people if it isn't removed from view. Eating lead paint chips and huffing asbestos fibers is toxic, not text on your computer screen that you can turn off.
This is an extremely naive and insensitive point of view. It's not just "text on your computer screen", it's communication between humans and can be just as hurtful as any other form. If you've never been the target of any sort of online harassment or anything similar that's very fortunate for you, but don't act so dismissive of other people's experiences.
That's okay with me that you or others here think I'm being insensitive. Maybe you're right, although if I am it's certainly not intentional. What I wrote above is my honest opinion, and I'm hoping I'll be allowed to continue to state my honest opinion here. And fwiw I have been on the receiving of some very nasty comments and harassment on Reddit due to being an active poster and mod over there, but I willingly accept that annoyance in exchange for being able to share my honest opinion and read the honest opinion of others even if I think they're completely wrong-headed. To be clear I'm talking about genuine opinions expressed in good faith that I may think are ridiculous or awful in some way, not deliberate trolling or insults simply for the sake of insulting.
I could see how your first comment could be interpreted in a worse way, but what you wrote here is how I read it and I think Deimos was kinda presumptuously dismissive and deflective. There's massive difference between, eg, posting unarguably hateful comments to incite chaos, attacking people personally, etc., and people feeling bothered/hurt by people being insensitive with word choices, being uncomfortable seeing people argue for ideas they perceive as threatening to their identity, etc.
I took the Tildes docs to be clear that the latter wouldn't be grounds for anything, but can't blame you for your concern. Whatever is considered hate speech here, I hope it's far away from some of the more sensitive ideas I've seen posted.
And I really hope
That kinda stuff isn't gonna be the norm.
I've spent a lot of time over the last few days talking (privately) with people that are upset about things that have been said, so the "they're just words on a screen, they can't hurt you" type of argument (which a lot of people try to use to justify truly awful behavior) may have made me jump to something more drastic than @RespectMyAuthoriteh actually meant.
However, the fact that even in their follow-up they refer to harassment as an "annoyance" makes me still unsure whether they actually recognize the level of impact it can have on someone.
I think he's still talking about specific kinds of harassment, like the low end of the spectrum mentioned by @Mumberthrax, (say, calling someone a fucking moron whose beliefs make the world shittier in a heated argument v relentlessly spamming someone's inbox with slurs and telling them to kill themselves) but I can't speak for him so I'll speak for me on something related.
While every abusive asshole definitely does use some 'it's just words, get over it' nonsense, I haven't seen that here yet (and I'd have no issue with you shutting it down if/when it does happen), but I've definitely seen people use 'words can hurt' to justify neutering dialogue.
I've seen some people describe their reactions ranging from upset to literally 'horrified' that people are discussing things like questioning pronouns, race and crime, arguing that businesses should be able to refuse service on any basis, not respecting the terminology they prefer, etc.
I'm sure those things genuinely upset some people and some would prefer to never see such subjects debated.
I'd just hope whatever solutions people are offered for such problems aren't site administration related. And that in general, someone being upset isn't how the lines are defined. That's how I understood the docs, at least.
I think maybe you two are talking about different things along the spectrum of misbehavior. If we are having a disagreement but remain respectful, that is like a 1. If you start talking about how you know where I live and are going to come and do terrible things to me and my family, that's a 10. I get the impression that @RespectMyAuthoriteh is imagining things closer to 2 or 3, possibly even up to 7 (think 4chan's /b/, where shit talking is the norm and means basically nothing to the vast majority active there).
edit: disclaimer: I have had my fair share of encounters with actual toxic users elsewhere on the internet and it fucking sucks - especially when they stay just within the letter of the law and still seem to relish in disruption and annoyance.
Depends in what sense you mean. If it's something like
then sure. Once user's starts taking things too personally threads start to end up being an endless argument. The majority of the issues don't stem from the topic but from us the users.
If it doesn't fit move it I say. That's the whole point of the different categories right? Just make it so we can easily find it once it has been moved. If a topic suddenly vanishes and you aren't subscribed to wherever it was moved...you won't see it again.
Absolutely. Makes ~ more transparent.
I'm not a fan of locking threads. If someone has done something ban-worthy, ban those individual users. If they haven't, but they're getting close, warn them. But I don't think locking threads achieves anything.
Yes. A log of all mod actions is a good idea.
It only takes a small number of people to create a dysfunctional culture. You need to set the tone early to make sure people i) know what's expected ii) aren't surprised when the bans start happening iii) know that moderator action is a request for them to step back a bit.
I'd welcome much stricter moderation.
In general, I think more granular modding options are the way to go. Most of the suggestions here are exactly that.
So if you can time-out someone instead of banning, that's good. If you can ban someone from a thread but not from a ~group, that's good. If you can lock just a sub-thread instead of "nuking" it or locking the whole post, that's good too. Pausing a thread is a pretty nifty idea I hadn't seen before this thread.
Additionally to that, if we ever get user-moderators, and user created ~groups, a feature I'd love is a "democratic modding" mode. Where upon creating a ~group, it can be tagged as a democratic group, and moderators are elected periodically, with some criteria for voting. This would be optional, allowing groups to keep the old "mods invite mods" style if they prefer. The feature would allow the community to be more active in controlling their group.
Can we stray away from hyper-strict moderation, ala HN, at least? When moderation gets more strict, bias gets a lot more evident.
Reading your comment, it's a little hard for me to tell if you view HN moderation as hyper-strict (and thus to be avoided) or as a successful example to be emulated.
In general, I've always felt that HN does a fairly good job of keeping discussion on track, and the (remarkably few) moderators do a fairly good job of being present and communicative when someone starts to push the boundaries of acceptable discourse. The site has it's share of problems, but compared to most subreddits for example, I've always felt it managed to generally foster informative and constructive dialogue (in the limited range of subjects that are considered on-topic there).
HN moderation is, IMO, hyper-strict.
Most of what I think has been said already, I'm just adding this to help tally opinions.
I think there does need to be strong moderation while the userbase is growing. Once an identity has been established by the rules, non-harmful unwanted content (pun threads) will largely be stomped out by the users.
Harmful content (attacking a user without benefitting the discussion) should continue to be moderated.
When possible, this moderation should only affect the users involved. As others have said; this could be a temporary ban on the users, temporary lock on a specific comment thread, or a combination. In extreme cases, temp lock the full thread.
I think mod actions should be temporary when possible. Obviously repeated actions require a stronger reaction.
I think there should be a public log of mod actions. I split them as the 'soft' actions and 'hard' actions. Soft actions are editing titles, tagging, and moving threads. These can be done quietly with a single line entry to the log.
Hard actions are bans, locks, and removals. These should log a specific reason for each case. Perhaps also add a stickied comment to locked threads? Due to potential harassment, these actions should not show the mod's username. This way, the actions can be questioned and discussed without allowing personal attacks on the mod.
Edit:
When removing comments, perhaps replace the comment with a rough statement of what the comment was? That way, discussion can continue along the thread without losing content. A comment removed for being fluff will spawn different responses to a comment removed for being racist.
Most of my opinions have been stated by others already (yes, strict moderation is a must for a site that wants non-fluff & good discussions, warn users, maybe a one-day ban or something, etc etc)
Two things I'd like to add which I don't think have been mentioned here yet.
From what I've seen of you this likely won't be an issue, but it's worth repeating: make sure you take the same approach with selecting mods and creating a moderation policy/culture as you are taking now; some transparency etc is good.
I'd argue for some additional mod actions: keeping discussion centralised, where applicable; Fixing links to more relevant articles (HN does this when there's "news" sites that say "bloomberg said" and then have a bad summary—it makes much more sense to link to the original source), not allowing duplicate threads (with news or album reviews, for example), etc.
My thoughts on moderation remain largely unchanged: the best communities are the ones that take care to curate themselves and requires moderation to be proactive enough to not let things spiral out of control before any action is taken.
Aside from the features you've planned I would propose another feature, something of a holdover from legacy forums: a moderator "warning" button for specific comments or threads which displays a reason for the warning on the post/comment itself rather then a moderator reply. Warnings themselves need not have any real effect, the idea is twofold:
1.) Give people a chance to examine and adjust their actions if they are deemed to be an issue
2.) Remind other users visibly of what the rules are and how/when they are enforced
I’m strongly in favor of retagging. The tagging system is pointless if we allow users to ignore the tags based on reasons nobody else is informed of ahead of time. I have visceral reactions to certain subjects (Rick and morty being one of them). I absolutely do not want to see topics of any kind, involving that subject. Other people aren’t always going to understand why I would feel so strongly about it. And that’s fine! But if anyone knows the annoyance of not being able to escape certain topics, then I’m sure they’ll agree that a functional tagging system is important.
The reverse is also true. If I want to read everything on a subject I have great interest in, I’m going to feel incredibly frustrated if I have to go looking in other tags to find it. That’s completely unsustainable moving forward. I’m sure some exceptions I haven’t thought of will occur. The rules can be further clarified when those situations happen.
What's your plan on making it scale? Right now in the alpha stage, it's fine to give lots of power to the only moderator (the site admin) because we can probably trust him to make the right decision on moderation. He's got the most skin in the game on Tildes than any user.
But what happens when the userbase grows? What happens if the userbase grows exponentially? My worry is when you start delegating power to other moderators, is that it's difficult to discern between abuse of the lock thread ability (censorship of alternative views) and legitimate moderation. The question is who watches the watchers?
What's your thought on a report button? It seems like a common feature on moderated social sites.
Have you read the page about the (vague) plans for a "trust system"? https://docs.tildes.net/mechanics-future
I'm sure that we'll need some sort of reporting/flagging before long.
Better to nip things in the bud early on. I think if something doesn't fit you should move it into the category you think best fits to establish some kind of baseline.
I'm not sure how you're planing to connect trust levels to edits and other mod actions, but, I guess, you would need a report feature for every entry in a topic or any other moderation log.
When topics get heated, you could also (temporarily) prevent them from updating on the activity list. This can function as a pseudo-lock in some situations.
In regards to the global, public log, the developers should assume there needs to be one, but they have to protect mods against harassment and never ending objections. Leveled read access to individual data points would do that for you.
The visualization of the log also matters a lot as well. There is little use for one when it is nothing more then an overflow of information. For instance, there is no need to gather logs relating to comments outside of the respective threads, and you may want to limit the visibility outside of topics to actual reports, excluding normal edits.