31
votes
Reddit announces a "revamp" of quarantined subreddits, then quarantines multiple major subreddits
Link information
This data is scraped automatically and may be incorrect.
- Title
- Revamping the Quarantine Function * r/announcements
- Published
- Sep 27 2018
I don't even understand what the actual change they're announcing is, with framing it as a "revamp". After it was originally introduced years ago, quarantine has only been used a handful of times as far as I know. I figured they had just given up on it.
The only real difference that the post seems to explain is the addition of an appeal process. So they're... looking to un-quarantine more subreddits?
Edit: looks like this was actually an extremely unclear way of announcing that they were going to quarantine a bunch of major subreddits. So far, at least the following have been quarantined: /r/CringeAnarchy, /r/WatchPeopleDie, /r/Ice_poseidon, /r/TheRedPill, /r/FULLCOMMUNISM, /r/braincels, /r/911truth
If you go to the wiki, they had a list of quarantined subs, but it was last updated 2 years ago: https://www.reddit.com/r/reclassified/wiki/quarantined
I think the big thing is that it will now explicitly show you that you are entering a quarantined subreddit
It always did that. I was involved in the original development for quarantine, I remember that every quarantined subreddit had an interstitial page and people had to specifically "opt in" to seeing each of them individually.
Oh, I guess I have never come across one. I guess in theory the ability to appeal is a good thing, I'm curious if there will be any transparency in that process
Well the point of quarantining was so you wouldn't just stumble across it. It's such a terrible idea though. The equivalent of sweeping a problem under the rug.
As expected the comment section is a total shitshow. Anti-T_D and anti-anti-T_D commenters duking it out over who can downvote each other the fastest, resulting in a sea of "controversial" comments. Admin replies buried under a mountain of downvotes so nobody will ever get a chance to read them. Fun times. :/
At least it makes me appreciate the fact that there are no downvotes on Tildes that much more.
You know I really dislike the liberal leanings of tildes but I'm just glad everyone is respectable here.
Such as?
Here's a survey /u/Kat did earlier on in the site's "history" that included a political leaning section, it's pretty clearly left leaning
Edit: I tried directly linking to the political leaning section but the element for it seems to have a different ID every page load, so you'll have to scroll down a bit to reach it.
Ooh, nice! I didn't see that originally...
Can you link them by any chance? RES' comment navigator can only find visible comments :/
Now that the comment section is full well beyond the 500 comment limit (and I don't have reddit gold.. sorry.. "premium" anymore), so the downvoted comments have been hidden "below the fold", it will be hard to find them no matter what (which was kind of my point)... but just from memory I was able to find one:
https://www.reddit.com/r/announcements/comments/9jf8nh/revamping_the_quarantine_function/e6qxjxg/?context=1
And looking through /u/landoflobsters' profile there is a few others you can see as well, e.g.:
https://www.reddit.com/r/announcements/comments/9jf8nh/revamping_the_quarantine_function/e6r2mjo/?context=1
/r/Announcements uses Q&A sort, so every top-level comment should have a direct admin response. Just minimize each after viewing the admin's comment.
It's really bizarre to me that reddit is still using /r/reddit.com modmail as their primary point of contact for support. You'd think they'd invest in a proper ticketing system that isn't quite as garbage. It's like if Facebook support was done all through a single Facebook page's messages...
All the messages sent to /r/reddit.com modmail get forwarded into a Zendesk ticketing system in the background. The admins don't actually use modmail, they just use Zendesk and their replies get forwarded back into reddit messages as well.
So from their perspective it is a ticketing system, but it's definitely still pretty weird from a user perspective for "how do I contact the admins?" "Oh, just go to this weird old subreddit that's been dead and locked for years, and message the mods there."
A conversation like this has probably occurred at some point or another:
Technical debt. Legacy code. Both phrases that are physically painful to think about.
Yeah, it gets more and more evident that most admins don't even use reddit the way most users/mods do. I wonder if the new modmail ever got finished after they forced a horrible half-built system and then abandoned it.
Yep! I spent some time trying to educate various helpers in /r/Help to not refer people to the modmail in /r/reddit, because I knew it was confusing to the general population - most of whom joined after /r/reddit was shut down and had never heard of it.
I usually just provide the direct link to the /r/reddit.com modmail so it isn't quite so confusing for people. I have even had to do that a bunch of times recently for several shadowbanned accounts requesting Tildes invites... so far two of which were reversed by the admins as a result, allowing me to finally process their requests.
A few helpers did that. It's helpful in the short term, but I'm a fan of the "teach a man to fish" method: "Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime."
Giving someone a link to the /r/reddit modmail helps them here and now. But what about when they have another problem in a month's time? They have to dig through their user history to find that post in /r/Help where the helpful person gave them that funny link to that dead subreddit. Meanwhile, if you give someone the admins' email address, that's easy to remember (contact@reddit.com). Or if you guide them to the 'contact us' menu at the bottom of every page, they'll be able to find it again for themselves next time.
Thing is, and I speak from the perspective of only having to contact the admins a few times a year, but it's my been my experience that they are way more responsive through /r/reddit.com modmail. No idea why.
Well, if like Deimos says it's being forwarded on to an official ticketing system, that'd be why. That ticketing system is probably the only 'official' company mechanism which has eyes paid to be on it 24/7. All of the other less official means are at the whims of the employees, which means they'll be monitored less stringently or not at all.
Right, but in theory contact@reddit.com goes to the same endpoint. I've used both—modmailing /r/reddit.com is more responsive.
That's a good point. But I suspect people are much less likely to want to send a formal email tied to their real identity in some way than simply use internal messaging, which probably discourages quite a few people from using it. The contact us page is a bit of a confusing mess requiring people to specifically click 'message the admins - something else" or get nowhere, so I don't like pointing people there. And finally, while the reddithelp contact form is far more straight-forward and convenient, it sometimes confuses people simply because it's an oddly separate domain which may appear like a phishing attempt.
Reddit really needs to come up with a better internal way to contact them. :/
Yes. Yes, it is.
Fuck yes!
Yeah, that definitely is the weirdest part. It shouldn't be too hard to set up a link somewhere in preferences or something that says 'contact admins'
/r/reddit.com still works, but you now get an auto-reply suggesting you use https://www.reddit.com/report instead.
Or in the new design they use the drop down menu on their username, go to "help center", then use the bold "contact support" link.
In other words, they don't show ads to their members, and are instead subsidized by all the normal subreddits that do show ads. Since when was "doesn't get advertised to" a punishment?
I have a strong suspicion that quarantines were necessary because they didn't want advertisers seeing their ads next to clear hate speech. It comes off to me as a way of protecting their revenue stream, not sacrificing it.
Yeah, this is how I see it too. Plus it's not like the users of t_d, et al, only stick to hate subs. They're viewing ads in other subreddits, buying Reddit Gold, bringing in revenue... just not in the nastiest places. If Reddit really valued ideals over revenue they would ban the subreddits and their most prolific users.
Found this floating about in one of the various Reddit/mod slacks
The sad demise of /r/CringeAnarchy
Oh there we go. They're quarantining multiple major subreddits now.
One of these things is not like the others.
Someone hypothesized that they just added it to hinder alt-right whataboutism
FC also has a quiet bit of genocide denial. It's not a good sub, and I speak as a leftist.
I am honestly not surprised to see /r/Ice_poseidon on that list. I was not even aware of who he was before doing the Tildes invite threads but investigated after noticing one thing quite a few of the users who I opted to not send invites to seemed to have in common was frequenting that subreddit. That and /r/opieandanthony as well.
It turns out that asshole celebrities often have equally shitty people as their fans, who could have guessed?
Given the amount of criticism Reddit has been under for bias and stifling free speech, I'm surprised by this announcement. I don't really see the value added here
EDIT: I found out that this is just adding appeals, which isn't as bad. But it has certainly created some uproar in the comments for some reason.
Personally, I think it's the excess of free speech that causes a lot of Reddit's problems; specifically the spread of misinformation and hurtful commentary levelled at other redditors and non-redditors alike.
Then again, I'm not a U.S. citizen so I find a lot of U.S. concepts, such as "free speech" very weird.
I like to make a distinction between how we talk and what we talk about.
We should be free to talk about anything at all, with no limits - that's what free speech means.
It's the how part that we're all getting stuck on. Calling someone a cunt is not the same as calling them out on their opinions and debating them. When you call someone a cunt, the conversation is pretty much over - nothing of value will follow that comment. That's the problem, and also the solution.
Keep the conversation healthy, but don't restrict the topics being discussed. Just block bad behaviors that corrupt the conversation. Technically, this means you are censoring the person who is behaving badly - but only until they stop acting like a child and grow up a little.
That's basically where Tildes is trying to go - with a dash of 'good taste' in the mix (no porn communities, dis-incentivizing low-effort content, etc).
The real reason this never happens online is simple - drama drives more traffic than anything else does, and all of those sites need the traffic to 'grow' so they can make money. That's why places like Reddit and Youtube will continue to intentionally misunderstand the problem - their salaries depend on them not understanding it.
I disagree. IMO there are certain topics and ideas that shouldn't be allowed to be broadcast in public (and on this site) as they are just blatant attempts to hide abhorrent, hateful beliefs behind dog-whistles and "reasonable" debate while being used as recruiting tools for hate groups. e.g. holocaust denial, transgenderism being a "mental illness", race "realism" aka "scientific" racism, etc.
p.s. This is precisely why Canada prosecutes public promotion of holocaust denial under our hate propaganda laws, and I agree with that approach entirely.
Frankly, I think if you keep the 'how' clean, these mental viruses won't arise in the first place. They depend on misinformation and fear/rage-inducing rhetoric for their appeal and to propagate through a community. The best disinfectant is sunlight and a conversation about exactly how and why these beliefs are wrong - which requires talking about them. Make the topic forbidden and the place might get a bit cleaner, but no minds will be changed and no one will learn anything.
Really? How's that ideology working out for reddit? Because from my perspective things there have gotten steadily worse because of that same ridiculous belief of spez's, and the alt-right numbers have been growing on the site as a result of it, not shrinking. Whereas the bans have actually been proven to be effective. So IMO "how" you allow those issue to be discussed isn't the problem at all... it's allowance of those abhorrent ideas to be broadcast in the first place that is.
Like I said, you bias the system towards facilitating an actual conversation - that's curating the 'how' aspect. In practice, that's working out for reddit just fine everywhere it's applied - which is a tiny group of places that are mostly part of the depthhub network.
Uhhhh.... you do know that /r/askhistorians (a member of depthhub) has banned holocaust denial questions/debate, right?
And I very much suspect race realism bullshit would also similarly be removed instantly from /r/askscience as well.
p.s. You can read about the AskHistorians ban (and their reasoning) here:
https://www.reddit.com/r/AskHistorians/comments/90p2m0/meta_askhistorians_now_featured_on_slatecom_where/
Yes, because holocaust denial is ridiculous and is used on reddit almost exclusively for soapboxing (which is a behavior that should be carefully watched regardless of topic). They didn't ban talking about the holocaust, though.
You have just moved the goalpost, IMO. I never said all talk of the holocaust should be removed, merely denial of it (as well as the dog-whistles), whereas you clearly stated you think there should be "no limits" as to what can be discussed so long as things are kept civil in tone.
And as to AskHistorians, not all holocaust denial is so blatantly obvious. They have banned more than just blatant denialism but also all the standard denialist dog-whistles and strategies for "debate" lead-ins as well (e.g. questioning the numbers killed, modern medicine benefits resulting from it, Zyklon-B development, etc). Even anything that they even remotely suspect is borderline bad-faith, they leave a 1,600 word retort to denialism, remove all the comments but theirs and then lock the thread.
The other two topics I mentioned (transgenderism being a "mental illness", race "realism" aka "scientific" racism) are very similar to holocaust denial in almost every way, including acting as dog-whistles.
I guess the difference is that I don't see 'holocaust denial' as a topic. I see 'holocaust' as a topic, and denial of it just one smaller part of that conversation. I have no objections to blocking it because that position is flatly untenable - we have proof and that's the end of that part of the conversation. People are free to take issue with that proof, but not free to bring that nonsense into a larger discussion with the goal of pushing some kind of agenda.
Both of your other examples are also positions that are flatly not supported by evidence, so it's the same story there. These are rhetorical weapons used to corrupt conversations, and there are a lot of them. None of them can stand up to sunlight - the instant you throw cold hard facts on the discussion, those positions evaporate. The people holding them have no choice then but to resort to discussion-killing behaviors because if they don't, they can't 'win.' When they do that in a place that cares about civility, they get warned, and if they persist, they get removed.
The answer to this crap is Snopes. Collect all of the evidence/facts/hard data in one place, make it easy to read and understand, and then make sure that everyone knows the link. That way whenever these distractions rear up, the people promoting them are forced to rebut the entire body of evidence. When they fail, they'll end up looking ridiculous, and lose any social credit they have because of it. That'll make them less likely to do it again in the future, or they'll leave and do it somewhere else where they can get away with it.
When we have a Tildes wiki operating, I think if we have short links to wiki pages that people can easily remember and use in the comments it'll throw this process into overdrive, and give people dedicated to facts and honest discussions an edge rebutting this sort of nonsense. I could type up a five paragraph response to a holocaust denial comment - and source it, and linkify it - or I could just remember the last time I saw this and make my reply "See $holocaust-denialism." Tildes can auto-convert that to a link to a page in the wiki that does a better job and has been built by dozens of people, updated every time the topic comes up with new information.
So yes, I do think we need these to be discussed here - if only to build those wiki pages, so that anyone who comes here will end up getting a solid presentation of real information. Once that's done, those discussions will end as fast as they begin, and become a non-issue. Don't worry though, there will always be more bullshit to debunk tomorrow, so there's plenty of work to look forward to. ;)
Your idea about building a wiki full of evidence, facts, and hard data is idealistic and nice - but naive. These people don't give a shit about your supposed "facts". Posting evidence to counter their claims only plays into their hand. You're engaging with them. You're allowing them to question your facts. You're putting the initiative in their hands. You're following their agenda. That discussion where you present facts and they innocently "ask questions" about those facts is just letting them put their talking points in front of an audience.
Sunlight isn't that strong a disinfectant. It's not instantaneous. While it's slowly and gradually reducing the size of a mould patch in one area, some spores escape and infect other areas. You need to quarantine that shit, rather than try to expose it.
That's not really where I was intending to go with the 'here are the facts' pages.
I think of them more like a clue-by-four to use to beat users like that into submission. It's not up for questioning or debate, here it is and this is the way things are run and why. If we've decided a topic is settled, then it's settled, and nagging ceaselessly about it after being informed is grounds for user discipline. That topic page serves as an educational record that gets everyone including new users on the same page as the rest of the group - without having to explain it a hundred times in comment threads. Just a quick link instead whenever the topic pops up.
It's not there for the idiot who is beating on his dead horse, whatever that horse happens to be that day. It's there for the people who come by that thread later and are wondering why the thread is locked and that user is muted or banned. It's public record, and somewhat like a legal precedent in a way, since it's setting the tone for the moderation. Issues arise, the group talks through them, develops consensus, implements new rules or guidelines (or new conventions, labels, and even governance systems) to solve them.
I think this whole discussion is getting a bit muddled because 'holocaust denial' is the topic. I'm looking at this from from the perspective of dealing with a major issue in a community and having the group itself handle it well. I honestly couldn't care less about the exact topic, I tend to think in the abstract. There will be new dog whistles in the future, and groups are going to want to ban all sorts of content and behaviors for good reasons and bad ones. I like having that public record to point back at and say, "This is why things are the way they are." I think it's a far better mandate than you can get from fiat.
Hope that explains better where I'm coming from.
I still think it's naive to believe that a wiki page of facts is going to shut down someone whose whole purpose for posting is to challenge those facts. As long as you allow them to present their talking points, someone else is reading those talking points. All those conspiracy theorists and fake-news peddlers already have the perfect responses for your fact-dump: you're just buying into the mainstream delusion, or you're shutting down honest discussion. And, out there in the big wide world, someone else is reading those comments, and believing them instead of your facts.
The wiki page doesn't shut him down.
The mods shut him down with a mute or locked thread, or admins with a ban. Then they link to the wiki or past discussion. Everyone reading the thread will know what went down and why.
I am honestly a bit surprised this is so hard for people to grasp. This stuff seems very simple to me. :)
So... just like Reddit, then? :P
Because this three-line comment you just wrote is very different to your previous explanation of how this would work:
You were previously saying that the evidence itself would be enough to end these discussions. However, now you're saying that moderators will need to step in to shut down those discussions, using the evidence only as a justification for their actions after the fact. These are two very different scenarios.
There are no rules without the discussion about those rules. The group won't follow any other authority. That's just how groups work. That's where the mandate comes from. When those rules are put into practice, having the past discussions that created them as a handy reference is a good way to educate the users and explain why action is being taken.
Holocaust denial is a bit of a special case, since we don't really need to have that discussion to decide to block the behavior. It's been had in askhistorians, and that link shared in this very thread, exactly like I said people would do with the wiki pages. This is normal forum behavior old as usenet. It'd be so much easier to do that and remember using wiki shortlinks, no?
Let's talk through this in excruciating detail and figure out where I'm losing you and cfabbro.
Now, here's Bob coming into this hypothetical clusterfuck of a thread, which is probably locked by now (or will be shortly). He's new, never been here before. He doesn't know the rules. He's seeing mod actions for the first time, and they just censored a user - but look, there's a link. Ah, this user has a history of promoting $whatever-bs and here's another handly link on why that position is bullshit because it pops up from time to time. Looks like everyone in this group decided to block the behavior and here are their reasons. Now Bob knows why things happened this way, can read to learn about the issue, and has a handy reference to rebut that particular flavor of bullshit in the future.
Next rulebreaking thread on the same topic (five days later, from a completely different user) is locked in minutes with the same results and same link. This time, lots of users posted the link, because they remember it from last time - they beat the mods, who arrive shortly to deal with it by locking the thread. The discussion ended as fast as it began, and became a non-issue. That's as solved as the problem can get in an online forum.
The evidence and past discussion ends the debate about if the content is allowed or not. It then educates new users and gets them all on the same page about the group's goals/governance. It's all out in the open. Mods have now been given the mandate to shut that topic/rule violation down going forward.
This is all so simple and clear to me I feel like I've walked into a bizarro world where basic moderation is like quantum physics to some people. This is how communities have always self-moderated in the past since the time of BBS systems. All we're talking about here is formalizing the process and putting together systems to help facilitate it.
If that 'discussion' about what rules and why never happens, I'm going to have a problem with it, and basically turn into goldfish on this topic. That's where my 'free speech' line is. Without that discussion it's random-power-user fiat, and that will end badly, like it always has, because those power users will have biases and make mistakes. The group has a shot at finding a better solution due to more perspectives being present.
So far Tildes has spent a rather terrifying amount of time discussing this stuff. All the systems and rules have been up for debate and refinement, and the few by-fiat rules here are simple common sense I think most anyone can get behind that were well-explained by Deimos in his announcement and docs pages. That's exactly how it should be. What I've described is how you scale it up to a hundred thousand plus communities, past the point where a single person has enough time/brainpower to keep up.
With all due respect, you did not explain your proposal clearly until now. You assumed @cfabbro and I understood certain parts of it that simply weren't clear.
I'm in total agreement with you on Steps 1 through 5.
As for Step 6 ("Oops - can't post any new comments for x days/weeks."), this is purely hypothetical, as I already indicated. There has been no hint from Deimos that there will be any posting limit on new Tildes accounts.
There will always be users who engage the bullshitter. Whether they don't know about your hypothetical wiki page of bullshit-stoppers, or whether they just love a good ol' rough'n'tumble, they'll dive right in, even while other people are posting links to the bullshit-stopping wiki page.
The bullshitter isn't going to stop bullshitting just because you post some facts at them. That has never stopped these people in the past, and it's not going to stop them in the future. I was once witness to the ultimate slap-down of scientific racism in /r/AskHistorians (paging @mundane_and_naive who might be interested in that thread), but it didn't stop those people coming back over and over again.
The only thing that stops this discussion is the mods arriving and locking the thread.
I really think you're a little too idealistic about how people behave. Having some wonderful wiki page full of bullshit-stoppers isn't going to stop the bullshitters coming, or stop people engaging with those bullshitters. Only the mods will achieve that.
Now, maybe those mods are acting on previous discussions among the community, so everyone is on the same page. But only their moderation actions will stop the bullshitters; the bullshit-stopping wiki page won't do a bloody thing to stop bullshit.
That's fair, given how scattered all the discussions about moderation solutions are floating around in multiple separate threads. I shouldn't be expecting everyone to keep up with them all.
I think it's inevitable there will be something done. It's no burden at all to honest users who only ever go through being a newbie one time. They'll forget it even happened and not care in the slightest. It's a bane for dishonest ones that have to do it over and over like Sisyphus every day. That metric is too tempting and effective to pass up unless we can dream up better ways to do it. Frankly, the entire moderation system fails if there's no disincentive to being banned, so that means there will be disincentives of some kind. If the site's ever charging for access (like SA and The Well) that's also effective without using time limits. The banned users get to pay to get in, again and again.
Sure. Some people can't resist troll bait. User labels can make it difficult to reply to a malice comment, though. "This user appears to be a troll, are you sure you want to feed it" etc. You'd have to click through before replying. Deimos mentioned that in the latest thread about labels. I'm pretty sure we can find a label-based solution to shutting down bullshitters real fast, before mods get involved (and with that probably calling the mod's attention to the thread/comment).
As I've said repeatedly, he'll stop when the mods make him, and then he'll go away - for a long time. So will his friends. If they are stupid enough to die on that hill, they are making it easy for us. Meanwhile everyone else gets to become informed. The wiki page is there for them, not for the bullshitter.
I think 1% of the people cause 90% of the problems. Once they are effectively designed out of the system we'll see what's left.
You are misunderstanding and misrepresenting @Amarok. They are basically saying that facts when invoked will lead free speech abusers into uncivil behaviour, allowing us to pick them up and put them out of the door quite easily, while not disallowing all discussion around sensitive topics.
You should read Amarok's excruciatingly detailed explanation.
A facts page makes it harder to derail the conversation. Anyone who wants to legitimately argue the other side has to do the work of tearing down that page to get anyone to take them seriously, and that won't be easy if the page has been through multiple discussions, improving every time. Those facts pages could end up becoming pretty badass resources on controversial topics.
The malcontents likely will regress to bad behaviors since that's really their only option if they can't counter the facts. Then the hammer comes down and they get muted or banned.
Facts win in real life. It's about time they started winning on the internet somewhere, too.
They don't evaporate though, that's the damn problem! And also precisely why the subject has been banned by AskHistorians! It's perfectly summed up by the AskHistorians mods in the Haaretz article, IMO:
Sunlight doesn't disinfect these abhorrent ideas... it just allows their proponents to plant their insidious seeds into the minds of everyone who looks at them!
We'll see about that. Tildes isn't reddit, reputation matters here. Over there, I can spam askhistorians with crap like that all day, every day, forever, and there's nothing they can do about it except post rants which fume impotently about the problem.
There isn't nothing they can do about it. When I was on that mod team, we used to remove that crap with utmost prejudice, and ban the people posting it (and I assume they're still doing that). We did the same thing with "race realism" crap, which often started with the innocent-seeming question "Why didn't Africans develop civilisation?"
Yet there's nothing stopping them from posting it again on reddit. Here there should be something, even if it's just a waiting period. Over there you're basically clicking to remove asshat content as fast as they can click to repost it.
Seems to me like that's wasting moderator time. The way to beat the asshats is to make sure that for every action a mod/admin has to take, the other party has to go through far more to get around it. That way they have to waste effort ten or a hundred to one compared to the mods. That gives the mods a fighting chance again.
Are you proposing that undesired behaviour by a user in one group on Tildes should affect their ability to participate in all groups on Tildes? Because, if someone posts unsupported crap in ~humanities.history.ask and the mods of that sub-group take action against that user, that user is free to post their crap again elsewhere on Tildes unless those mods can enforce some sort of restriction across the whole of Tildes.
Is that really the way to go? Do we want the moderators of one group/sub-group being able to affect a user's ability to participate everywhere else on Tildes? Things I've read about Tildes' future direction have hinted and implied that the ability to restrict a user's participation on the whole of Tildes is going to be restricted to a very small coterie of ultra-high mods - not just the mid-level mods of groups and sub-groups. In other words, it'll be like Reddit in that way: moderators can ban a user from their own subreddit, but only admins can ban the user from the whole website.
I expect the mods of any given group will have the ability to mute users for a time period within those groups only. Giving mods banning capability is probably a bad idea - we've seen what happened on reddit. I wrote a bit about this before.
That said, if a user is constantly making a pain in the ass out of himself in certain groups, he's likely to get banned from the site, rather than just the group. When someone acts like an ass in the produce section, they get kicked out of the supermarket, not just the produce section. When people are asked to shape up, the ones who do won't have any problems. The ones who don't won't be here very long.
... which is exactly the same situation that currently occurs on Reddit, that you're describing as a waste of moderators' time. A moderator bans a person from one subreddit/group, but the person can still post in other subreddit/groups. The mods of ~humanities.history.ask can stop their problem child from posting in their group, so the problem child takes their crusade to other groups on Tildes. Eventually, the lower-level mods of a few groups have to appeal to the higher-up mods to get the problem child banned site-wide.
Unless a low-level mod of a sub-group can stop a user posting across the whole of Tildes, then we have exactly the same problem you're complaining about on Reddit: "there's nothing stopping them from posting it again on reddit". Regardless of what you call the various functions ("mute", "temp ban", "permaban", "kick out"), the operation of those functions appears to be the same across both platforms: low-level mods can restrict a user's ability to post in a specific location, but only high-level mods can block the user from posting on the whole website. It's the same basic split of responsibility and functionality on both websites.
On reddit, I can create a new account in one minute.
On Tildes, it's not that simple, and it should never be that simple. That user loses all trust, and that can come with posting restrictions or even having new users wait a couple weeks before they get posting privileges.
Explain to me how that's the same. It looks to me like the difference between no punishment at all and a couple weeks sitting in the corner, unable to do anything except fume.
The user posts something vile like that one time. Warning. The next time, ban. Done deal. They don't get to take their little crusades on the road. That's already happened here a couple of times. Accounts will have a report card eventually, just like we already have on reddit, so keeping track of this stuff across dozens of communities/groups/mods/interactions should be no problem. When the user crosses the line, it'll get noticed and sent on to whatever group/team Tildes eventually has for handling that sort of discipline.
I'm pretty sure that someone who's here to spread discord and disinformation is not interested one little bit in whether they can apply comment labels or edit topic tags.
These things are extremely hypothetical. Sure, some Tildeans have proposed these as possibilities, but I've never seen Deimos even hint that he endorses the idea that someone might create a new Tildes account but not be able to post until later. Let's not mistake Tildeans' wish lists for actual Tildes functionality.
Who does the banning? The local mods of a sub-group, or the higher-up mods? And how is that different to moderators of a subreddit banning an account from their subreddit, and then appealing to admins to get that account suspended on the whole of Reddit? Locals mods can block users from posting in local groups/subreddits, but only higher-up mods can block users from posting on the whole website.
Please explain to me how a moderator of ~humanities.history.ask can identify a Holocaust denier in their sub-group, and then block that denier from posting their crap in ~news.germany. If they can't do that, then this is exactly the same problem that a moderator of /r/AskHistorians can not block a denier from posting their crap in /r/Germany.
We're talking on two tangents - let's take it over here where I talk it through.
He hasn't. Right now since it's invite only we're completely protected from users coming right back to continue their crap, so we don't need to deal with it.
Once Tildes isn't invite only, we'll be needing to do something about it - some system or mechanism that prevents someone from logging right back into the site on a new account to continue their bullshit. What that is, we'll discover when we have a daily discussion about it someday and see what people come up with.
Here's an idea, maybe something like
user flags
that only certain high-level trusted users can see, edit, or add to. Then ~news.germany would be perfectly capable of filtering out individuals labeled with certain user flags from other sub-groups, likeholocaust-denier
orneo-nazi
orknown-troll
, and whatever other flag they wanted to deny posting or reading access to.Yes, this is off-topic. If you want some reading material, start with the Why is Africa "less developed"? section of the FAQ over at /r/AskHistorians. If you have follow-up questions, you could post them there, or even try our own ~humanities group here on Tildes (which is an umbrella group that includes history).
(I've labelled your comment as 'Offtopic', but I can't label my own comment. Could you please...?)
But that feeds back into what @cfabbro specifically said: some topics you just have to ban outright for the very same reasons. Holocaust denial is quasi-exclusively soapboxing or politically motivated white-washing but still done under the guise of healthy skepticism and historic discussion. Also damingly it is most often by groups tangental to or outright overlapping with the kinds of people who will want to discuss "race realism", "preservation of white culture", or "Men's Rights".
More importantly the argument as to why people think banning Holocaust denial is bad and AH is bad for doing it is the same exact reasoning: that sunlight is the best disinfectant and the best thing to do with Holocaust deniers is to educate them or the people listening to them as to why they are wrong. And like @cfabbro said: that didn't work for Reddit, and it didn't work for AH so they decided to just outright ban it, no matter how respectful, well-articulated, or sourced Holocaust denial replies are.
Like I said, this isn't Reddit. The solution comes from giving the group the tools to defend itself from hostile activities. If someone's going around Tildes posting about holocaust denial after being shown the evidence, they'll get warned, if they fail to shape up, they'll get banned (a real ban, not reddit's fake bans). That'll get taken care of by the group's own members using distributed democratic power.
Jumping straight to 'let's censor everything we find contentious, subversive, or disagreeable' brings its own problems. So will importing all of reddit's traditions and projecting reddit's failures onto to a new place that hasn't got reddit's limitations or spineless administration. :)
The problem comes when someone has to decide what allowed and what isn't. I don't trust any regulating body to it, not today, not in 20 years, and most likely never. Censorship is what it is - I don't agree with it on a principle, not on the implementation. Everyone should be able to voice their opinion, especially in a respectful manner. If you disagree - you can always counter the points. And the worst part of censorship is not even that it violates someone's right to speech, but it also violates everyone's right to listen. I would never ever in my right mind let anyone decide what I am allowed to read or listen to, and effectively speech restriction does this as well. In fact, if someone tells me NOT TO read something or listen to something you might be pretty sure that's exactly what I'm going to read next or listen to next.
A call for violence a call for violence, I don't know what else to say here. It's the "overt" part I have problem with, since if you'll ask the right people anything can be violence. If we are willing to agree on any limitation on free speech and free expression, these limits must be set in very clear way, beyond any reasonable doubt about what they mean. There might be no "overt" here, because any ambiguity in limitations can and will be used to cover more and more ground. For example, in my country we have free speech written in the constitution on paper, but we also have a law that forbids blasphemy and "extremist" speech. It had been sold to the public as a way to fight nazis and anyone else who tries to stir hatred, but now it's a decade later, few administrations changed, and each iteration they've added more and more to what is extrimist and what isn't. So now anyone who protests reforms or expresses political dissent is an extrimist, and even citing the constitution is "extrimist" if the judge rules so. And just by having that one single law we've lost all our freedom of speech.
See, you've done it again, you used the word "may". Only if there is a clear threat in what's being said you can act preventively, but if there is a "may" at play then someone acting on it is punishable, but the speech should not be. You have to respect my right to be exposed to any ideas, even the ones you consider dangerous.
I don't think that it's as ambiguous as you put it. If we aren't joking with each other me saying "I'm going to come to your house and burn it down" or "I will punch you right now" is pretty clear.
I don't think that anything should be censored, a call to violence should be investigated by the law enforcement. One exception here that I'm willing to consider is when some sensitive information is exposed - for example, someone's home address that isn't public, some form of an ID number, credit card numbers - that kind of stuff.
That's almost an oxymoron - like 'civil war'. You aren't being civil if you're calling for someone's face to get bashed in, by definition.
The subtlety can be a problem - it's possible to craft responses that appear perfectly reasonable but use weasel words and other common deceptive tactics to muddy the waters. It's not hard to craft responses with a mild air of condescension that triggers emotional responses putting people at a disadvantage in a discussion. The best trolls don't appear to be trolls at all.
Vigilance is the solution. Someone will see through the bullshit and put the troll in their place. Wouldn't it be nice if the forum itself was built to facilitate that happening, rewarding the vigilant and punishing the troll? ;)
Beating the trolls means playing a long game - they can slide by an incident now and then, but eventually their bad behavior pattern becomes clear over time.
Just FYI: the "civil" in "civil war" means occurring among fellow citizens, not polite. It's ultimately from the Latin "civis", meaning citizen.
Hah, TIL. That term annoys me much less now. :)
This is awesome. I have never thought of it this way. Specifically the idea that how and what we talk about can be different. I have often thought that the idea of unacceptable words and political correctness are stupid and simply attempts at censorship. Thank you for refining my views.
I still think political correctness is most of the time stupid and unnecessary, but the idea that there could be silencing a voice without it being truely censorship.
Agreed unreservedly; and applying the "free speech" label does nothing to separate the concept and the implementation, as you mention.
I'm not from the US either, but even I understand that free speech is really important. In the case of Reddit they know where they draw the line, but they aren't very open about it or about the enforcement of the line.
But I find your comment about "free speech" as a concept being weird even weirder - it's not from the US originally, and the reason why a lot of countries have in under law is simple - people are not rational beings, and you can't trust people to be rational about anything. Whenever limitation are put not just on one website, but on a society as a whole that means that someone gets to set what is acceptable to say and what isn't - in the case of Reddit or Tildes we know who these people are, where the line is supposedly drawn, and if we don't like it we are free to live, and as Reddit demonstrated - you can't trust people to not wield any kind of power in their self interest. Any power executed by the government left unchecked turns into the tool of oppression that the rulers use to oppress the ruled. In the age of enlightenment it's been proven that it's a lot better to deal with consequences of free speech than with the consequences of censorship executed by the authority.
Freedom of speech in the First Amendment was always so ambiguous.
"Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."
Not to mention people take the Bill of Rights like it's the same as the Ten Commandments and try and apply it to their lives because they are "patriotic".
I hope this means they plan to actually quarantine more subreddits. The idea is okay, not as good as closing subreddits that violate the Content Policy, but an improvement on the current process of ignoring them. I hope to see more action from the mods, because the current approach is unsustainable imo.
I was hoping to see an announcement that they'd quarantined TD.
I avoid anything political and as such r/TD has never bothered me.