-
2 votes
-
EFF launches "TOSsed Out", a new project to highlight ways that Terms of Service and other rules are unevenly and unthinkingly applied to people by online services
12 votes -
The subtle economics of private World of Warcraft servers: Anarchy, order and who gets the loot
5 votes -
Wikipedia’s refusal to profile a Black female scientist shows its diversity problem
13 votes -
Study finds Reddit’s ban of its most toxic subreddits worked
17 votes -
Is Tildes 18+?
I was thinking about posting this to ~news, but suddenly I've realised that I don't know if the word “fuck”, or any of the Seven Dirty Words, are allowed in titles. Is Tildes adults-only? Should...
I was thinking about posting this to ~news, but suddenly I've realised that I don't know if the word “fuck”, or any of the Seven Dirty Words, are allowed in titles. Is Tildes adults-only? Should people write something like “f***” in titles instead?
11 votes -
Labor demands Facebook remove 'fake news' posts about false Australian death tax plans
9 votes -
An internet for kids: Instead of regulating the internet to protect young people, give them a youth-net of their own
12 votes -
Mod annotations for removed comments
I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that...
I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that point was genial and non-controversial, so it seems especially odd that there's just this black hole there. What struck me mostly was how opaque the moderation was. There is no indication of what kind of content was removed, why it was removed, or specifically who did the removal or when it happened.
Then I scrolled down and at the very bottom I found what I guess is meant to address these concerns, a comment from Deimos:
Sigh, I saw this thread was active and thought it was going to have an actual on-topic discussion in it. Let's (mostly) start this over.
It's not always clear online so I want to say that I'm not rage-posting or bellyaching about censorship or any of the usual drama that tends to crop up on sites like Tildes from time to time. I trust Deimos' moderation and give this the benefit of the doubt. What I'm actually doing, I guess, is making a feature request about better annotation for removed comments.
Would it make sense to show a note (like Deimos' comment) in-thread at the position of the deleted content? Instead of down at the bottom of the page or unattached to anything relevant? In my opinion some kind of "reason" message should always be provided with any moderation activity as a matter of course. Even if it's just boilerplate text chosen from a dropdown menu.
Also, would a single bulk-annotation for all of the related removals make for better UX than 13 separate ones? I think that would be both easier to read, and easier for Deimos to generate on the backend.
I feel like we may have had this conversation previously, but I couldn't find it. Apologies if I'm beating a dead horse.
13 votes -
Reddit’s /r/Piracy is deleting almost ten years of history to avoid ban
33 votes -
Crazy idea to help stop the spreading of untruthful news
One of the main issues with news on social media is the spread of fake or false news. This happens on every platform that allows sharing news. If Tildes continues to gain popularity, this will...
One of the main issues with news on social media is the spread of fake or false news. This happens on every platform that allows sharing news. If Tildes continues to gain popularity, this will likely happen on Tildes. I had an Idea: what if tildes had a group of fact checkers that check to see if the news is truthful, and block posts that link to untrustworthy new sites? could be like a 3 strikes thing, where if a new source has 3 articles posted that have misinformation, they would be blocked (the post also removed).
This is just an idea, feel free to highlight any issues with it.
10 votes -
The internet's hidden rules: An empirical study of Reddit norm violations at micro, meso, and macro scales
19 votes -
How Facebook's hour of inaction enabled the Christchurch video to spread
8 votes -
Inside YouTube’s struggles to shut down video of the New Zealand shooting
11 votes -
Anti-Muslim hate speech is absolutely relentless on social media even as platforms crack down on other extremist groups
6 votes -
Why tech companies failed to keep the New Zealand shooter’s extremism from going viral
9 votes -
Tumblr suffers 150 million drop in traffic after porn ban
30 votes -
Twitter has ambitious plans to change the way we tweet by limiting snark and improving the "health" of interactions, but so far it's gone nowhere
15 votes -
The Comment Moderator Is The Most Important Job In The World Right Now
28 votes -
The life of a comment moderator for a right-wing website
27 votes -
Inside Facebook’s war on hate speech: An exclusive embed with Facebook’s shadow government
14 votes -
YouTube bans comments on videos of children
35 votes -
RIP Culture War Thread - /r/slatestarcodex's regular thread for debating polarizing issues showed the difficulties and risks of hosting those conversations
39 votes -
YouTube and demonetization: The hammer and nail of content moderation
8 votes -
The trauma floor - The secret lives of Facebook moderators in America
17 votes -
Facebook decided which users are interested in Nazis — and let advertisers target them directly
10 votes -
Reddit rules and the banning of certain anime communities
21 votes -
Maintaining trust and safety at Discord with over 200 million people
14 votes -
In screening for suicide risk, Facebook takes on tricky public health role
9 votes -
Inside Facebook’s secret rulebook for global political speech
10 votes -
How fascist sympathizers hijacked Reddit’s libertarian hangout
29 votes -
Sam Harris drops Patreon, rips 'political bias' of 'Trust and Safety' team's bans
17 votes -
Slack is banning users who have visited US-sanctioned countries (including Iran and Cuba) while using its app
20 votes -
A third of Wikipedia discussions are stuck in forever beefs
18 votes -
Curbing hate online: What companies should do now
8 votes -
Looking for opinions on how to moderate a community
Hello. I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in...
Hello.
I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in and it got ugly (4 bans in the first 12 comments), in the other a trans woman took part and got shit for it (also featured a few users banned).Now, each of them had a very different approach. The first got defensive and stopped participating, while the second took the time to respond to the stupid but not offensive ones, trying to educate them.
So even if this is something that bothers me a lot and makes considerably angry, I realised that maybe I should take a more nuanced view on this, and I should actually ask for more opinions on how to handle thiS, instead of simply applying my own standards and maybe making things worse and/or missing a chance to make things better. And since Tildes has always provided me with intelligent, thoughtful and interesting points of view and opinions, I thought this would be the best place for this question.
And so here I am, asking anyone that would care to give an opinion: what would a good moderator do? How harsh or lenient should we be with ignorant but not offensive comments? Should we get involved at all if the discussion is not offensive? What would make our sub a nicer place to everyone? Any other thoughts?
Thank you very much to all.
20 votes -
'Rank socialism': Facebook removes senator's official page over hate speech
8 votes -
Text of u/DivestTrump's post about T_D and Russia propaganda that was deleted
51 votes -
Twitter was going to ban Alex Jones — until its CEO stepped in and protected him
19 votes -
Should deleting comments be the standard behaviour, or can we consider a less censored approach by default?
I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit. I'd like to see a move to locking...
I often stumble in to threads with entire comment chains deleted. I assume most people here have faced the same situation as well, either here or on reddit.
I'd like to see a move to locking comments rather than deleting them by default. That would mean no further replies to the comment or any other comment in that chain, no one being able to delete or edit their comments, no one being able to add or remove votes to a comment, etc.
I understand for particularly egregious comments removal is completely necessary (especially when it goes hand-in-hand with banning users), but a lot of times comments are deleted as a means to prevent long argumentative back-and-forth chains that spam and derail topics, as well as antagonize users.
In a lot of cases I feel like deleting the comment only further serves to hide what is unacceptable behaviour (even if that behaviour should be obvious), rather than setting an example for the userbase.
30 votes -
Feature proposal: Real-time moderation transparency page (vote in comments)
Proposal: Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions. A...
Proposal:
Create a new page where all users can view all moderation actions. This would make transparency a core part of the platform, hopefully avoiding any misunderstandings about mod actions.A new page, maybe called tildes.net/moderation, is available to all registered users. I am not sure where the link to should appear on the site, maybe on the user's profile sidebar?
This page contains a table of all possible moderation actions. The actions may include: deleted topics, deleted comments, tag modification, moved topics, edited topic titles, banned user, locked topics. (this begs the question, what are the possible mod actions, and that they must be codified.)
Very roughly, the table columns might include: Date, User(being mod'ed), Mod Action(a list of possible mod actions), Mod Action Reason (either a text field, or a list of possible reasons for this action), Link (null if action is a deleted topic.)
I think that the user who did the moderating should not be publicly listed for now, to avoid drama?
Some of the related Topics: (please make a top-level comment with any others)Could we have a stickied list of all bans with reasons included?
Daily Tildes discussion - our first ban
Please vote for the comment which best reflects your position on this proposal.
As a bonus question, please make a top-level comment if you have general comment about my format of voting on comments. Would you prefer a straw poll on a 3rd party platform? Is there a cleaner way to do this?
Edit: added "banned user" to actions list, I probably missed others, let me know. Also added the obvious locked topics.
23 votes -
Lets discuss tags, again
There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags I've noticed a few things about tags and rather than make a topic for each...
There's been some discussion around tags since users were given tag-editing privileges, such as Tag Use and Article Tags
I've noticed a few things about tags and rather than make a topic for each one I thought I'd make a few top level comments instead, hopefully with others doing the same for anything tag related they'd like to see discussed.
20 votes -
The impossible job: Inside Facebook’s struggle to moderate two billion people
14 votes -
Tildes code of conduct
Tildes code of conduct says Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent. Can you change that to just say don't...
Tildes code of conduct says
Do not post anyone's sensitive personal information (related to either their real world or online identity) with malicious intent.
Can you change that to just say don't post personal info? Even if it's not done with malicious intent it should still be removed to protect people's privacy.
Also while it does say to not post spam on tildes terms of service I think It should say that on the code of conduct.
Edit: I mean posting personal info without consent and not public information.
Telling someone how to contact a company would be fine but not posting someone's address.
12 votes -
Could we have a stickied list of all bans with reasons included?
In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users...
In the interest of transparency (and a little bit in statistics) it would be really cool to have a master banlist or at least a thread with links to all ban-worthy posts. This would help new users understand what isn't acceptable in the community and allow for community discussion on what could be considered an unjustified ban or a weird influx of bad behavior. This wouldn't be super viable when the site goes public, but would be a neat implementation in Tildes' alpha state.
14 votes -
Moderator tools: what do you have and what should be the immediate priorities?
I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long...
I don't want to get too high in the clouds with moderating philosophy. Instead I want to talk about action steps that can be taken in the very near term to improve moderating. Especially so long as Deimos is the only one with most of the moderating tools at their disposal, I think it's crucial to make sure it's as painless as possible.
So far it looks like Deimos has these moderating tools available to him:
- User bans
- Comment removal
- Thread locking/removal
- Title/tag editing (and this ability is shared by many of us as well)
Am I missing anything?
The three next tools I would hope are coming next are:
- A reporting mechanism, where users can report comments and threads that they think should be removed.
- A feedback mechanism for reports, telling users that a report they gave was acted on.
- A note taking system for the moderator-type person, shareable with all other moderator-type persons at that level, with an expiration date probably around 30 days.
Now I'll talk about why. First, the reporting mechanism. While it's still possible to keep up with everything that gets posted, I don't necessarily think it's the best use of Deimos' time to read literally everything, especially as the site expands its userbase and presumably activity level and depth. The reporting system at first should probably just be a button, maybe eventually with a pop-up field allowing the user a brief description why their reporting, and a queue that gets populated with comments and threads that get reported.
Coinciding with a report queue/option should probably be an easy, rudimentary system for providing feedback to those whose reports led to moderating action. At first, an automated message saying something like "thank you for reporting recently. Action has been taken on one of your recent reports" without any relevant links would do fine, and we can leave the particulars of how much detail to add for later discussions.
The last thing I think should help things considerably in the immediate term is a time-limited user tracking tool for the moderator-type person. As things scale, it isn't always going to be feasible to use mental bandwidth remembering each username and the relevant history associated with their behavior. A good note-taking tool with an auto-timed expiration date on notes would be a good way to address what can easily become a hugely mentally taxing role at almost any scale. This tool should let Deimos take a discrete note for himself (and other moderators at that permission level and higher) connected to a user regarding any questionable threads or comments that were yellow/red flags, or any other moderator action taken against a user within the last X days/months (the particulars don't matter to me as much as that there is an expiration date to these notes). This should let the moderator type person focus on the broader history of the users they're looking at before making a decision, without having to go searching for every relevant comment from the past 30 days. Fewer problematic users at scale should fall through the cracks and more users that might just be having a bad day can be let off with comment removals and/or warnings.
Are these priorities fair? Are there design elements you would want to see in the immediate term that would help reduce the burden of moderating? Are there problems with these tools I'm suggesting that you would want to see addressed?
19 votes -
Moderators of Reddit, tell us about your experiences in fostering quality discussion and content (or failures to do so)
Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god...
Since the moderator community here is quite large, I figure we would have quite alot of interesting perspectives over here in Tildes. Feel free to chip in even if you're not a moderator, or god forbid, moderate such subs as T_D. Having a range of perspectives is, as always, the most valuable aspect of any discussion.
Here are some baseline questions to get you started:-
-
Did your subreddit take strict measures to maintain quality ala r/AskHistorians, or was it a karmic free-for-all like r/aww?
-
Do you think the model was an appropriate fit for your sub? Was it successful?
-
What were the challenges faced in trying to maintain a certain quality standard (or not maintaining one at all)?
-
Will any of the lessons learnt on Reddit be applicable here in Tildes?
29 votes -
-
Facebook blunders its way through the world and deals with the consequences later. In Myanmar, that strategy has had deadly consequences.
12 votes -
Twitter puts Alex Jones's account in "read-only mode" for a week, so he can't tweet, retweet, or like content
11 votes -
Andreas Schou - On Moderation
6 votes