-
28 votes
-
The Trauma Floor: The secret lives of Facebook moderators in America
7 votes -
Facebook failed to delete 93% of posts containing speech violating its own rules in India
8 votes -
Who has your back? Censorship edition 2019 - Report by the EFF that assesses major tech companies' content moderation policies
8 votes -
GitHub shocks top developer: Access to five years' work inexplicably blocked
24 votes -
YouTube just banned supremacist content, and thousands of channels are about to be removed
14 votes -
Publishers that closed their comments sections made a colossal mistake
9 votes -
Community based tag-curation
This was inspired by this post where the user tagged the post as "sugges" rather than "suggestions." Since tags decline in utility with minor spelling mistakes like this, I wonder if there could...
This was inspired by this post where the user tagged the post as "sugges" rather than "suggestions."
Since tags decline in utility with minor spelling mistakes like this, I wonder if there could be a way for nitpicky grammarians, like myself, to just go through an edit broken tags, add relevant tags, prune unnecessary ones, etc.
I guess it would be sort of a moderation responsibility, but I expect we would prefer they focus on content moderation. Tag editing is low-key enough that people with this responsibility probably wouldn't need to be vetted as thoroughly or held to the same kind of community standards of behavior that a mod would be. We'd just have to trust them to not be pranksters or abusive with it (e.g. making tags like "this poster is a doodyhead").
8 votes -
Microsoft Windows Terminal YouTube video removed for copyright claim
12 votes -
Confessions of a Reddit 'Karma Whore': My years-long journey to the top of Reddit's karma leaderboards has only made me feel more alone
21 votes -
Facebook acknowledges Pelosi video is faked but declines to delete it
22 votes -
Facebook's third Community Standards Enforcement Report, covering Q4 2018 and Q1 2019
2 votes -
EFF launches "TOSsed Out", a new project to highlight ways that Terms of Service and other rules are unevenly and unthinkingly applied to people by online services
12 votes -
The subtle economics of private World of Warcraft servers: Anarchy, order and who gets the loot
5 votes -
Study finds Reddit’s ban of its most toxic subreddits worked
17 votes -
Wikipedia’s refusal to profile a Black female scientist shows its diversity problem
13 votes -
Is Tildes 18+?
I was thinking about posting this to ~news, but suddenly I've realised that I don't know if the word “fuck”, or any of the Seven Dirty Words, are allowed in titles. Is Tildes adults-only? Should...
I was thinking about posting this to ~news, but suddenly I've realised that I don't know if the word “fuck”, or any of the Seven Dirty Words, are allowed in titles. Is Tildes adults-only? Should people write something like “f***” in titles instead?
11 votes -
Labor demands Facebook remove 'fake news' posts about false Australian death tax plans
9 votes -
An internet for kids: Instead of regulating the internet to protect young people, give them a youth-net of their own
12 votes -
Mod annotations for removed comments
I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that...
I just came across this field of 13 admin-removed comments and frankly it left me feeling rather unsettled. That's a lot of content to just nuke all at once. Contextually, the thread up to that point was genial and non-controversial, so it seems especially odd that there's just this black hole there. What struck me mostly was how opaque the moderation was. There is no indication of what kind of content was removed, why it was removed, or specifically who did the removal or when it happened.
Then I scrolled down and at the very bottom I found what I guess is meant to address these concerns, a comment from Deimos:
Sigh, I saw this thread was active and thought it was going to have an actual on-topic discussion in it. Let's (mostly) start this over.
It's not always clear online so I want to say that I'm not rage-posting or bellyaching about censorship or any of the usual drama that tends to crop up on sites like Tildes from time to time. I trust Deimos' moderation and give this the benefit of the doubt. What I'm actually doing, I guess, is making a feature request about better annotation for removed comments.
Would it make sense to show a note (like Deimos' comment) in-thread at the position of the deleted content? Instead of down at the bottom of the page or unattached to anything relevant? In my opinion some kind of "reason" message should always be provided with any moderation activity as a matter of course. Even if it's just boilerplate text chosen from a dropdown menu.
Also, would a single bulk-annotation for all of the related removals make for better UX than 13 separate ones? I think that would be both easier to read, and easier for Deimos to generate on the backend.
I feel like we may have had this conversation previously, but I couldn't find it. Apologies if I'm beating a dead horse.
13 votes -
Reddit’s /r/Piracy is deleting almost ten years of history to avoid ban
33 votes -
Crazy idea to help stop the spreading of untruthful news
One of the main issues with news on social media is the spread of fake or false news. This happens on every platform that allows sharing news. If Tildes continues to gain popularity, this will...
One of the main issues with news on social media is the spread of fake or false news. This happens on every platform that allows sharing news. If Tildes continues to gain popularity, this will likely happen on Tildes. I had an Idea: what if tildes had a group of fact checkers that check to see if the news is truthful, and block posts that link to untrustworthy new sites? could be like a 3 strikes thing, where if a new source has 3 articles posted that have misinformation, they would be blocked (the post also removed).
This is just an idea, feel free to highlight any issues with it.
10 votes -
The internet's hidden rules: An empirical study of Reddit norm violations at micro, meso, and macro scales
19 votes -
How Facebook's hour of inaction enabled the Christchurch video to spread
8 votes -
Inside YouTube’s struggles to shut down video of the New Zealand shooting
11 votes -
Anti-Muslim hate speech is absolutely relentless on social media even as platforms crack down on other extremist groups
6 votes -
Why tech companies failed to keep the New Zealand shooter’s extremism from going viral
9 votes -
Tumblr suffers 150 million drop in traffic after porn ban
30 votes -
Twitter has ambitious plans to change the way we tweet by limiting snark and improving the "health" of interactions, but so far it's gone nowhere
15 votes -
The Comment Moderator Is The Most Important Job In The World Right Now
28 votes -
The life of a comment moderator for a right-wing website
27 votes -
Inside Facebook’s war on hate speech: An exclusive embed with Facebook’s shadow government
14 votes -
YouTube bans comments on videos of children
35 votes -
RIP Culture War Thread - /r/slatestarcodex's regular thread for debating polarizing issues showed the difficulties and risks of hosting those conversations
39 votes -
YouTube and demonetization: The hammer and nail of content moderation
8 votes -
The trauma floor - The secret lives of Facebook moderators in America
17 votes -
Facebook decided which users are interested in Nazis — and let advertisers target them directly
10 votes -
Reddit rules and the banning of certain anime communities
21 votes -
Maintaining trust and safety at Discord with over 200 million people
14 votes -
In screening for suicide risk, Facebook takes on tricky public health role
9 votes -
Inside Facebook’s secret rulebook for global political speech
10 votes -
How fascist sympathizers hijacked Reddit’s libertarian hangout
29 votes -
Sam Harris drops Patreon, rips 'political bias' of 'Trust and Safety' team's bans
17 votes -
Slack is banning users who have visited US-sanctioned countries (including Iran and Cuba) while using its app
20 votes -
A third of Wikipedia discussions are stuck in forever beefs
18 votes -
Curbing hate online: What companies should do now
8 votes -
Looking for opinions on how to moderate a community
Hello. I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in...
Hello.
I moderate a reddit sub with about 450 thousand people and we have had trouble with transgender people facing abuse from idiots in two different threads. In one of them, a woman chimed in and it got ugly (4 bans in the first 12 comments), in the other a trans woman took part and got shit for it (also featured a few users banned).Now, each of them had a very different approach. The first got defensive and stopped participating, while the second took the time to respond to the stupid but not offensive ones, trying to educate them.
So even if this is something that bothers me a lot and makes considerably angry, I realised that maybe I should take a more nuanced view on this, and I should actually ask for more opinions on how to handle thiS, instead of simply applying my own standards and maybe making things worse and/or missing a chance to make things better. And since Tildes has always provided me with intelligent, thoughtful and interesting points of view and opinions, I thought this would be the best place for this question.
And so here I am, asking anyone that would care to give an opinion: what would a good moderator do? How harsh or lenient should we be with ignorant but not offensive comments? Should we get involved at all if the discussion is not offensive? What would make our sub a nicer place to everyone? Any other thoughts?
Thank you very much to all.
20 votes -
'Rank socialism': Facebook removes senator's official page over hate speech
8 votes -
Text of u/DivestTrump's post about T_D and Russia propaganda that was deleted
51 votes -
Twitter was going to ban Alex Jones — until its CEO stepped in and protected him
19 votes