Tumblr's problem is that they're not good at policing their platform. The porn ban came as a result of them being too lazy to properly moderate their platform and purge CSAM, not due to pressures...
Tumblr's problem is that they're not good at policing their platform. The porn ban came as a result of them being too lazy to properly moderate their platform and purge CSAM, not due to pressures from Apple or Google - think about it, if they were puritanical, they would've deplatformed Reddit long ago.
I have definitely seen pornographic content on Tumblr since the ban.
This is the usual problem of perfect moderation being economically infeasible, and this is why new 18+ content continued to appear there despite the ban. It is nearly impossible to tell a 18+...
This is the usual problem of perfect moderation being economically infeasible, and this is why new 18+ content continued to appear there despite the ban.
It is nearly impossible to tell a 18+ picture of a 18y0d old (legal) from a 18+ picture of a 17y364d old (illegal, and often punishable by real prison sentences and up to lifetime placement on sexual offender lists in certain jurisdictions).
A lot of the softcore non-nude stuff is stolen from various TikTok and Instagram profiles. Surely a Tineye or Google Image search will locate the original source? Or some may depict the names of...
It is nearly impossible to tell a 18+ picture of a 18y0d old (legal) from a 18+ picture of a 17y364d old (illegal, and often punishable by real prison sentences and up to lifetime placement on sexual offender lists in certain jurisdictions).
A lot of the softcore non-nude stuff is stolen from various TikTok and Instagram profiles. Surely a Tineye or Google Image search will locate the original source? Or some may depict the names of any teams.
Nudity isn't the only issue here. Sexualised non-nude content of minors (a.k.a. jailbait) is also banned as per Tumblr's terms of service, although it is in a legal grey area in many places and is similarly morally repugnant to actual CP. Some nations such as the UK and Switzerland make it illegal to view, possess and distribute such images and would criminally punish it similarly to how they'd punish CSAM.
This is the usual problem of perfect moderation being economically infeasible, and this is why new 18+ content continued to appear there despite the ban.
I'd argue the inverse. Tumblr is a very good argument for why automation is a terrible thing. At the very least Reddit has community moderators that can snuff out such content quickly despite being a far larger platform than Tumblr, though that presents issues of its own.
Machine learning is in its infancy and not even big tech giants like Amazon, Apple, Microsoft, Meta or Twitter can get things right.
When you see game companies like Blizzard, Valve and Riot cut staff and employ machine learning algorithms to police their games, you end up with toxic cesspools where the report system is weaponised to unfairly reprimand others. This has happened with World of Warcraft, League of Legends and Dota 2, and is the main reason why Heroes of the Storm failed to retain players.
So why do these companies automate their processes? Because staffing is a major expense, and cutting staff is a very good way to increase profits and make your fat-cat shareholders happy.
This is likely what Elon Musk is gonna do with Twitter, btw.
It's weird, because this change opens them back up to those issues. It's just as illegal to have nudes of minors on your platform as it is to have porn proper, so I'm not sure what the deal is.
It's weird, because this change opens them back up to those issues. It's just as illegal to have nudes of minors on your platform as it is to have porn proper, so I'm not sure what the deal is.
(the emphasis is mine.) Hopefully, they will eventually understand that nudity should be allowed unless the platform is explicitly purposed for the audience largely made of legal minors (e.g....
if you want to know more about that, our CEO Matt recently explained why it’s not feasible for us to safely and successfully support porn communities at this time
(the emphasis is mine.) Hopefully, they will eventually understand that nudity should be allowed unless the platform is explicitly purposed for the audience largely made of legal minors (e.g. Duolingo), and there can be rules that restrict nudity to softcore, just like the ones DeviantArt has.
Tumblr's problem is that they're not good at policing their platform. The porn ban came as a result of them being too lazy to properly moderate their platform and purge CSAM, not due to pressures from Apple or Google - think about it, if they were puritanical, they would've deplatformed Reddit long ago.
I have definitely seen pornographic content on Tumblr since the ban.
This is the usual problem of perfect moderation being economically infeasible, and this is why new 18+ content continued to appear there despite the ban.
It is nearly impossible to tell a 18+ picture of a 18y0d old (legal) from a 18+ picture of a 17y364d old (illegal, and often punishable by real prison sentences and up to lifetime placement on sexual offender lists in certain jurisdictions).
A lot of the softcore non-nude stuff is stolen from various TikTok and Instagram profiles. Surely a Tineye or Google Image search will locate the original source? Or some may depict the names of any teams.
Nudity isn't the only issue here. Sexualised non-nude content of minors (a.k.a. jailbait) is also banned as per Tumblr's terms of service, although it is in a legal grey area in many places and is similarly morally repugnant to actual CP. Some nations such as the UK and Switzerland make it illegal to view, possess and distribute such images and would criminally punish it similarly to how they'd punish CSAM.
I'd argue the inverse. Tumblr is a very good argument for why automation is a terrible thing. At the very least Reddit has community moderators that can snuff out such content quickly despite being a far larger platform than Tumblr, though that presents issues of its own.
Machine learning is in its infancy and not even big tech giants like Amazon, Apple, Microsoft, Meta or Twitter can get things right.
When you see game companies like Blizzard, Valve and Riot cut staff and employ machine learning algorithms to police their games, you end up with toxic cesspools where the report system is weaponised to unfairly reprimand others. This has happened with World of Warcraft, League of Legends and Dota 2, and is the main reason why Heroes of the Storm failed to retain players.
So why do these companies automate their processes? Because staffing is a major expense, and cutting staff is a very good way to increase profits and make your fat-cat shareholders happy.
This is likely what Elon Musk is gonna do with Twitter, btw.
It's weird, because this change opens them back up to those issues. It's just as illegal to have nudes of minors on your platform as it is to have porn proper, so I'm not sure what the deal is.
They're not coming back.
(the emphasis is mine.) Hopefully, they will eventually understand that nudity should be allowed unless the platform is explicitly purposed for the audience largely made of legal minors (e.g. Duolingo), and there can be rules that restrict nudity to softcore, just like the ones DeviantArt has.