It's so much worse than this. These people actually go around inventing challenges and dares and whatnot, using fake accounts to convince kids to do whatever it is that satisfies their cravings....
It's so much worse than this. These people actually go around inventing challenges and dares and whatnot, using fake accounts to convince kids to do whatever it is that satisfies their cravings. The kids fall for it because they are kids, and because their zero-view channels are now getting comments and views from these people, so they think they're gaining popularity. This has been going on for years and years, totally unchecked.
This article makes it seem that YT is all over this problem with solutions and actions. Bullshit PR. This has been going on for years and if you've frequented /r/elsagate just a few times you've...
This article makes it seem that YT is all over this problem with solutions and actions. Bullshit PR. This has been going on for years and if you've frequented /r/elsagate just a few times you've know.
And I'm sorry if this is too much off topic but I'm just now learning that my longtime physicians first name is... August?
It's so sad to see this kind of thing happening across YouTube, and the wormhole the YouTube Algorithm sends you down. Reminds me of the Polish teen triathalonner who posted this gif on Instagram...
It's so sad to see this kind of thing happening across YouTube, and the wormhole the YouTube Algorithm sends you down. Reminds me of the Polish teen triathalonner who posted this gif on Instagram a few days ago, and goft stolen off of her and posted onto Reddit, 9gag, and was shared over 2 million times across Poland. Crotch shots from the gif and pictures from Google searches showed up all over the place, especially on 9gag (but not Reddit, thank God, although I do see your comment about /r/ElsaGate@Fin), as I learned when I sent the above Reddit link to her to let her know. She was horrified by what people were posting, just from a simple fun gif that got a few reposts. I hope that nothing like this every plagues Tildes (@Deimos), and I sure hope all the platforms acorss the internet get rid of this terrible content.
Eesh. And I thought Youtube couldn't get worse when they completely fucked over animators/content creators. Now they're harboring kiddy-diddlers and profiting off of it. It's at least a tad...
Eesh. And I thought Youtube couldn't get worse when they completely fucked over animators/content creators. Now they're harboring kiddy-diddlers and profiting off of it. It's at least a tad humorous that YouTube's complete autonomy is bringing this to light by not taking videos down for once. And they likely won't be taking these videos down unless several more companies pull their ads from the platform. This is the result of YT's sheer negligence. As long as people keep watching, money will keep rolling in, and they know it. I really hope something more involved will be done about it, but I know better than to expect more than failure from them. What an awful situation.
It's so bizarre that on one hand, seemingly breathing in the wrong way can result in a video being demonitized, but elsagate and these sorts of videos flourish. What a weird time to live in.
It's so bizarre that on one hand, seemingly breathing in the wrong way can result in a video being demonitized, but elsagate and these sorts of videos flourish. What a weird time to live in.
It's almost malicious lol. Filters listen for "bad" words and flag with abandon. It's completely autonomous, and as such, is easy to work around if you don't mind having zero integrity
It's almost malicious lol. Filters listen for "bad" words and flag with abandon. It's completely autonomous, and as such, is easy to work around if you don't mind having zero integrity
I found out about this over this weekend (Sunday or Monday) and I'm a bit surprised. For all that Google claims to continue to "improve" their platforms, this is disgusting. And I'm sure they know...
I found out about this over this weekend (Sunday or Monday) and I'm a bit surprised. For all that Google claims to continue to "improve" their platforms, this is disgusting. And I'm sure they know about it.
At this point, I only use Google for my core services (email, drive, photos). Everything else I use another service for if possible. I just wish that Google wasn't required to use parts of the internet - there is a definite monopoly in some cases, and not enough competition. It's a similar situation to Facebook, though that's slowly changing (many games or services liked to use Facebook for a while, and it was the only method of saving progress or signing in).
The best I think we can do is continue to educate people about what companies like this are doing, demystify the background workings. People lie - researching topics is important for every day discourse and forming opinions on important matters.
Kudos to the guy who made the video. That's work I'm glad I didn't have to do.
There are tons of open source and privacy-orientated alternatives to Googles services. I implore you to check them out. Privacytools.io is a good resource.
There are tons of open source and privacy-orientated alternatives to Googles services. I implore you to check them out. Privacytools.io is a good resource.
It's improved a lot for Norwegian searches in the last few years. They've also added a toggle to limit the search results to only pages in your country, which helps a lot sometimes.
It's improved a lot for Norwegian searches in the last few years. They've also added a toggle to limit the search results to only pages in your country, which helps a lot sometimes.
I've had to use Google's support channels many-a-time, and the only times I've had issues is when it came to getting support as a developer for Google Play. Developer support sucks, but for their...
I've had to use Google's support channels many-a-time, and the only times I've had issues is when it came to getting support as a developer for Google Play. Developer support sucks, but for their paid service's support, I can only say good things.
Please, for the love of god, hire fucking humans to address these problems. I know that even with tons of real people on board, this is a hard problem to solve given that it seems to occur on low...
Please, for the love of god, hire fucking humans to address these problems. I know that even with tons of real people on board, this is a hard problem to solve given that it seems to occur on low view-count videos that attract no outside attention otherwise, but you will not be able to automate the solution here. Even in these most clear cut cases where no one disagrees with the need to have this shit nuked from orbit, tech companies can't take the slightest bit of responsibility for what they let happen on their platforms. Maybe this will make enough of a wave to force Google to actually do something, but I doubt it. Most likely just another performative move without changing anything fundamental about how they moderate and manage their website.
The only other thing would be Heroes 2.0, but I don't think anyone will be happy with that.
See this article on the topic of humans filtering the web for the rest of us. Granted the article is 5 years old. But I think there's a human element to filtering content on the web, on the major...
See this article on the topic of humans filtering the web for the rest of us. Granted the article is 5 years old. But I think there's a human element to filtering content on the web, on the major platforms. But — whether by humans or automation — it is impossible (and likely unsustainable) to catch everything.
If a platform cannot exist without housing and enabling pedophilia, it should not exist. I recognize that it isn't a good financial decision to do what needs to be done to address the problem in...
If a platform cannot exist without housing and enabling pedophilia, it should not exist. I recognize that it isn't a good financial decision to do what needs to be done to address the problem in any way, but that's the problem, isn't it? And a few slipping by is much different than a long-term widespread problem. This isn't a new thing that needs to be stamped out, it's just now that advertisers are being publicly dragged into it.
I'm not sure what you really wanted me to get from the article. It's tough work and the people who do it deserve to be well-paid. I seriously sympathize with those horror stories. But the horror of the removal job just reinforces that it needs to be done, and it needs to be done more.
I mean, I get what you're saying and I agree with you in principle. But the postal service enables pedophilia. The phone company does. Heck, the local power utility does. I wish there was a way to...
If a platform cannot exist without housing and enabling pedophilia, it should not exist.
I mean, I get what you're saying and I agree with you in principle. But the postal service enables pedophilia. The phone company does. Heck, the local power utility does. I wish there was a way to surgically remove just the evil applications of every technology, but realistically there will always be bad actors finding ways to use benign tools to do bad things. We can't throw the baby out with the bathwater and nuke every system that can't prevent it from happening.
If the limit in addressing those problems occurs because of profitability rather than some moral principle, then I stand by what I said. You can't quite solve the problem with the postal service...
If the limit in addressing those problems occurs because of profitability rather than some moral principle, then I stand by what I said. You can't quite solve the problem with the postal service because the methods to do so would require invasions in privacy that can't really be justified.
However, the thing stopping YouTube and every other platform is that doing what needs to be done would be too expensive, and I'd rather see the platform somehow forced into doing those things with the possibility of going under as a result than seeing it not try as hard as it possibly can.
There's a huge difference between not being able to properly address a problem without compromising the core of what the technology is and does and just not doing more because it wouldn't be cost effective. I understand the urge to try and play the mature pessimist and act like the most basic of moderation is a far-off dream, but it really doesn't have to be. It's the bare minimum.
Youtube Heroes was (is?) a "points" based user contribution/moderation system that YouTube attempted (is still attempting?). They released a video announcing it a few years ago but it completely...
Youtube Heroes was (is?) a "points" based user contribution/moderation system that YouTube attempted (is still attempting?). They released a video announcing it a few years ago but it completely blew up in their faces and so immediately afterwards they subtly and discreetly edited the original video (I would assume to try to clarify things in order to stem the outrage), but people noticed they made changes without admitting that they had, and so that backfired even worse... so then they deleted the video entirely (and maybe cancelled the program?).
AFAIK nobody really knows if the Heroes program is still being worked on or not, but there are still some other related ones that Google has in place but kept rather silent about (e.g. trusted flagger program, contributors program, etc.)... probably because they don't want a repeat of the shitstorm that the Heroes announcement caused.
I would assume that with "Heroes 2.0" @Whom is suggesting that if they did cancel the program due to the backlash that they should try again.
Original Video: https://www.youtube.com/watch?v=O13G5A5w5P0 Reddit post with further commentary from OP: https://old.reddit.com/r/videos/comments/artkmz/
As someone that gets paid to train AI/ML the current state of things is that AI is fucking awful at doing what we want it to do. It also gets fed tons of garbage data as many AI trainers...
As someone that gets paid to train AI/ML the current state of things is that AI is fucking awful at doing what we want it to do. It also gets fed tons of garbage data as many AI trainers (including google) cheap out and set wages for work that only Indians and other third worlders will accept. I'm not trying to be racist here but the data that those places output is not up to par but I can't blame them for what they are paid.
I can't think of a single big company that I haven't worked for and it really gives me doubts about the AI quality when you have seen it from my side.
Pinterest is a notable example of a platform that has figured out how to get decent data but even their AI is pretty awful.
Is it true that the more narrow and specific and well defined the task is, the better shot you have at creating something useful? From a mod tools perspective - do you think there could ever be...
Is it true that the more narrow and specific and well defined the task is, the better shot you have at creating something useful?
From a mod tools perspective - do you think there could ever be value in using these tools, with a herd of them each aimed at one very narrow task, as a collective mechanism that just brought things to the attention of humans, so that the humans can use them as a sort of radar to laze through massive amounts of data?
Could these humans, through simple 'good find' and 'bad find' mechanics, potentially be able to train these tools in some useful way to make them better as they use them?
Yes, I find that the best thing to do is for a simple question such as 'is this a cat' and work through that. Asking people to label what the object is is much less likely to go well than a simple...
Yes, I find that the best thing to do is for a simple question such as 'is this a cat' and work through that. Asking people to label what the object is is much less likely to go well than a simple yes or no question if it is X and keep that yes or no question the same for long periods of times or people will get tripped up.
Basically you have to go slower through data to get good data and most people don't want to do that. It costs more $$ to do things that slowly basically.
It has like 3 good use cases. And to be quite frank, I don't think that even the really helpful ones like Google Maps are enough of a benefit to justify all of the damage that AI has/is capable of...
It has like 3 good use cases. And to be quite frank, I don't think that even the really helpful ones like Google Maps are enough of a benefit to justify all of the damage that AI has/is capable of doing.
So it appears that Google's solution is to...demonetize videos that get comments from pedophiles? What the hell? They don't even want to pretend like this isn't just about the advertisers to them....
They don't even want to pretend like this isn't just about the advertisers to them. Even completely nuking the videos, while not the right thing to do, could at least be sold as a necessity to solve a serious problem. But all we get is that if pedophiles decide they're gonna target your kid, you can't make money anymore because it'll make YouTube look bad to advertisers.
It's so much worse than this. These people actually go around inventing challenges and dares and whatnot, using fake accounts to convince kids to do whatever it is that satisfies their cravings. The kids fall for it because they are kids, and because their zero-view channels are now getting comments and views from these people, so they think they're gaining popularity. This has been going on for years and years, totally unchecked.
Nestle and Disney have also paused their advertising on YouTube now in response.
This article makes it seem that YT is all over this problem with solutions and actions. Bullshit PR. This has been going on for years and if you've frequented /r/elsagate just a few times you've know.
And I'm sorry if this is too much off topic but I'm just now learning that my longtime physicians first name is... August?
It's so sad to see this kind of thing happening across YouTube, and the wormhole the YouTube Algorithm sends you down. Reminds me of the Polish teen triathalonner who posted this gif on Instagram a few days ago, and goft stolen off of her and posted onto Reddit, 9gag, and was shared over 2 million times across Poland. Crotch shots from the gif and pictures from Google searches showed up all over the place, especially on 9gag (but not Reddit, thank God, although I do see your comment about /r/ElsaGate @Fin), as I learned when I sent the above Reddit link to her to let her know. She was horrified by what people were posting, just from a simple fun gif that got a few reposts. I hope that nothing like this every plagues Tildes (@Deimos), and I sure hope all the platforms acorss the internet get rid of this terrible content.
Eesh. And I thought Youtube couldn't get worse when they completely fucked over animators/content creators. Now they're harboring kiddy-diddlers and profiting off of it. It's at least a tad humorous that YouTube's complete autonomy is bringing this to light by not taking videos down for once. And they likely won't be taking these videos down unless several more companies pull their ads from the platform. This is the result of YT's sheer negligence. As long as people keep watching, money will keep rolling in, and they know it. I really hope something more involved will be done about it, but I know better than to expect more than failure from them. What an awful situation.
It's so bizarre that on one hand, seemingly breathing in the wrong way can result in a video being demonitized, but elsagate and these sorts of videos flourish. What a weird time to live in.
It's almost malicious lol. Filters listen for "bad" words and flag with abandon. It's completely autonomous, and as such, is easy to work around if you don't mind having zero integrity
I found out about this over this weekend (Sunday or Monday) and I'm a bit surprised. For all that Google claims to continue to "improve" their platforms, this is disgusting. And I'm sure they know about it.
At this point, I only use Google for my core services (email, drive, photos). Everything else I use another service for if possible. I just wish that Google wasn't required to use parts of the internet - there is a definite monopoly in some cases, and not enough competition. It's a similar situation to Facebook, though that's slowly changing (many games or services liked to use Facebook for a while, and it was the only method of saving progress or signing in).
The best I think we can do is continue to educate people about what companies like this are doing, demystify the background workings. People lie - researching topics is important for every day discourse and forming opinions on important matters.
Kudos to the guy who made the video. That's work I'm glad I didn't have to do.
There are tons of open source and privacy-orientated alternatives to Googles services. I implore you to check them out. Privacytools.io is a good resource.
Duckduckgo.com is actually very good.
It's improved a lot for Norwegian searches in the last few years. They've also added a toggle to limit the search results to only pages in your country, which helps a lot sometimes.
I've had to use Google's support channels many-a-time, and the only times I've had issues is when it came to getting support as a developer for Google Play. Developer support sucks, but for their paid service's support, I can only say good things.
Please, for the love of god, hire fucking humans to address these problems. I know that even with tons of real people on board, this is a hard problem to solve given that it seems to occur on low view-count videos that attract no outside attention otherwise, but you will not be able to automate the solution here. Even in these most clear cut cases where no one disagrees with the need to have this shit nuked from orbit, tech companies can't take the slightest bit of responsibility for what they let happen on their platforms. Maybe this will make enough of a wave to force Google to actually do something, but I doubt it. Most likely just another performative move without changing anything fundamental about how they moderate and manage their website.
The only other thing would be Heroes 2.0, but I don't think anyone will be happy with that.
See this article on the topic of humans filtering the web for the rest of us. Granted the article is 5 years old. But I think there's a human element to filtering content on the web, on the major platforms. But — whether by humans or automation — it is impossible (and likely unsustainable) to catch everything.
If a platform cannot exist without housing and enabling pedophilia, it should not exist. I recognize that it isn't a good financial decision to do what needs to be done to address the problem in any way, but that's the problem, isn't it? And a few slipping by is much different than a long-term widespread problem. This isn't a new thing that needs to be stamped out, it's just now that advertisers are being publicly dragged into it.
I'm not sure what you really wanted me to get from the article. It's tough work and the people who do it deserve to be well-paid. I seriously sympathize with those horror stories. But the horror of the removal job just reinforces that it needs to be done, and it needs to be done more.
I mean, I get what you're saying and I agree with you in principle. But the postal service enables pedophilia. The phone company does. Heck, the local power utility does. I wish there was a way to surgically remove just the evil applications of every technology, but realistically there will always be bad actors finding ways to use benign tools to do bad things. We can't throw the baby out with the bathwater and nuke every system that can't prevent it from happening.
If the limit in addressing those problems occurs because of profitability rather than some moral principle, then I stand by what I said. You can't quite solve the problem with the postal service because the methods to do so would require invasions in privacy that can't really be justified.
However, the thing stopping YouTube and every other platform is that doing what needs to be done would be too expensive, and I'd rather see the platform somehow forced into doing those things with the possibility of going under as a result than seeing it not try as hard as it possibly can.
There's a huge difference between not being able to properly address a problem without compromising the core of what the technology is and does and just not doing more because it wouldn't be cost effective. I understand the urge to try and play the mature pessimist and act like the most basic of moderation is a far-off dream, but it really doesn't have to be. It's the bare minimum.
Pardon my ignorance; what is Heroes 2.0?
YouTube Heroes
Or better yet, Folding Ideas' video on the subject
Youtube Heroes was (is?) a "points" based user contribution/moderation system that YouTube attempted (is still attempting?). They released a video announcing it a few years ago but it completely blew up in their faces and so immediately afterwards they subtly and discreetly edited the original video (I would assume to try to clarify things in order to stem the outrage), but people noticed they made changes without admitting that they had, and so that backfired even worse... so then they deleted the video entirely (and maybe cancelled the program?).
AFAIK nobody really knows if the Heroes program is still being worked on or not, but there are still some other related ones that Google has in place but kept rather silent about (e.g. trusted flagger program, contributors program, etc.)... probably because they don't want a repeat of the shitstorm that the Heroes announcement caused.
I would assume that with "Heroes 2.0" @Whom is suggesting that if they did cancel the program due to the backlash that they should try again.
p.s. You can watch the original YouTube Heroes announcement video here:
https://www.youtube.com/watch?v=TlHY7oMYrHM
Original Video: https://www.youtube.com/watch?v=O13G5A5w5P0
Reddit post with further commentary from OP: https://old.reddit.com/r/videos/comments/artkmz/
To anyone who didn't realize it before, let this be the clear warning that AI/ML needs to be purged from social media sites.
As someone that gets paid to train AI/ML the current state of things is that AI is fucking awful at doing what we want it to do. It also gets fed tons of garbage data as many AI trainers (including google) cheap out and set wages for work that only Indians and other third worlders will accept. I'm not trying to be racist here but the data that those places output is not up to par but I can't blame them for what they are paid.
I can't think of a single big company that I haven't worked for and it really gives me doubts about the AI quality when you have seen it from my side.
Pinterest is a notable example of a platform that has figured out how to get decent data but even their AI is pretty awful.
Is it true that the more narrow and specific and well defined the task is, the better shot you have at creating something useful?
From a mod tools perspective - do you think there could ever be value in using these tools, with a herd of them each aimed at one very narrow task, as a collective mechanism that just brought things to the attention of humans, so that the humans can use them as a sort of radar to laze through massive amounts of data?
Could these humans, through simple 'good find' and 'bad find' mechanics, potentially be able to train these tools in some useful way to make them better as they use them?
Yes, I find that the best thing to do is for a simple question such as 'is this a cat' and work through that. Asking people to label what the object is is much less likely to go well than a simple yes or no question if it is X and keep that yes or no question the same for long periods of times or people will get tripped up.
Basically you have to go slower through data to get good data and most people don't want to do that. It costs more $$ to do things that slowly basically.
It has like 3 good use cases. And to be quite frank, I don't think that even the really helpful ones like Google Maps are enough of a benefit to justify all of the damage that AI has/is capable of doing.
So it appears that Google's solution is to...demonetize videos that get comments from pedophiles? What the hell?
They don't even want to pretend like this isn't just about the advertisers to them. Even completely nuking the videos, while not the right thing to do, could at least be sold as a necessity to solve a serious problem. But all we get is that if pedophiles decide they're gonna target your kid, you can't make money anymore because it'll make YouTube look bad to advertisers.
That's upsetting, but i can't see a solution to this besides closing the website.