Two devs automated the process of generating and publishing "garbage" mobile slot machine games on Google Play, and made over $50,000
This is hilarious (and kind of depressing). If you ever wonder why there's so many shitty apps on the app stores, this is exactly why—you can literally generate garbage like "3D Inexperienced Great Horned Owl Slots" (actual example from the article), throw ads on it, and still end up making a lot of money if you scattershot enough variants onto the store.
It reminds me of the "Something is Wrong on the Internet" article talking about how many children's videos on YouTube seem to be getting algorithmically generated but still attracting a ton of views due to using popular keywords.
That's only part of the problem though I feel like. In fact I'd merely call that a symptom, the real problem in my eyes (besides the shady devs who do this) is people still downloading these garbage apps, or perhaps I should say the lack of awareness to these kinds of apps and how bad they are.
You could publish one million of these to the play store, but if nobody downloaded them then you'd just be wasting your time. The fact that publishing hundreds of clones to the play store is profitable merely shows us that a significant percentage of people do not care enough about what they're putting on their phones, and that's very, very bad.
And even assuming the majority of the people installing these are kids who don't know any better, then that's still bad because it shows a lack of oversight on the parents' part when it comes to the digital space.
And of course shady devs fully take advantage of this sort of space, and it works.
Of course I also wholeheartedly wish there were effective policies in place to stop this predatory behavior, but I really do feel like we have a problem on both ends here, and I think more people need to be informed about this type of stuff if we want to go anywhere with it.
It could also not be people downloading these apps. It'd be a hell of a scam, but those devs could be paying bot farms to click the ads. In that situation, the losers are the advertisers.
I don't know if you could use bots for this, Google is actually pretty good at detecting interactions from bots on their platforms and filtering them out. At most I feel like you could use a click farm with real humans (which sadly is a real thing), but I can't imagine the return would outweigh investment.
For sure Google is good at detecting it, but it still happens. NY Mag did a story on it a couple months ago in response to a DOJ press release about $36 million in ad fraud. One of the networks defrauded was Google: http://nymag.com/intelligencer/2018/12/how-much-of-the-internet-is-fake.html
There seems not be a completely isolated case. The google play store has many studios that seem to make one game template then just spit it out with new art. It makes the google play store incredibly hard to navigate, especially because they target common keywords, search almost feels pointless. I get what google is doing here but I don't think it is the right direction to keep going in if they want to improve the quality of the content on their storefront, which I don't believe is their goal.
This issue strikes me as an entire category - as Deimos mentioned, generated kids videos AI generated, targeted ad T-shirts, knock-off products on Amazon Marketplace, investment sites triggered to post pre-written articles in response to price data changes, etc.
It's way too far to attribute it to just a matter of bad actors, when there's a clear economic incentive to do this shit - what would y'all call it?
I'm reminded of an Intransigence article that identifies the "wall of noise" we experience online as a method for creating artificial scarcity.
I'd call it the cobra effect - it's what happens when good old fashioned human ingenuity (and/or greed, for the more cynical among us) meets a market where then incentives aren't quite perfectly aligned to the intended outcomes. The latest iteration just happens to have robots, because we live in the future now.