I think this alludes to the Target pregnancy scandal, which was debunked. People have been saying "AI can predict human behavior" since that article (from 2012). Unprofitable disasters like the...
The platform will know your idea is pregnant far before you will.
People have been saying "AI can predict human behavior" since that article (from 2012). Unprofitable disasters like the Metaverse, unappealing advertisements, and lack of strong evidence make me skeptical.
Before LLMs, a company couldn’t just absorb your idea and ship it. Ideas needed programmers, and programmers worked in meat-space-and-time, i.e. they were a limited resource, expensive and slow.
Before LLMs, big companies absorbed and shipped ideas from smaller companies after launch: this famously happened in 1998 with Watson/Sherlock for Mac OS. Another example is Trader Joes ripping off smaller brands, by faking interest in stocking to get product samples. Nowadays, I most frequently see big companies steal smaller companies' ideas via acqui-hire (e.g. OpenClaw and Moltbook).
I think it's unlikely a big company would steal an idea before it demonstrated success; such an idea may not succeed at all, and big companies tend to avoid risk, plus there are many other promising ideas. Meanwhile, acqui-hiring or outright buying a rising startup is relatively cheap for a billion-dollar company, though life-changing for the startup founder(s); most deals are in the single- to triple-digit millions.
I think you're missing the analogy of the dark forest. All it takes is one bad actor to do this for it to become the dominant strategy. The existence of one "hunter" mandates a response that...
I think you're missing the analogy of the dark forest. All it takes is one bad actor to do this for it to become the dominant strategy. The existence of one "hunter" mandates a response that proliferates. It is, quite literally, "if you can't beat them, join them."
I think you are unnecessarily discounting the absolutely trivial risk to reward ratio that mass data center compute offers. Why not spend a few tens of dollars in electricity to make a duplicate of something that could gross millions a year? Code ten million projects for half a billion dollars, and if 1% take off, you break even. It's trivial to implement "clean room" rebuilds of extant products with AI and change them modestly to avoid IP/copyright limitations, all while allowing your competition to do the expensive experimentation and optimization for you...more importantly, you deny a competitor their unchallenged market, even as you take a slice. What's the response for a competitor once this happens once or twice? Well, they have to, too. And so do all of their competitors. It is an inherent race to the bottom for digital properties, one where increasingly marginal margins are whittled away as duplicates propagate.
Or.
Cartelization is another possibility. A handful of tech giants could, hypothetically, just agree to snipe new innovators, while protecting their own. In such a scenario, margins can remain robust, but innovation from the outside will be absorbed. Why bother to buy a promising app for a few million dollars, when 5 companies can spend $1k on apps, swamp the market, and destroy the original concept? One or two of those apps will survive, the other companies take the modest loss, and move on, while the winning companies take profits. Unfortunately, taken to a maximalist extreme, this will still eventually fail due to economic degradation, as the paths for individuals to earn money reduce, while extraction increases...hello Black Mirror.
I'm not the author, but I think the author raises thought-provoking points. It may be worthwhile to look into the "Genesis" Project, and the goals (explicit and implicit) it sets out for technology, engineering, medicine, and science, then complement it with Curtis Yarvin's concept of technofeudalism, which is supported by deep-pocketed donors. The dream is a fully vertically integrated economy where the "no money only spend" meme is a reality. In short, CEOs have autonomy and absolute freedom to do as they wish, while others are serfs. (A brief article on Yarvin). Put another way, whether literally or figuratively, "the humans will be discarded." Of course, all the AI absolutist supporters assume they will be the CEOs, and don't think about the inherent absurdism of a self-propagating money machine that turns us all into paperclips, but, hey, they are smart, just look at all the money they have...
I agree. This has been a long standing concern with search engines. Yet here we are, doing Google searches to find out if anyone has ever thought of our latest great idea. I would actually want to...
I agree. This has been a long standing concern with search engines. Yet here we are, doing Google searches to find out if anyone has ever thought of our latest great idea.
I would actually want to believe that our new AI overlords will actually "steal" all our ideas because nothing ever happens, but I don't see how this is different from what our old search overlords could have done. If there is a reason this is different, I'm open to reasons why.
Yes, but the target pregnancy hoax is also much closer to reality these days. It's becoming increasingly commonplace for this interaction: Two people have conversation in private One or more of...
Yes, but the target pregnancy hoax is also much closer to reality these days.
It's becoming increasingly commonplace for this interaction:
Two people have conversation in private
One or more of these people start getting targetted ads directly related to the conversation.
People presume phone is listening in.
While 3 might not be true, the fact that 2 can happen as frequently as it does is obscene.
I think this alludes to the Target pregnancy scandal, which was debunked.
People have been saying "AI can predict human behavior" since that article (from 2012). Unprofitable disasters like the Metaverse, unappealing advertisements, and lack of strong evidence make me skeptical.
Before LLMs, big companies absorbed and shipped ideas from smaller companies after launch: this famously happened in 1998 with Watson/Sherlock for Mac OS. Another example is Trader Joes ripping off smaller brands, by faking interest in stocking to get product samples. Nowadays, I most frequently see big companies steal smaller companies' ideas via acqui-hire (e.g. OpenClaw and Moltbook).
I think it's unlikely a big company would steal an idea before it demonstrated success; such an idea may not succeed at all, and big companies tend to avoid risk, plus there are many other promising ideas. Meanwhile, acqui-hiring or outright buying a rising startup is relatively cheap for a billion-dollar company, though life-changing for the startup founder(s); most deals are in the single- to triple-digit millions.
I think you're missing the analogy of the dark forest. All it takes is one bad actor to do this for it to become the dominant strategy. The existence of one "hunter" mandates a response that proliferates. It is, quite literally, "if you can't beat them, join them."
I think you are unnecessarily discounting the absolutely trivial risk to reward ratio that mass data center compute offers. Why not spend a few tens of dollars in electricity to make a duplicate of something that could gross millions a year? Code ten million projects for half a billion dollars, and if 1% take off, you break even. It's trivial to implement "clean room" rebuilds of extant products with AI and change them modestly to avoid IP/copyright limitations, all while allowing your competition to do the expensive experimentation and optimization for you...more importantly, you deny a competitor their unchallenged market, even as you take a slice. What's the response for a competitor once this happens once or twice? Well, they have to, too. And so do all of their competitors. It is an inherent race to the bottom for digital properties, one where increasingly marginal margins are whittled away as duplicates propagate.
Or.
Cartelization is another possibility. A handful of tech giants could, hypothetically, just agree to snipe new innovators, while protecting their own. In such a scenario, margins can remain robust, but innovation from the outside will be absorbed. Why bother to buy a promising app for a few million dollars, when 5 companies can spend $1k on apps, swamp the market, and destroy the original concept? One or two of those apps will survive, the other companies take the modest loss, and move on, while the winning companies take profits. Unfortunately, taken to a maximalist extreme, this will still eventually fail due to economic degradation, as the paths for individuals to earn money reduce, while extraction increases...hello Black Mirror.
I'm not the author, but I think the author raises thought-provoking points. It may be worthwhile to look into the "Genesis" Project, and the goals (explicit and implicit) it sets out for technology, engineering, medicine, and science, then complement it with Curtis Yarvin's concept of technofeudalism, which is supported by deep-pocketed donors. The dream is a fully vertically integrated economy where the "no money only spend" meme is a reality. In short, CEOs have autonomy and absolute freedom to do as they wish, while others are serfs. (A brief article on Yarvin). Put another way, whether literally or figuratively, "the humans will be discarded." Of course, all the AI absolutist supporters assume they will be the CEOs, and don't think about the inherent absurdism of a self-propagating money machine that turns us all into paperclips, but, hey, they are smart, just look at all the money they have...
I agree. This has been a long standing concern with search engines. Yet here we are, doing Google searches to find out if anyone has ever thought of our latest great idea.
I would actually want to believe that our new AI overlords will actually "steal" all our ideas because nothing ever happens, but I don't see how this is different from what our old search overlords could have done. If there is a reason this is different, I'm open to reasons why.
Yes, but the target pregnancy hoax is also much closer to reality these days.
It's becoming increasingly commonplace for this interaction:
While 3 might not be true, the fact that 2 can happen as frequently as it does is obscene.