28
votes
Sam Altman’s second coming sparks new fears of the AI apocalypse
Link information
This data is scraped automatically and may be incorrect.
- Authors
- Peter Guest Morgan Meaker, Jaina Grey, Julian Chokkattu, Steven Levy, Thor Benson, Paresh Dave, Will Knight
- Published
- Nov 22 2023
- Word count
- 2066 words
I am personally very skeptical about claims by AI companies that they can be trusted to regulate themselves via "mandatory self-regulation through codes of conduct". Regulatory capture is bad enough; we don't need tech companies to start the game on the final square of the board. We need regulation around not only the big matters such as erosion of democratic institutions and manipulation of public sentiment (aka superpowered a propaganda machine; see Oops! We Automated Bullshit for the suggestion that AI has automated politicians' ability to sway voters using content-free speech) but also the more mundane problems of "job displacement, around discrimination, around transparency and accountability."
And for all the non-profit declarations from OpenAI, I believe this quote from Satya Nadella expresses all we need to know: "I’ll be very, very clear: We’re never going to get back into a situation where we get surprised like this, ever again." Microsoft and money are solidly in control.
What regulations can we even propose right now? All I can think of that needs addressing are the IP concerns. Outside of that does it make any sense to regulate when you don’t know what problems will arise?
Nope, especially when we still have a generation of politicians who don't understand tech and have a history of ignoring or misunderstanding the people who do.
We could have a whole regulatory body made up of more capable regulators. But they’d mostly be sitting around right now. Maybe for years to come.
This seems entirely unfair considering that Sam Altman has repeatedly called for regulation of AI. He testified before the Senate Judiciary committee about it! They want to be regulated. OpenAI's voluntary commitments are a substitute for regulation because we don't have it yet.
(Meanwhile, they get a lot of heat from libertarians on the other side for being too pro-regulation.)
Altman’s calls for regulation struck me more as an angle on regulatory capture than a good faith effort to guide ethical AI development. OpenAI made a strong show with their first mover advantage, then Sam decided it would be beneficial for them to pull up the ladder behind them.
Apparently there's no way to win this one.
Yes, regulations have often been used as a way to limit the competition. Lobby for complex regulation that only bigger companies can handle to make it difficult for new companies to enter the market.
It's been going on for long enough that it's part of the standard playbook. No executive in a for-profit industry is calling for regulation unless they're planning to get in the room and shape the laws.
Anything else would be grossly irresponsible in respect to investors or shareholders.
OpenAI tried to self-regulated and look what happened.
Mirror, for those hit by the paywall:
https://archive.is/64H59