What's your p(doom)?
Now that ChatGPT's been around for long enough to become a quotidian fixture, I think most of us have realized that we're closer than expected to generalized artificial intelligence (or at least a...
Now that ChatGPT's been around for long enough to become a quotidian fixture, I think most of us have realized that we're closer than expected to generalized artificial intelligence (or at least a reasonable facsimile of it), even when comparing to just a couple years ago.
OG AI doomers like Eliezer Yudkowsky seem a little less nutty nowadays. Even for those of us who still doubt the inevitably of the AI apocalypse, the idea has at least become conceivable.
In fact, the concept of an AI apocalypse has become mainstream enough to gain a cute moniker: p(doom), i.e. the (prior) probability that AI will inflict an existential crisis on humanity.
So for funsies, I ask my dear tilderinos: what is your p(doom)? How do you define an "existential crisis" (e.g., 90%+ population lost)? Why did you chose your prior? How would you change public policy to address your p(doom)?