The question seems extraordinarily vague. There are many kinds of artificial intelligence and many ways to be green. But considering that there is AI that runs on mobile phones, I think the answer...
The question seems extraordinarily vague. There are many kinds of artificial intelligence and many ways to be green. But considering that there is AI that runs on mobile phones, I think the answer has to be yes.
How much power do good LLM’s use in practice is another question.
A comparison that I think makes sense would be the cost of ChatGPT versus watching Netflix. How many seconds do you need to watch a movie for it to use the same amount of power as a typical ChatGPT query?
It isn't just a question, it's the title of an article about a non-profit focused in climate action engaging with gen-AI features. Netflix never had to build way more energy capacity available to...
It isn't just a question, it's the title of an article about a non-profit focused in climate action engaging with gen-AI features.
A comparison that I think makes sense would be the cost of ChatGPT versus watching Netflix. How many seconds do you need to watch a movie for it to use the same amount of power as a typical ChatGPT query?
Netflix never had to build way more energy capacity available to run whatever it needed, let alone nuclear plants. I agree that streaming is energy-intensive, but from the outside (and considering it's a bigger market in users than AI), I think this comparison misses the point.
Arguably, most of the expensive processing (video decoding) is happening on the user's device. This allows them to distribute the work. At least in some communities, nuclear plants do charge...
Netflix never had to build way more energy capacity available to run whatever it needed, let alone nuclear plants.
Arguably, most of the expensive processing (video decoding) is happening on the user's device. This allows them to distribute the work. At least in some communities, nuclear plants do charge user's smartphones.
OpenAI's work is entirely centralized, both for model training and chat inference. They accept all the power costs associated with that. Training will likely always be focused in data centers, but it seems plausible that inference will move to local use. New hardware made video decoding accessible to phones, and the same could be true of AI applications as well.
Not only that, but even though the video decoding is done client side, Netflix still had to switch to colocation a loooong time ago precisely because their computation, storage, and bandwidth...
Not only that, but even though the video decoding is done client side, Netflix still had to switch to colocation a loooong time ago precisely because their computation, storage, and bandwidth requirements were way too high to have it centralized anymore. So, sure they didn't have to build more energy capacity directly, but the data centers all over the world that now host their services for them absolutely did. Is it as much as AI collectively uses now? Probably not, but they likely still use a LOT of energy. It's just that their energy usage is spread out over hundreds/thousands of data centers the world over, which is why there was no need to build a nuclear plant for themselves.
Are you sure about that? Genuine question. I haven’t checked recently (by which I mean, probably not in half a year), but even without counting Google’s “2.3 billion [involuntary] AI users” from...
and considering it's a bigger market in users than AI
Are you sure about that? Genuine question. I haven’t checked recently (by which I mean, probably not in half a year), but even without counting Google’s “2.3 billion [involuntary] AI users” from their language model-search summary feature, these firms like OpenAI did have some pretty insane usage numbers, including unique user count.
It's certainly true that the AI boom has increased demand for data centers, electrical power and computer components. But that's a broad, industry-wide trend. In addition to that, I think it's...
It's certainly true that the AI boom has increased demand for data centers, electrical power and computer components.
But that's a broad, industry-wide trend. In addition to that, I think it's also useful to compare the marginal costs for serving user requests.
Looks like typical estimates are 0.3 watt-hours for ChatGPT queries, versus 120 watt-hours per hour of watching Netflix, with wide error bars. But it seems likely that you could do >100 ChatGPT queries before using more energy than watching a movie on Netflix.
By the way, 120 watt-hours per hour is 120 watts, so Netflix is like keeping an old-fashioned incandescent light bulb on.
I don’t really think the premise makes that much sense. Would someone say that Silksong is a “green” game and Expedition 33 is a polluting because the latter has more intensive graphics? I guess...
I don’t really think the premise makes that much sense. Would someone say that Silksong is a “green” game and Expedition 33 is a polluting because the latter has more intensive graphics?
I guess you could, I’m not sure it’s very useful.
In the end, what is called “AI” is just a bunch of matrix multiplication. It’s neither inherently good or bad for environment. Because it’s a bunch of matrices. And so is computer graphics, hence why GPUs are used for both.
The question seems extraordinarily vague. There are many kinds of artificial intelligence and many ways to be green. But considering that there is AI that runs on mobile phones, I think the answer has to be yes.
How much power do good LLM’s use in practice is another question.
A comparison that I think makes sense would be the cost of ChatGPT versus watching Netflix. How many seconds do you need to watch a movie for it to use the same amount of power as a typical ChatGPT query?
It isn't just a question, it's the title of an article about a non-profit focused in climate action engaging with gen-AI features.
Netflix never had to build way more energy capacity available to run whatever it needed, let alone nuclear plants. I agree that streaming is energy-intensive, but from the outside (and considering it's a bigger market in users than AI), I think this comparison misses the point.
Arguably, most of the expensive processing (video decoding) is happening on the user's device. This allows them to distribute the work. At least in some communities, nuclear plants do charge user's smartphones.
OpenAI's work is entirely centralized, both for model training and chat inference. They accept all the power costs associated with that. Training will likely always be focused in data centers, but it seems plausible that inference will move to local use. New hardware made video decoding accessible to phones, and the same could be true of AI applications as well.
Not only that, but even though the video decoding is done client side, Netflix still had to switch to colocation a loooong time ago precisely because their computation, storage, and bandwidth requirements were way too high to have it centralized anymore. So, sure they didn't have to build more energy capacity directly, but the data centers all over the world that now host their services for them absolutely did. Is it as much as AI collectively uses now? Probably not, but they likely still use a LOT of energy. It's just that their energy usage is spread out over hundreds/thousands of data centers the world over, which is why there was no need to build a nuclear plant for themselves.
Are you sure about that? Genuine question. I haven’t checked recently (by which I mean, probably not in half a year), but even without counting Google’s “2.3 billion [involuntary] AI users” from their language model-search summary feature, these firms like OpenAI did have some pretty insane usage numbers, including unique user count.
Yeah, Google was missing in my thinking, but YouTube alone (+2 billion users?) is a streaming bigger than any of the other ones.
It's certainly true that the AI boom has increased demand for data centers, electrical power and computer components.
But that's a broad, industry-wide trend. In addition to that, I think it's also useful to compare the marginal costs for serving user requests.
Looks like typical estimates are 0.3 watt-hours for ChatGPT queries, versus 120 watt-hours per hour of watching Netflix, with wide error bars. But it seems likely that you could do >100 ChatGPT queries before using more energy than watching a movie on Netflix.
By the way, 120 watt-hours per hour is 120 watts, so Netflix is like keeping an old-fashioned incandescent light bulb on.
I don’t really think the premise makes that much sense. Would someone say that Silksong is a “green” game and Expedition 33 is a polluting because the latter has more intensive graphics?
I guess you could, I’m not sure it’s very useful.
In the end, what is called “AI” is just a bunch of matrix multiplication. It’s neither inherently good or bad for environment. Because it’s a bunch of matrices. And so is computer graphics, hence why GPUs are used for both.