Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn’t require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.
Most of these figures are guesses along a spectrum of “educated” since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you’re looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.
That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn’t matter–there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).
Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn’t require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.
Most of these figures are guesses along a spectrum of “educated” since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you’re looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.
That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn’t matter–there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).
Well, most of the carbon footprint for models is in training, which you probably don’t need to do at home.
That said, even with training they are not nearly our leading cause of pollution.
Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days
Basically every tech company is using it… It’s millions of people, not just us…
Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.