• blaue_Fledermaus@mstdn.io
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    1 day ago

    Makes me wonder what they are doing to reach these figures.
    Because I can run many models at home and it wouldn’t require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.

    • FatCrab@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 hours ago

      Most of these figures are guesses along a spectrum of “educated” since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you’re looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.

      That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn’t matter–there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).

    • Artisian@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      23 hours ago

      Well, most of the carbon footprint for models is in training, which you probably don’t need to do at home.

      That said, even with training they are not nearly our leading cause of pollution.

      • REDACTED@infosec.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        18 hours ago

        Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days

      • very_well_lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 hours ago

        Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.