In the late 2000s there was a push by a lot of businesses to not print emails and people use to add a ‘Please consider this environment before printing this email.’

Considering how bad LLMs/‘ai’ are with power consumption and water usage a new useless tag email footer should be made.

      • Trihilis@ani.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        So I’m not saying RTO is worse than AI or vice versa. But do you have any data to back up that statement. I’ve been seeing nothing but news about AI data centers being an absolute nightmare for the planet. And even more so when corrupt politicians let then be built in places that already have trouble with maintaining normal water levels.

        I get both are bad for the environment.

        Editie: thanks for all the sources everyone. TIL

        • Showroom7561@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Proof was during COVID:

          In many megacities of the world, the concentration of PM and NO2 declined by > 60% during the lockdown period. The air quality index (AQI) also improved substantially throughout the world during the lockdown. SOURCE

  • SolidShake@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    I don’t think regular people really understand the power needed for AI. It’s often taught that we just have it. But not where it comes from.

    • fading_person@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      People keep telling us that ai energy use is very low, but at the same time, companies keep building more and more giant power hungry datacenters. Something simply doesn’t add up.

      Sure, a small local model can generate text at low power usage, but how useful will that text be, and how many people will actually use it? What I see is people constantly moving to the newest, greatest model, and using it for more and more things, processing more and more tokens. Always more and more.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        Each datacenter is set to handle millions of users, so it concentrates all the little requests into very few physical locations.

        The tech industry further amplifies things with ambient LLM invocation. You do a random google search, it implicitly does an LLM unasked. When a user is using an LLM enabled code editor, it’s making LLM requests every few seconds of typing to drive the autocomplete suggestions. Often it has to submit a new LLM request before the old one even completed because the user typed more while the LLM was chewing on the previous input.

        So each LLM invocation may be reasonable, but they are being concentrated impact wise into very few places and invocations are amplified by tech industry being overly aggressive about overuse for the sake of 'ambient magic.

    • cdf12345@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      “If everyone is littering, it’s not a big deal if I throw the occasional can on the ground”

      • Artisian@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 months ago

        I mean, depends on the email. If you spend more time answering yourself than the AI would, you almost certainly emit more green house gasses, used more fresh water and electricity, and burned more calories. Depending on the email, you might have also decreased net happiness generally.

        Do we care about the environment or not? Please, oppose datacenters in desserts and stop farming alfalfa where water supplies are low. But your friend using AI to answer an email that could have been a google search is not the problem.