Net-zero emission goals went out the window with AI.

  • kitnaht@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    3
    ·
    edit-2
    2 months ago

    In this world, we obey the law of thermodynamics. I’d love to know how this 3 bottles of water is “consumed”. Because more than likely, the water is simply being used for cooling, which doesn’t consume it at all, it just makes it warmer.

    • Zikeji@programming.dev
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      4
      ·
      2 months ago

      Yeah the article is disingenuous at best. There are many things wrong with generative AI, but this is just a lousy approach.

      If I make a PC, put in a water cooling loop, and use it to run an LLM - sure, water is circulating, but that water isn’t just vanishing lol.

      • iAmTheTot@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        6
        ·
        2 months ago

        My friend, you are naive at best if you think AI data centers are using closed loop water cooling. Look up evaporative cooling towers. It’s “consumed” in the sense that it is evaporated.

        • Zikeji@programming.dev
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          2 months ago

          I specifically avoided saying they did because I wasn’t knowledgeable on the topic. But I agree, I could equally be accused of being disingenuous by phrasing it in a way that could lead people to assume they use closed loops.

          I did look those up, and while evaporation cooling isn’t the only method used, it also doesn’t evaporate all the water each pass, only a portion of it (granted “a portion” is all I found at a quick look, which isn’t actually useful).

          I do agree though, the water usage is excessive, and when though that water only “changes forms”, it’s still removes it from a water source and only some of it may make its way back in.

        • Pup Biru@aussie.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          2 months ago

          and it’s still absolute crap… the heat produced by 100 words of GPT inference is negligible - it CERTAINLY doesn’t take 3L of water evaporating to cool it

    • groet@feddit.org
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      2 months ago

      It consumes the resource of “purified, available water” which is consumed as it is no longer purified or unavailable (if evaporated). The same way nothing ever “consumes” energy, it just makes it unusable.

    • LostXOR@fedia.io
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      2 months ago

      The water simply vanishes, consumed by the AI’s ever growing need for H Y D R A T I O N. /s

  • peanuts4life@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    2 months ago

    Wait… What? The article seems to imply that the water is consumed, but it’s referencing the water used in cooling loops.

    • Nawor3565@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      31
      ·
      2 months ago

      Data centers don’t have “water cooling loops” that are anything like the ones in consumer PCs. To maximize cooling capacity, a lot of the systems use some sort of evaporative cooling that results in some of the water just floating away into the atmosphere (after which point it would need to be purified again before it could be used for human consumption)

      It also seems from what I can find like some data centers just pipe in clean ambient-temperature water, use it to cool the servers, and then pipe it right back out into the municipal sewer system. Which is even more stupid, because you’re taking potable water, sending it through systems that should be pretty clean, and then mixing it with waste water. If anything, that should be considered “gray water”, which is still fine to use for things like flushing toilets.

      • AndrewZabar@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 months ago

        As with everything else, we need the government to regulate it because otherwise the corporations don’t really give a shit.

      • morbidcactus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        I would be really surprised if anyone is cooling data centres with city water except in emergency, that’s so unbelievably expensive (could see water direct from a lake though but that had it’s own issues too). I recall saving millions just by adjusting a fill target on an evaporative cooling tower so it wouldn’t overfill (levels were really cyclic, targets weren’t tuned for them), and that was only a fraction of what it’d have cost if we’d’ve used pure city.

        • Todd Bonzalez@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 months ago

          This is correct. You don’t need potable water for cooling systems. Releasing vapor returns natural water where it came from, without adding any more heat to the environment than you already were.

          The environmental cost of AI needs to be measured in gigawatt hours, distributed over different energy generation methods.

          Adding heat to the system isn’t a big deal if you’re powered by solar energy, for example.

    • Umbrias@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      water supply is a limited resource, everyone here appears to be focusing on the wrong thing. when a data center uses water in its cooling noops, that water is made inaccessible anywhere else, such as agriculture, natural habitats, drinking. it does not matter (directly) that the water technically is potable or not after use. Very little water ever leaves the earth system, yet drought exists.

  • Affidavit@lemm.ee
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    3
    ·
    2 months ago

    Huh. I run a LLM locally on my own machine. Not looking forward to my next water bill.

    • nexguy@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Have you checked your computer’s gallons per hour? I’m thinking of getting an electric myself.

  • ninjabard@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    2 months ago

    It’s almost like these “services” are an unnecessary blight that benefit only those that profit financially from them.

    • desktop_user@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      that, is what a service always was and will continue to be, live service games, service jobs, telco service. this isn’t new, it just affects more people

  • fubarx@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 months ago

    The Excel spreadsheet that calculates this has so many ‘assumption’ cells.

  • Bob Robertson IX@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 months ago

    We need municipal datacenters that can be integrated into the municipal water departments, and municipal electrical grid. Use the hot water to provide ‘on tap’ hot water for local businesses that need it.

    • RubberDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Jup, way more of these kinds of solutions are needed. But data enters usually add stuff to the water to make it cool better and make it undrinkable. But the whole ordeal just shows water and power is too cheap for these kinds of uses.

  • nexguy@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 months ago

    How many bottles of water does generating a bottle of water consume? Checkmate water bottle and water bottle related statistical analysis enthusiasts.

  • okamiueru@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 months ago

    Can anyone explain the conversion from “a bottle of water” to something like kWh?

    • Affidavit@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 months ago

      Sorry for the delayed response, it took me a while to do the calculations but I finally figured it out:

      It’s magic.

      I hope this helps.

  • SpikesOtherDog
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 months ago

    I’m not 100% down with these numbers. The verge has a breakdown of energy usage for generation and training, and you could argue that demand is responsible for training.

    I would also argue that energy usage would be directly related to water usage. Unless there is passive cooling unrelated to the energy generation, the evaporation would be directly related to the energy cost.

    I didn’t collect sources while I was coming this, but I found that it takes about .3 KWH to generate an AI image - about the same as fully charging a smartphone. 1kwh is ~860 kCal (one Calorie = kCal = 1000 calories) 1 image is ~282 Calories 1 Calorie heats 1 liter of water 1 degree It takes ~540 Calories to vaporise 1 liter 2 images vaporize a liter of water There are ~30k liters in an 18 foot above ground pool with 4ft of water. There were 15 billion images generated daily in May 2024 As of August 2023, people have generated almost 15.5 billion AI-generated images, and each day sees approximately 34 million new AI-generated images. 17 million liters vaporized daily, about 500 swimming pools This put my numbers at ~250k swimming pools vaporized so far.

    • BrikoX@lemmy.zipOPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      <…> I found that it takes about .3 KWH to generate an AI image <…>

      There isn’t really set power usage per image, since different models will take different amount of time. There are 100s of different factors and optional toggles that can increase or reduce time needed.

      • SpikesOtherDog
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        2 months ago

        I was using a pretty broad brush. Some of the figures were higher, some were lower, but it many were consistently around .3. Feel free to take it with a grain of salt. Even if I’m over by 50%, it is still a large number.

        • 0laura@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          my GPU can use 80 watts max and it takes 10 seconds to produce an image, that’s about 0.00022kwh, which is 1300 times less than what you said.

    • 0laura@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      it takes about 10 seconds for me to generate an image, my GPU can use 80 watts max. that’s 800 watt seconds or 0.0002222 kWh, if my math is correct

      • SpikesOtherDog
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        That’s a good point. I’m guessing my numbers might refer more to cloud providers than individuals with smaller data sets.

        • 0laura@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          yea, i feel like theres a lot of misinformation around ai, both from ai bros and ai haters. im definitely not a fan of big corporations stealing the work of small artists, but it seems like most places have just become an ai hate circlejerk