• Armok_the_bunny@lemmy.world
    link
    fedilink
    English
    arrow-up
    120
    arrow-down
    5
    ·
    24 days ago

    Cool, now how much power was consumed before even a single prompt was ran in training that model, and how much power is consumed on an ongoing basis adding new data to those AI models even without user prompts. Also how much power was consumed with each query before AI was shoved down our throats, and how many prompts does an average user make per day?

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      49
      arrow-down
      17
      ·
      edit-2
      24 days ago

      I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.

      Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.

      • Damage@feddit.it
        link
        fedilink
        English
        arrow-up
        56
        arrow-down
        3
        ·
        24 days ago

        If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?

        • Imacat@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          2
          ·
          24 days ago

          To be fair, nuclear power is cool as fuck and would reduce the carbon footprint of all sorts of bullshit.

        • Armok_the_bunny@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          24 days ago

          Volume of requests and power consumption requirements unrelated to requests made, at least I have to assume. Certainly doesn’t help that google has forced me to make a request to their ai every time I run a standard search.

          • Rentlar@lemmy.ca
            link
            fedilink
            English
            arrow-up
            17
            ·
            24 days ago

            Seriously. I’d be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.

            The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.

            • zlatko@programming.dev
              link
              fedilink
              English
              arrow-up
              5
              ·
              24 days ago

              I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?

        • douglasg14b@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          edit-2
          24 days ago

          That’s not small…

          100’s of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.

          A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to… 600 Megawatts

          Or about 60 houses energy usage for a year in the U.S.


          It’s an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.

      • taiyang@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        24 days ago

        I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we’ve had industrial usage of energy and water usage that isn’t sustainable… almonds in CA alone are a bigger problem than AI, for instance.

        Not that I’m pro-AI cause it’s a huge headache from so many other perspectives, but the environmental argument isn’t enough. Corpo greed is probably the biggest argument against it, imo.

      • boor@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        19 days ago

        Please show your math.

        One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s 42 day’s use of one server is equivalent to the electricity consumption of the average USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware elsewhere in the data center, let alone the electricity required to run the facility’s climate control.

        xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.

        H100 is old. State of the art GB200 NVL72 is 120kW per rack.

        Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has built multiple new natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.

        This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 days ago

          I should have specified it was an earlier llama model. They have scaled up to more then a flight or two. You are mostly right except for how much a house uses. It’s about 10,500 kW per year, you’re off by a thousand. It uses in an hour about 8 hours of house time, which is still a lot though, specially when you consider musks 1 million gpus.

          https://kaspergroesludvigsen.medium.com/facebook-disclose-the-carbon-footprint-of-their-new-llama-models-9629a3c5c28b

          Their first model took 2 600 000 kwh, a plane takes about 500 000. The actual napkin math was 5 flights. I had done the math like 2 years ago but yeah, I was mistaken and should have at least specified it was for their first model. Their more recent ones have been a lot more energy intensive I think.

          • boor@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            19 days ago

            Thanks for catching, you are right that the average USA home is 10.5MWh/year instead of kWh. I was mistaken. :)

            Regarding the remainder, my point is that the scale of modern frontier model training, and the total net-new electricity demand that AI is creating is not trivial. Worrying about other traditional sources of CO2 emissions like air travel and so forth is reasonable, but I disagree with the conclusion that AI infrastructure is not a major environmental and climate change concern. The latest projects are on the scale of 2-5GW per site, and the vast majority of that new electricity capacity will come from natural gas or other hydrocarbons.

      • douglasg14b@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        edit-2
        24 days ago

        A flight to Europe’s worth of energy is a pretty asinine way to measure this. Is it not?

        It’s also not that small the number, being ~600 Megawatts of energy.

        However, training cost is considerably less than prompting cost. Making your argument incredibly biased.

        Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.

  • sbv@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    46
    ·
    24 days ago

    In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.

    • unmagical@lemmy.ml
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      1
      ·
      24 days ago

      There are zero downsides when mentally associating an energy hog with “1 second of use time of the device that is routinely used for minutes at a time.”

      https://xkcd.com/1035/

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        edit-2
        24 days ago

        With regard to sugar: when I started counting calories I discovered that the actual amounts of calories in certain foods were not what I intuitively assumed. Some foods turned out to be much less unhealthy than I thought. For example, I can eat almost three pints of ice cream a day and not gain weight (as long as I don’t eat anything else). So sometimes instead of eating a normal dinner, I want to eat a whole pint of ice cream and I can do so guilt-free.

        Likewise, I use both AI and a microwave, my energy use from AI in a day is apparently less than the energy I use to reheat a cup of tea, so the conclusion that I can use AI however much I want to without significantly affecting my environmental impact is the correct one.

        • Victor@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          24 days ago

          You should probably not eat things because of how much calories they have or don’t have, but because of how much of their nutrients you need, and how much they lack other, dangerous shit. Also eat slowly until you’re full and no more. Also move a lot.

          We shouldn’t need calculators for this healthy lifestyle.

          The reason for needing to know which foods are healthy is because… well, we forgot.

          • ArbitraryValue@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            24 days ago

            I’m not saying that ice cream is healthier than a normal dinner, just that if I really crave something sweet then the cost to my health of eating it periodically is actually quite low, whereas the cost of some other desserts (baked sweets are often the worst offenders) is relatively high. That means that a lot can be gained simply by replacing one dessert with a different, equally tasty dessert. Hence my ice cream advocacy.

        • unmagical@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          24 days ago

          On a “respond to an individual query” level, yeah it’s not that much. But prior to response the data center had to be constructed, the entire web had to be scraped, the models trained, the servers continually ran regardless of load. There’s also way too many “hidden” queries across the web in general from companies trying to summarize every email or product.

          All of that adds to the energy costs. This equivocation is meant to make people feel less bad about the energy impact of using AI, when so much of the cost is in building AI.

          Furthermore, that’s the median value–the one that falls right in the middle of the quantity of queries. There’s a limit to how much less energy a query to the left of the median can use; there’s a significantly higher runway to the right of the median for excess energy use. This also only accounted for text queries; images and video generation efforts are gonna use a lot more.

          • ArbitraryValue@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            24 days ago

            Your points are valid, but I think that building AI has benefits beyond simply enabling people to use that AI. It advances the state of the art and makes even more powerful AI possible. Still, it would be good to know about the amortized cost per query of building the AI in addition to the cost of running it.

        • dohpaz42@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          24 days ago

          Individually you’re spot on. Your AI use doesn’t matter. But, and this is where companies tend to leave off, when you take into account how many millions (or billions) of times something is done in a day (like AI prompts), then that’s when it genuinely matters.

          To me, this is akin to companies trying to pass the blame to consumers when it’s the companies themselves who are the biggest climate offenders.

          • ArbitraryValue@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            24 days ago

            I don’t see why this argument works better against AI than it does against microwaves. Those are used hundreds of millions of times a day too.

            • dohpaz42@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              24 days ago

              You’re right. But if I had to pick a why, I’d go with how microwaves at least provide a service for households by heating up food.

              AI’s only viable service (at the time of this writing) is a replacement for viagra for techbros when they need to get an erection.

    • DarkCloud@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      24 days ago

      The article also mentions each enquiry also evaporates 0.26 of a milliliter of water… or “about five drops”.

    • ganksy@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      24 days ago

      In addition:

      This report was also strictly limited to text prompts, so it doesn’t represent what’s needed to generate an image or a video.

    • plyth@feddit.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 days ago

      The human mind uses about 100 watt. The equivalent would be 400 questions per hour, 6 per minute or one every 10 seconds. That’s close to human capacity.

  • rowrowrowyourboat@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    24 days ago

    This feels like PR bullshit to make people feel like AI isn’t all that bad. Assuming what they’re releasing is even true. Not like cigarette, oil, or sugar companies ever lied or anything and put out false studies and misleading data.

    However, there are still details that the company isn’t sharing in this report. One major question mark is the total number of queries that Gemini gets each day, which would allow estimates of the AI tool’s total energy demand.

    Why wouldn’t they release this. Even if each query uses minimal energy, but there are countless of them a day, it would mean a huge use of energy.

    Which is probably what’s happening and why they’re not releasing that number.

  • L0rdMathias@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    24 days ago

    median prompt size

    Someone didn’t pass statistics, but did pass their marketing data presention classes.

    Wake me up when they release useful data.

  • Rhaedas@fedia.io
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    24 days ago

    Now do training centers, since it’s obvious they are never going to settle on a final model as they pursue the Grail of AGI. I could do the exact same comparison with my local computer and claim that running a prompt only uses X amount of watts because the GPU heats up for a few seconds and is done. But if I were to do some fine tuning or other training, that fan will stay on for hours. A lot different.

  • salty_chief@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    22
    ·
    24 days ago

    So as thought virtually no impact. AI is here and not leaving. It will outlast humans on earth probably.