• vala@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 hours ago

    It’s so important to differentiate between commercial LLMs and AI as a general concept.

  • vane@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    From this page it turns out that every prompt is one glass of water. Is there any chance we run out of water at this point ?

    • Kissaki@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 hours ago

      There have been reports of AI data centers further draining water reserves in areas of non abundant nor sufficiently recovering water. Which has not only environment but social and human consequences in the area.

    • boaratio@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      21 hours ago

      I appreciate you sharing sources for that. I know almond use a lot of water. But one of the things you mentioned is food, and the other is a liar.

      • jsomae@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        21 hours ago

        that’s very pragmatic, but you can also flip this around – almonds are a luxury compared to other more practical foods, whereas LLMs can help a coder net an income if used properly. I don’t think you can justify almonds if you’re going to claim AI usage is unethical on purely environmental grounds.

        Anyway, check out the third link for more info on the total water usage of data centers; it doesn’t really add up to much compared to much larger things like golf courses. I don’t get why anyone would use water usage as a reason to agitate against AI for given that there are so many worse problems AI is causing.

  • Log in | Sign up@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Yeah, AI is shit and a massive waste of energy, but it’s NOTHING compared to the energy usage of the airline industry.

      • Log in | Sign up@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        I checked. The IEA says airlines generate about a gigaton of CO2, and it’s still growing since the dip of covid, which is perhaps where your infographic authors got their screwy figures, which are, like I suggested, the wrong order of magnitude.

      • Log in | Sign up@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        Picked at random, It also claims this:

        Why does nighttime AI use burn dirtier energy? Fossil fuel dominance: Coal and gas supply up to 90% of overnight electricity. Solar drop-off: Solar disappears after sunset, while wind delivers only ~30% capacity at night. Peak carbon hours: Between 2–4 AM, grid intensity rises to 450–650 gCO₂/kWh, compared to 200–300 gCO₂/kWh in the afternoon.

        This is complete bullshit in the UK, where energy is greenest in the small hours of the night when demand is low and the wind turbines are still turning. Least green and most expensive is late afternoon and evening, when energy usage spikes.

        Let me reiterate. AI is crap. AI is a massive waste of energy, but your website has its calculations off in terms of order of magnitude when it comes to comparing the airline industry pushing tons of metal fast and hard into and through the sky with AI pushing a bunch of electrons through a bunch of transistors. Seriously, way off.

  • NewNewAugustEast@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Well what you said is not true, but since you are so interested in this, why limit it to AI? Just quit using computers all together.

      • NewNewAugustEast@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        They said that AI is polluting worse than global air travel. They are mixing up pollution vs energy used. If it was pollution global air travel creates 80 Million Tons of CO a month. All AI in use is 15 million tons a month. Global air travel is far more polluting.

        As an aside, and this is crazy: there is a reference, in the article OP posted to a paper, that suggests that humans, are far worse than AI for CO creation depending on the task. Which I found surprising.

        So I read the published paper in the journal Nature and:

        Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writer, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts. Emissions analyses do not account for social impacts such as professional displacement, legality, and rebound effects. In addition, AI is not a substitute for all human tasks. Nevertheless, at present, the use of AI holds the potential to carry out several major activities at much lower emission levels than can humans.

        Ok I honestly did not see that coming.

      • NewNewAugustEast@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        Thanks for clarifying. You made up statistics, your post is nonsense.

        And you responded without any consideration that the consistent reliance on computers, in general, is using a HUGE amount of energy, AI or not, indicate that you simply want to chase windmills and not have a conversation. Well played.

        HurrDeeeDurrrr indeed. Next time let the grown ups talk.

          • NewNewAugustEast@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 day ago

            Arrg! I didnt mean to delete what I wrote I was just trying to update it.

            You confused energy use with pollution.

            And what I wrote before was:

            I basically said that I was serious, people if they cared would stop using computers. But I am not going to stop, you are not going to stop, so data centers are going to grow no matter what we do, and computing use is going to increase energy consumption. We need to (even says in the article you posted in the links) improve efficiency, get better hardware, use lower cost training models, use energy recovery and not use lossy evaporator cooling.

  • boovard@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Barely ever used it just for that reason and the fact that the algorithms are getting worse by the day. But now my work is forcing us to use it. To increase productivity you see…

  • surph_ninja@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 days ago

    A lot of these studies they list are already years outdated and irrelevant. The models are much more efficient now, and it’s mainly the Musk owned AI data centers that are high pollution. Most of the pollution from the majority of data centers is not from AI, but other use.

    The old room-sized ENIAC computers used 150-200 kW of power, and couldn’t do even a fraction of what your smart phone can do. The anti-AI people are taking advantage of most people’s ignorance, intentionally using outdated studies, and implying that the power usage will continue to grow- when in fact it has already shrunk dramatically.

    • Reygle@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      A Phone can’t do anything. It can send/receive and the datacenter does the work. Surely everyone understands this.

      A modern AI data center have already shot right past 200 Terrawatt hours and are on track to double again in the next two years.

      People can’t be this blind.

      • AeonFelis@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        A phone can do a lot. Much much more than ENIAC era supercomputer (I think you’ll have to get pretty close to the end of the previous century to find a supercomputer more powerful than a modern smartphone)

        What a phone can’t do is run an LLM. Even powerful gaming PCs are struggling with that - they can only run the less powerful models and queries that’d feel instant on service-based LLMs would take minutes - or at least tens of seconds - on a single consumer GPU. Phones certainly can’t handle that, but that doesn’t mean that “cant’ do anything”.

      • surph_ninja@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        LoL. Guess I can just get rid of phone’s processor then, huh?

        And again, you link an image from an outdated study. Because the new data shows the use declining, so it wouldn’t help your fear mongering.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I did some research and according to some AI’s this is true. According to some other AI’s this is false.

    • MangoCats@feddit.it
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      The statement strikes me as overblown extreme position staking.

      I use AI in my work, not every day, not even every week, but once in a while I’ll run 20-30 queries in a multi-hour session. At the estimated 2Wh per query, that puts my long day of AI code work at 60Wh.

      By comparison, driving an electric car consumes approximately 250Wh per mile. So… my evil day spent coding with AI has burned as much energy as a 1/4 mile of driving a relatively efficient car, something that happens every 15 seconds while cruising down the highway…

      In other words, my conscience is clear about my personal AI energy usage, and my $20/month subscription fee would seem to amply pay for all the power consumed and then some.

      Now, if you want to talk about the massive data mining operations taking place at global-multinational corporations, especially those trolling the internet to build population profiles for their own advantages and profit… that’s a very different scale than one person tapping away at a keyboard. Do they scale up to the same energy usage as the 12 million gallons of jet fuel burned hourly by the air travel (and cargo) industries? Probably not yet.

      9.6kWh of energy in a gallon of jet fuel, so just jet fuel consumption is burning over 115 Gigawatts on average, 24-7-365.

        • MangoCats@feddit.it
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 hours ago

          I hope your recycling is net carbon neutral as well. Example: how much CO2 is released by a recycling program which sends big diesel trucks all over the city to collect recyclables including cardboard, sorting that cardboard at a facility, shipping a small fraction of that to a pulp recycling facility and making recycled cardboard from the post-consumer captured pulp? Consider the alternative to be: torching the cardboard at the endpoint of use - direct conversion to CO2 without the additional steps.

          Don’t forget: new from pulpwood cardboard also is contributing to (temporary) carbon capture by growing the pulpwood trees which also provides groundwater recharge and wildlife habitat on the pulpwood tree farms - instead of the pavement, concrete, steel, electricity and fuel consumption of the recycling process.

  • melfie@lemy.lol
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I have started using Copilot more lately, but I’ve also switched from plastic straws to paper, so I’m good, right?

  • Wildmimic@anarchist.nexus
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    OP, this statement is bullshit. you can do about 5 million requests for ONE flight.

    i’m gonna quote my old post:

    I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.

    The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.

    Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!

    Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.

    Read up here !

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 day ago

      If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365…

      Added up, it’s not that much power, or water.

      That’s unrealistic, of course, but that’s literally the physical cap of what humanity can produce at the moment.

    • Reygle@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      If you only include chat bots, your numbers look good. Sadly reality isn’t in “chat bots”.

        • Reygle@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 day ago

          Image/Video generation, analysis (them scrubbing the entire public internet) consumes far, far more than someone asking an AT “grok is this true”

          • lets_get_off_lemmy@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 day ago

            Do you have a source for this claim? I see this report by Google and MIT Tech Review that says image/video generation does use a lot of energy compared to text generation.

            Taking the data from those articles, we get this table:

            AI Activity Source Energy Use (per prompt) Everyday Comparison
            Median Gemini Text Prompt Google Report 0.24 Wh Less energy than watching a 100W TV for 9 seconds.
            High-Quality AI Image MIT Article ~1.22 Wh Running a standard microwave for about 4 seconds.
            Complex AI Text Query MIT Article ~1.86 Wh Roughly equivalent to charging a pair of wireless earbuds for 2-3 minutes.
            Single AI Video (5-sec) MIT Article ~944 Wh (0.94 kWh) Nearly the same energy as running a full, energy-efficient dishwasher cycle.
            “Daily AI Habit” MIT Article ~2,900 Wh (2.9 kWh) A bit more than an average US refrigerator consumes in a full 24-hour period.
            • MangoCats@feddit.it
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 day ago

              Another way of looking at this: A “Daily AI Habit” on your table is about the same as driving a Tesla 10 miles, or a standard gas car about 3 miles.

              Edit 4 AI videos, or detour and take the scenic route home from work… about the same impact.

              • lets_get_off_lemmy@reddthat.com
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                1 day ago

                I like that as well, thank you! Yeah, the “Daily AI Habit” in the MIT article was described as…

                Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise.

                Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram.

                You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.

                As a daily AI user, I almost never use image or video generation and it is basically all text (mostly in the form of code), so I think this daily habit likely wouldn’t fit for most people that use it on a daily basis, but that was their metric.

                The MIT article also mentions that we shouldn’t try and reverse engineer energy usage numbers and that we should encourage companies to release data because the numbers are invariably going to be off. And Google’s technical report affirms this. It shows that non-production estimates for energy usage by AI are over-estimating because of the economies of scale that a production system is able to achieve.

                Edit: more context: my daily AI usage, on the extremely, extremely high end, let’s say is 1,000 median text prompts from a production-level AI provider (code editor, chat window, document editing). That’s equivalent to watching TV for 36 minutes. The average daily consumption of TV in the US is around 3 hours per day.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        I’m not sure what you’re referencing. Imagegen models are not much different, especially now that they’re going transformers/MoE. Video gen models are chunky indeed, but more rarely used, and they’re usually much smaller parameter counts.

        Basically anything else machine learning is an order of magnitude less energy, at least.

  • ayyy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Your article doesn’t even claim that. Do you have any idea just how carbon intensive a flight is?

    • interdimensionalmeme@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I imagine people making that claim accept air travel as useful and “AI”, really, all datacenters as not useful. I’ve had people tell me oh, air travel is more efficient per mile that road travel. But this ignores that people wouldn’t drive thousands of miles if it was not as easy as booking a flight.

  • AndiHutch@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    It also pollutes the mind of ignorant people with misinformation. Not that that is anything new. But I do think objective truth is very important in a democratic society. It reminds me of that video that used to go around that showed Sinclair Broadcasting in like 20 some different ‘local’ broadcast news all repeating the same words verbatim. It ended with ‘This is extremely dangerous to our democracy’. With AI being added to all the search engines, it is really easy to look something and unknowingly get bombarded with false info pulled out of the dregs of internet. 90% of people don’t verify the answer to see if it is based in reality.