Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

  • dustyData@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    AI is on another completely different level of energy consumption. Consider that Sam Altman, of OpenAI, is investing on Nuclear power plants to feed directly their next iterations of AI models. That’s a whole ass nuclear reactor to feed one AI model. Because the amount of energy we currently create is several magnitudes not enough for what they want. We are struggling to feed these monsters, it is nothing like how supercomputers tax the grid.

    • 𝚜𝚑𝚊𝚍𝚎𝚊𝚛𝚐@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Supercomputers were feared to be untenable resource consumers then, too.

      Utilizing nuclear to feed AI may be the responsible and sustainable option, but there’s a lot of FUD surrounding all of these things.

      One thing is certain: Humans (and now AI) will continue to advance technology, regardless of consequence.

      • dustyData@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Would you kindly find a source for that? Supercomputers run discrete analyses or processes then halt. The big problem with these LLMs is that they run as on line services that have to be on all the time to chat with millions of users online. The fact they’re never turned off is the marked difference. As far as I recall, supercomputers have always been about power efficiency and don’t ever recall anyone suggesting to plug one to a nuclear reactor just to run it. Power consumption has never been the most important concern about even exaflops supercomputers.

        Another factor is that there aren’t that many supercomputers in the world, a handful of thousand of them. While it takes that same number of servers, which are less energy efficient and run 24/7 all year, to keep an LLM service up and available to the public with 5 nines. That alone overruns even the most power hungry supercomputers in the world.

        • 𝚜𝚑𝚊𝚍𝚎𝚊𝚛𝚐@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          6 months ago

          Would you kindly find a source for that?

          I can personally speak from the 80s, so that’s not exactly a golden age of reliable information. There was concern about scale of infinite growth and power requirements in a perpetual 24/7 full-load timeshare by people that were almost certainly not qualified to talk about the subject.

          I was never concerned enough to look into it, but I sure remember the FUD: “They are going to grow to the size of countries!” - “They are going to drink our oceans dry!” … Like I said, unqualified people.

          Another factor is that there aren’t that many supercomputers in the world, a handful of thousand of them.

          They never took off like the concerned feared. We don’t even concern ourselves with their existence.

          Edit: grammar

          • dustyData@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            For what is worth, this time around it isn’t unqualified people. There are strong scientifically studied concerns, not that infinite growth of LLMs, but their current numbers are already too power hungry. And what actual plans are currently in the engineering pipes are too much as well, not wild speculation, but actually funded and on the way development.

            • 𝚜𝚑𝚊𝚍𝚎𝚊𝚛𝚐@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              I am concerned about the energy abuse of LLMs, but it gets worse. AGI is right around the corner, and I fear that law of diminishing return may not apply due to advantages it will bring. We’re in need of new, sustainable energy like nuclear now because it will not stop.