- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.
Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.
Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.
Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.
Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.
In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.
New technologies will sometimes need more energy. Thats hardly news. If we continue yo switch to renewables the impact will also be small. AI isnt even listed as its own point, heck it is not even listed in most energy budgets, yet it sounds like there will be no energy left for the rest, which is laughable, since it likely uses around 1% of the energy needed (its estimated at 2% for it in general)
This article may as well be trying to argue that we’re wasting resources by using “cloud gaming” or even by gaming on your own, PC.
deleted by creator
Gaming actually provides a real benefit for people, and resources spent on it mostly linearly provide that benefit (yes some people are addicted or etc, but people need enriching activities and gaming can be such an activity in moderation).
AI doesn’t provide much benefit yet, outside of very narrow uses, and its usefulness is mostly predicated on its continued growth of ability. The problem is pretrained transformers have stopped seeing linear growth with injection of resources, so either the people in charge admit its all a sham, or they push non linear amounts of resources at it hoping to fake growing ability long enough to achieve a new actual breakthrough.
Do you never play games with bots? Those are AI.
I’m going to assume that when you say “AI” you’re referring to LLMs like chatGPT. Otherwise I can easily point to tons of benefits that AI models provide to a wide variety of industries (and that are already in use today).
Even then, if we restrict your statement to LLMs, who are you to say that I can’t use an LLM as a dungeon master for a quick round of DnD? That has about as much purpose as gaming does, therefore it’s providing a real benefit for people in that aspect.
Beyond gaming, LLMs can also be used for brainstorming ideas, summarizing documents, and even for help with generating code in every programming language. There are very real benefits here and they are already being used in this way.
And as far as resources are concerned, there are newer models being released all the time that are better and more efficient than the last. Most recently we had Llama 3 released (just last month), so I’m not sure how you’re jumping to conclusions that we’ve hit some sort of limit in terms of efficiency with resources required to run these models (and that’s also ignoring the advances being made at a hardware level).
Because of Llama 3, we’re essentially able to have something like our own personal GLaDOS right now: https://www.reddit.com/r/LocalLLaMA/comments/1csnexs/local_glados_now_running_on_windows_11_rtx_2060/
It isn’t resource efficient, simple as that. Machine learning isn’t something new and it indeed was used for decades in one form or another. But here is the thing: when you train a model to do one task good, you can approximate learning time and the quality of it’s data analyzis, say, automating the process of setting price you charge for your hotel appartments to maximize sales and profits. When you don’t even know what it can do, and you don’t even use a bit of it’s potential, when your learning material is whatever you was dare to scrap and resources aren’t a question, well, you dance and jump over the fire in the bank’s vault. LLM of ChatGPT variety doesn’t have a purpose or a problem to solve, we come with them after the fact, and although it’s thrilling to explore what else it can do, it’s a giant waste*. Remember blockchain and how everyone was trying to put it somewhere? LLMs are the same. There are niche uses that would evolve or stay as they are completely out of picture, while hyped up examples would grow old and die off unless they find their place to be. And, currently, there’s no application in which I can bet my life on LLM’s output. Cheers on you if you found where to put it to work as I haven’t and grown irritated over seeing this buzzword everywhere.
* What I find the most annoying with them, is that they are natural monopolies coming from the resources you need to train them to the Bard\Bing level. If they’d get inserted into every field in a decade, it means the LLM providers would have power over everything. Russian Kandinsky AI stopped to show Putin and war in the bad light, for example, OpenAI’s chatbot may soon stop to draw Sam Altman getting pegged by a shy time-traveler Mikuru Asahina, and what if there would be other inobvious cases where the provider of a service just decides to exclude X from the output, like flags or mentions of Palestine or Israel? If you aren’t big enough to train a model for your needs yourself, you come under their reign.
That is a good argument, they are natural monopolies due to the resources they need to be competitive.
Now do we apply this elsewhere in life? Is anyone calling for Boeing to be broken up or Microsoft to be broken up or Amazon to be broken up or Facebook?
We are missing big time on broking them into pieces, yes. No argument. There’s something wrong if we didn’t start that process a long time ago.
Ok, first off, I’m a big fan of learning new expressions where they come from and what they mean (how they came about, etc). Could you please explain this one?:
well, you dance and jump over the fire in the bank’s vault.
And back to the original topic:
It isn’t resource efficient, simple as that.
It’s not that simple at all and it all depends on your use case for whatever model you’re talking about:
For example I could spend hours working in Photoshop to create some image that I can use as my Avatar on a website. Or I can take a few minutes generating a bunch of images through Stable Diffusion and then pick out one I like. Not only have I saved time in this task, but I have used less electricity.
In another example I could spend time/electricity to watch a Video over and over again trying to translate what someone said from one language to another, or I could use Whisper to quickly translate and transcribe what was said in a matter of seconds.
On the other hand, there are absolutely use cases where using some ML model is incredibly wasteful. Take, for example, a rain sensor on your car. Now, you could setup some AI model with a camera and computer vision to detect when to turn on your windshield wipers. But why do that when you could use this little sensor that shoots out a small laser against the window and when it detects a difference in the energy that’s normally reflected back it can activate the windshield wipers. The dedicated sensor with a low power laser will use far less energy and be way more efficient for this use case.
Cheers on you if you found where to put it to work as I haven’t and grown irritated over seeing this buzzword everywhere.
Makes sense, so many companies are jumping on this as a buzzword when they really need to stop and think if it’s necessary to implement in the first place. Personally, I have found them great as an assistant for programming code as well as brainstorming ideas or at least for helping to point me in a good direction when I am looking into something new. I treat them as if someone was trying to remember something off the top of their head. Anything coming from an LLM should be double checked and verified before committing to it.
And I absolutely agree with your final paragraph, that’s why I typically use my own local models running on my own hardware for coding/image generation/translation/transcription/etc. There are a lot of open source models out there that anyone can retrain for more specific tasks. And we need to be careful because these larger corporations are trying to stifle that kind of competition with their lobbying efforts.
Otherwise I can easily point to tons of benefits that AI models provide to a wide variety of industries
Go ahead and point. I’m going to assume when you say “AI” that you mean almost anything except actual intelligence.
I think you’re confusing “AI” with “AGI”.
“AI” doesn’t mean what it used to and if you use it today it encompasses a very wide range of tech including machine learning models:
Speech to text (STT), text to speech (TTS), Generative AI for text (LLMs), images (Midjourney/Stable Diffusion), audio (Suno). Upscaling, Computer Vision (object detection, etc).
But since you’re looking for AGI there’s nothing specific to really point at since this doesn’t exist.
Edit: typo
Speech to text (STT), text to speech (TTS), Generative AI for text (LLMs), images (Midjourney/Stable Diffusion), audio (Suno). Upscaling, Computer Vision (object detection, etc).
Yes, this is exactly what I meant. Anything except actual intelligence. Do bosses from video games count?
I think it’s smart to shift the conversation away from AI to ML, but that’s part of my point. There is a huge gulf between ML and AGI that AI purports to fill but it doesn’t. AI is precisely that hype.
If “AI doesn’t mean what it used to”, what does it mean now? What are the scientific criteria for this classification? Or is it just a profitable buzzword that can be attached to almost anything?
But since you’re looking for AGI there’s nothing specific to really point at since this doesn’t exist.
Yes, it doesn’t exist.
I mean, that’s kind of the whole point of why I was trying to nail down what the other user meant when they said “AI doesn’t provide much benefit yet”.
The definition of “AI” today is way too broad for anyone to make statements like that now.
And to make sure I understand your question, are you asking me to provide you with the definition of “AI”? Or are you asking for the definition of “AGI”?
Do bosses from video games count?
Count under the broad definition of “AI”? Yes, when we talk about bosses from video games we talk about “AI” for NPCs. And no, this should not be lumped in with any machine learning models unless the game devs created a model for controlling that NPCs behaviour.
In either case our current NPC AI logic should not be classified as AGI by any means (which should be implied since this does not exist as far as we know).
You read too many headlines and not enough papers. There is a massive list of advancements that AI has brought about. Hell, there is even a massive list of advancements that you personally benefit from daily. You might not realize it, but you are constantly benefiting from super efficient methods of matrix multiplications that AI has discovered. You benefit from drugs that have been discovered by AI. Guess what what has made google the top search engine for 20 years? AI efficiency gains. The list goes on and on…
People in this thread think AI is just the funny screenshot they saw on social media and concluded that they are smart and AI is dumb.
Absolutely. I am surprised, I would expect more from people who would end up at a site like this.
AI doesn’t provide much benefit yet
Lol
I don’t understand how you can argue that gaming provides a real benefit, but AI doesn’t.
If gaming’s benefit is entertainment, why not acknowledge that AI can be used for the same purpose?
There are other benefits as well – LLMs can be useful study tools, and can help with some aspects of coding (e.g., boilerplate/template code, troubleshooting, etc).
If you don’t know what they can be used for, that doesn’t mean they don’t have a use.
LLMs help with coding? In any meaningful way? That’s a great giveaway that you’ve never actually produced and released any real software.
FWIW I do that all the time, it’s helpful for me too.
I gave up on ChatGPT for help with coding.
But a local model that’s been fine-tuned for coding? Perfection.
It’s not that you use the LLM to do everything, but it’s excellent for pseudo code. You can quickly get a useful response back about most of the same questions you would search for on stack overflow (but tailored to your own code). It’s also useful for issues when you’re delving into a newer programming language and trying to port over some code, or trying to look at different ways of achieving the same result.
It’s just another tool in your belt, nothing that we should rely on to do everything.
If gaming’s benefit is entertainment, why not acknowledge that AI can be used for the same purpose?
Ah yes the multi-billion dollar industry of people reading garbage summaries. Endless entertainment.
Ah yes the multi-billion dollar industry of people reading garbage summaries. Endless entertainment.
See, I’m not even sure if you’re criticizing LLMs or modern journalism…lmao
Unfortunately, they seem to be one and the same these days.
Yeah it is a bit weak on the arguments, as it doesn’t seem to talk about trade offs?
The forefront of technology overutilizes resources?
Always has been.
AI is on another completely different level of energy consumption. Consider that Sam Altman, of OpenAI, is investing on Nuclear power plants to feed directly their next iterations of AI models. That’s a whole ass nuclear reactor to feed one AI model. Because the amount of energy we currently create is several magnitudes not enough for what they want. We are struggling to feed these monsters, it is nothing like how supercomputers tax the grid.
Supercomputers were feared to be untenable resource consumers then, too.
Utilizing nuclear to feed AI may be the responsible and sustainable option, but there’s a lot of FUD surrounding all of these things.
One thing is certain: Humans (and now AI) will continue to advance technology, regardless of consequence.
Would you kindly find a source for that? Supercomputers run discrete analyses or processes then halt. The big problem with these LLMs is that they run as on line services that have to be on all the time to chat with millions of users online. The fact they’re never turned off is the marked difference. As far as I recall, supercomputers have always been about power efficiency and don’t ever recall anyone suggesting to plug one to a nuclear reactor just to run it. Power consumption has never been the most important concern about even exaflops supercomputers.
Another factor is that there aren’t that many supercomputers in the world, a handful of thousand of them. While it takes that same number of servers, which are less energy efficient and run 24/7 all year, to keep an LLM service up and available to the public with 5 nines. That alone overruns even the most power hungry supercomputers in the world.
Would you kindly find a source for that?
I can personally speak from the 80s, so that’s not exactly a golden age of reliable information. There was concern about scale of infinite growth and power requirements in a perpetual 24/7 full-load timeshare by people that were almost certainly not qualified to talk about the subject.
I was never concerned enough to look into it, but I sure remember the FUD: “They are going to grow to the size of countries!” - “They are going to drink our oceans dry!” … Like I said, unqualified people.
Another factor is that there aren’t that many supercomputers in the world, a handful of thousand of them.
They never took off like the concerned feared. We don’t even concern ourselves with their existence.
Edit: grammar
For what is worth, this time around it isn’t unqualified people. There are strong scientifically studied concerns, not that infinite growth of LLMs, but their current numbers are already too power hungry. And what actual plans are currently in the engineering pipes are too much as well, not wild speculation, but actually funded and on the way development.
I am concerned about the energy abuse of LLMs, but it gets worse. AGI is right around the corner, and I fear that law of diminishing return may not apply due to advantages it will bring. We’re in need of new, sustainable energy like nuclear now because it will not stop.
The difference is that supercomputers by and large actually help humanity. They do things like help predict severe weather, help us understand mathematical problems, understand physics, develop new drug treatments, etc.
They are also primarily owned and funded by universities, scientific institutions, and public funding.
The modern push for ubiquitous corpo cloud platforms, SaaS, and AI training has resulted in massive pollution and environmental damage. For what? Mostly to generate massive profits for a small number of mega-corps, high level shareholders and ultra wealthy individuals, devalue and layoff workers, collect insane amounts of data to aid in mass surveillance and targeted advertising, and enshitify as much of the modern web as possible.
All AI research should be open source, federated, and accountable to the public. It should also be handled mostly by educational institutions, not for-profit companies. There should be no part of it that is allowed to be closed source or proprietary. No government should honor any copyright claims or cyber law protecting companies’ rights to not have their software hacked, decompiled, and code spread across the web for all to see and use as they see fit.
While I absolutely agree with everything you’ve stated, I’m not taking a moral position here. I’m just positing that the same arguments of concern have been on the table since the establishment of massive computational power regardless of how, or by whom, it was to be utilized.
Not unlike the species of it’s creators, go figure.
ITT hella denialism.
Nothing like the good old magical-thinking-from-guys-who-love-logic.
Believing oneself to be the rational one in life continues to sadly be the origin of so many blind spots in people’s thinking.
It is a little scary. Machine learning / LLMs consumes insane amounts of power, and it’s under everyone’s eyes.
I was shocked a few months ago to learn that the Internet, including infrastructure and end-user devices, already consumed 30% of world energy production in 2018. We are not only digging our grave, but doing it ever faster.
Now look into animal farming!
Seriously, though, our population growth rates are unsustainable, and we really better start getting in with nuclear power soon.
I already look into it, I choose to be vegetarian.
Nuclear power plants are a patch to the bigger issue, the idea of infinite progress. We need to reduce consumption.
The Sam Altman fans also say that AI would solve climate change in a jiffy. Problem is, we already have all the tech we need to solve it. We lack the political will to do it. AI might be able to improve our tech further, but if we lack the political will now, then AI’s suggestions aren’t going to fix it. Not unless we’re willing to subsume our governmental structures to AI. Frankly, I do not trust Sam Altman or any other techbro to create an AI that I would want to be governed by.
What we end up with is that while AI might improve things, it almost certainly isn’t worth the energy being dumped into it.
You know I have never once heard anyone saying what you are saying that they are. I personally think it would be better for us to address bad arguments that are being made instead of ones we wish existed solely so we can argue with them.
You mean the ones Sam Altman is actually saying?
Claim:
"The Sam Altman fans also say that AI would solve climate change in a jiffy. "
What he said:
"If we spend 1% of the world’s electricity training powerful AI, and that AI does figure out how to get (to carbon goals) that would be a massive win, (especially) if that 1% lets people live their lives better.”
Were you just assuming I would take you at your word?
Check my edit in the post above, made over an hour before you posted this.
Actually made after I posted that. Why do you keep lying? It’s messed up. This is low stakes internet comments.
And no he didn’t say what you swore he said.
Frankly, I do not trust Sam Altman or any other techbro to create an AI that I would want to be governed by.
“Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.”
~ Frank Herbert, Dune
Thing is, I could maybe be convinced that a sufficiently advanced AI would run society in a more egalitarian and equitable way than any existing government. It’s not going to come from techbros, though. They will 100% make an AI that favors techbros.
I agree that these arguments are stupid, but is anyone actually saying we should do those things?
No one is.
Yes, Sam Altman himself.
Seems he didn’t say what you said he did. Why did you lie?
Why do you keep embarrassing yourself?
Posioning the well. You can admit your lies btw
What is this even? Batteries for UPS in a datacenter wouldn’t be a patch on even a few days of production of EVs, water isn’t being shipped from “drier parts of the world” to cool datacenters, and even if it were, it’s not gone forever once it’s used to cool server rooms.
Absolutely, AI and crypto are a blight on the energy usage of the world and that needs to be addressed, but things like above just detract from the real problem.
The water is because datacenters have been switching to evaporative cooling to save energy. It does save energy, but at the cost of water. It doesn’t go away forever, but a lot of it does end up raining down on the ocean, and we can’t use it again without desalination and using even more energy.
That may all be true, but the amount of water used by these data centers is miniscule, and it seems odd to focus on it. The article cites Microsoft using 700,000 liters for ChatGPT. In comparison, a single fracking well in the same state might use 350,000,000 liters, and this water is much more contaminated. There are so many other, more substantive, issues with LLMs, why even bring water use up?
Edit: If evaporative cooling uses less energy it might even be reducing total industrial water use, considering just how much water is used in the energy industry.
a lot of it does end up raining down on the ocean, and we can’t use it again without desalination
Where do you think rain comes from? Why do hurricanes form over the ocean?
Dude, please. If things just worked out like that, we wouldn’t have water issues piling up with the rest of our climate catastrophe.
No no they’ve got a point. Everyone knows that the invisible hand of the free market and the invisible hand of the replenishing water table just reach out, shake hands, and agree to work it all out.
Rainforests. Like the Amazon that is being deforested obscenely in some areas
Love how we went from “AI needs to be controlled so it doesn’t turn everything into paperclips” to “QUICK, WE NEED TO TURN THE PLANET INTO PAPERCLIPS TO GET THIS AI TO WORK!!”
AI companies*
We all know this, and we all know the “ai” they have right now is anything but that.
But these companies are making billions from this gold rush hype, so they could give two shits about the planet
Oh no let’s build more gigantic server farms about it
There are layers of wrong and stupid to this article.
Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights.
“The cloud” accounts for something like 80% of the internet across the entire planet. I’d be curious what 80% of transportation infrastructure would end being in comparison… no takers? We’re only comparing to (some) flights instead of, I dunno, the vast bulk of our fossil fuel powered transport infra?
In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.
Oh no, the most popular song in the world used the same amount of energy as 40k homes in the US. The US probably has something in the range of a hundred million homes. The efficiency of computing equipment increases by a sizable percentage every single year, with the odds being good the same data could be served at 1/20th the cost today. So why aren’t we talking about, say, heat pumps for those homes? You know, since they’re still using the same amount of energy they did in 2018?
…about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3… Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues…
What is this idiocy? You realize that a chip fab uses something to the tune of ten million gallons of water per day, right? Ten million. Per day. I’m not even looking at other industrial processes, which are almost undoubtedly worse (and recycle their water less than fabs) - but if you’re going to whine about the environmental impact of tech, maybe have a look at the manufacturing side of it.
Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards.
Man, we’re really grasping at straws here. More complaining about water usage, pollution, water security, labor standards, human rights violations… wait, were we talking about the costs of data centers or capitalism in general? Because I’m pretty sure these issues are endemic, across every industry, every country, maybe even our entire economic system. Something like a data center, which uses expensive equipment, likely has a lower impact of every single one of these measures than… I dunno… clothes? food? energy production? transport? Honestly guys, I’m struggling to think of an industry that has lower impact, help me out (genuine farm to table restaurants, maybe).
There are things to complain about in computing. Crypto is (at least for the time being) a ponzi scheme built on wasting energy, social media has negative developmental/social effects, etc. But the environmental impact of stuff like data centers… its just not a useful discussion, and it feels like a distraction from the real issues on this front.
In fact I’d go further and say its actively damaging to publish attack pieces like these. The last few years I didn’t drive to the DMV to turn in my paperwork, I did it over the internet. I don’t drive to work because I’m fully remote since the pandemic, cutting my gas/car usage by easily 90%. I don’t drive to blockbuster to pick out videos the way I remember growing up. The sheer amount of physical stuff we used to do to transmit information has been and is gradually all being transitioned to the internet - and this is a good thing. The future doesn’t have to be all bad, folks.
Thank you. The 700000 litres in particular pissed me off… that’s a 9 meter cube. Whoopdie doo
For comparison, a single hydraulically fractured oil well uses over 100 times as much water.
Cmon, outside of ol’ Bitcoin, my freedom of money networks are a drop in the bucket.
The reason the article compares to commercial flights is your everyday reader knows planes’ emissions are large. It’s a reference point so people can weight the ecological tradeoff.
“I can emit this much by either (1) operating the global airline network, or (2) training LLMs.” It’s a good way to visualize what sort of cost LLMs have without just citing tons-of-CO2/yr.
Downplaying that by insisting we look at the transportation industry as a whole doesn’t strike you as… a little silly? We know transport is expensive; It is moving tons of mass over hundreds of miles. The fact computer systems even get close is an indication of the sheer scale of energy being poured into them.
and recycle their water less than fabs
Which is actually a very good idea economics-wise but fabs didn’t care much for the longest time because while crucial it’s still a minor part of their operating infrastructure. They had bigger fish to fry.
The thing is if you clean a wafer with ultrapure water, the resulting waste water might have some nasty stuff in it… but tap water has more stuff in it, just not as nasty. They generally need to process the waste water to be environmentally safe, anyway, doesn’t take much to feed it back into the cycle and turn it into ultrapure, again.
Side note in case you’re wondering what it’s like to drink that kind of water: It’s basically a novel way to burn your tongue. The osmotic pressure due to lack of minerals will burst cell walls but you’re not a microorganism so you’ll most likely be fine and the load on your overall mineral stores is only marginally higher than when drinking ordinary water, we get the vast majority of our minerals from food.
But the environmental impact of stuff like data centers… its just not a useful discussion,
I’d say it is but more along the lines of feeding waste heat into district heating. Someone can shower with those CPU cycles.
Good assessment, thanks
with the odds being good the same data could be served at 1/20th the cost today
Gotta nitpick you there. According the Moore’s law (really more of a rule of thumb), the price of the silicon used to serve those videos should be 1/16 of what it is today. I’m not aware of any corresponding law that describes trends in energy consumption. It’s getting better for sure, but I’d be shocked if there was a 20x improvement in 6 years.
Goddamn what a beautiful comment, brings a tear to my eye
Wow, AI really will kill us, just not in the way anyone imagined
deleted by creator
AI Training is a flexible energy consumer, meaning it can be switched on and off at will, so that it can take advantage of excess solar power during the daylight, providing extra income to solar panel parks. The important thing to do is to install solar panels, and then AI training isn’t an environmental problem anymore.
We already have a more elegant solution than training AI when solar arrays produce more electricity than the grid needs - batteries. It strikes me as a better option to save the energy for later use than to burn it off to train AI.
It looks like you and the commenter you replied to are talking about two different problems. You’re talking about what to do about excess solar energy, they are talking about how to power AI training in an environmentally-friendly way.
I would say that both are interesting proposals to look at. Of course, doing the math and crafting the best approach is work and takes time, and I can’t give many details in a lemmy comment.
Or dams
Heck yeah! Love me some pumped hydro
Yes it does, and wait until you hear about literally every other industry.
Difference is that AI is absolutely pointless lmao
This is the same excuse crypto bros make. Though that makes sense because the venn diagram between AI evangelists that blow up like the Hindenburg the moment you levy any critique against AI and its usage is basically a circle with crypto bros who assure us that any day now it will stop being treated like penny stocks and actually be useful “because they just like the tech.”
You are on lemmy, a decentralized and open platform. Cryptos are to money what lemmy is to their centralized and proprietary counterpart.
Cryptocurrencies have no real world applications. AI does.
I agree it does. But that has nothing to do with how energy intensive it currently is. You can see in my other comment that I am an advocate for it in my own work - it has great uses in some industries.
We have to be critical of the resources it takes and the ways it is deployed. It’s the only way to improve it. Yet AI evangelists act like it’s already perfect and anybody who dares question the church of LLM is declared a Luddite.
AI evangelists act like it’s already perfect and anybody who dares question the church of LLM is declared a Luddite.
I don’t think that’s the case, though. The only people actively “evangelizing” LLMs are either companies looking for investors or “influencers” looking for attention by tapping on people’s insecurities.
Most people just either find it useful for some use cases or just hate it.
You’re doing it right now. You’re criticizing that user for saying it’s okay to talk about AI’s failures. You’re the example, evangelizing and shilling. My advice: STFU.
You’re doing it right now. You’re criticizing that user for saying it’s okay to talk about AI’s failures. You’re the example, evangelizing and shilling. My advice: STFU.
It seems like you missed the memo on reading comprehension. I literally quoted the exact part I’m criticizing, which clearly isn’t what you claimed.
And being overly emotional and telling people to STFU online? That’s a masterclass in civility right there.
Ohmahgosh you’re so right, I see it now, you telling them they were wrong to criticize AI was in fact the correct take all along. You’ve shown me the way, All Hail AI. ALL HAIL AI.
What a fucking shill.
That’s wrong, I buy drugs online with cryptocurrencies all the time to this day and have done it long before the normies showed up and turned it into a mostly financial scam.
Evading the man and LEOs when the law ain’t right is my god-given right and I’m thankful to be born in the age of onions and crypto.
Crypto is basically cash for online transactions. Pretty niche, but cool and definitely in demand for some situations.
Just how in the real world you’re shit outta luck if you lose your wallet. Or if you give someone money, but they laugh you in the face you can either cut your losses or try your luck in a fist fight. It’s the same with crypto.
With banks you have a separate authority that can handle all these cases, which is desirable in 99% of all transactions.
Unfortunately it’s volatile af, and the most popular crypto currency (Bitcoin)has untenable transaction costs and transaction limitations (10 transactions per second, globally - what a stupid design decision)
Crypto bros hate you lol I think you use it the only way it’s actually useful. Drugs and tax evasion/hiding money.
If you could hold your breath long enough to get out of your first world bubble, you would be able to see that bitcoin is massively popular amongst people who need ways to escape their collapsing fiat currencies. It is hilarious how spoiled people who happen to be born in countries where everything is taken care of them are too thick and compationless to even consider that other people have actual problems.
Is that why you use it? Because of your collapsing fiat currency?
I’m lucky enough to be from a country with a relatively stable fiat currency, although it is unclear how much longer that will be the case. In order to protect the value I’ve gained from my work, I do hold some of it in Bitcoin. I also use it to support charitable efforts in less fortunate countries. It is an excellent way to transfer value to exactly who I want to transfer it to without giving massive fees to banks and other companies that facilitate the transfer of funds.
A big thing to remember is that whenever you hold any countries currency, you are basically giving them a blank check to your energy. You are telling them that they can have as much of the value that you have saved that they want. When they print more money, they are taking that value directly from you. It is one thing to pay taxes on income, property, and goods purchased and sold, but on top of that, they have the ability to extract extra value from you just by running their printers. The more you believe that a government represents you and has your best wishes at heart, the more you should be holding their currency.
Good, I hate cryptobros and aibros and artbros and luddites and industrialists and environmentalists, but I love communal living, hate cities, love AI (and AI art), love art (and craft of said art), love nature & the environment and animals, hate vegans, and love science and industry etc.
At this point I have such an ultra-niche hyper-specific take on this (and almost everything) that I feel completely out of touch with most people which seem at first glance to navigate mostly by vibes and emotions of how they feel about a vague aesthetic sense of modernity that day.
Yeah! Accelerating societal collapse!
As far as I know there would be, it’s just that nobody is using them that way
Such as?
What about this?
I’ve used it to improve selected paragraphs of my writing, provide code snippets and find an old comic based on a crude description of a friend.
I feel like these interactions were valuable to me and only one (code snippets) could have been easily replaced with existing tools.
As a professional editor (video/audio) AI has drastically altered my work in amazing, productive ways.
I’m still a critic of course. For some industries it’s clearly a solution in search of a problem so they can hype investment. A tool being useful doesn’t mean I’m unable to critique it!
“aI AnD cRyPtO aRe ThE sAmE bRo”
You know that your take that they both must suck in the exact same ways just because tech bros get hyped about them, is literally just as shallow, surface level, and uninformed as most tech bros?
Like yeah man, tech hype cycles suck. But you know what else was once a tech hype cycle? Computers, the internet, smartphones. Sometimes they are legitimate, sometimes not.
AI is solving an entirely new class of problem that computers have been literally unable to solve for their entire existence. Crypto was solving the problem of making a database without a single admin. One of those is a lot more important and foundational than the other.
On top of that, crypto algorithms are fundamentally based on “proof of work”, i.e. literally wasting more energy than other miners in the network is a fundamental part of how their algorithm functions. Meaning that with crypto there is basically no value prop to society and it inherently tries to waste energy, neither is the case for AI.
Plus guess how much energy everyone streaming 4K video would take if we were all doing it on CPUs and unoptimized GPUs?
Orders of magnitude more power than every AI model put together.
But guess what? Instead we invented 4k decoding chips that are optimized to redner 4k signals at the hardware level so that they don’t use much power, and now every $30 fire stick can decode a 4k signal on a 5V usb power supply.
That’s also where we’re at with the first Neural Processing Units only just hitting the market now.
I did not say AI and crypto are the same. I said the advocates are the same.
I mined from 2012 to 2015. Then I wisened up. Currently I use AI almost every day in my work. I was using it in my production tools before anyone knew with an LLM was outside of academic circles. Hell i barely understood what I was using at first. I am squarely in favor of AI tools. The inability of tech bros to handle the slightest critique is incredibly familiar and frustrating. They spike the conversation at every turn and immediately attack people’s intelligence.
Sure, uninformed tech hypebois suck in the same way, but the arguments around crypto and AI, especially around energy usage, are fundamentally not the same.
You keep responding to a thing I didn’t say.
But I will say now that both are very energy intensive/resource draining and the refusal to have a serious conversation about it - as spearheaded by tech bros/AI evangelists like I’ve described - is incredibly frustrating and makes people like me look down on the entire endeavor as a result.
Intellectually I know there are “responsible” developers and tools being made. But the loudest and most funded are not those people. And we need to consider the impact they have. This includes resource usage.
Edit: this just appeared on my feed https://www.ft.com/content/ddaac44b-e245-4c8a-bf68-c773cc8f4e63
Someone posted a shitty article about AI and power usage, someone pointed out that literally every industry uses a ton of power but AI gets clicks, you said AI and Crypto bros are the same.
If you don’t mean to imply that the counter arguments around AI and Crypto in terms of energy use are the same then write better given the context of the conversation.
And posting another shitty article that just talks about power usage going up across literally all types of industry, including just normal data centers and manufacturing plants, and then vaguely talking about chatGPT’s power usage compared to Google search to try and make it sound like those things are connected, is not having a serious discussion about it.
It’s skimming a clickbait headline of a clickbait article and regurgitating the implication in it like it’s a fact.
👍
To be fair, crypto will never stand a chance against fiat as a means for payments because governments ensure that it’s complicated to tax. However, the underlying blockchain technology remains very interesting to me as a means of getting around middlemen companies.
Cryptos have drastically reduced their energy consumption through technological improvements.
That’s why nobody complains about crypto energy consumption anymore. It’s just bitcoin.
But these LLMs just need more and more with no end in sight.
Funny how 99.99% of cryptos shrivel up and die while bitcoin continues to serve people all over the world and is constantly becoming more and more popular. Maybe if you lived with, or even gave a shit about, people in below average wealth countries you would understand why Bitcoin is so useful to them.
Yeah “just” Bitcoin which comprises ~50% of the market cap for all of crypto.
Go on benefiting from the people who actually do stuff while simultaneously whining about it. You’ve been using AI for 20 years, you’re just too thick to know about it. There are millions of people in 2nd and 3rd world countries who have had their lives massively improved thanks to bitcoin, you’re just too spoiled and naive and to give a shit about them. Climb down off your soap box and go read something beyond the headline.
I mined crypto for 4 years and have used AI professionally on a daily basis for years but you seem to have a knack for making a lot of assumptions about me so I don’t know why you would stop now.
I am allowed to critique things I use/participate in.
That other poster is using a disingenuous debate tactic called “whataboutism”. Basically shifting the focus from what’s being criticised (AI resource consumption) to something else (other industries).
Your comparison with evangelists is spot on. In my teen years I used to debate with creationists quite a bit; they were always
- oversimplifying complex matters
- showing blatant lack of reading comprehension, and distorting/lying what others say
- vomiting certainty on things that they assumed, and re-eating their own vomit
- showing complete inability to take context into account when interpreting what others say
- chain-gunning fallacies
- “I’m not religious, but…”
always to back up something as idiotic as “the world is 6kyo! Evolution is a lie!”.
Does it ring any bell for people who discuss with AI evangelists? For me, all of them.
(Sorry bolexforsoup for the tone - it is not geared towards you.)
I know what a whataboutism is haha but yes it’s good to see someone else recognize it for what it is.
Guys guys! There’s room for all of us to eat our fair share of natural resources and doom the planet together!
But no, AI bad AI bad AI bad AI bad lalalaa I can’t hear you AI bad /s
Dumb.
Seems like you’re hearing it perfectly, but not listening.
“The world is complicated and scary! I don’t understand it so it must be bad! M-muh planet farting cows evil industry fuck the disabled/sick/queer!” - What luddites actually believe.
Anprims/eco-fashes begone. If the planet was destroyed for the betterment of conditions for the proletariat today and future alike there’d be literally no issue, it’s just some rock lol, AI is far more important. Also brutalism and soviet blocs are the best architectural styles, everything else is bourgeois cringe.
So… Absolutely need to be aware of the impact of what we do in the tech sphere, but there’s a few things in the article that give me pause:
Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.
- “Could”. More likely it was closed loop.
- Water isn’t single use, so even if true how does this big number matter.
What matter is the electrical energy converted to heat. How much was it and where did that heat go?
Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.
Can you say non sequitur ?
The outdated network holding back housing is that it doesn’t go to the right places with the capacity needed for the houses. Not that OpenAIUK is consuming so much that there’s no power left. To use a simily, there’s plenty of water but the pipes aren’t in place.
This article is well intentioned FUD, but FUD none the less.
700.000 litres also sounds like much more than 700 m³. The average German citizen consumed 129 litres per day or roughly 47 m³ annually. The water consumption of 15 people is less than most blocks.
Energy consumption might be a real problem, but I don’t see how water consumption is that big of a problem or priority here.
It’s usually not the water itself but the energy used to “systemize” water from out-of-system sources
Pumping, pressurization, filtering, purifying all take additional energy.
The average German citizen consumed 129 litres per day
That seems like a lot. Where are you getting that number?
The EPA states that each American uses an average of 82 gallons or 310.4 litres a day (study from 2015). Source: https://www.epa.gov/watersense/statistics-and-facts
I would assume that includes stuff like toilets,baths,showers,dishes and hand washing etc as fresh water uses. Either that or Germans are the ultimate hydrohommie.
A quick search says 3.7L is the recommended intake for men, and 2.7L for women. Forget AI, Germans appear to be the real resource guzzlers!
Here “consume” means far more than just “drank”. If you take a shower at home, you are consuming water. Wash your car? Consume water. Water your garden? Consume water.
Aha! That makes a lot more sense with that framing.
EDIT: In 2019 in Canada the daily residential average was 215L per day. 129L seems like a dream in contrast.
I imagine the number goes up considerably when you account for showering, washing clothes and dishes, and water used while cooking. It would go up even more if you account for the water used to produce the food consumed by the individual.
Liters are a great unit for making small things seem large. I’ve seen articles breathlessly talking about how “almost 2000 liters of oil was spilled!” When 2000 liters could fit in the back of a pickup truck.
Water “consumption” is also a pretty easy to abuse term since water isn’t really consumed, it can be recycled endlessly. Whether some particular water use is problematic depends very much on the local demands on the water system, and that can be accounted for quite simply by market means - charge data centers money for their water usage and they’ll naturally move to where there’s plenty of cheap water.
Liters are a great unit for making small things seem large. I’ve seen articles breathlessly talking about how “almost 2000 liters of oil was spilled!” When 2000 liters could fit in the back of a pickup truck.
That just means you have no intuitive sense of how large a litre is. If they’d written it as “2000 quarts” (which is close enough to being the same volume at that level of rounding) would it have painted a clearer picture in your head?
Oil is different because 1 ppm can ruin a whole litre or something in that direction.
Assuming that’s true, most of the oil tends to clump together. 2000L doesn’t just perfectly disperse out across billions of litres of water.
“Could”. More likely it was closed loop. As I understand it this is an estimate, thus the word “could”. This has nothing to do with using closed or open look water cooling. Water isn’t single use, so even if true how does this big number matter.
The point they are trying to make is that fresh water is not a limitless resource and increasing usage has various impacts, for example on market prices.
The outdated network holding back housing is that it doesn’t go to the right places with the capacity needed for the houses. Not that OpenAIUK is consuming so much that there’s no power left. To use a simily, there’s plenty of water but the pipes aren’t in place.
The point being made is that resources are allocated to increase network capacity for hyped tech and not for current, more pressing needs.
Is there a reason it needs to be fresh water? Is sea water less effective?
corrosion
Oh makes sense.
A lot of industry does use grey water or untreated water for cooling as it’s substantially cheaper to filter it and add chemicals to it yourself. What’s even cheaper is to have a cooling tower and reuse your water, in the volumes it’s used at industrial scales it’s really expensive to just dump down the drain (which you also get charged for), when I worked as a maintenance engineer I recall saving something like 1m cad minimum a year by changing the fill level in our cooling tower as it would drop to a level where it’d trigger city water backups to top up the levels to avoid running dry, and that was a single processing line.
“Could”. More likely it was closed loop.
Nope. Here’s how data centres use water.
It boils down to two things - cooling and humidification. Humidification is clearly not a closed loop, so I’ll focus on the cooling:
- cold water runs through tubes, chilling the air inside the data centre
- the water is now hot
- hot water is exposed to outside air, some evaporates, the leftover is colder and reused.
Since some evaporates you’ll need to put more water into the system. And there’s an additional problem: salts don’t evaporate, they concentrate over time, precipitate, and clog your pipes. Since you don’t want this you’ll eventually need to flush it all out. And it also means that you can’t simply use seawater for that, it needs to be freshwater.
Water isn’t single use, so even if true how does this big number matter.
Freshwater renews at a limited rate.
What matter is the electrical energy converted to heat. How much was it and where did that heat go?
Mostly to the air, as promoting the evaporation of the water.
Can you say non sequitur ?
More like non sequere than non sequitur. Read the whole paragraph:
Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects. This will only get worse as households move away from using fossil fuels and rely more on electricity, putting even more pressure on the National Grid. In Bicester, for instance, plans to build 7,000 new homes were paused because the electricity network didn’t have enough capacity.
The author is highlighting that electrical security is already bad for you Brits, for structural reasons; it’ll probably get worse due to increased household consumption; and with big tech consuming it, it’ll get even worse.
Data center cooling towers can be closed- or open-loop, and even operate in a hybrid mode depending on demand and air temps/humidity. Problem is, the places where open-loop evaporative cooling works best are arid, low-humidity regions where water is a scarce resource to start.
On the other hand, several of the FAANGS are building datacenters right now in my area, where we’re in the watershed of the largest river in the country, it’s regularly humid and rainy, any water used in a given process is either treated and released back into the river, or fairly quickly condenses back out of the atmosphere in the form of rain somewhere a few hundred miles further east (where it will eventually collect back into the same river). The only way that water is “wasted” in this environment has to do with the resources used to treat and distribute it. However, because it’s often hot and humid around here, open loop cooling isn’t as effective, and it’s more common to see closed-loop systems.
Bottom line, though, I think the siting of water-intensive industries in water-poor parts of the country is a governmental failure, first and foremost. States like Arizona in particular have a long history of planning as though they aren’t in a dry desert that has to share its only renewable water resource with two other states, and offering utility incentives to potential employers that treat that resource as if it’s infinite. A government that was focused on the long-term viability of the state as a place to live rather than on short-term wins that politicians can campaign on wouldn’t be making those concessions.
They can be closed-loop as in your region but they usually aren’t - besides the problem that you mentioned, a closed loop increases electricity consumption (as you’ll need a heat pump instead), and electricity consumption is also a concern. Not for the environmental impact (corporations DGAF), but price.