- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.
Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.
Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.
Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.
Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.
In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.
What is this even? Batteries for UPS in a datacenter wouldn’t be a patch on even a few days of production of EVs, water isn’t being shipped from “drier parts of the world” to cool datacenters, and even if it were, it’s not gone forever once it’s used to cool server rooms.
Absolutely, AI and crypto are a blight on the energy usage of the world and that needs to be addressed, but things like above just detract from the real problem.
The water is because datacenters have been switching to evaporative cooling to save energy. It does save energy, but at the cost of water. It doesn’t go away forever, but a lot of it does end up raining down on the ocean, and we can’t use it again without desalination and using even more energy.
a lot of it does end up raining down on the ocean, and we can’t use it again without desalination
Where do you think rain comes from? Why do hurricanes form over the ocean?
Rainforests. Like the Amazon that is being deforested obscenely in some areas
Dude, please. If things just worked out like that, we wouldn’t have water issues piling up with the rest of our climate catastrophe.
No no they’ve got a point. Everyone knows that the invisible hand of the free market and the invisible hand of the replenishing water table just reach out, shake hands, and agree to work it all out.
That may all be true, but the amount of water used by these data centers is miniscule, and it seems odd to focus on it. The article cites Microsoft using 700,000 liters for ChatGPT. In comparison, a single fracking well in the same state might use 350,000,000 liters, and this water is much more contaminated. There are so many other, more substantive, issues with LLMs, why even bring water use up?
Edit: If evaporative cooling uses less energy it might even be reducing total industrial water use, considering just how much water is used in the energy industry.
if it’s not crypto miners with GPUs it’s AI, these narratives never really connect well with reality. /u/0ptimal wrote a great comment on this post: https://alexandrite.app/lemmy.world/comment/10355707
To no surprise, the other comments are full of laypeople that feel they understand the entire field they have never studied well enough to preach to others about just how useless and terrible it is, who also know nothing about the subject.
This is what pisses me off so much about the climate crisis. People tell me not to use my car, but then microsoft just randomly blow out 30% more co2 for AI
Cars collectively emit far more carbon than ChatGPT, and ChatGPT is only going to get more optimized from here.
Ultimately the answer should be in a heavy carbon tax, rather than having a divine ruler try and pick and choose where it’s worth it to spend carbon.
Part of why right wing politics are becoming so popular again is that so many politicians shove the financial responsibility of cutting carbon onto the normal population. My point is that it feels useless to cut my own emission as long as massive corporations can just randomly emitt way more without consequence. Also, microsoft use electricity for more that just chatgpt.
Look up how much pollution is made from the massive shipping boats when they get into international waters and start burning bilge oil.
I have no doubt about that
You know that Microsoft doesn’t just sit there and burn electricity for fun right?
Microsoft data centers are doing what consumers ask them to do. They are burning data at the request of users, no different than your personal PC.
Actually the main difference is that he computers in their data centers are far more energy efficient than your PC.
I am SHOCKED
So then you realize that it’s not Microsoft burning that electricity, but individual consumers?
I’d still blame microsoft for shoving AI down peoples throats. Search something on bing (or google for that matter) and you get an AI response, even if you don’t want it. It’s the choice of these corporations.
You’re really trying yourself in knots to try and blame the big bad corpos and no one else.
Yes they are shoving it in people’s faces, and when the average person uses their default browser with a default search engine and searches on Bing and it uses AI in addition to a search index they are to blame, but every single user who intentionally seeks out ChatGPT or Copilot is also to blame.
It’s a new technology, people are going to use it and burn energy with it and then eventually we will make a more efficient version of it as it matures, similar to everything else, including traditional search.
We need better carbon (and equivalents) accounting, and knowledge of equivalents.
E.g. Turning 60 people vegetarian = having 1 baby.
The current metric is equivalent tons of CO2, and I think we actually do have numbers for that on vegetarian vs omnivorous vs heavy meat diets.
A bit harder to quantify for a human life though, certainly. We are able to at least convert methane emissions to a CO2 equivalent
We recommend four widely applicable high-impact (i.e. low emissions) actions with the potential to contribute to systemic change and substantially reduce annual personal emissions: having one fewer child (an average for developed countries of 58.6 tonnes CO2-equivalent (tCO2e) emission reductions per year), living car-free (2.4 tCO2e saved per year), avoiding airplane travel (1.6 tCO2e saved per roundtrip transatlantic flight) and eating a plant-based diet (0.8 tCO2e saved per year). These actions have much greater potential to reduce emissions than commonly promoted strategies like comprehensive recycling (four times less effective than a plant-based diet) or changing household lightbulbs (eight times less).
https://iopscience.iop.org/article/10.1088/1748-9326/aa7541/pdf
1 vegetarian baby or?
I think this implies that a vegetarian baby is only 1/60 less polluting than an omnivore baby.
All that for glorified autocomplete
But now I know that I can jump off the Golden Gate bridge to cure my depression.
The golden gate bridge is so far away from me. I don’t know what to do to cure depression. :(
I plan on building my own Golden Gate Bridge (to scale) and then jump off!
That is an absurd reduction of reality, blatant illustration of dunning-kruger in relation to LLMs
LLMs are just predictive text but bigger
Or is it? Dun-dun-DUNNN!
I mean, that’s also all you or I am.
Speak for yourself, loser. Repeating shit you heard an influencer say on Twitch is cringe.
Explain to me how we’re not or kindly go outside and play hide and go fuck yourself.
I’m calling it now: you’ll end up poor and unhappy
That’s cool, free will doesn’t exist, whats going to happen is going to happen. I’ve accepted that, so I might die poor, but you’re the only one here with a chance of dying truly unhappy.
You know what’s funny? What negative prompts you’d have to give an LLM to get it to respond the way you do.
You were more entertaining when you just repeated dumb lines from your favourite influencers
You are insulting someone simply because they didn’t go along with your strawman? Intelligence is in short supply these days.
The ugly truth behind journalist: broke English majors are guzzling resources at planet-eating rates
By age of 21 most journalist have produced 336 metric tons of Co2 and and 20 000 lbs of waste
deleted by creator
You know what’s ironic? We’re all communicating on a decentralized network which is inefficient when compared to a centralized network.
I’m sure we could nitpick and argue over what’s the most efficient solution for every little thing, but at the end of the day we need to see if the pros outweigh the cons.
I highly doubt the “people” downvoting the nerds here understand what a decentralised network is, I bet some of them think Lemmy is just an app owned by a megacorp somewhere. How it works must be like magic to the unwashed .world masses.
It’s going to stop when the price of energy reflects its external cost. Externalities are very well understood by economists, so big oil has convinced us to go after consumers instead.
We need a Green New Deal, not a villain of the week.
Never. Cope and seethe luddite. Btw AI plagiarizes less than humans. Back to Reddit, now!
deleted by creator
Bruh you’re projecting harder than an IMAX cinema
Crypto and proof of work algorithms inherently waste energy.
AI using a lot of energy is like 4k video using a lot of energy, yeah, it does right now, but that’s because we’re not running it on dedicated hardware specifically designed for it.
If we decoded 4k videos using software at the rate we watch 4k videos, we’d already have melted both ice caps.
AI bad though!
Me: ChatGPT, can you create a system that’s capable of powering your systems in a environmentally sustainable way?
ChatGPT: THERE IS INSUFFICIENT DATA FOR A MEANINGFUL ANSWER.
I mean ChatGPT can’t do it but humans can and are… Why do you think Microsoft / Apple / Google are all introducing NPU / AI coprocessing chips?
The new ARM powered surface laptops that consume like 30W of power are more capable of running an AI model than my gaming PC from 2 years ago that consumes ~300W of power.
I’m referencing the short story, The Last Question by Isaac Asimov.
https://users.ece.cmu.edu/~gamvrosi/thelastq.html
Why do you think Microsoft / Apple / Google are all introducing NPU / AI coprocessing chips?
Because they’re all part of a technocult trying to make a digital god.
Pass a carbon tax. Oh wait that would be too easy.
GTFO with your time-tested solution to negative externalities.
Why don’t you just hand over all your income to the government just to be sure you won’t engage in any unnecessary activity.
It seems the people who are the most staunch defenders of capitalism and free markets are the most resistant to the capitalist and free market solution.
Clean air (or rather, air with normal levels of carbon) belongs to the public, and anyone who wants to take it away should pay the public.
Sigh. You can hold any opinion you want about the ideal society. This is a good idea for the society we have now. If we all die it’s not going to matter if Adam Smith or Karl Marx was correct.
Adam Smith would go absolutely ballistic if he were to see our current system. Not at all his vision.
I’m pretty sure he was agreeing with you…?
I think that some are allergic to any slightest notion of capitalism being good
Carbon taxes doesn’t make capitalism good, it’s still like, the cause of the problem in the first place
hey i think you attracted some of those people you mentioned :)
Sometimes I just want to see online world burn
Now do I want to engage em or not? Probably not I guess, it would be tiring
On top of that, if you refuse to defend your vague statements implying it would be a waste of your time and beneath you, you end up being always right!
Which may be because recent history has proven beyond doubt that capitalism without regulation is catastrophical and capitalists will always push the boundaries & try to get rid of regulation, thereby it is always catastrophical, with temporary periods where it looks good on the surface.
AI -and cryptocurrencies- use massive amounts of energy and the only value they produce is wealth. We don’t get correct, reliable and efficient results with AI, and we don’t get a really useful currency but a speculatory asset with cryptocurrencies. We are speeding into a climate disaster out of pure greed.
This is absolutely false. GitHub Copilot (and it’s competitors) alone are already actively helping and assisting virtually every software developer around the world, and highly structured coding languages are just the easiest lowest hanging fruit.
Yes we are heading to a climate disaster because of greed, but that has nothing to do with AI.
I don’t want to doxx myself or blow my own horn. The programming I do, and many developers do, is not something ChatGPT or Bing AI or whatever it is called can do.
At best, it is a glorified search engine that can find code snippets and read -but not understand- documentation. Saves you some time but it can’t think and it can’t solve a problem it hasn’t seen before, something programmers often have to do a lot.
But after you’ve written the code, don’t you find that the LLM is great at documentation?
Dude, if you’ve never used copilot then shut up and don’t say anything.
Don’t pretend like you write code that doesn’t benefit from AI assisted autocomplete. Literally all code does. Just capitalization and autocompleting variable names with correct grammar is handy, let alone literally any time there’s boiler plate or repetition.
Lmao, the idea that you having an NDA makes you work on super elite code that doesn’t benefit from copilot if hilarious. Ive worked on an apps used by hundreds of millions of people and backend systems powering fortune 10 manufacturers, my roommate is doing his PhD on advanced biological modelling and data analysis, copilot is useful when working on all of them.
Just capitalization and autocompleting variable names with correct grammar is handy
We have had IDEs for decades
Oh do tell us again how you haven’t used copilot without saying the words ‘i haven’t used copilot’. Stackoverflow’s professional developer survey found that 70% of devs are using AI assistants, you think none of them have heard of an IDE or Intellisense before?
what are the competitors to github’s copilot? I tried it for personal and really like it but can’t use it for work due to IP leak risks.
I’m hoping there is a self hosted option for it.
there are lots actually: https://bito.ai/blog/free-github-copilot-alternatives-for-vs-code/
that article is clearly biased towards bito but it’s a decent list none-the-less
here is a better one: https://stackoverflow.blog/2024/05/29/developers-get-by-with-a-little-help-from-ai-stack-overflow-knows-code-assistant-pulse-survey-results/
I think you are vastly over estimating how many developers are using GitHub Copilot.
It’s also a laundering scheme to make free software proprietary.
That’s bs now and will only become more so with time.
This was posted two days ago: https://stackoverflow.blog/2024/05/29/developers-get-by-with-a-little-help-from-ai-stack-overflow-knows-code-assistant-pulse-survey-results/
We found that most of those using code assistant tools report that these assistants are satisfying and easy to use and a majority (but not all) are on teams where half or more of their coworkers are using them, too. These tools may not always be answering queries accurately or solving contextual or overly specific problems, but for those that are adopting these tools into their workflow, code assistants offer a way to increase the quality of time spent working.
The majority of respondents (76%) let us know they are using or are planning to use AI code assistants. Some roles use these tools more than others amongst professional developers: Academic researchers (87%), AI developers (76%), frontend developers (75%), mobile developers (60%), and data scientists (67%) currently use code assistants the most. Other roles indicated they are using code assistants (or planning to) much less than average: data/business analysts (29%), desktop developers (39%), data engineers (39%), and embedded developers (42%). The nature of these tools lend themselves to work well when trained well; a tool such as GitHub Copilot that is trained on publicly available code most likely will be good at JavaScript for frontend developers and not so good with enterprise and proprietary code scenarios that business analysts and desktop developers face regularly.
But go ahead, speak for the whole goddamn industry, we’re totally not using AI or AI code-assist!!!
Sorry, I’m not seeing how your source is helping your argument.
The line I’m responding to is
“This is absolutely false. GitHub Copilot (and it’s competitors) alone are already actively helping and assisting virtually every software developer around the world.”
While your source says: "The majority of respondents (76%) let us know they are using or are planning to use AI code assistants. "
An un scientific survey (aka not random) which it’s self claims the 75% of people who respond used OR ARE PLANNING ON USING (aka, not use it yet), does not equal virtually every developer.
Also wasn’t stack overflow recently getting bad press for selling content to AI companies? Something that pissed large parts of the developer community? Something that would make developers not happy with AI not take the survey?
Anyway, have a great day, and enjoy your AI assistant.
Nothing at all?
I love all the people telling you you’re wrong, few if any are actually developers themselves.
For those that aren’t: https://stackoverflow.blog/2024/05/29/developers-get-by-with-a-little-help-from-ai-stack-overflow-knows-code-assistant-pulse-survey-results/
We get it, you don’t know what you’re talking about.
God, I love it when laypeople feel they understand the entire field they have never studied and are oh-so-confident to preach to others who also know nothing about the subject.
But it’s okay, because now we can get wrong answers faster than ever, and we’ve taken human creativity and joy out of art.
I know this is probably way off topic, but it made me think of Friendship is Optimal, especially the ending.
I think we’ll improve this a lot. Now it’s a race to be first, later it will be a race to be profitable and keep costs low.
Plus the sun outputs a lot more energy than earth can ever consume so we just need to get better at collecting it without creating waste on the side.
We’re already going to have to deploy wind and solar at a breakneck pace to solve global warming. Why do we need a technology that would force us to install even more?
We all know this, and we all know the “ai” they have right now is anything but that.
But these companies are making billions from this gold rush hype, so they could give two shits about the planet
This article may as well be trying to argue that we’re wasting resources by using “cloud gaming” or even by gaming on your own, PC.
deleted by creator
Yeah it is a bit weak on the arguments, as it doesn’t seem to talk about trade offs?
Gaming actually provides a real benefit for people, and resources spent on it mostly linearly provide that benefit (yes some people are addicted or etc, but people need enriching activities and gaming can be such an activity in moderation).
AI doesn’t provide much benefit yet, outside of very narrow uses, and its usefulness is mostly predicated on its continued growth of ability. The problem is pretrained transformers have stopped seeing linear growth with injection of resources, so either the people in charge admit its all a sham, or they push non linear amounts of resources at it hoping to fake growing ability long enough to achieve a new actual breakthrough.
I’m going to assume that when you say “AI” you’re referring to LLMs like chatGPT. Otherwise I can easily point to tons of benefits that AI models provide to a wide variety of industries (and that are already in use today).
Even then, if we restrict your statement to LLMs, who are you to say that I can’t use an LLM as a dungeon master for a quick round of DnD? That has about as much purpose as gaming does, therefore it’s providing a real benefit for people in that aspect.
Beyond gaming, LLMs can also be used for brainstorming ideas, summarizing documents, and even for help with generating code in every programming language. There are very real benefits here and they are already being used in this way.
And as far as resources are concerned, there are newer models being released all the time that are better and more efficient than the last. Most recently we had Llama 3 released (just last month), so I’m not sure how you’re jumping to conclusions that we’ve hit some sort of limit in terms of efficiency with resources required to run these models (and that’s also ignoring the advances being made at a hardware level).
Because of Llama 3, we’re essentially able to have something like our own personal GLaDOS right now: https://www.reddit.com/r/LocalLLaMA/comments/1csnexs/local_glados_now_running_on_windows_11_rtx_2060/
It isn’t resource efficient, simple as that. Machine learning isn’t something new and it indeed was used for decades in one form or another. But here is the thing: when you train a model to do one task good, you can approximate learning time and the quality of it’s data analyzis, say, automating the process of setting price you charge for your hotel appartments to maximize sales and profits. When you don’t even know what it can do, and you don’t even use a bit of it’s potential, when your learning material is whatever you was dare to scrap and resources aren’t a question, well, you dance and jump over the fire in the bank’s vault. LLM of ChatGPT variety doesn’t have a purpose or a problem to solve, we come with them after the fact, and although it’s thrilling to explore what else it can do, it’s a giant waste*. Remember blockchain and how everyone was trying to put it somewhere? LLMs are the same. There are niche uses that would evolve or stay as they are completely out of picture, while hyped up examples would grow old and die off unless they find their place to be. And, currently, there’s no application in which I can bet my life on LLM’s output. Cheers on you if you found where to put it to work as I haven’t and grown irritated over seeing this buzzword everywhere.
* What I find the most annoying with them, is that they are natural monopolies coming from the resources you need to train them to the Bard\Bing level. If they’d get inserted into every field in a decade, it means the LLM providers would have power over everything. Russian Kandinsky AI stopped to show Putin and war in the bad light, for example, OpenAI’s chatbot may soon stop to draw Sam Altman getting pegged by a shy time-traveler Mikuru Asahina, and what if there would be other inobvious cases where the provider of a service just decides to exclude X from the output, like flags or mentions of Palestine or Israel? If you aren’t big enough to train a model for your needs yourself, you come under their reign.
That is a good argument, they are natural monopolies due to the resources they need to be competitive.
Now do we apply this elsewhere in life? Is anyone calling for Boeing to be broken up or Microsoft to be broken up or Amazon to be broken up or Facebook?
We are missing big time on broking them into pieces, yes. No argument. There’s something wrong if we didn’t start that process a long time ago.
Ok, first off, I’m a big fan of learning new expressions where they come from and what they mean (how they came about, etc). Could you please explain this one?:
well, you dance and jump over the fire in the bank’s vault.
And back to the original topic:
It isn’t resource efficient, simple as that.
It’s not that simple at all and it all depends on your use case for whatever model you’re talking about:
For example I could spend hours working in Photoshop to create some image that I can use as my Avatar on a website. Or I can take a few minutes generating a bunch of images through Stable Diffusion and then pick out one I like. Not only have I saved time in this task, but I have used less electricity.
In another example I could spend time/electricity to watch a Video over and over again trying to translate what someone said from one language to another, or I could use Whisper to quickly translate and transcribe what was said in a matter of seconds.
On the other hand, there are absolutely use cases where using some ML model is incredibly wasteful. Take, for example, a rain sensor on your car. Now, you could setup some AI model with a camera and computer vision to detect when to turn on your windshield wipers. But why do that when you could use this little sensor that shoots out a small laser against the window and when it detects a difference in the energy that’s normally reflected back it can activate the windshield wipers. The dedicated sensor with a low power laser will use far less energy and be way more efficient for this use case.
Cheers on you if you found where to put it to work as I haven’t and grown irritated over seeing this buzzword everywhere.
Makes sense, so many companies are jumping on this as a buzzword when they really need to stop and think if it’s necessary to implement in the first place. Personally, I have found them great as an assistant for programming code as well as brainstorming ideas or at least for helping to point me in a good direction when I am looking into something new. I treat them as if someone was trying to remember something off the top of their head. Anything coming from an LLM should be double checked and verified before committing to it.
And I absolutely agree with your final paragraph, that’s why I typically use my own local models running on my own hardware for coding/image generation/translation/transcription/etc. There are a lot of open source models out there that anyone can retrain for more specific tasks. And we need to be careful because these larger corporations are trying to stifle that kind of competition with their lobbying efforts.
Otherwise I can easily point to tons of benefits that AI models provide to a wide variety of industries
Go ahead and point. I’m going to assume when you say “AI” that you mean almost anything except actual intelligence.
I think you’re confusing “AI” with “AGI”.
“AI” doesn’t mean what it used to and if you use it today it encompasses a very wide range of tech including machine learning models:
Speech to text (STT), text to speech (TTS), Generative AI for text (LLMs), images (Midjourney/Stable Diffusion), audio (Suno). Upscaling, Computer Vision (object detection, etc).
But since you’re looking for AGI there’s nothing specific to really point at since this doesn’t exist.
Edit: typo
Speech to text (STT), text to speech (TTS), Generative AI for text (LLMs), images (Midjourney/Stable Diffusion), audio (Suno). Upscaling, Computer Vision (object detection, etc).
Yes, this is exactly what I meant. Anything except actual intelligence. Do bosses from video games count?
I think it’s smart to shift the conversation away from AI to ML, but that’s part of my point. There is a huge gulf between ML and AGI that AI purports to fill but it doesn’t. AI is precisely that hype.
If “AI doesn’t mean what it used to”, what does it mean now? What are the scientific criteria for this classification? Or is it just a profitable buzzword that can be attached to almost anything?
But since you’re looking for AGI there’s nothing specific to really point at since this doesn’t exist.
Yes, it doesn’t exist.
I mean, that’s kind of the whole point of why I was trying to nail down what the other user meant when they said “AI doesn’t provide much benefit yet”.
The definition of “AI” today is way too broad for anyone to make statements like that now.
And to make sure I understand your question, are you asking me to provide you with the definition of “AI”? Or are you asking for the definition of “AGI”?
Do bosses from video games count?
Count under the broad definition of “AI”? Yes, when we talk about bosses from video games we talk about “AI” for NPCs. And no, this should not be lumped in with any machine learning models unless the game devs created a model for controlling that NPCs behaviour.
In either case our current NPC AI logic should not be classified as AGI by any means (which should be implied since this does not exist as far as we know).
You read too many headlines and not enough papers. There is a massive list of advancements that AI has brought about. Hell, there is even a massive list of advancements that you personally benefit from daily. You might not realize it, but you are constantly benefiting from super efficient methods of matrix multiplications that AI has discovered. You benefit from drugs that have been discovered by AI. Guess what what has made google the top search engine for 20 years? AI efficiency gains. The list goes on and on…
People in this thread think AI is just the funny screenshot they saw on social media and concluded that they are smart and AI is dumb.
Absolutely. I am surprised, I would expect more from people who would end up at a site like this.
AI doesn’t provide much benefit yet
Lol
I don’t understand how you can argue that gaming provides a real benefit, but AI doesn’t.
If gaming’s benefit is entertainment, why not acknowledge that AI can be used for the same purpose?
There are other benefits as well – LLMs can be useful study tools, and can help with some aspects of coding (e.g., boilerplate/template code, troubleshooting, etc).
If you don’t know what they can be used for, that doesn’t mean they don’t have a use.
LLMs help with coding? In any meaningful way? That’s a great giveaway that you’ve never actually produced and released any real software.
FWIW I do that all the time, it’s helpful for me too.
I gave up on ChatGPT for help with coding.
But a local model that’s been fine-tuned for coding? Perfection.
It’s not that you use the LLM to do everything, but it’s excellent for pseudo code. You can quickly get a useful response back about most of the same questions you would search for on stack overflow (but tailored to your own code). It’s also useful for issues when you’re delving into a newer programming language and trying to port over some code, or trying to look at different ways of achieving the same result.
It’s just another tool in your belt, nothing that we should rely on to do everything.
If gaming’s benefit is entertainment, why not acknowledge that AI can be used for the same purpose?
Ah yes the multi-billion dollar industry of people reading garbage summaries. Endless entertainment.
Ah yes the multi-billion dollar industry of people reading garbage summaries. Endless entertainment.
See, I’m not even sure if you’re criticizing LLMs or modern journalism…lmao
Unfortunately, they seem to be one and the same these days.
Do you never play games with bots? Those are AI.
This isn’t a good situation, but I also don’t like the idea that people should be banned from using energy how they want to. One could also make the case that video games or vibrators are not “valuable” uses of energy, but if the user paid for it, they should be allowed to use it.
Instead of moralizing we should enact a tax on carbon (like we have in Canada) equal to the amount of money it would take to remove that carbon. AI and crypto (& xboxes, vibrators, etc) would still exist, but only at levels where they are profitable in this environment.
If someone wants to use a vibrator that consumes an entire city’s worth of yearly energy consumption each day then I’d say that they shouldn’t be allowed to do that. Making a excessive energy consumption prohibitively expensive goes some way towards discouraging this at least.
If I get you right, you talk of carbon offsets. And investigation after investigation finds that the field is permeated with shady practices that end up with much less emissions actually offset.
So we absolutely should pay special attention to industries that are hogging a lot of energy. Xboxes and especially vibrators spend way less energy than data centers - though again, moving gaming on PCs and developing better dumb gaming terminals to use this computing power while playing with controllers in a living room is an absolute win for the environment.
Nope, carbon tax is different to carbon offsets. A carbon tax is intended to put an immediate financial burden onto energy producers and/or consumers commensurate to the environmental impact of the power production and/or consumption.
From a corporations perspective, it makes no sense to worry about the potential economic impact of pollution which may not have an impact for decades. By adding a carbon tax, those potential impacts are realised immediately. Generally, the cost of these taxes will be passed to the consumer, affecting usage patterns as a potential direct benefit but making it a politically unattractive solution due to the immediate cost of living impact. This killed the idea in Australia, where we still argue to this day whether it should be reinstated. It also, theoretically, has a kind of anti-subsidy effect. By making it more expensive to “do the wrong thing” you should make it more financially viable to build a business around “doing the right thing”.
All in theory. I don’t know what studies are out there as to the efficacy of carbon tax as a strategy. In the Australian context, I think we should bring it back. But while I understand why the idea exists and the logic behind why it should work, I don’t know how that plays out in practice.
So we absolutely should pay special attention to industries that are hogging a lot of energy. Xboxes and especially vibrators spend way less energy than data centers - though again, moving gaming on PCs and developing better dumb gaming terminals to use this computing power while playing with controllers in a living room is an absolute win for the environment.
Bruh, this is flat out a lie.
No, xboxes do not use less power when they are in your house then when they are in a data center. Servers and data center computers (including the xboxs powering xcloud), are typically more power efficient when running in optimized and monitored data centers, where they are liquid cooled with heat pumps, than when running in your dusty ass house running a fan and your houses’ AC to cool them.
The power consumption of video games, if you add up every console while playing them, every server running the multiplayer and updates, and every dev machine crunching away, is a massive waste of economically unproductive energy.
The person above is right. If you want to address the climate crisis, slap a carbon tax on the cost of pollution, don’t artificially pick and choose what you think is worthwhile based on your gut.
Environment doesn’t stop at electricity costs, it’s also about manufacturing.
A simple terminal is more efficient to produce and has way longer lifespan, removing the need to update it for many, many years.
And then you can tie it either to your existing PC (which you need anyway) or cloud (which is used by other players when you’re not playing, again reducing the need for components).
That’s what I meant there. Generally, from an energy standpoint, gaming can absolutely be made more energy-efficient if hardware would put it as a priority. You can make a gaming machine that needs 15W or 1500W, depending on how you set it up.
Yes and manufacturing an Xbox for every single household, boxing it and shipping it to them, and then having it sit unused for 90% of the time, has a much bigger carbon cost than manufacturing a fraction of the number of Xboxes, shipping them all in bulk to the same data center, and then having them run almost 24/7 and be shared amongst everyone.
And the same thing about optimizing gaming hardware is true for AI. The new NPUs in the surface laptops can run AI models on 30W of power that my 300W GPU from 2 years ago cannot.
I feel like we went onto two very different planes here.
Sure, data centers are more efficient than a decentralized system, but the question is, to what point the limitless hogging of power and resources makes sense?
Sure, a lot of computing power goes into, say, console gaming, but that’s not what I originally talked about. I talked about data centers training AI models and requiring ever more power and hardware as compared to what we expend on gaming, first of all.
And while in gaming the requirements are more or less shaped by the improvements to the hardware, for AI training this isn’t enough, so the growth is horizontal, with more and more computing power and electricity spent.
And besides, we should ideally curb the consumption of both industries anyway.
Sure, a lot of computing power goes into, say, console gaming, but that’s not what I originally talked about. I talked about data centers training AI models and requiring ever more power and hardware as compared to what we expend on gaming, first of all.
But they don’t. Right now the GPU powering every console, gaming PC, developer PC, graphic artist, twitch streamer, YouTube recap, etc. consumer far far more power than LLM training.
And LLM training is still largely being done on GPUs which aren’t designed for it, as opposed to NPUs that can do so more efficiently at the chip level.
I understand the idea that AI training will always inherently consumer power because you can always train a model on bigger or more data, or train more parameters, but most uses of AI are not training, they’re just users using an existing trained model. Google’s base search infrastructure also took a lot more carbon to build initially than is accounted for when they calculate the carbon cost of an individual search.