“There’s no way to get there without a breakthrough,” OpenAI CEO Sam Altman said, arguing that AI will soon need even more energy.
Finally some scifi bullshit he is spewing might actually lead to a good outcome
Nvidia Execs: Did you say the price of GPUs should go up?
Get rid of bitcoin and you solve the energy problem.
Can we get rid of AI too?
I would also like this
The human brain uses about 20W. Maybe AI needs to be more efficient instead?
deleted by creator
So AI can’t exist without stealing people’s content and it can’t exist without using too much energy. Why does it exist then?
Because it’s a miracle technology. Both of those things are also engineering problems - ones that have been massively mitigated already. You can run models almost as good as gpt3.5 on a phone, and individuals are pushing the limits on how efficiently we can train every week
It’s not just making a chatbot or a new tool for art - it’s also protein folding, coming up with unexpected materials, and being another pair of eyes that will assist a person do anything.
They literally promise the fountain of youth, autonomous robots, better materials, better batteries, better everything. It’s a path for our species to break our limits, and become more.
The downside is we don’t know how to handle it. We’re making a mess of it, but it’s not like we could stop… The AI alignment problem is dwarfed by the corporation alignment problem
deleted by creator
There’s no way these chatbots are capable of evolving into Ultron. That’s like saying a toaster is capable of nuclear fusion.
I think we’ve got a bit before we have to worry about another major jump in AI and way longer for an Ultron. The ones we have now are effectively parsers for google or other existing data. I personally still don’t see how we feel like we can get away with calling that AI.
Any AI that actually creates something ‘new’ that I’ve seen still requires a tremendous amount of oversight, tweaking and guidance to produce useful results. To me, they still feel like very fancy search engines.
🙄 iTS nOt stEAliNg, iTS coPYiNg
By your definition everything is stealing content. Nearly everything in human history is derivative of others work.
The models get more efficient and smaller very fast if you look just a year back. I bet we’ll run some small LLMs locally on our phones (I don’t really believe in the other form factors yet) sooner as we believe. I’d say prior 2030.
I can already locally host a pretty decent ai chatbot on my old M1 Macbook (llama v2 7B) which writes at the same speed I can read, its probably already possible with the top of the line phones.
Lol, “old M1 laptop” 3 to 4 years is not old, damn!
(I have running macbookpro5,3 (mid 2009) on Arch, lol)
But nice to hear that M1 (an thus theoretically even the iPad, if you are not talking about M1 pro / M1 max) can already run llamma v2 7B.
Have you tried the mistralAI already, should be a bit more powerful and a bit more efficient iirc. And it is Apache 2.0 licensed.
3 to 4 years is not old
Huh, nice. I got the macbook air secondhand so I thought it was older. Thanks for the suggestion, I’ll try mistralAI next, perhaps on my phone as a test.
Optimizing power consumption? Why?!
Unity developers be like.
This dude al is the new florida man, wonder if it’s the same al from married with children
Some of the smartest people on the planet are working to make this profitable. It’s fucking hard.
You are dense and haven’t taking even a look at simple shit like hugging face. Power consumption is about the biggest topic you find with anyone in the know.
In fairness the computing world has seen unfathomable efficiency gains that are being pushed further with the sudden adoption of arm. We are doing our damnedest to make computers faster and more efficient, and we’re doing a really good job of it, but energy production hasn’t seen nearly those gains in the same amount of time. With the sudden widespread adoption of AI, a very power hungry tool (because it’s basically emulating a brain in a computer), it has caused a sudden spike in energy needed for computers that are already getting more efficient as fast as we can. Meanwhile energy production isn’t keeping up at the same rate of innovation.
It’s not so much the hardware as it is the software and utilisation, and by software I don’t necessarily mean any specific algorithm, because I know they give much thought to optimisation strategies when it comes to implementation and design of machine learning architectures. What I mean by software is the full stack considered as a whole, and by utilisation I mean the way services advertise and make use of ill-suited architectures.
The full stack consists of general purpose computing devices with an unreasonable number of layers of abstraction between the hardware and the languages used in implementations of machine learning. A lot of this stuff is written in Python! While algorithmic complexity is naturally a major factor, how it is compiled and executed matters a lot, too.
Once AI implementations stabilise, the theoretically most energy efficient way to run it would be on custom hardware made to only run that code, and that code would be written in the lowest possible level of abstraction. The closer we get to the metal (or the closer the metal gets to our program), the more efficient we can make it go. I don’t think we take bespoke hardware seriously enough; we’re stuck in this mindset of everything being general-purpose.
As for utilisation: LLMs are not fit or even capable of dealing with logical problems or anything involving reasoning based on knowledge; they can’t even reliably regurgitate knowledge. Yet, as far as I can tell, this constitutes a significant portion of its current use.
If the usage of LLMs was reserved for solving linguistic problems, then we wouldn’t be wasting so much energy generating text and expecting it to contain wisdom. A language model should serve as a surface layer – an interface – on top of bespoke tools, including other domain-specific types of models. I know we’re seeing this idea being iterated on, but I don’t see this being pushed nearly enough.[1]
When it comes to image generation models, I think it’s wrong to focus on generating derivative art/remixes of existing works instead of on tools to help artists express themselves. All these image generation sites we have now consume so much power just so that artistically wanting people can generate 20 versions (give or take an order of magnitude) of the same generic thing. I would like to see AI technology made specifically for integration into professional workflows and tools, enabling creative people to enhance and iterate on their work through specific instructions.[2] The AI we have now are made for people who can’t tell (or don’t care about) the difference between remixing and creating and just want to tell the computer to make something nice so they can use it to sell their products.
The end result in all these cases is that fewer people can live off of being creative and/or knowledgeable while energy consumption spikes as computers generate shitty substitutes. After all, capitalism is all about efficient allocation of resources. Just so happens that quality (of life; art; anything) is inefficient and exploiting the planet is cheap.
For example, why does OpenAI gate external tool integration behind a payment plan while offering simple text generation for free? That just encourages people to rely on text generation for all kinds of tasks it’s not suitable for. Other examples include companies offering AI “assistants” or even AI “teachers”(!), all of which are incapable of even remembering the topic being discussed 2 minutes into a conversation. ↩︎
I get incredibly frustrated when I try to use image generation tools because I go into it with a vision, but since the models are incapable of creating anything new based on actual concepts I only ever end up with something incredibly artistically compromised and derivative. I can generate hundreds of images based on various contortions of the same prompt, reference image, masking, etc and still not get what I want. THAT is inefficient use of resources, and it’s all because the tools are just not made to help me do art. ↩︎
dude think about this stuff before you open the floodgates bro
That requires someone in business to think beyond the next quarter’s profits.
Stop mining bitcoin.
It’s called nuclear energy. It was discovered in 1932 and properly harnessed with an effective reactor that consumes both radioactive material and waste (CANDU) in 1950’s/1960’s and the newest CANDU reactors are some of the safest and most efficient energy generation in the world.
Pretending like there needs to be a larger investment into something like cold fusion in order to run these computers is incredibly dishonest or presenting a clear hole in education coverage. (The DoE should still work on researching cold fusion, but not because of this.)
Yeah, nuclear has been available and in use over the period of the sharpest increase in co2 emissions. It’s not responsible for it, but it’s not the answer. The average person can’t harness nuclear energy. But all the renewable energies in the world can fit on a small house: wind, solar, hydro. Why bring radioactive materials into this?
We have a system to distribute electricity
But why continue to rely on a system of profit that is being run like a mob, being split into distinct territories where “free market capitalism” can’t even allow us to not get gouged by profit seekers? Why not generate our own power? Why not 100% renewables? Like I said, why bring radioactive materials into this? For that matter, why bring capitalism into it?
My comment was referring to when you mentioned the average person not being able to harvest nuclear energy as an argument against it.
I’m 100% for broad solar adaptation and even laws forcing new homes to be built with them. The other renewables you mention aren’t harvestable by the average person either sadly.
I think nuclear is an important tool for running clean societies. Industries need a lot of power and I can also see mini reactors being bought by small towns for their citizens. It has its uses when the renewables aren’t pheasible but the best is always solar or wind farms and hydro for sure.
Microsoft is actually looking at dedicated SMRs to run AI server farms, but could we fucking not?
I love nuclear but China is building them as fast as they can and they’re still being massively outpaced by their own solar installations. If we hadn’t shut down most of the research and construction in the 80’s it would have been great, but it’s not going to be a solution to the huge power requirement growth from EVs and shit like AI in the “short” term of 1-20 years.
Solar alone can’t meet humanity’s energy needs without breakthroughs in energy storage.
Most energy we use the grid for is generated on demand. That means only a few moments ago, the electricity powering your computer was just a lump of coal in a furnace.
If we don’t have the means to store enough energy to meet demands when the sun isn’t out or wind isn’t blowing, then we need more sources of energy than just sun and wind.
There is a lot of misinformation being perpetuated by the solar industry to fool people like you into thinking all investments should be directed to it over other options.
Please educate yourself before parroting industry talking points that only exist to take people for a ride.
There is growing scientific consensus that 100% renewables is the most cost effective option.
Grid storage doesn’t have the same weight limitations that EVs do, which opens up a lot more paths. Flow batteries, for one, might be all we need. They’re already gearing those up for mass production, so we don’t need any further breakthroughs (though they’re always nice if they come).
Getting to 95% is surprisingly easy; there are non-linear factors at work to getting that last 5%, but you wouldn’t need to use other sources very much at all. The wind often blows when the sun doesn’t shine. We have tons of historical weather data about how these two combine in a given region, which means we can calculate the maximum expected lull between the two. Double that amount and put in enough storage to cover it. This basic plan was simulated in Australia, and it gets there for an affordable cost.
Then we can worry about that last 5%.
Nuclear advocates have been using the same talking points since the 90s, and have missed how the economics have been swept out from underneath them.
Supplying energy isn’t only doing what’s “cost effective.” It’s about meeting demand.
This is why when suppliers have difficulty meeting demand, prices go up.
If we only did what was the cheapest instead of what was required to meet demand, then our demands wouldn’t be met and we would be without energy during those times.
If only we could convert empty hype into energy.
Well we can, we had a “jumpstyle” wave going on in the Netherlands a couple of years ago. No clue if it ever got off the ground anywhere else seeing as it was a techno thing or something.
It’s like crypto but sliiiighly better
How about an efficiency breakthrough instead? Our brains just need a meal and can recognize a face without looking at billions of others first.
I mean, we can only do that because our system was trained for hundreds of thousands, millions of years into being able to recognise others of same species
But the training has already been done, no?
Almost all of our training was done without requiring burning fossil fuels. So maybe ole Sammy can put the brakes on his shit until it’s as fuel efficient as a human brain.
Food production and transport is famously a zero emission industry.
We’ve been around for hundreds of thousands of years as homosapiens. Food production and transport emissions were practically 0% until the last 100 years. So, yes, that’s right.
We still need to look at quite a few. And the other billions have been pre-programmed by a couple of billion years of evolution.
pocket nuke plants… have to be the stopgap between here and fusion. are there still people working on those car-sized nuke plants for a more distributed system?
Exactly. This is why the AI hype train is overblown. Stop shoving “AI” everywhere when they know it’ll cost a lot in electricity.
The real path forwards with AI will be specialized super advanced models costing hundreds per run (business use case) and/or locally run AI using NPUs, especially the latter.
deleted by creator
Yes we will build a massive nursing home and use the old people as batteries.