Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
It’s kinda interesting how the most power-consuming uses of graphics chips — crypto and AI/ML — have nothing to do with graphics.
(Except for AI-generated graphics, I suppose)