Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

masterspace ,

Sure, a lot of computing power goes into, say, console gaming, but that's not what I originally talked about. I talked about data centers training AI models and requiring ever more power and hardware as compared to what we expend on gaming, first of all.

But they don't. Right now the GPU powering every console, gaming PC, developer PC, graphic artist, twitch streamer, YouTube recap, etc. consumer far far more power than LLM training.

And LLM training is still largely being done on GPUs which aren't designed for it, as opposed to NPUs that can do so more efficiently at the chip level.

I understand the idea that AI training will always inherently consumer power because you can always train a model on bigger or more data, or train more parameters, but most uses of AI are not training, they're just users using an existing trained model. Google's base search infrastructure also took a lot more carbon to build initially than is accounted for when they calculate the carbon cost of an individual search.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines