Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

snekerpimp ,

Picked up an AMD instinct mi25 to try and do just that. Can get easy-diffusion working after some cussing and voodoo. Cannot get rocm to do ANY llm of any kind, feels like a waste of video ram

Also have a tesla p4 that runs most text-to-image models rather well, but have been unsuccessful at any llm either, even oobabooga can’t seem to run on it.

Have given up because the software stack keeps advancing and leaving my hardware behind. I don’t have $3000 for an a100 or $1300 for an mi100 sooo… until the models can run on older/less powerful hardware, I’m probably sitting out of this game. Even though I’d love to be elbow deep in this one.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • selfhosted@lemmy.world
  • incremental_games
  • meta
  • All magazines