Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

Greg ,
@Greg@lemmy.ca avatar

That's 128GB RAM, the GPU has 24GB VRAM. Ollama has gotten pretty smart with resource allocation. Smaller models can fit soley on my VRAM but I can still run larger models on RAM.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines