Welcome to Incremental Social! Learn more about this project here! Check out lemmyverse to find more communities to join from here!
rs137 , 4 months ago Llama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.
Llama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.