Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

Avatar_of_Self ,

It really depends on what model you want to run and how much training is bundled with it. You can pretty much run any model if you have enough disk space but of course GPU + VRAM is preferred for a ChatGPT like fast response. Otherwise, running on an older CPU and RAM is going to be noticeably slower, especially with complex models with a lot of training data to trawl through.

There are some pretty lite models out there but the responses will be more barebones and probably seem 'less informed'.

Give GPT4All a try for your first time. It makes install, configuration and usage point-and-click while being fairly straight forward. For the presented/featured models, it presents a small summary and VRAM recommended, though there are many, many other models available from inside the UI.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • technology@lemmy.world
  • incremental_games
  • meta
  • All magazines