Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

Fingerthief OP ,
@Fingerthief@infosec.pub avatar

Local models are indeed already supported! In fact any API (local or otherwise) that uses the OpenAI response format (which is the standard) will work.

So you can use something like LM Studio to host a model locally and connect to it via the local API it spins up.

If you want to get crazy...fully local browser models are also supported in Chrome and Edge currently. It will download the selected model fully and load it into the WebGPU of your browser and let you chat. It's more experimental and takes actual hardware power since you're fully hosting a model in your browser itself. As seen below.

https://infosec.pub/pictrs/image/c55a4dd0-8f77-43d4-9486-52a038abc0e6.png

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines