Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

abhibeckert ,

Future systems could for example start asking questions more often

Current systems already do that. But they're expensive and it might be cheaper to have a human do it. Prompt engineering is very much a thing if you're working with high performance low memory consumption language models.

We're a long way from having smartphones with a couple terabytes of RAM and a few thousand GPU cores... but our phones can run basic models and they do. Some phones use a basic LLM for keyboard auto correct for example.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines