Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

yoshi

@yoshi@lemmy.today

This profile is from a federated server and may be incomplete. Browse more on the original instance.

yoshi ,

We use words to describe our thoughts and understanding. LLMs order words by following algorithms that predict what the user wants to hear. It doesn't understand the meaning or implications of the words it's returning.

It can tell you the definition of an apple, or how many people eat apples, or whatever apple data it was trained on, but it has no thoughts of it's own about apples.

That's the point that OOP was making. People confuse ordering words with understanding. It has no understanding about anything. It's a large language model - it's not capable of independent thought.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines