Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

Ephera ,

The problem is that the training data is biased and these AIs pick up on biases extremely well and reinforce them.

For example, people of color tend to post fewer pictures of themselves on the internet, mostly because remaining anonymous is preferable to experiencing racism.
So, if you've then got a journalistic picture, like from the food banks mentioned in the article, suddenly there will be relatively many people of color there, compared to what the AI has seen from its other training data.
As a result, it will store that one of the defining features of how a food bank looks like, is that it has people of color there.

To try to combat these biases, the bandaid fix is to prefix your query with instructions to generate diverse pictures. As in, literally prefix. They're simply putting words in your mouth (which is industry standard).

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@beehaw.org
  • random
  • incremental_games
  • meta
  • All magazines