Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

7heo ,

I would have work sign a legal discharge that from the moment I use the technology, none of the recordings or transcription of me can be used to incriminate me in case of an alleged malpractice.

In fact, since both are generated or can be generated in a way that both sounds very assertive but also can be adding incredibly wild mistakes, in a potentially life and death situation, they legally recognise potentially nullifying my work, and taking the entire legal responsibility for it.

As you can see in the most recent example involving Air Canada, a policy has been invented out of thin air. Such policy is costing the company. In the case of a doctor, if the administration of the wrong sedative, the wrong medication, or if the wrong diagnosis was communicated to the patient, etc; all that could have serious consequences.

All sounding (using your phrasings, etc) like you, being extremely assertive, etc.

A human doing that job will know not to derive from the recording. An AI? "antihistaminic" and "anti asthmatic" aren't too far off, and that is just one example off of the top of my head.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • privacy@lemmy.ml
  • random
  • incremental_games
  • meta
  • All magazines