• Xulai@mander.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 days ago

    As someone who works with integrating AI- it’s failing badly.

    At best, it’s good for transcription- at least until it hallucinates and adds things to your medical record that don’t exist. Which it does and when the providers don’t check for errors - which few do regularly- congrats- you now have a medical record of whatever it hallucinated today.

    And they are no better than answering machines for customer service. Sure, they can answer basic questions, but so can the automated phone systems.

    They can’t consistently do anything more complex without making errors- and most people are frankly too dumb or lazy to properly verify outputs. And that’s why this bubble is so huge.

    It is going to pop, messily.

    • Laser@feddit.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 days ago

      and most people are frankly too dumb or lazy to properly verify outputs.

      This is my main argument. I need to check the output for correctness anyways. Might as well do it in the first place then.