Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    9 hours ago

    1/2: You still haven’t accounted for bias.

    Apparently, reading comprehension isn’t your strong point. I’ll just block you now, no need to thank me.

    • XLE@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      Ironic. If only you had read a couple more sentences, you could have proven the naysayers wrong, and unleashed a never-before-seen unbiased AI on the world.