this post was submitted on 26 Aug 2023
397 points (85.6% liked)

Technology

59657 readers
2930 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women's Hospital found that cancer treatment plans generated by OpenAI's revolutionary chatbot were full of errors.

you are viewing a single comment's thread
view the rest of the comments
[–] Pyr_Pressure@lemmy.ca 35 points 1 year ago (1 children)

Chatgpt is a language / chatbot. Not a doctor. Has anyone claimed that it's a doctor?

[–] Agent641@lemmy.world 7 points 1 year ago (1 children)

Chatgpt fails at basic math, and lies ablut the existence of technical documentation.

I mostly use it for recipe inspuration and discussing books Ive read recently. Just banter, you know? Nothing mission-critical.

[–] IDontHavePantsOn@lemm.ee 4 points 1 year ago (1 children)

Just a couple days ago it continually told me it was possible to re-tile part of my shower that is broken without cutting tiles, but none of the math added up. (18.5H x 21.5w area) "Place a 9" tile vertically. Place another 9“ tile vertically on top on the same side. Place another 9" tile on top vertically to cover the remainder of the area."

I told chatgpt it was wrong, which it admitted, and spit out another wrong answer. I tried specifying a few more times before I started a new chat and dumbed it down to just a simple math algorithm problem. The first part of the chat said it was possible, layed out the steps, and then said it wasn't possible in the last sentence.

I surely wouldn't trust chatgpt to advise my healthcare, but after seeing it spit out very wrong answers to a basic math question, I'm just wondering why anyone would try to have it advise anyone's health are.

[–] Brandon658@lemmy.world 1 points 1 year ago (1 children)

People want to trust it as a source of quick knowledge. It is easier to be told 9 goes into 81 a total of 8 times trusting that the computer is always right, because it had access to everything, than to work out the answer given was wrong and is actually 9.

Think of WebMD. People love to self diagnose despite it commonly being known as a bad practice. But they do so because they can do it with less effort, faster, and cheaper than making an appointment to drive to an office so you can speak with a doctor that runs a few tests and gets back to you in a week saying they aren't sure and need to do that process all over again.

[–] raptir@lemm.ee 1 points 1 year ago

The healthcare issue is that I'm usually checking WebMD to see if what I'm experiencing is an actual issue that I need to go to the doctor for since it's so expensive to go to a doctor.