this post was submitted on 26 Aug 2023
397 points (85.6% liked)

Technology

59657 readers
2930 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women's Hospital found that cancer treatment plans generated by OpenAI's revolutionary chatbot were full of errors.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] MrSlicer@lemmy.world 6 points 1 year ago

So does my dog, this isn't news.

[–] alienanimals@lemmy.world 5 points 1 year ago

Clickbait written by an idiot who doesn't understand technology. I guess they give out journalism degrees to anyone who can write a top 10 buzzfeed article.

[–] shotgun_crab@lemmy.world 4 points 1 year ago

Wht would you even consider using chatGPT for this?

[–] Sanctus@lemmy.world 2 points 1 year ago

I thought it released in 2021. Maybe it was on the cusp. I was basically using it to find what I couldn't seem to find in the docs. Its definitely replaced my rubber ducky, but I still have to double check it after my Unity experience.

[–] drekly@lemmy.world 2 points 1 year ago

It speeds things up for people who know what they're talking about. The doctor asking for the plan could probably argue a few of the errors and GPT will say "oh you're right, I'll change that to something better" and then it's good to go.

Yes you can't just rely on it to be right all the time, but you can often use it to find the right answer with a small conversation, which would be quicker than just doing it alone.

I recently won a client with GPTs help in my industry.

I personally think I'm very knowledgeable in what I do, but to save time I asked what I should be looking out for, and it gave me a long list of areas to consider in a proposal. That list alone was a great starting block to get going. Some of the list wasn't relevant to me or the client, so had to be ignored, but the majority of it was solid, and started me out an hour ahead, essentially tackling the planning stage for me.

To someone outside of my industry, if they used that list verbatim, they would have brought up a lot of irrelevant information and covered topics that would make no sense.

I feel it's a tool or partner rather than a replacement for experts. It helps me get to where I need to go quicker, and it's fantastic at brainstorming ideas or potential issues in plans. It takes some of the pressure off as I get things done.

[–] GBU_28@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

No one is building document traversal LLM in the healthcare space with off the shelf tools

[–] quadropiss@lemmy.world 2 points 1 year ago

😱😱😱😱😱😱😱😱😱 /j

load more comments
view more: ‹ prev next ›