this post was submitted on 04 Jul 2023
54 points (95.0% liked)

ChatGPT

8937 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] memmytesting1@lemm.ee 7 points 1 year ago (1 children)

The worst part is the way it states these thing as fact instead of just “so you should be…”. Definitely tricks people. Obviously this one is easy to spot, but it does the same thing no matter whether it is right or wrong, and if you have no idea about a topic you’ll just believe what you’re reading is correct.

I only ever use this for quick things that I have a pretty strong grasp on but still need a little push over the hill to solve a particular problem. Almost always I get a response that’s only half correct at best, but something inside the response I can pull out and use to solve the issue I’m having.

[–] flimsyberry@lemmy.world 1 points 1 year ago

I think that's a pretty good approach. I think it would benefit all to see ChatGPT more like an approximation/guessing machine. It often hits the mark, or even gets really close. Its bigger hallucinations are frequently hilarious.