this post was submitted on 12 Jan 2025
1167 points (98.1% liked)

memes

10963 readers
3371 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Ilovethebomb@lemm.ee 49 points 6 days ago (37 children)

Surely they've thought about this, right?

[–] Tyfud@lemmy.world 17 points 6 days ago (3 children)

It's fake. Llms don't execute commands on the host machine. They generate text as a response, but don't ever have access to or ability to execute random code on their environment

[–] Ziglin@lemmy.world 5 points 6 days ago (1 children)

Some are allowed to by (I assume) generating some prefix that tells the environment to run the following statement. ChatGPT seems to have something similar but I haven't tested it and I doubt it runs terminal commands or has root access. I assume it's a funny coincidence that the error popped up then or it was indeed faked for some reason.

[–] Venator 2 points 2 days ago

faked for some reason.

Comedy: the reason is comedy.

load more comments (1 replies)
load more comments (34 replies)