this post was submitted on 20 Feb 2025
40 points (100.0% liked)

TechTakes

1638 readers
51 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] swlabr@awful.systems 19 points 1 day ago* (last edited 1 day ago) (1 children)

Why? Per the poll: “a lack of reliability.” The things being sold as “agents” don’t … work.

Vendors insist that the users are just holding the agents wrong. Per Bret Taylor of Sierra (and OpenAI):

Accept that it is imperfect. Rather than say, “Will AI do something wrong”, say, “When it does something wrong, what are the operational mitigations that we’ve put in place to deal with it?”

I think this illustrates the situation of the LLM market pretty well, not just at a shallow level of the base incentives of the parties at play, but also at a deeper level, showing the general lack of humanity and toleration of dogshit exhibited by the AI companies that they are trying to brainwash everyone with.

[–] andrew_bidlaw@sh.itjust.works 14 points 1 day ago

It's not unlike medical supplements before they got pushed away from real drugs and had all the markings on them about being just supplements rather than panaceas..

ATTENTION: LLM is not a real worker. Consult a competent manager before firing everyone around you.

Snake oil won't be a miscomparison on all points.