this post was submitted on 06 Aug 2023
266 points (92.4% liked)

Technology

59696 readers
2564 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Yes! This is a brilliant explanation of why language use is not the same as intelligence, and why LLMs like chatGPT are not intelligence. At all.

you are viewing a single comment's thread
view the rest of the comments
[–] Peanutbjelly@sopuli.xyz 10 points 1 year ago (2 children)

This whole thread is absurd.

Chatgpt has a form of intelligence depending on your definition of intelligence. It may also be considered conscious in a very alien and undeveloped way. It is definitely not sentient.

Kind of like having the stochastic word generating part of a brain and nothing else.

You can still shape it into something capable of intelligent and directed activity.

People are really bad at accepting the level of nuance necessary for this topic.

It is useful and fantastic for what it already is. People are just really bad at understanding what it is.

[–] FaceDeer@kbin.social 9 points 1 year ago (3 children)

A lot of people are deeply invested in the notion that human intelligence is unique and special and impossible to replicate. Either their personal sense of worth is bound up in that notion (see for example many of the artists who get very angry when people call AI generated images "art") or it's simply a threat to their jobs and economic wellbeing. The result is a powerful need to convince themselves that there's a special something that's missing from ChatGPT and its ilk that will "never" be replicated by machines.

It's true that ChatGPT isn't intelligent in the same way that human brains are intelligent. But it is intelligent, in ways that are useful. And "never" is a bad bet to make for the rest of those capabilities.

[–] kaffiene@lemmy.world 1 points 1 year ago (1 children)

Chatgpt is not intelligent. Not in the sense where we use that word anywhere else, including the animal kingdom. Transformer is an extraordinary clever and sophisticated algorithm, thou

[–] FaceDeer@kbin.social 5 points 1 year ago

As I said:

It's true that ChatGPT isn't intelligent in the same way that human brains are intelligent.

There isn't just one kind of intelligence.

[–] rob64@startrek.website 1 points 1 year ago* (last edited 1 year ago)

My sense in reading the article was not that the author thinks artificial general intelligence is impossible, but that we're a lot farther away from it than recent events might lead you to believe. The whole article is about the human tendency to conflate language ability and intelligence, and the author is making the argument both that natural language does not imply understanding of meaning and that those financially invested in current "AI" benefit from the popular assumption that it does. The appearance or perception of intelligence increases the market value of AIs, even if what they're doing is more analogous to the actions of a very sophisticated parrot.

Edit all of which is to say, I don't think the article is asserting that true AI is impossible, just that there's a lot more to it than smooth language usage. I don't think she'd say never, but probably that there's a lot more to figure out—a good deal more than some seem to think—before we get Skynet.