this post was submitted on 03 Jan 2025
713 points (98.8% liked)

Technology

60299 readers
3072 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Soon people discovered that Meta’s ghoulish posters had been among us for months, even years. There’s Liv, a “Proud black queer momma of 2 & truth-teller,” according to its Instagram profile. Add to that Brian, “everybody’s grandpa;” Jade, “your girl for all things hip-hop;” and Carter, a “relationship coach.” I’m sure there are more yet to be discovered.

All four of these posters have pages on both Facebook and Instagram with mirrored content and all four have post histories that go back to September 26, 2023. The accounts have the blue verified check marks and a label indicating that they’re an AI “managed by Meta.” Users can block them on Facebook, but not on Instagram. Users can also message them across all of Meta’s platforms, including WhatsApp.

you are viewing a single comment's thread
view the rest of the comments
[–] lurch@sh.itjust.works 13 points 5 days ago (3 children)

I don't like Zuck, but he did not misunderstand humans. He's just 2 steps ahead of you. Kids go on https://c.ai right now and playfully chat with AI posing as various pop culture characters or professions. While this generation grows up, Zuck will have a portfolio of made up characters ready and waiting for them on FB, where the humans can live their life without ever befriending another real human and sharing everything with Meta, to be monetized.

[–] azertyfun@sh.itjust.works 17 points 4 days ago* (last edited 4 days ago) (1 children)

Any source on any significant amount of children wasting time talking to AIs, or just anecdotes and a bad case of "youth these days"?

The whole concept smells like fringe NEET 4chan-adjacent behavior. LLMs aren't capable of maintaining an even remotely convincing simulacrum of human connection, and anyone who would project companionship onto these soulless computer programs obviously has preexisting and severe mental issues (relying on AIs to fill a void in human connection is certainly unhealthy but a symptom, not the root cause).

The potential market for these AIs will never be any bigger than the market for anime waifu body pillows, because it's same audience, different decade. Literally everyone else thinks AI girlfriends and body pillow waifus are weird as all hell, and that's not going to change because neurotypical people want and need human connection and can tell the difference between a rock with googly eyes and a friend.

Also arguably a rock with googly eyes has more charm and personality than Zuck's horror show.

[–] cyd@lemmy.world 2 points 4 days ago

LLMs aren't capable of maintaining an even remotely convincing simulacrum of human connection,

Eh, maybe, maybe not. 99% of the human-written stuff in IM chats, or posted to social media, is superficial fluff that a fine-tuned LLM should have no problem imitating. It's still relatively easy to recognize AI models outputs in their default settings, because of their characteristic earnest/helpful tone and writing style, but that's quite easily adjustable.

One example worth considering: people are already using fine tuned LLMs to copilot tabletop RPGs, with decent success. In that setting, you don't need fine literature, just a "good enough" quality of prose. And that is already far exceeding the average quality that you see in social media.

[–] SkyNTP@lemmy.ml 11 points 5 days ago

Tobacco company selling cigarettes to kids. More at 11.