this post was submitted on 30 Jul 2024
959 points (97.9% liked)

Technology

59731 readers
3386 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

If you've watched any Olympics coverage this week, you've likely been confronted with an ad for Google's Gemini AI called "Dear Sydney." In it, a proud father seeks help writing a letter on behalf of his daughter, who is an aspiring runner and superfan of world-record-holding hurdler Sydney McLaughlin-Levrone.

"I'm pretty good with words, but this has to be just right," the father intones before asking Gemini to "Help my daughter write a letter telling Sydney how inspiring she is..." Gemini dutifully responds with a draft letter in which the LLM tells the runner, on behalf of the daughter, that she wants to be "just like you."

I think the most offensive thing about the ad is what it implies about the kinds of human tasks Google sees AI replacing. Rather than using LLMs to automate tedious busywork or difficult research questions, "Dear Sydney" presents a world where Gemini can help us offload a heartwarming shared moment of connection with our children.

Inserting Gemini into a child's heartfelt request for parental help makes it seem like the parent in question is offloading their responsibilities to a computer in the coldest, most sterile way possible. More than that, it comes across as an attempt to avoid an opportunity to bond with a child over a shared interest in a creative way.

you are viewing a single comment's thread
view the rest of the comments
[–] Khanzarate@lemmy.world 96 points 4 months ago (3 children)

The thing they're trying to market is a lot of people genuinely don't know what to say at certain times. Instead of replacing an emotional activity, its meant to be used when you literally can't do it but need to.

Obviously that's not the way it should go, but it is an actual problem they're trying to talk to. I had a friend feel real down in high school because his parents didn't attend an award ceremony, and I couldn't help cause I just didn't know what to say. AI could've hypothetically given me a rough draft or inspiration. Obviously I wouldn't have just texted what the AI said, but it could've gotten me past the part I was stuck on.

In my experience, AI is shit at that anyway. 9 times out of 10 when I ask it anything even remotely deep it restates the problem like "I'm sorry to hear your parents couldn't make it". AI can't really solve the problem google wants it to, and I'm honestly glad it can't.

[–] Serinus@lemmy.world 64 points 4 months ago (1 children)

They're trying to market emotion because emotion sells.

It's also exactly what AI should be kept away from.

[–] nilloc@discuss.tchncs.de 8 points 4 months ago

But ai also lies and hallucinates, so you can’t market it for writing work documents. That could get people fired.

Really though, I wonder if the marketing was already outsourced to the LLM?

Sadly, after working in Advertising for over 10 years, I know how dumb the art directors can be about messaging like this. It why I got out.

[–] assassin_aragorn@lemmy.world 35 points 4 months ago

A lot of the times when you don't know what to say, it's not because you can't find the right words, but the right words simply don't exist. There's nothing that captures your sorrow for the person.

Funny enough, the right thing to say is that you don't know what to say. And just offer yourself to be there for them.

[–] mozz@mbin.grits.dev 8 points 4 months ago

Yeah. If it had any empathy this would be a good task and a genuinely helpful thing. As it is, it’s going to produce nothing but pain and confusion and false hope if turned loose on this task.