this post was submitted on 28 Nov 2024
166 points (85.5% liked)

Technology

59774 readers
3103 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

People in 2024 aren't just swiping right and left on online dating apps — some are crafting their perfect AI match and entering relationships with chatbots.

Eric Schmidt, Google's former CEO, recently shared his concerns about young men creating AI romantic partners and said he believes that AI dating will actually increase loneliness.

"This is a good example of an unexpected problem of existing technology," Schmidt said in a conversation about AI dangers and regulation on "The Prof G Show" with Scott Galloway released Sunday.

Schmidt said an emotionally and physically "perfect" AI girlfriendcould create a scenario in which a younger male becomes obsessed and allows the AI to take over their thinking.

"That kind of obsession is possible," Schmidt said in the interview. "Especially for people who are not fully formed."

you are viewing a single comment's thread
view the rest of the comments
[–] FourPacketsOfPeanuts@lemmy.world 2 points 6 days ago (2 children)

I think the part that feels 'sad' to you is what's going to change socially over the next 50 years. I think it's going to become extremely normal to at least have a "mental health AI friend" who knows you really well and keeps you going through the day, is someone to talk to, someone who's always there, someone who's the first to detect that you may be in danger. Overall I think society's going to receive that as a good thing. And it will, I think, be normal because it will be so believable, and so useful, and for a large number of people, keep them well and feeling good about themselves. In that context some of those attachments turning romantic, or people just being sexually into whatever that assistant can say or do will be increasingly normal. It will also feel really good, let's not forget that. We're really only at the very start of what immersive VR is going to be. Once AI becomes not a little better but 50-100 years of innovation better I don't think we can really underestimate how much it's going to feel like you're actually interacting with [insert fantasy here]. Once tactile feedback sees similar improvements we're about 75% of the way to what people would use an actual holodeck for anyway. I can't see how that doesn't have a dramatic effect on how people view human-human romantic relationships. Over time the proportion of people who can have a believable experience of their absolute sexual fantasy is only going to grow over time. With how ubiquitous that will be I can't see how in most relationships people know they're "second best". I think that has a profound effect on how people make attachments to one another. I think once "having a real girlfriend" is seen as the secondary way to get your sexual needs met, that that will have a terminal effect on how many young men even want to be in relationships let alone stay around to be a father.

[–] bane_killgrind@slrpnk.net 4 points 5 days ago (1 children)

keeps you going through the day, is someone to talk to, someone who’s always there, someone who’s the first to detect that you may be in danger

That's you. You need the skills and attention to reassure yourself and introspect.

You just gave me a new reason to dislike LLMs, they allow people to refrain from maturing.

Yes true. But people need friends, sometimes an assistant too. That's the kind of role I imagine it slotting into. But I agree in many cases it could get overly parental and actually hinder people from being independent. I don't think it would be an LLM though, they're wholly unsuited to the task. Some as yet unreleased model of AI probably..

[–] Petter1@lemm.ee 2 points 6 days ago* (last edited 6 days ago) (1 children)

Yea, I guess Apple is aiming somewhere in this direction by giving new AI Siri all the context information to make you feel that it cares about you 🤔

Microsoft’s Recall as well

[–] FourPacketsOfPeanuts@lemmy.world 1 points 6 days ago (1 children)

At some point they're going to have to depart from the "single name" branding. Because the selling point is going to be how unique the AI is for you, how different from all the others. I wonder how they'll handle branding at that point? Maybe if you create "Marla", Apple are still going to say it's a type of "Siri" if that's the base technology. But I think we may start seeing weirder more organic things like "your AI Marla - a daughter of Siri - is ready"

[–] Petter1@lemm.ee 1 points 6 days ago* (last edited 6 days ago)

😂I totally see them saying that at a keynote