this post was submitted on 21 Oct 2024
525 points (98.0% liked)

Facepalm

2657 readers
1 users here now

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] GreenKnight23@lemmy.world 8 points 1 month ago (1 children)

or...they're both assholes and she's a gaslighting psychopath. just going off what evidence is at my disposal.

at this point if you're with a partner that refuses to acknowledge your needs in the relationship there's literally no reason to remain in the relationship.

[–] BilboBargains@lemmy.world -1 points 1 month ago (1 children)

Like her need for him to answer reasonable questions? Why does the origin of the question pose a threat and why doesn't he give examples? He's like the rando poster who says 'hey guys I forgot the passcode to my iPhone, got a workaround for that?' okay buddy, so you stole a phone then.

[–] GreenKnight23@lemmy.world 1 points 1 month ago (1 children)

if they were reasonable questions then she wouldn't need AI to ask them.

she's using AI to analyze her perception of the argument and then attacking him based on a flawed analysis.

he's not sharing enough info to determine why they have so many arguments nor what they are about.

they're both being shitty to each other and they both need to acknowledge the relationship is failing due to the individual flaws they have as people.

in a relationship differences can be strengths, similarities can be weaknesses, and personality flaws can be dangerous. it all depends on how those in the relationship deal with their differences, similarities, and flaws.

these two obviously can't deal.

[–] BilboBargains@lemmy.world -1 points 1 month ago (1 children)

Are you saying it would be preferable if she was given the same advice from a human or read it in a book? This guy cannot defend his point of view because it's probably not particularly defensible, the robot is immaterial.

[–] GreenKnight23@lemmy.world 2 points 1 month ago (1 children)

Are you saying it would be preferable if she was given the same advice from a human or read it in a book?

I'll spell it out for you. Y E S

I'm not going to argue the finer points of how a LLM has literally no concept of human relationships. Or how LLMs give the least effective advice on record.

if you trust a LLM to give anything other than half-baked garbage I genuinely feel sad for any of your current and future partners.

This guy cannot defend his point of view because it's probably not particularly defensible, the robot is immaterial.

when you have a disagreement in a long-term intimate relationship it's not about who's right or wrong. its about what you and your partner can agree on and disagree on and still respect each other.

I've been married for almost 10 years, been together for over 20, we don't agree on everything. I still respect my partners opinion and trust their judgment with my life.

every good relationship is based on trust and respect. both are concepts foreign to LLMs, but not impossible for a real person to comprehend. this is why getting a second opinion from a 3rd party is so effective. even if it's advice from a book, the idea comes from a separate person.

a good marriage counselor will not choose sides, they aren't there to judge. a counselor's first responsibility is to build a bridge of trust with both members of the relationship to open dialogue between the two as a conduit. they do this by asking questions like, "how did that make you feel?" and "tell me more about why you said that to them."

the goal is open dialogue, and what she is doing by using ChatGPT is removing her voice from the relationship. she's sitting back and forcing the guy to have relationship building discussions with a LLM. now stop, now think about how fucked up that is.

in their relationship he is expressing what he needs from her, "I want to you stop using ChatGPT and just talk to me." she refuses and ignores what he needs. in this scenario we don't know what she needs because he didn't communicate that. the only thing we can assume based on her actions is that she has a need to be "right". what did we learn about relationships and being "right"? it's counterproductive to the goals of a healthy relationship.

my point is, they're both flawed enough and are failing to communicate. neither are right, and introducing LLMs into a broken relationship is not the answer.

[–] BilboBargains@lemmy.world -1 points 1 month ago

Okay so you don't trust the robot to give relationship advice, even if that advice is identical to what humans say. The trouble is we never really know where ideas come from. They percolate up into consciousness, unbidden. Did I speak to a robot earlier? Are you speaking to a robot right now? Who knows. All I know is that when someone I love and respect asks me to explain myself I feel that I should do that no matter what.