this post was submitted on 30 Sep 2023
556 points (97.8% liked)

World News

39142 readers
2741 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Pooptimist@lemmy.world 104 points 1 year ago (107 children)

Hear me out on this one:

If we take it as given that pedophilia is a disorder and ultimatly a sickness, wouldn't it be better that these people get their fix from AI created media than from the real thing?

IMO there was no harm done to any kid in the creation of this and it would be better to give these people the fix they need or at least desperately desire in this way before they advance to more desperate and harmful measures.

[–] DrPop@lemmy.one 78 points 1 year ago

You have a point, but in at least one of these cases the images used were of the girls around them and even tried extorting one is then. Issues like this should be handled on a case by case basis.

[–] hoshikarakitaridia@sh.itjust.works 52 points 1 year ago (8 children)

Some of the comments here are so stupid: "either they unpedophile themselves or we just kill them for their thoughts"

Ok so let me think this through. Sexual preferences in any way or pretty normal and they don't go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It's the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because "if I go to prison I might as well make sure it was worth it". Kids get hurt.

"But we could make thinking about it illegal!" No we can't. Say that's a law, what now? If you don't like someone, they're a "pedophile". Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.

So why AI? Well as the commenter above me already said, if there's no victim, there's no problems. While that doesn't make extortion legal (I mean obv. it's a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We'd have more control over it.

I know ppl still want the easy solution which evidently doesn't work, but imo this is a perfect solution.

load more comments (8 replies)
[–] HawlSera@lemm.ee 32 points 1 year ago (10 children)

That's basically how I feel. I'd much rather these kinds of people jack it to drawings and AI Generated images if the alternative is that they're going to go after real chidlren.

load more comments (10 replies)
load more comments (104 replies)
[–] ICastFist@programming.dev 66 points 1 year ago (10 children)

One thing I have to ask for those that say pedos should seek psychological/psychiatric treatment: do you even know a professional that won't immediately call the cops if you say "i have sexual desires for kids"?

I wholly agree that this is something that should receive some form of treatment, but first the ones afflicted would have to know that they won't be judged, labeled and exposed when they do so.

[–] NAXLAB@lemmy.world 33 points 1 year ago (8 children)

Things that happen inside your head = not illegal

Things that happen outside of your head = can potentially be illegal.

load more comments (8 replies)
load more comments (9 replies)
[–] phoenixz@lemmy.ca 57 points 1 year ago (41 children)

I'm very conflicted on this one.

Child porn one of those things that won't go away if you prohibit it, like alcohol. It'll just go underground and cause harm to real children.

AI child pornography images, as disturbing as they might be, would serve a "need", if you will, while not actually harming children. Since child pornography doesn't appear to be one of those "try it and you'll get addicted" things, I'm genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

[–] clausetrophobic@sh.itjust.works 35 points 1 year ago (3 children)

Normalisation in culture has effects on how people behave in the real world. Look at Japan's sexualization of women and minors, and how they have huge problems with sexual assault. It's not about whether or not real children are getting hurt, it's about whether it's morally right or wrong. And as a society, we've decided that CP is very wrong as a moral concept.

[–] PhlubbaDubba@lemm.ee 25 points 1 year ago (3 children)

Here's the thing though, being too paranoid about normalization also makes the problem worse, because the truth is that these are people with severe mental problems, who in all likelihood want to seek professional help in most cases.

The problem is the subject is SO taboo that even a lot of mental health professionals will chase them off like rabid animals when the solution is developing an understanding that can lead to a clinical treatment plan for these cases.

Doing that will also help the CSAM problem too since getting people out of the alleyways and into professional help will shrink the market significantly, both immediately and overtime, reducing the amount of content that gets made, and as a result, the amount of children victimized to make that content.

The key factor remains, we have to stop treating these people like inhuman monsters that deserve death and far worse whenever they're found. They're sick in the head souls who need robust mental health care and thought management strategies.

load more comments (3 replies)
load more comments (2 replies)
load more comments (40 replies)
[–] Surreal@programming.dev 50 points 1 year ago (6 children)

If the man did not distribute the pictures, how did the government find out? Did a cloud service rat him out? Or spyware?

[–] sudo22@lemmy.world 52 points 1 year ago (3 children)

My guess would be he wasn't self hosting the AI network so the requests were going through a website.

load more comments (3 replies)
load more comments (5 replies)
[–] papertowels@lemmy.one 36 points 1 year ago* (last edited 1 year ago) (8 children)

So this does bring up an interesting point that I haven't thought about - is it the depiction that matters, or is it the actual potential for victims that matters?

Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.

[–] ilmagico@lemmy.world 21 points 1 year ago (6 children)

Every country has different rules, standing on wikipedia.

Personally, I feel that if making completely fictitious depictions of child porn, where no one is harmed (think AI-generated, or by consenting adults depicting minors) was legal, it might actually prevent the real, harmful ones from being made, thus preventing harm.

load more comments (6 replies)
load more comments (7 replies)
[–] JokeDeity@lemm.ee 31 points 1 year ago (1 children)

Considering every other aspect of this is being argued in this thread to exhaustion, I just want to say it's wild they caught him since it says he didn't distribute it.

[–] GBU_28@lemm.ee 25 points 1 year ago (1 children)

He probably just used a cloud service that wasn't private.

load more comments (1 replies)
[–] uis@lemmy.world 22 points 1 year ago* (last edited 1 year ago) (5 children)

The AI was harmed. We need to protect the AI.

load more comments (5 replies)
load more comments
view more: next ›