this post was submitted on 30 Sep 2023
556 points (97.8% liked)

World News

39165 readers
2303 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BreakDecks@lemmy.ml 5 points 1 year ago

In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that's not what a CSAM ban is about. It's about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can't consent, so even the distribution or basic retention of this content violates a child's rights.

Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it's attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.

It's a sensitive subject that most people don't see nuance in. It's hard to admit that pedophilia isn't a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.

With that said, we don't have much of a description of the South Korean man's offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it's my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.