this post was submitted on 24 Jul 2023
193 points (79.3% liked)

Technology

34984 readers
247 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[–] mindbleach@lemmy.world 28 points 1 year ago (4 children)

4.1 Illustrated and Computer-Generated CSAM

Stopped reading.

Child abuse laws "exclude anime" for the same reason animal cruelty laws "exclude lettuce." Drawings are not children.

Drawings are not real.

Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn't count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson's rights, because he doesn't fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.

This cannot be a controversial statement. Anyone who can't distinguish fiction from real life has brain problems.

You can't rape someone in MS Paint. Songs about murder don't leave a body. If you write about robbing Fort Knox, the gold is still there. We're not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.

If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.

[–] wmassingham@lemmy.world 5 points 1 year ago (1 children)

You should keep reading then, because they cover that later.

[–] mindbleach@lemmy.world 3 points 1 year ago

What does that even mean?

There's nothing to "cover." They're talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.

No shit they are also discussing actual CSAM alongside... drawings. That is the problem. That's what they did wrong.

[–] DrQuint@lemmy.world 5 points 1 year ago (1 children)

Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.

Pfft, of course, that's why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.

I mean, I think it's disgusting, but I don't think it should be illegal. I feel the same way about cigarettes, 2 girls 1 cup, and profane language. It's absolutely not for me, but that shouldn't make it illegal.

As long as there's no victim, knock yourself out with whatever disgusting, weird stuff you're into.

[–] markpaskal@lemmy.ca 1 points 1 year ago (1 children)

Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.

load more comments (1 replies)
[–] balls_expert@lemmy.blahaj.zone 1 points 1 year ago* (last edited 1 year ago) (53 children)

Okay, thanks for the clarification

Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I'm not sure your point changed anything.

[–] priapus@sh.itjust.works 6 points 1 year ago (14 children)

They are not saying it shouldn't be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.

load more comments (14 replies)
load more comments (52 replies)