this post was submitted on 28 Aug 2023
1748 points (98.0% liked)

Lemmy.World Announcements

29077 readers
8 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
 

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

you are viewing a single comment's thread
view the rest of the comments
[–] krayj@sh.itjust.works 67 points 1 year ago* (last edited 1 year ago) (3 children)

How does closing lemmyshitpost do anything to solve the issue? Isn't it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?

[–] Cabrio@lemmy.world 38 points 1 year ago (1 children)

It stops their instance hosting CSAM and removes their legal liability to deal with something they don't have the capacity to at this point in time.

How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

[–] krayj@sh.itjust.works 25 points 1 year ago* (last edited 1 year ago) (2 children)

How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

But that's not what happened. They didn't take the server offline. They banned a community. If some remote person had access to my pc and they were loading it up with child porn, I would not expect that deleting the folder would fix the problem. So I don't understand what your analogy is trying to accomplish because it's faulty.

Also, I think you are confusing my question as some kind of disapproval. It isn't. If closing a community solves the problem then I fully support the admin team actions.

I'm just questioning whether that really solves the problem or not. It was a community created on Lemmy.world, not some other instance. So if the perpetrators were capable of posting to it, they are capable of posting to any community on lemmy.world. You get that, yeah?

My question is just a request for clarification. How does shutting down 1 community stop the perpetrators from posting the same stuff to other communities?

[–] Ghostalmedia@lemmy.world 18 points 1 year ago

Fact of the matter is that these mods are not lawyers, and even if they were not liable, they would not have the means to fight this in court if someone falsely, or legitimately, claimed they were liable. They’re hobbits with day jobs.

I also mod a few large communities here, and if I’m ever in that boat, I would also jump. I have other shit to do, and I don’t have the time or energy to fight trolls like that.

If this was Reddit, I’d let all the paid admins, legal, PR, SysOps, engineers and UX folks figure it out. But this isn’t Reddit. It’s all on the hobbyist mods to figure it out. Many are not going to have the energy to put up with it.

[–] Cabrio@lemmy.world 9 points 1 year ago (1 children)

It's not meant to solve the problem, it's meant to limit liability.

[–] krayj@sh.itjust.works 5 points 1 year ago (1 children)

How does it limit liability when they could continue posting that content to any/every other community on lemmy.world?

[–] Cabrio@lemmy.world 8 points 1 year ago* (last edited 1 year ago) (1 children)

But it does remove the immediate issue of CSAM coming from shitpost so world isn't hosting that content.

[–] Double_A@discuss.tchncs.de 4 points 1 year ago (1 children)

Shitpost is not the only community on World Ffs!

[–] stealthnerd@lemmy.world 3 points 1 year ago

They're taking a whack-a-mole approach for sure but it's either that or shut the whole instance down. I imagine their hope is that either the bad guys give up/lose interest or that it buys them some time.

Either way, it shows they are taking action which ultimately should help limit their liability.

[–] Whitehat93875@lemmy.world 23 points 1 year ago

They also changed the account sign ups to be application only so people can't create accounts without being approved.

[–] Ghostalmedia@lemmy.world 18 points 1 year ago (1 children)

It doesn’t solve the bigger moderation problem, but it solves the immediate issue for the mods who don’t want to go to jail for modding a community hosting CSM.

[–] krayj@sh.itjust.works 8 points 1 year ago (2 children)

Doesn't that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?

[–] Ghostalmedia@lemmy.world 16 points 1 year ago (1 children)

Yup. The perpetrators win.

If you were in their shoes, would you want to risk going to jail for kiddy porn, risk having your name associated with CSM online, or drain your personal savings account to fight these folks?

These mods are not protected by a well funded private legal team. This isn’t Reddit.

[–] krayj@sh.itjust.works 3 points 1 year ago (2 children)

You don't have to explain how liability works. I get it. What I don't get is how removing that specific community is going to limit their liability when the perpetrators will just target a different community.

[–] Whitehat93875@lemmy.world 11 points 1 year ago

Sign-ups are manual approval applications, no more automated sign-ups from them, if they have existing accounts and target another community it'll be closed as well and those accounts banned, there isn't a stream of new accounts though because all accounts going forward need to be manually approved.

[–] ttmrichter@lemmy.world 2 points 1 year ago

One of the ways you avoid liability is you show that you're actively taking measures to prevent illegal content.

[–] MsPenguinette@lemmy.world 2 points 1 year ago (1 children)

The perps are taking a big risk as well. Finding and uploading csam means being in possession of it. So we can at least take solace in knowing it's not a tool that just anyone wiill use to take down a community.

Uploading to websites counts as distribution. The authorities will actually care about this. It's not just some small thing that is technically a crime. It's big time crime being used for skme thing petty.

So while the perp might win in the short term, they are risking their lives using this tactic. I'm not terribly worried about it becoming a common tactic

I'd anything, if I were the one doing this, I'd be worried that I might be pissing off the wrong group of people. If they keep at it and become a bigger problem, everyone is going to be looking for them. And then that person is going to big boy prison.

[–] krayj@sh.itjust.works 1 points 1 year ago

That is a great point. I don't know if the admin team are proactively reporting that activity to law enforcement, but I hope they are.