this post was submitted on 19 Sep 2023
72 points (88.3% liked)

Lemmy

12468 readers
1 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

founded 4 years ago
MODERATORS
 

There's another round of CSAM attacks and it's really disturbing to see those images. It was really bothering to see those and they weren't taken down immediately. There was even a disgusting shithead in the comments who thought it was funny?? the fuck

It's gone now but it was up for like an hour?? This really ruined my day and now I'm figuring out how to download tetris. It's really sickening.

you are viewing a single comment's thread
view the rest of the comments
[–] Ghostalmedia@lemmy.world 45 points 1 year ago (4 children)

Real talk, Lemmy needs some of the basic ass moderation tools that Reddit had so mods can be alerted and so mods can recommend that an admin ban an account or domain.

Sure, there are ways that we can scan uploads with AI and do a bunch of other complex magic, but we need the basics first.

[–] hitagi@ani.social 32 points 1 year ago (1 children)

One tool that I liked from Reddit was manually approving posts from accounts under a certain age or karma threshold. I hope we can get tools like that one day.

[–] lemann@lemmy.one 4 points 1 year ago* (last edited 1 year ago)

There is already the ability to restrict by karma with lemmy bots, but this will just encourage karma farming IMO, hence why nobody has done this yet

I like the sound of the former approach - it sounds like a more effective solution and is similar to what Discourse does (manual approval of posts for new accounts, with an accompanying trust level) in a lemmy implementation it could possibly be managed or set by each instance

Edit:clarification

[–] Sunroc@lemmy.world 8 points 1 year ago (1 children)

Lemmy will need a trust and safety team, but those can be expensive, and it would be an operational challenge for every instance to have experienced people. Would probably work best if there was a T&S collective and instances can elect to use them as a resource.

[–] Ghostalmedia@lemmy.world 5 points 1 year ago

But before we can even get to that, we need those basic mod tools. A volunteer TS team would need that to be effective.

Can’t address a serious report if you don’t know it exists, and if you aren’t empowered report bad actors to admins to ban them from an instance.

[–] Corgana@startrek.website 6 points 1 year ago

Better tools will open the door for instance admins who don't come from a network admin/developer background to responsibly host their communities, too.

For the Lemmyverse to truly thrive, Admins should be relatively free to focus their time on the social elements of running an instance, which is a wholly different skillset than systems administration. Right now in order to be an effective Admin you need a heaping of both, (unless of course you're interested in running an unmoderated instance).

[–] CluckN@lemmy.world 4 points 1 year ago

Even with fantastic moderation tools if one malicious user can take down an entire Lemmy instance then all is for naught.