And, if you are comfortable with sharing, what are some of the worst things that you’ve seen?
Isn’t this on the same level (hilarious and obvious fetish request) as asking a soldier how many people they’ve killed?
Personally, I’m asking it because I’m considering running an instance of my own, and I want to know what to expect. This is an issue that concerns me.
The worst has probably been the CSAM and the detailed instructions on how to kill yourself or suicidal ideation.
In the end, to me, it’s just more content to remove, just like spam.
I moderate !traditional_art@lemmy.world and so far I have not encountered a single disturbing post that had to be removed. I imagine in non-political areas of lemmy, people are more chill.
Apparently some mods definition of disturbing is different than the rest of ours. One of them thinks talking about jury nullification is the utmost horror.
There was a while after the Reddit exodus where apparently lots of CSAM was being posted. I never saw it personally, but apparently it was a big issue.
What is scam?
Csam= Child Sexual Abuse Material. It’s CP.
Ahhhh, kinda wish I hadn’t asked… But thanks for answering.
As far as I’ve seen the mods and admins have been doing a stellar job of removing it drom the platform.
I moderate a few smaller but active ones. I don’t see a whole lot of bad stuff apart from the occasional garbage opinion such as [insert least favorite ideology here].
This might be because the places I mod are pretty chill in nature. I’m sure of the topic was something more volatile and spicy, such as politics, it could be a different story.
I run a community that has like 8 posts, most by me, so never. I need to post again cause that Marvel game is dying.
If you feel left out I have a FunkoPop and a mason jar that I can post there.
Woah what the hell? I’ve seen like 5 comments mentioning CSAM. I’m gonna need someone to elaborate on what the fuck was going on
The Reddit API exodus in 2023 brought hundreds of new instances and tens of thousands of new users to Lemmy. Unfortunately, many of those instances had open registrations, Lemmy’s moderation tools were very basic (they still are), and there were not enough mods/admins to provide 24-hour coverage on the larger instances. A handful of trolls took advantage of the situation by posting grotesque stuff for lulz: CSAM, scat porn, racist memes, etc. Sometimes it would stay up for hours until someone with the right permissions noticed and took it down.
The situation is better now. Though there is still plenty of room for improvement.
Not an admin or moderator but dear god it must be difficult to cope mentally and emotionally for any and all moderators to see truly disturbing stuff. That’s a salute they don’t get enough credit for.
When I volunteered as a new mod starting last year, I saw some horrific csam posts that I can’t forget, I just hope those who reported them have recieved some mental health support.
To cope, I hike to my local park, take edibles, try to do anything to keep me distracted but it’s hard to forget. Even as I sit here freezing in my chair the ticking of my clock is a deafening reminder of the garbage I’ve seen. It doesn’t help in that I’m an empath(I receive the pain of others) either so I just think about why these things happen and try to promote systems that prevent past mistakes from reoccurring.
A couple systems I’m aiming to improve is public education and mental health support. When people know how to better protect themselves and those most trustworthy nearest to them, the higher likelihood that atrocities like these can be prevented and stopped. That’s the hope anyways. No guarantee it’ll work.