this post was submitted on 24 Jul 2023
232 points (100.0% liked)

Technology

37739 readers
712 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

you are viewing a single comment's thread
view the rest of the comments
[–] jordanlund@lemmy.one 47 points 1 year ago (2 children)

"massive child abuse material problem"

"112 instances of known CSAM across 325,000 posts"

While any instance is unacceptable, does 112/325,000 constitute a "massive problem"?

0.0000034462% of posts are unacceptable! Massive problem!

[–] crystal@feddit.de 37 points 1 year ago

You moved the period in the wrong direction. It's 0.034462%.

[–] ParsnipWitch@feddit.de 9 points 1 year ago

That's just the material they knew was CSAM from previous investigations.

There were also 713 uses of the top 20 CSAM-related hashtags across the Fediverse on posts that contained media, as well as 1,217 text-only posts that pointed to “off-site CSAM trading or grooming of minors.” The study notes that the open posting of CSAM is “disturbingly prevalent.”