this post was submitted on 17 Dec 2024
102 points (88.6% liked)
Fedigrow
689 readers
231 users here now
To discuss how to grow and manage communities / magazines on Lemmy, Mbin, Piefed and Sublinks
founded 7 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It definitely could go either way.
The toxicity needs to be discussed in order to deal, but what is the real benefit of doing that at the per-user level? To make a cross-instance blacklist? The affected users would just create an alt, plus what is "toxic" to some ("I want women to not be treated as people" is the epitome of grace and class to others - someFUCKINGhow?!).
A complicating factor is that currently, moderator reports aren't even federated across instances, and that won't be added until at least 0.20 as Nutomic put onto the Lemmy Roadmap. Not that it should either hinder or accelerate the need for such a community, just that it seems tangentially related?
I keep coming back to the idea of porn: should it not exist (no, I mean yes, I mean it should not be entirely banned, studies show that banning it at least correlates if not actually contributes to causing actual irl physical violence), or can it simply be labeled properly? The problem being that while the Fediverse does an excellent job of labeling NSFW content (and PieFed even adds a new category, on top of NSFW, for "gore"), it fails miserably at labeling most other things - e.g. you cannot criticize Russia, China, or North Korea on the infamous "community of privacy and FOSS enthusiasts, run by Lemmy’s developers", which how would the latter have in any way implied the former, in its wording?
Making porn be "opt-in" makes it safe to visit the Fediverse even at work, without fear of being part of the company's "cost savings plan" (at least due to such a reason as this, assuming they even need a reason at all). Failing to label toxic users as toxic allows them to mix in amongst all the other users, with no distinction offered except to allow or deny, at which point the moderation requires effort to perform that task. Unless we try other ways?