this post was submitted on 15 Jun 2023
345 points (99.1% liked)

sh.itjust.works Main Community

7703 readers
7 users here now

Home of the sh.itjust.works instance.

Matrix

founded 1 year ago
MODERATORS
345
Beehaw* defederated us? (sh.itjust.works)
submitted 1 year ago* (last edited 1 year ago) by can@sh.itjust.works to c/main@sh.itjust.works
 
you are viewing a single comment's thread
view the rest of the comments
[–] lvl@kbin.social 16 points 1 year ago (1 children)

Well, look at the bright side: the evolution of descentralized federation now depends on the moderation topic. I wouldn't be surprised if someone takes federation to the next level and creates a moderation tool which would work out of the box for the fediverse, at the technology level (e.g. ActivityPub).

If and when this happens, federation has a bigger chance in replacing current centralized social networks.

[–] EnglishMobster@kbin.social 17 points 1 year ago* (last edited 1 year ago) (1 children)

I've been kicking around an idea in my head of making a Lemmy fork that has Tildes' ideas about modding baked in. (I would fork Kbin but I don't know PHP.)

In my experience, it's always been the best approach to select new moderators from the people known as active, high-quality members of the community. My goal with the trust system on Tildes is to turn this process of discovering the best members and granting them more influence into a natural, automatic one.

...

Trusting someone is a gradual process that comes from seeing how they behave over time. This can be reflected in the site's mechanics—for example, if a user consistently reports posts correctly for breaking the rules, eventually it should be safe to just trust that user's reports without preemptive review. Other users that aren't as consistent can be given less weight—perhaps it takes three reports from lower-trust users to trigger an action, but only one report from a very high-trust user.

This approach can be applied to other, individual mechanics as well. For example, a user could gain (or lose) access to particular abilities depending on whether they use them responsibly. If done carefully, this could even apply to voting—just as you'd value the recommendation of a trusted friend more than one from a random stranger, we should be able to give more weight to the votes of users that consistently vote for high-quality posts.

...

Another important factor will be having trust decay if the user stops participating in a community for a long period of time. Communities are always evolving, and if a user has been absent for months or years, it's very likely that they no longer have a solid understanding of the community's current norms. Perhaps users that previously had a high level of trust should be able to build it back up more quickly, but they shouldn't indefinitely retain it when they stop being involved.

Between these two factors, we should be able to ensure that communities end up being managed by members that actively contribute to them, not just people that want to be a moderator for its own sake.

Combine that with things like AutoModerator (the person behind Tildes is the one who built AutoMod on Reddit) and it seems like a reasonable way for a platform to promote good stuff and cut down on bad.

You'll have to deal with per-community "power users" with a lot of power, but the alternative is unelected mods who can be just as bad.

I don't know if I'm ever going to get around to making that fork. But I think taking Tildes' approach to mods is novel and fresh, and I quite like it.

[–] joan@kbin.social 4 points 1 year ago

That's also how it works on StackOverflow and HN. The more karma you have the more access to moderation tools.