this post was submitted on 31 Jul 2023
147 points (85.5% liked)

Meta (lemm.ee)

3574 readers
2 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, !meta@lemm.ee will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 1 year ago
MODERATORS
 

There's a new one suddenly popping up in my feed but obviously the reports are being "resolved" by the mods of that community. They suggested to me that I block their community but I will not because that is how you get a cesspit of an instance. How do we report disinformation communities straight to the admins?

Edit: the admins did remove the community in question so I'm going to take that as the official stance on disinformation communities and also assume that any type of community (right wing, left wing, or other) that are intentionally spreading disinformation will be removed. That makes me feel much better about the situation since this type of thing is pretty much guaranteed to pop up again.

you are viewing a single comment's thread
view the rest of the comments
[–] lagomorphlecture@lemm.ee 27 points 1 year ago (1 children)

We're talking about a community that is dedicated to posting misinformation and apparently trolling. It is very common for that kind of content and community to be explicitly forbidden in general purpose online communities because that isn't general purpose content. This isn't a matter of things people simply don't want to see. It's content that has proven to be problematic for any community that it infiltrates and generally results in a decline in quality and decorum.

[–] shootwhatsmyname@lemm.ee 8 points 1 year ago (1 children)

I agree. The unfortunate reality is that the line between misinformation and opinion is very different from person to person—especially when it comes to politics. It’s easy to moderate and remove illegal content based on local laws of the country an instance resides in, but trying to moderate content from a single U.S. political party raises more questions and will take more volunteer manpower from admins. We would need to define as a community:

  1. What is the criteria for a misinformation or trolling community?
  2. Will we defederate from entire instances if they meet the criteria for misinformation/trolling?
  3. Will we regulate all types of communities (like technology, hobby, humor, culture, news, and war-related communities), or will we only regulate politically-driven communities.

I still don’t think this is the right move. I joined the Fediverse because of the ability to post and consume content without any person or entity manually or automatically determining what I can and cannot see. I specifically chose this instance because of it’s relaxed policy on defederation. I value being able to see all content and be aware of everyone’s voice, even if it is blatantly false information or offensive.

[–] dmention7@lemm.ee 12 points 1 year ago (2 children)

I don't think anyone is advocating for defederation, just upholding some base standards for discourse on communities directly hosted by this instance. If it was just a normal rightwing sub, I'd agree with you, but defending a blatant troll/disinformation sub is getting into "paradox of tolerance" territory for me personally.

Hell, the snowflakes banned me for making a single post warning another user not to feed the trolls. 😂

I have zero problem with staying federated with instances I vehemently disagree with. But I also have little desire to stay on one that "Free Speech"es itself into becoming a safe space for trolls and disinformation peddlers.

[–] Grangle1@lemm.ee 6 points 1 year ago (1 children)

The one problem is, specifically with this type of conversation, anyone even in the center is not welcome in the conversation because the echo chamber is so strong that anything even in the center is instantly labeled "misinformation". Who decides what the difference between "opinion I disagree with" and "misinformation" is? Far too often it's left to a person or group, be it on the left or the right, that holds that anything they or the most vocal political users disagree with is "dangerous misinformation". And I tend to notice that unless it's a specific right-wing instance like explodingheads, anything that's not on the far left is either down voted to oblivion or outright removed and anyone who posts or says anything positive about it is effectively driven out, including people who argue such things in good faith. That tends to lead to the creation of such instances as explodingheads and attitudes like the people who reside there.

[–] dmention7@lemm.ee 1 points 1 year ago (1 children)

That is a fair point, but I submit its kind of tangential or maybe orthogonal the core topic. The problem of people not being able to discuss controversial topics maturely is not improved by hosting clear bad-faith conversation. That just poisons the well and makes it even harder to hold the good faith conversations.

You don't wring your hands about throwing out the baby with the bathwater when you're faced with a bucket of sewage.

[–] Grangle1@lemm.ee 4 points 1 year ago

But that's how what I mentioned happens - the vocal users decide that anything that disagrees with them is that "sewage" or "poison", even if it is legitimate, and then you end up with that echo chamber situation. I would think that proper moderation of political communities would ensure that polite, good-faith argument, regardless of the political leaning of the view, would be allowed, but that's not how it often happens because of how moderators and vocal users define what good faith argument is, mostly based on whether the argument agrees with their own view.

[–] shootwhatsmyname@lemm.ee 6 points 1 year ago* (last edited 1 year ago) (1 children)

Hell no, I’m not vouching for defederation or defending those communities in any way. Step back and look at the bigger picture with me. I think there are potential problems with moderating based on vague and non-concrete things, and I’m trying to further the discussion so we define them better together.

If we’re going to remove the communities OP is referring to, for example, we need to define (1) what qualifies as misinformation and trolling, and (2) what content/communities/users we’re proposing to remove in the future.

If we use dictionary definitions…

misinformation: false or inaccurate information, especially that which is deliberately intended to deceive

troll: a person who makes a deliberately offensive or provocative online post

…then admins will have the new responsibility of (1) deciding whether content is true or false, (2) determining the intent of the content creator, and (3) deciding what is offensive or provocative.

Are we going to remove content if it offends someone? Will admins be deleting content based on the assumed intentions of the creator?

That’s not the instance I signed up for, and it also goes against basic human rights. I can see it being highly problematic for moderators and admins in the long run unless we move away from being a “general purpose” instance.

[–] lagomorphlecture@lemm.ee -1 points 1 year ago (1 children)

Hello, OP here and I just circled back to this. I want to clarify that I personally used the word disinformation and never used the word misinformation. These are different.

Misinformation is false or inaccurate information—getting the facts wrong. Disinformation is false information which is deliberately intended to mislead—intentionally misstating the facts.

I think the majority of people are ok with interacting with people who are misinformed. We've all been misinformed on various topics at different times in our lives. The community in question, however, and communities like it, exist for the sole purpose of spreading lies to cause harm to individuals or society. I probably should not have used the words "right wing" in my post but the majority of communities spreading harmful disinformation right now tend to be right wing. The admins removed the community and I take that as a sign that they will remove other disinformation communities as well, including but not limited to right wing communities.

The community in question had links to known disinformation sites. I and others reported these posts. I included links to sources identifying those websites as disinformation sites. The moderator of that community "resolved" the reports without removing the content, which I have since confirmed does remove it from the admin queue. Herein lies a serious problem. A bad actor can conceivably post a bunch of intentionally misleading information and links, then clear reports to hide it from the admins and there is no clear way to report a community or mod who is acting in bad faith. I was also unable to find anything in the instance guidelines that specifically outlined a stance on disinformation, which was concerning especially given that the mod was "resolving" reports in that way.

I'm not really looking for a lot of further discussion as I think the difference between dis and misinformation is important and that seems to have initially been misunderstood by some people. I would hope that the admins make an official clarification on the issue but since they removed the community I think that makes the position clear and any further discussion, if anyone wants to be on an instance that hosts disinformation communities, should be directed to the admins.

Again, this isn't about misinformation as we have all experienced "being wrong". Anyone who has been misinformed and is interacting with the wider community on good faith will generally listen when someone gives them some valid information to the contrary. People spreading disinformation are acting in bad faith and will continue intentionally spreading known lies. This distinction is important to my original post and I just wanted to clarify for anyone who happens in here.

[–] shootwhatsmyname@lemm.ee 2 points 1 year ago

A few things wrong with what you said:

  1. You did use the word “misinformation” in the comment I was replying to:

We’re talking about a community that is dedicated to posting misinformation and apparently trolling.

  1. The definition of misinformation I used in my previous comment includes “deliberate deception” and matches your definition of disinformation, so if I’m not mistaken we are actually on the same page there and my points are still relevant to the discussion.

I totally agree there needs to be a good way to report communities to admins. I also think vote manipulation, making multiple accounts, brigading, automated posting, and other ways of manipulating the system to push an opinion should be prohibited.

What I don’t agree with is removing communities for “disinformation.” What’s happening is:

  1. Those community mods banned you because you posted “disinformation”
  2. You want the community removed because they are posting “disinformation”

I think there’s an inherent flaw with our definitions of truth here. If you say one thing is truth, and some community mods say another thing is truth, how do we decide which voice is silenced on lemm.ee?

This might be a hard one to digest, but please genuinely consider it: I think we are mis-labeling opinion and calling it truth without realizing it. Calling your own opinion absolute truth is a very dangerous game to play when you are making decisions for other people (read history to learn more). And, like I initially pointed out, moderating an entire instance based on opinion doesn’t seem to line up with a “general purpose” instance like lemm.ee in my opinion. Do you think that’s plausible?

I know you said you don’t want to discuss further, but it’s hard to learn from you if you don’t converse or answer questions. If you have specific thoughts or disagreements on any points I’ve made, I’d appreciate hearing them