this post was submitted on 15 Jul 2023
2466 points (97.9% liked)

Technology

59588 readers
3180 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

nuff said

you are viewing a single comment's thread
view the rest of the comments
[–] PoliticalAgitator@lemmy.world 10 points 1 year ago* (last edited 1 year ago)

The biggest obstacle to spreading far-right propaganda has always been finding a platform.

Before the internet, when neonazis tried to shove racist leaflets into peoples pockets at punk gigs, they'd be immediately run out of the venue, despite "angry, dissaffected, young people" being exactly the kind of vulnerability they were looking for.

When the internet did come along, initially things weren't much better. Sure, there were sites like Stormfront, but nobody went there. So instead they'd "raid" other forums to spread their shitty views, getting instantly banned because they hadn't figured out how to be a Nazi with plausible deniability yet.

When they finally nailed that, it was a big moment for them.

Historically, mainstream media also never gave a fuck what the opinions of Nazis were. But the moment they rebranded to "alt-right", the psycopathic, for-profit, neoliberal media companies saw a way to make some quick cash without having to openly admit they were functioning as a mouthpiece for people with swastika tattoos.

From there, the "mask on, hide your powerlevel" stategy was codified. 4chan and far-right Discord servers openly stategized about how to do it best, such as presenting their dogshit opinions as popular, moderate beliefs and blaming progressives for their asshole personalities.

By the time Charlottesville's swastika-waving parade and domestic-terrorism-finale happened, it was too late. Key figures in the far-right funnel had settled into social media like bedbugs at a two-star hotel.

Whenever a platform tried to get rid of them, they'd slip away through cracks in the walls. They would get banned and create new accounts that were slightly toned down, searching for that sweet spot of "as far-right as we can get away with". They'd move to another major platform (or somewhere else on the same platform), because there was no coordinated effort to remove them for good.

But despite the slow, uncordinated response from social media sites, it was starting to work, especially on Twitter. By the time you'd hidden how far-right you were, you could no longer spread your message. Nobody was fooled by the dog whistles, fake engagement and flowery misrepresentations of "freedom of speech" any more.

Intially, they tried their own mask-off Twitter with Truth Social (who conspiciously aren't being sued by Musk for being a Twitter clone). But the numbers were dogshit. It had a fraction of the traffic and everybody there was already far-right. You could keep them frothy, but you couldn't breed more of them.

So Musk bought Twitter. Ideally, he wanted to just hand one of the big three socials back to right-wing reactionaries ane extremists but he also has no problem just killing the platform.

The only thing that mattered was that the deplatforming stopped, before people realised that it works and makes sites 1000x better.