this post was submitted on 05 Feb 2024
253 points (95.3% liked)

politics

19120 readers
2679 users here now

Welcome to the discussion of US Politics!

Rules:

  1. Post only links to articles, Title must fairly describe link contents. If your title differs from the site’s, it should only be to add context or be more descriptive. Do not post entire articles in the body or in the comments.

Links must be to the original source, not an aggregator like Google Amp, MSN, or Yahoo.

Example:

  1. Articles must be relevant to politics. Links must be to quality and original content. Articles should be worth reading. Clickbait, stub articles, and rehosted or stolen content are not allowed. Check your source for Reliability and Bias here.
  2. Be civil, No violations of TOS. It’s OK to say the subject of an article is behaving like a (pejorative, pejorative). It’s NOT OK to say another USER is (pejorative). Strong language is fine, just not directed at other members. Engage in good-faith and with respect! This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.
  3. No memes, trolling, or low-effort comments. Reposts, misinformation, off-topic, trolling, or offensive. Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.
  4. Vote based on comment quality, not agreement. This community aims to foster discussion; please reward people for putting effort into articulating their viewpoint, even if you disagree with it.
  5. No hate speech, slurs, celebrating death, advocating violence, or abusive language. This will result in a ban. Usernames containing racist, or inappropriate slurs will be banned without warning

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.

That's all the rules!

Civic Links

Register To Vote

Citizenship Resource Center

Congressional Awards Program

Federal Government Agencies

Library of Congress Legislative Resources

The White House

U.S. House of Representatives

U.S. Senate

Partnered Communities:

News

World News

Business News

Political Discussion

Ask Politics

Military News

Global Politics

Moderate Politics

Progressive Politics

UK Politics

Canadian Politics

Australian Politics

New Zealand Politics

founded 1 year ago
MODERATORS
all 34 comments
sorted by: hot top controversial new old
[–] mofongo@lemm.ee 26 points 9 months ago

A day later, Biden announced, “If you harm an American, we will respond”, and dropped missiles on more than 80 targets across Syria and Iraq. Sure bro, just so long as the Americans aren’t teenagers with smart phones.

Who wrote this??? Sure bro, that sounds like some social media comment. And why use a retaliation bombing as a positive example??? That’s just absurd. Bombings are never good and can always wound or kill innocent people. I get the message social media has negative effects on teenagers but by god the writing style is just braindead.

[–] bomberesque1@lemm.ee 22 points 9 months ago

That's definitely his wanking face

[–] _sideffect@lemmy.world 13 points 9 months ago (1 children)

No one with money cares, and no one who wants to be the guy with money cares either

It's us. The ones without the money, that care. And we CAN make a change but we don't know how.

Not yet.

[–] FigMcLargeHuge@sh.itjust.works 13 points 9 months ago (2 children)

And we CAN make a change but we don’t know how.

Start with "stop using their software." If they have no data to sell, their company is worthless, but the people you claim care so much won't stop. I have heard so many excuses as to why people "need" social media its maddening, and yet somehow I manage to survive without sites like facebook, etc. I am only on lemmy as an anonymous contributor, and if I feel like things are headed towards where they are with other social media I will drop it like a hot potato. Delete your accounts, and stop giving them data.

[–] jherazob@kbin.social 8 points 9 months ago

The problem of course is that the vast mass of consumers won't do this, it's only us weirdos

[–] GluWu@lemm.ee 5 points 9 months ago (1 children)

Reducing the supply increases the value. Create data, just make it worthless. You'll be doing your part. If enough people piss in the pool, it will eventually reach a concentration that nobody wants to touch.

[–] FigMcLargeHuge@sh.itjust.works 2 points 9 months ago

That would work if you had control over what data they collected. At this point they are collecting data that you don't get to see, and you have no say in what they do with it. I realize people signed up for this, but I think they have also been duped or at least denied the ability to even understand what gets collected. You can't pee into a pool you aren't allowed to even get near.

[–] TheBat@lemmy.world 9 points 9 months ago (1 children)
[–] THEDAEMON@lemmy.ml 5 points 9 months ago* (last edited 9 months ago)

Yep i wish tech giants would show some standard . But then again they wouldn't be where they are now.

[–] RealFknNito@lemmy.world 8 points 9 months ago (1 children)

"I'm not responsible for the wellbeing of my children, big tech is!" This kind of shit is how we get forced into having to use government IDs to use the internet. Some states already do it for porn. The danger isn't big tech, it's harassment that should be taken care of offline.

You want to regulate them go for it but what if they go for the internet as a whole and not just social media? Then what?

[–] theangryseal@lemmy.world 0 points 9 months ago (1 children)

OH, SO, stalker calls me and tells me to end myself you mean I can’t get paid by Verizon? Bullshit! I should be protected. The KIIIIIDDDS should be protected!

Someone uses a bulletin board at a post office to post a picture of my kid with captions on it that cause my kid to feel depressed, I WANT LAWS! The person who is in charge of the post office? Straight to jail! Manufacturer of the bulletin board? YUP, prison! Company who made the paper? Everyone who works there should be locked up!

Now, back to reality.

Wherever humans can be social, there will be bad humans using that to hurt other humans. You’re right. It should start with the parents. We didn’t ban kids from using telephones and television. Some parents did. That’s their business.

Facebook should do the best they can to enforce their policies about minors being on the platform, but billions of people use Facebook. Billions. They can’t possibly be responsible for all of it.

I don’t like Facebook as a company, at all. Still though, we should do better to handle our homes and stop counting on outsiders to do it all for us.

[–] RealFknNito@lemmy.world 6 points 9 months ago (1 children)

I don't like Facebook, Twitter, any of those sites either nor do I use them but what I can't get over is that people demand action and I haven't seen any suggestions. Just a demand for change. The only thing that comes to mind in censoring messages/tweets but wow that would be a great way to kill the site.

Adults want a free, open experience for themselves but also want a safe, enclosed space for kids. In the same spot. So how do you differentiate between an adult and a kid reliably without an ID?

You can't. And it's why there will never be a solution. Kids will lie to get the adult freedom and then suffer the consequences be it mental or otherwise.

[–] VieuxQueb@lemmy.ca 3 points 9 months ago (1 children)

Letting your kid on the Internet without supervision is akin to giving them unlimited blank plane tickets. Yes they could experience some very enriching event but could and most likely will be left hurt and traumatized in an unfamiliar place.

[–] RealFknNito@lemmy.world 2 points 9 months ago (1 children)

But that kind of argument can go back to every generation for every sufficient advancement of media. "Without supervision your kids could watch something traumatizing on TV" i.e. horror movies, and I'm sure the same extends to the radio and even books. The world can be traumatizing but it isn't the world's responsibility to have kid-safe barriers on everything just in case.

[–] VieuxQueb@lemmy.ca 3 points 9 months ago

That's my point. It's your kids not mine. Not my job to care for them irl and not my job to care for them on the Internet.

[–] Semi-Hemi-Demigod@kbin.social 7 points 9 months ago

If Zuck could monetize dead kids maybe he'd care

[–] SpicyLizards@reddthat.com 5 points 9 months ago (1 children)
[–] moitoi@lemmy.dbzer0.com 3 points 9 months ago (2 children)
[–] THEDAEMON@lemmy.ml 1 points 9 months ago

Reptilopology.

[–] SpicyLizards@reddthat.com 0 points 9 months ago

Edna Krabappoly

[–] autotldr@lemmings.world 2 points 9 months ago

This is the best summary I could come up with:


Last week’s grilling of Mark Zuckerberg and his fellow Silicon Valley Übermenschen was a classic of the genre: front pages, headlines, and a genuinely stand-out moment of awkwardness in which he was forced to face victims for the first time ever and apologise: stricken parents holding the photographs of their dead children lost to cyberbullying and sexual exploitation on his platform.

A coroner in Britain found that 14-year-old Molly Jane Russell, “died from an act of self-harm while suffering from depression and the negative effects of online content” – which included Instagram videos depicting suicide.

Yet Silicon Valley’s latest extremely disruptive technology, generative AI, was released into the wild last year without even the most basic federally mandated product testing.

Last week, deep fake porn images of the most famous female star on the planet, Taylor Swift, flooded social media platforms, which had no legal obligation to take them down – and hence many of them didn’t.

Could there be any possible downside to releasing this untested new technology – one that enables the creation of mass disinformation at scale for no cost – at the exact moment in which more people will go to the polls than at any time in history?

To understand America’s end-of-empire waning dominance in the world, its broken legislature and its capture by corporate interests, the symbolism of a senator forcing Zuckerberg to apologise to bereaved parents while Congress – that big white building stormed by insurrectionists who found each other on social media platforms – does absolutely nothing to curb his company’s singular power is as good as any place to start.


The original article contains 1,183 words, the summary contains 264 words. Saved 78%. I'm a bot and I'm open source!

[–] snownyte@kbin.social 0 points 9 months ago

This whole article is just golden with contradictions.

[–] Halosheep@lemm.ee 0 points 9 months ago (2 children)

This is silly. It's not any one company's job to parent your children.

It wouldn't hurt for Facebook to provide the tools to better handle the situation but it's not like Zuck or anyone at Facebook directly participated in any of this.

[–] JackiesFridge@lemmy.world 6 points 9 months ago

It's not their job, but they can do it with very little effort and they don't. Like it or not, children are on Facebook, and if Meta can slap you within 3 seconds of posting a nipple, they can remove content that actually IS harmful.

[–] SpaceCowboy@lemmy.ca 6 points 9 months ago

The Facebook whistleblower said that there was indication that the algorithm was promoting eating disorders to teenage girls. When it was reported to the execs, the reaction of the execs was like "yeah but what kind of ad numbers are we getting on that content?" and decided not to change anything.

Sure I agree people shouldn't let social media algorithms raise their children. But that doesn't mean social media companies should be given carte blanche to behave like psychopaths. They can and should adjust their algorithms when harmful content is being promoted even while parents should be doing more to monitor their children's activity online. We can do both!

But I think they should probably change the CDA so social media companies are liable for the content their algorithms promote. It's actually a removal of some regulation, that's what the silicon valley tech bros want, right? Less regulation?

[–] a22546889@lemmynsfw.com -4 points 9 months ago

No one cares...