this post was submitted on 10 Jul 2023
329 points (92.7% liked)
Technology
59588 readers
4687 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Which is why I find the whole banning TikTok concept absurd.
It’s picking one easy scapegoat company to rally around, completely ignores the thousands upon thousands of other applications that collect data on us.
It’s not security, it’s security theater. It’s lazy and designed to distract us. It’s to keep us from not asking questions about any company’s practices that might hurt someone politically or financially.
We don’t need to ban TikTok. We need to ban Tik Tok and thousands of others like it. We need to have real conversations and put forth real solutions with regards to privacy, globally. It won’t happen though. Because it’s going to cost somebody money.
I've come to the conclusion that it is algorithms that have become evil. There was a thread where someone was asking for help stoppinh YouTube from radicalizing their mother due to the videos it would suggest to her.
I use stuff like newpipe and freetube to try and get away from these personalized attempts at content, since there is still good content on YouTube. It's just that so many sites try and keep you there as long as possible and then start feeding you content that can warp people. But, algorithms don't understand the impact of it, since it's either a 0 or 1 of user stays or user leaves.
algorithms can't "become evil" any more than your toaster can. It's being directed and programmed by people who know exactly what they're intending to achieve.
It's to say algorithms despite no intent to be evil have led to negative impact due to no care for the context of the recommendation. So someone can go in searching up health information then go down a rabbit hole of being recommended pseudo health advice then flat earth and so on. Not because the algorithm wants to turn people a certain way, but because it's just recommending videos that users that liked similar videos might find of interest. It's just broad categories.
Wasn't implying algorithms are sentient. At least not yet until AI integration happens.
If you can be radicalized by videos from YouTube, it isn't the algorithm, it's you
It's not a company, it is the CCP. There's is a massive difference, both in terms of what the organizations can access and the warrant requirements at the governmental level. I'm getting really tired of having to explain the difference in privacy rights concerning governments and private institutions. It's just like freedom of speech or religion. It has everything to do with private vs public institutions.
You don't think Meta, Google etc are passing data to the American government?
It's not about the CCP. It's because kids watch TikTok and then don't like conservatives