this post was submitted on 18 Sep 2023
-22 points (31.0% liked)
Asklemmy
43833 readers
1301 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Maybe I lack foreseeing ability but they can... guess what university I study in, what meds I take, the government could know much more if they commit themselves.
Some guy mentioned insurance, that would probably make a point, but I live in a country we don't have it
Although my anecdote ended with additional data collection, the scary part is the manipulation of action. You might think that, as an example, they see you browsing a pokemon website and therefore show you more pokemon ads, something that coule be mutually beneficial. What you should be worried about, is something like based on your browsing behaviour they figure out how to manipulate your political action, or figure out your state of mental well-being and manipulate it. There is especially horror cases here when this is algorithm driven instead of being pushed by humans. One could imagine ,and I want to preface this by saying I'm not aware of this ever having happened, a machine learning algorithm relating signs of some mental illnesses with an uptick in firearm sales, and then increasing advertising of firearms to those people. You could imagine this driving things like an increased suicide rate.
There are many more other outlets for propaganda. Their effectiveness are hard to measure as well as the effectiveness of ads. Figuring out how to manipulate is philosophically impossible. How would you train an AI if you don't know if your actions led to success or not. Mental conditions are themselves poorly understood and defined. And we only have superficial web browsing data at our disposal.