this post was submitted on 10 Jun 2023
1 points (100.0% liked)

Technology

3 readers
1 users here now

Computers, phones, AI, whatever

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zombo@partizle.com 1 points 1 year ago* (last edited 1 year ago) (2 children)

Yes, there is potential for a slippery slope. And any filtering technology could be used for nefarious purposes. But this strikes me as pretty far from the slope and the purpose is clearly a good one. Remember you can always just turn it off.

[–] dragonfornicator@partizle.com 1 points 1 year ago (1 children)

You can only turn it off until you can't. The road to hell is paved with good intentions.

[–] Zombo@partizle.com 1 points 1 year ago

That's kind of the risk with any technology. And I admit, it is the most likely way we lose control: someone will ask, "why does Apple let you turn off the child porn filter?" and the answers may not be enough for lawmakers or an angry mob.

That the same could be said of a great many tools that filter bad content, from spam filtering to DDOS filtering. Should a technology not be available to consumers based on a hypothetical? That's just as bad.

If a technology exists to filter content I don't want to see, who are you to tell me Apple shouldn't sell me a device with that technology I want?