1075
this post was submitted on 13 Sep 2023
1075 points (97.9% liked)
Technology
59627 readers
2979 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're good, but no I haven't worked in IT, I've job hopped in manufacturing most of my life I just went to high school in the early 2000s and in my experience those particular things were ubiquitous enough to be common knowledge. I fully understand that there's people out there who have no idea how to operate a computer, it also makes sense to me why an IT person would see the most numerous and most extreme examples of this, but I think precisely because of that you have a bias in the other direction because everybody who has to come to you is likely an idiot, that doesn't mean everybody who isn't an IT professional is also an idiot.
I agree, that's a decent point, but I have a counterpoint. I think with sheer numbers alone, especially when it comes to the context of computers would give more accurate results even if they could be somewhat biased. A larger sample size is more likely to give a more accurate idea of a picture of what's going on. I also think if you compare an IT person, versus a non-IT person, the IT person is going to be able to identify Firefox being a search engine or a browser 10 times out of 10 lol, whereas with a non-IT person, those numbers could be anywhere except for 10/10, most likely anyway. lol