this post was submitted on 17 Aug 2023
562 points (95.8% liked)

Ask Lemmy

26980 readers
2035 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

My wife and I started talking about this after she had to help an old lady at the DMV figure out how to use her iPhone to scan a QR code. We're in our early 40s.

you are viewing a single comment's thread
view the rest of the comments
[–] nednobbins@lemm.ee 18 points 1 year ago (1 children)

My wife and I regularly joke that one day we'll harass our kids to help us with our neural interfaces but I don't think that sort of thing will happen any time soon.

When I was a kid in the 80's a lot of people could already afford computers. They weren't so cheap that everyone had them but they were affordable to a fair number of people if they really wanted one. A C64 cost $595 at launch, that's under $2,000 in today's dollars.

The biggest barrier to computers were that they weren't "user friendly". If you wanted to play a simple video game you needed to know some basic command line instructions. When I wanted to set up my first mouse for my 8086 it involved installing drivers and editing config.sys and autoexec.bat. You couldn't really do anything with a computer those days unless you were willing to nerd out.

At the same time, nerding out on a computer could easily get you deep into the guts of your computer in a functional way. I learned that the only way I could play video games at night was if I opened up the computer and disconnected the speaker wire so it wouldn't alert my parents. I also learned that I could "hack" Bards Tale by opening up the main file with debug and editing it so that the store would sell an infinite number of "Crystal Swords".

Today there are 2 cell phones for every human on earth. Kids walk around with supercomputers in their pockets. But they've become so "user friendly" that you barely even need to be literate to operate one. That's generally a good thing but it removes an incentive to figuring out how the stuff works. Most people only bother with that if they're having some trouble getting it working in the first place.

At the same time it's gotten much harder to make changes to your computer. The first Apple was a pile of circuits you needed to solder together. You can't even remove the battery on a modern one (without jumping through a lot of hoops). If you edit some of your games it's more likely to trigger some piracy or cheat protection than to let you actually change it.

There are still large communities of computer nerds but your average person today basically treats computers like magic boxes.

I'd expect that kind of gap in other areas. I'd take 3d printing as an example. You can get one now for a few hundred bucks. They're already used in industry but, at this point, they're still very fiddly. The people who have them at home are comfortable doing stuff like troubleshooting, flashing ROMs, wading through bad documentation and even printing custom upgrades for their printer.

[–] drphungky@lemmy.world 3 points 1 year ago (1 children)

My wife and I regularly joke that one day we’ll harass our kids to help us with our neural interfaces

This but unironically.

Seriously, there will be new interfaces in the next 20 years. People always underestimate tech change and growth, and we already see VR and AR in their infancy. Do you remember what it was like watching your parents or grandparents hunt and peck type, or struggle to double click something, or double clicking things that don't need double clicked? Did you struggle with helping them Google their problem (back when Google was useful)? There will always be luddites and people who don't adopt new tech, but even among those who do, they're often slower or just have a less intuitive understanding of newer tech. This will happen to us. Even simple motion controls while in AR will likely be hard for people to pick up and develop new muscle memory. Neural interfaces will likely require you to "think a certain way" to best interact with things, and I don't doubt many of us will be bad at it. And most likely of all, we'll be bad at something we don't even predict - and many people won't care that much. I'd argue in some cases it actuallystarts with "what's the use?" which tons of millennials have already done with Twitter, Instagram, Tiktok etc. Even if you recognize the value, you're comfortable and happy not using the new thing, and that's a double whammy when combined with the effort it takes to learn new tech.

[–] nednobbins@lemm.ee 1 points 1 year ago

I think you're sort of right but it will depend heavily on how radical a shift the new technology is. In order for there to be this kind of divide there needs to be a steep learning curve to the technology. People are only willing to put up with those learning curves if there's a significant advantage. That means that manufacturers can only successfully market "difficult" technologies if they provide a big advantage.

I'm not aware of any old people having difficulty transitioning from quills to, fountain pens to ball point pens. They all basically did the same thing and you only had to make minor adjustments. Nobody bothered learning how to use the Writer since it didn't actually let you do anything better. They were willing to go through the significant curve of learning how to use typewriters because, once they did, they could write significantly faster.

Computers and cell phones are a whole different way of interacting with people and information than "hardcopy" was. You didn't just swap some objects that did the same thing with a different approach. It wasn't even just a slightly different way of doing the same thing. Those technologies allowed us to interact with the world in a totally new way. It was worth learning a bunch of weird computer stuff that older generations had never heard of because we could do things they never dreamed of. (eg I used to get rushed when talking with my grandmother to save on long distance bills, now I don't even think about long distance costs other than latency.

I'm sure that sort of thing will happen again but it would require a far more disruptive technology than AR. That's a small iteration that we've already been primed for. When Terminator 1 came out, nobody was confused when it switched to "terminator vision" and you saw the AR display. That's why I joke about neural interfaces. In theory, that could give a person significantly higher throughput rates to their computer. There are all kinds of potential benefits to. It would be worth it for people to put up with steep learning curves, unintuitive interfaces and lots of troubleshooting if it meant they could suddenly "read" at 10,000 words a minute or control complex robots. Not everyone would go through that effort and it would create the kinds of divides that we saw with computers.

When I look at current technologies as an old(ish) person, it's a very different view than my parents and grandparents had. They didn't understand the new technologies. I have no trouble understanding them, I just think a lot of them are a waste of my time (unlike screwing around on Lemmy, which is totally productive /s).