this post was submitted on 05 Feb 2025
238 points (82.2% liked)

Technology

61632 readers
5014 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Voroxpete@sh.itjust.works 6 points 21 hours ago (2 children)

I think these are actually valid examples, albeit ones that come with a really big caveat; you're using AI in place of a skill that you really should be learning for yourself. As an autistic IT person, I get the struggle of communicating with non-technical and neurotypical people, especially clients who you have to be extra careful with. But the reality is, you can't always do all your communication by email. If you always rely on the AI to correct your tone or simplify your language, you're choosing not to build an essential skill that is every bit as important to doing your job well as it is to know how to correctly configure an ACL on a Cisco managed switch.

That said, I can also see how relying on the AI at first can be a helpful learning tool as you build those skills. There's certainly an argument that by using tools, but paying attention to the output of those tools, you build those skills for yourself. Learning by example works. I think used in that way, there's potentially real value there.

Which is kind of the broader story with Gen AI overall. It's not that it can never be useful; it's that, at best, it can only ever aspire to "useful." No one, yet, has demonstrated any ability to make AI "essential" and the idea that we should be investing hundreds of billions of dollars into a technology that is, on its best days, mildly useful, is sheer fucking lunacy.

[–] msage@programming.dev 4 points 17 hours ago (2 children)
[–] CarnivorousCouch@lemmy.world 2 points 7 hours ago

This was an interesting read, thanks for sharing.

[–] Voroxpete@sh.itjust.works 2 points 16 hours ago* (last edited 28 minutes ago)

Noted, I'll be giving that a proper read after work. Thank you.

Edit to add: Yeah, that pretty much mirrors my own experiences of using AI as a coding aid. Even when I was learning a new language, I found that my comprehension of the material very quickly outstripped whatever ChatGPT could provide. I'd much rather understand what I'm building because I built it myself. A lot of the time, when you use a solution someone else provided you don't find out until much later how badly that solution held you back because it wasn't actually the best way to tackle the problem.

[–] spankmonkey@lemmy.world 4 points 21 hours ago

If you always rely on the AI to correct your tone or simplify your language, you’re choosing not to build an essential skill that is every bit as important to doing your job well as it is to know how to correctly configure an ACL on a Cisco managed switch.

This is such a good example of how it AI/LLMs/whatever are being used as a crutch that is far more impactful than using a spellchecker. A spell checker catches typos or helps with unfamiliar words, but doesn't replace the underlying skill of communicating to your audience.