this post was submitted on 20 Sep 2024
373 points (99.2% liked)

People Twitter

4988 readers
3408 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying.
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] vulgarcynic@sh.itjust.works 29 points 2 days ago (2 children)

If you look at the enterprise pricing and options for Copilot and Security Copilot, they're building a pretty obvious business model around automating everything from end user basic tasks to tier 1 incident response.

I'm not advocating that it will work, especially as a person in IR but, all the big players are pushing for security automation. All it's going to take is one high profile incident to shift the CSO's and the like to jump in with both hands full of "ai" purchase orders.

The shittiest part is, this is only going to eliminate more entry level secops jobs. Jobs that are generally a great place to start in the industry.

[–] bizarroland@fedia.io 2 points 1 day ago

This is the real concern I have.

It's already hard enough keeping a job in IT that doesn't drive you absolutely crazy, between having to deal with people who still cannot use a computer and people who make a hundred times what you do having every single blink on their screen being a tier one response crisis and the having to do the actual work of it which is building up and establishing the systems that these workers use.

If they also make it so that there's less need to hire PFYs (pimply faced youths) so that the old blood doesn't get refreshed with the new blood, then it's going to tank the entire sector.

[–] tdawg@lemmy.world 15 points 2 days ago (1 children)

It's also going to create more headaches for the people left to fix things

"Shut up and use the AI we are paying a fortune for!"

Proceeds to figure everything out for themselves and works themselves to death