this post was submitted on 05 Feb 2024
133 points (98.5% liked)
Technology
59204 readers
2972 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm highly doubtful that scammers could get enough real video of multiple employees in the same company to train an AI to pull this off convincingly. Celebrities, yes. Regular people, no
However, Occam's Razor tells me this employee knows exactly where that money went and plans to quietly slip away to a tropical island to retire, after getting fired for being "gullible."
There are lots of stupid people.
You need lots and lots of real video of a person to train an AI to make fake videos of that person. So, unless the CFO and the other allegedly faked employees are all youtubers, there's very good reason to consider more plausible explanations.
To your point, you are correct. There are lots of stupid people. This includes people that will blindly believe that AI can just magically do anything and not even consider simpler explanations for things like this.
I think it was just last year there was a story about some school official claiming to have been duped into paying scammers millions from the schools funds, only to later have been caught making the whole thing up in an attempt to steal the money. (Maybe somebody remembers enough to find a link) So it's not remotely far fetched to think that's what could be happening here.
Why even use AI? Mocap + features reconstructed from some photo + your acting under that mask + one spam call to steal voice sample + pitch shifter to slightly modify your voice. We have fucking Hololive where vtubers are anime girls streaming in real time. If it means at least $1 mil, you can find the right people to do a good deepfake under a budget. And they won't probably use LLM if they worth employment as it's unreliable and, as you said, needs training data, needs processing power, time etc. You'd be surprised how many things people do as shitposts in AE and Blender. Making Biden say an N-word to Obama is easier than you think when you know who to ask.
It's not an LLM problem, it's their stupid security, verification processes and this person be a gullible idiot.