this post was submitted on 25 Oct 2024
886 points (98.7% liked)
Technology
59693 readers
3188 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As someone with degrees and decades of experience, I urge you not use it for that. It's a cleverly disguised randomness machine, it will give you incorrect information that will be indistinguishable from truth because "truth" is never the criteria that it can use, but be convincing is. It will seed those untruths into you and unlearning bad practices that you picked up at the beginning might take years and cost you a career. And since you're just starting, you have no idea how to pick up bullshit from truth as long as the final result seem to work, and that's the works way to hide the bullshit from you.
The field is already very accessible for everyone who wants to learn it, the amount of guides, examples, teaching courses, very useful youtube videos with thick Indian accent is already enormous, and most of them are at least trying to self-correct, while LLM actively doesn't, in fact it's trying to do the opposite.
Best case scenario you're learning inefficiently, worst case scenario you aren't learning at all
Thank you, I will take this into consideration. It sure is tempting to use LLMs but I will always trust experts in the field over LLMs.
Yeah, the scary thing about LLMs is that by their very nature they sound convincing and it's very easy to fall into a trap, we as humans are hardwired to misconstrue the ability to talk smoothly for intelligence, and when computer started to speak with complete sentences and hold the immediate context of a conversation, we immediately started to think that we have a thinking machine and started believing it.
The worst thing is, there are legit uses for all the machine learning stuff and LLMs in particular, so we can't just throw it all out of the window, we will have to collectively adapt to this very convincing randomness machine that is just here all the time