this post was submitted on 10 Apr 2024
1298 points (99.0% liked)
Programmer Humor
19551 readers
1067 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs are in a position to make boring NPCs much better.
Once they can be run locally at a good speed it'll be a game changer.
I reckon we'll start getting AI cards for computers soon.
We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I'm pretty sure they can run LLMs but I haven't looked around.
GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware