this post was submitted on 30 Oct 2023
525 points (95.5% liked)

Technology

59568 readers
4285 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI one-percenters seizing power forever is the real doomsday scenario, warns AI godfather::The real risk of AI isn't that it'll kill you. It's that a small group of billionaires will control the tech forever.

you are viewing a single comment's thread
view the rest of the comments
[–] SupraMario@lemmy.world 19 points 1 year ago (3 children)

Uhh what? You can totally run LLMs locally.

[–] MooseBoys@lemmy.world 10 points 1 year ago

Inference, yes. Training, no. Derived models don’t count.

[–] Jeremyward@lemmy.world 7 points 1 year ago (1 children)

I have Llama 2 running on localhost, you need a fairly powerful GPU but it can totally be done.

[–] SailorMoss@sh.itjust.works 4 points 1 year ago

I’ve run one of the smaller models on my i7-3770 with no GPU acceleration. It is painfully slow but not unusably slow.

[–] jcdenton@lemy.lol -4 points 1 year ago

To get the same level as something like chat gpt?