this post was submitted on 27 Oct 2023
301 points (96.6% liked)

Technology

59652 readers
5256 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] lloram239@feddit.de 0 points 1 year ago (1 children)

I think we might end up with the Microsoft/Apple/Google situation all over again. While it's easy to build an AI, having to jump between AIs for each and every task is no fun. I think the one that wins the golden goose is the one that manages to build a complete OS with AI at it's core, i.e. instead of Unix shell, you just have a ChatGPT-like thing sitting there that it can interact with all your data and other software in a save and reliable manner. Basically the computer from StarTrek were you just tell it what you want and it figures out how to get it.

That others can spin up their own LLM won't help here, as whoever gets to be the default AI that pops up when you switch on your computer will be the one that has the control and can reek the benefits.

[–] nossaquesapao@lemmy.eco.br 1 points 1 year ago (1 children)

While a full ai computer can sound cool, it would behave in a non reproducible and error prone way.

[–] lloram239@feddit.de 1 points 1 year ago (1 children)

Yes, but whoever overcomes those problems will be the next Microsoft/Apple/Google (or get rich by getting bough by either of them). I think a large paradigm shift in how we do computing is unavoidable, LLMs are way to powerful to just be left as chatbots.

[–] nossaquesapao@lemmy.eco.br 1 points 1 year ago (1 children)

Do you think these problems are solvable, and not inherent characteristics? I don't know, I expect to see computers with high performant ai modules, but not a full ai computer.

[–] lloram239@feddit.de 1 points 1 year ago* (last edited 1 year ago)

Just have the LLM output verifiable scripts instead of manipulating the data directly itself and have version control for the data so the AI can undo changes. All pretty doable, though maybe tricky to get into old apps.