this post was submitted on 19 Nov 2023
1 points (100.0% liked)

Apple

69 readers
11 users here now

A place for Apple news, rumors, and discussions.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] QVRedit@alien.top 1 points 1 year ago (2 children)

An obvious thing is if they want to beef up their neural engine to better handle AI stuff ? If they are going to add ‘on device’ LLM runtime support ?

[–] MasterofOreos@alien.top 1 points 11 months ago (1 children)

LLM, emphasis on the Large. A standard LLM will take up A LOT of storage space to the point where the user won't have much left. Let alone the continuous processing power needed.

Siri + local "Small" Language Model. It could have full access to on device data, but more generalized quires are done remotely.

[–] QVRedit@alien.top 1 points 11 months ago

That does seem farther more likely, I would agree.