this post was submitted on 19 Nov 2023
1 points (100.0% liked)
Apple
69 readers
11 users here now
A place for Apple news, rumors, and discussions.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
An obvious thing is if they want to beef up their neural engine to better handle AI stuff ? If they are going to add ‘on device’ LLM runtime support ?
LLM, emphasis on the Large. A standard LLM will take up A LOT of storage space to the point where the user won't have much left. Let alone the continuous processing power needed.
Siri + local "Small" Language Model. It could have full access to on device data, but more generalized quires are done remotely.
That does seem farther more likely, I would agree.