this post was submitted on 15 Feb 2024
414 points (94.4% liked)
Technology
59627 readers
3423 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't really look at it as a symptom of lack of graphics throughput, but more as a benefit of eye tracking, which is also potentially something that benefits, say, the immersion of others through portraying your facial expressions more realistically, or something to that effect. You could also use it as a kind of peripheral for games or software, and apple currently uses it as a mouse, so it's not totally useless. But I also can't imagine that most developers are going to be imaginative enough to make good use of it, if we can't even think of good uses for basic shit, like haptic feedback.
Perhaps it breaks even in terms of allowing them to save money they otherwise would've spent on rendering, but I dunno if that's the case, since the camera has to be pretty low latency, and you have to still dedicate hardware resources to the eye tracking and foveated rendering in order to get it to look good. Weight savings, then? I just don't really know. I guess we'll see, if it gets more industry adoption.