Tuesday, 19 April, 2022 UTC


Summary

Picking up where we left off last week in examining the drivers and dynamics of the AR glasses market, the biggest question on everyone’s mind is what will Apple do? This includes why, what, and when. AR proponents eagerly await as clues continue to trickle out of Cupertino.
Among potential product and business models for AR glasses, Apple could end up setting the standard in its rumored market entrance. Indeed, many AR proponents are banking on this outcome, as it could accelerate the AR market through Apple’s signature “halo effect.”
But the question is, what’s Apple’s approach? As we’ve examined, tech giants often invest in emerging tech for one common reason: to future-proof their core business. In Apple’s case, AR glasses can vault its core hardware business in the face of a maturing smartphone market.
“Apple Glass” could accomplish this by both boosting and succeeding the aging iPhone. The former happens as it creates reliance on the iPhone for local compute and connectivity. The phone gains importance — and user incentive to upgrade — if it powers your smart glasses.
An iPhone succession plan is meanwhile accomplished through a suite of wearables that augments the suite of iThings at the center of our computing lives. That could mean line-of-sight graphics that accompany spatial audio from your AirPods and biometrics from your Apple Watch.
This theory fits Apple’s signature multi-device ecosystem approach. It will emphasize that the whole is greater than the sum of its parts so you should own several devices (sound familiar?). In this way, AR glasses will be a key puzzle piece in Apple’s much-vaunted wearables road map.

Redefining AR

After covering the why of Apple’s AR glasses the remaining question addresses the what. What will they look like, and what will be the primary feature set? We don’t know for sure, but many clues point to the likelihood that Apple will eschew common connotations with AR experiences.
In other words, Apple likely won’t launch AR glasses — at least in version 1 — that employ “heavy AR.” This is world-immersive AR that has spatial and semantic understanding of its surroundings. It’s all about graphics that populate your field of view in dimensionally-accurate ways.
To achieve these functions, there are design tradeoffs such as bulk and heat, which would deviate from Apple’s style and design DNA. So in the sliding scale between sleek glasses that power “lite AR”; and bulky hardware that powers “heavy AR,” Apple will lean towards the former.
The first clue for this theory is the state of the underlying technology. It’s not to the point where sleekness and graphical intensity are possible in the same device. The second clue comes from Apple’s market size and resulting fiduciary drive to pursue massive markets.
Given that reality, “lite” AR glasses have a much larger addressable market than heavy AR hardware, as the latter appeals to a subset of technophiles. Apple’s mass-market requirements could lead it to something that is more along the lines of corrective eyewear or sunglasses.
In other words, eyeglasses and sunglasses are much larger markets than AR glasses. So Apple could enter the $200 billion corrective eyewear market with a better mousetrap. AR features could include line-of-sight notifications and elegant integrations with other Apple wearables.
Game of Life

Current Connotations

Apple could also broaden the concept of “augmentation” beyond current connotations. So instead of cartoon monsters, digital “layers” will be things that generally help people see better — either in a corrective sense or with digital filters that “brighten” your day in various ways.
Other clues signal practical mass-market functions, such as the integration with Apple’s “ Project Gobi.” This involves retail point-of-sale codes that unlock product promotions or Apple Pay. This not only has mass-market applicability but could align well with Post-Covid “touchless” retail.
There could also be spatial audio integration with AirPods (remember the ecosystem approach). This could involve an audible “notification layer” that joins its visual counterpart. Use cases include identifying people or real-time foreign language translation — a potential killer app.
All of the above could represent Apple’s first step into sensory augmentation. Like the iPhone 1’s long evolutionary path to the pocket supercomputer we know today, “Apple Glass” will grow from simple augmentation to — eventually — full-blown AR. But that could be a gradual process.
https://medium.com/media/bd5664c1a0f0a501826f3e49e5b84ffb/href
Can ‘Apple Glass’ Bring AR to the Masses? was originally published in AR/VR Journey: Augmented & Virtual Reality Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.