Like TrueDepth, it would enable new kinds of applications and more precise AR.
The publication’s source says that Apple engineers have been working on the rear-facing 3D camera for two years, and it is currently planned for inclusion in at least one model later this year. However, the timing could still change.
Apple has made much more progress developing APIs and tools for third-party developers and its own software teams to use to create new experiences and functionality, though.
Apple introduced the similar front-facing TrueDepth array in the iPhone X in . Its key feature is Face ID, which scans the user’s face to authenticate them before unlocking access to personalized files and services on the device. It’s also used for Animojis and Memojis, animated avatars used by some in the Messages app.
However, there is significant potential to the tech that remains largely untapped. Apple provided developers with the tools they needed to use the TrueDepth feature in their apps. There have been some applications, but they have mostly been gimmicks and games like Nathan Gitter’s Rainbrow
Apple may hope that adding these sensors to the rear of the device will inspire new, more useful applications. And then there’s the obvious augmented reality connection.
Cupertino introduced ARKit, a set of tools and features available to developers for making augmented reality apps on iOS, in . We wrote a detailed explainer about it in 2019 after ARKit 2 was introduced. The latest version is ARKit 3 , introduced alongside iOS and iPadOS (in late) . ARKit 3 introduced people occlusion, built-in motion-capture tools, multiple face tracking, and the ability for multiple users to work simultaneously in the same AR environment, among other things.
Apple CEO Tim Cook has previously said he believes augmented. Reality will at some point be a watershed moment akin to the introduction of the App Store, and there has been plenty of evidence that the company has been working internally on an AR glasses product. Earlier this week, 9to5Mac claimed to have uncovered evidence of (a new AR app) in leaked iOS code, as well as indications that a rear ToF sensor is coming to new iPad Pro models as well. The app would allow users to get heads-up information on products and other objects in spaces around them and would be used in both Apple Stores and at Starbucks locations.
Up to this point, AR apps on the iPhone have relied on depth differentials between the multiple traditional cameras on the back of the newer iPhones to handle depth, but that is not as precise as what this new sensor array would allow. Quoted in the Fast Company article, Lumentum VP Andre Wong says that AR apps haven’t taken off in a huge way in part because of the lack of this depth-sensing capability and what that means for the quality of the experiences:
When you use AR apps without depth information, it’s a bit glitchy and not as powerful as it ultimately could be … Now that ARKit and (Google’s) ARCore have both been out for some time now, you’ll see new AR apps coming out that are more accurate in the way they place objects within a space.
Introducing these more precise tools will almost certainly improve AR on iPhones, but it’s probably not enough to open the watershed cook has predicted. There’s still something awkward about using AR experiences on a phone, and the success of the rumored glasses product might be needed to start a developer gold rush.
In the meantime, though, Apple seems to be remaining focused on AR, building out both hardware and software features and tech that allow Developers to experiment and bring new ideas to the App Store. As we wrote in our ARKit 2 explainer , it’s a long game: a vast, refined array of APIs and tools would be necessary to facilitate rapid adoption of AR for the glasses product by third-party developers. Building all that now on the admittedly flawed-for-this-purpose iPhone platform would mean Apple could hit the ground running when and if its glasses finally come to market.