Apple’s WWDC Annual Keynote Notes Are Like An Easter Egg Hunt For Tech Fans: Big Hardware Revelations Don’t Always Get, But All Software Ads Hide A Trace Of Evidence That Collectively Reveals A Lot . And so it was at WWDC 2020.
At first glance, this year’s show was about Apple Silicon, iOS 14, and Craig Federighi’s continuously stunning hair. But connect the dots and you will clearly see the exciting silhouette of the Apple Glasses.
Naturally, the always-secretive Apple hardly even mentioned Augmented Reality (AR) explicitly when talking about iOS 14 and iPadOS 14. And as impressive as Apple CEO Tim Cook’s Space Gray glasses were, they weren’t smart (as far as we could tell ).
But put together some of the slightly baffling individual ads and the image of Apple Glasses is starting to emerge: a spatial audio update for AirPods Pro, location-based AR tools for developers, ‘App Clips’ that conveniently serve you little digital pop-ups information, ‘hand pose’ detection in Apple’s Vision framework and even new 3D icons that look great for AR.
There’s no question about it: AR chess pieces are assembling throughout the Apple ecosystem with iOS 14. And like Clark Kent, all Apple needs now is a pair of smart glasses to complete the look.
Speaking of Superman, perhaps the most overlooked and impressive demo at WWDC was one that soared above a digital San Francisco in an ARKit 4 preview. ARKit is Apple’s suite of software tools for AR application developers that , as Apple claims, “powers the world’s largest AR platform, iOS.”
You may not know that iOS is an AR platform because, well, the technology is still in its infancy. But a particular demo of ARKit 4, which showed off the kit’s new ‘location anchors’, revealed just how fast it’s about to change with iOS 14 and iPadOS 14. These ‘location anchors’ allow apps to place AR creations, like statues, game characters, or giant signs. – to very specific locations in the real world. In other words, Apple’s AR is coming out.
This means that everyone in those places, some of whom will soon be using Apple Glasses, can roam around the same virtual creation and experience it the same way. Which is a big problem. In addition to Pokemon Go, the real AR has been trapped inside by changing IKEA’s virtual furniture. And while virtual home shopping will certainly get big, AR’s move outdoors with iOS 14 is a huge leap that paves the way for Apple Glasses.
However, perhaps the most exciting thing about ‘location anchors’ is the technology behind them. On iOS 14 and iPadOS 14 devices, ARkit 4 can combine your geographic coordinates with high-resolution map data from Apple Maps.
According to Apple ARKit engineer Quinton Petty, this process, which Apple calls ‘visual localization’, means that you will be able to “precisely locate your device relative to the surrounding environment with greater precision than could ever be done with just GPS” . This is crucial for a good outdoor AR experience, not to mention other smartphone apps. It’s also where Apple’s focus veers from rivals like Google and Niantic, the creator of Pokemon Go.
While Niantic recently began collecting 3D visual data from its players, which raised privacy concerns, Apple said at WWDC that its location-based AR uses advanced machine learning techniques that run “directly on your device” and that “There is no cloud processing, and no image is returned to Apple.” That fits in perfectly with Apple’s broader privacy theme, like an Airpod that’s inserted into its charging case.
Have you ever wondered why Apple continues to persist with Apple Maps? It’s the foundation of the AR layer Apple is building on top of the real world, rather than just another way to help you get to the grocery store, even if those new cycling directions on Apple Maps in iOS 14 look incredibly useful.
Naturally, there are still plenty of digital surveys to be done. Right now, those ‘location anchors’ are only available in the San Francisco Bay Area, New York, Los Angeles, Chicago and Miami, with “more cities arriving during the summer.” This is because much of the location’s accuracy appears to be based on data from Look Around, which is the equivalent of Apple Maps to Google’s Street View.
It’s going to take a while for it to go global, but iOS 14 is a big step towards Apple Glasses (expected to arrive in March 2021 or 2022) and an outdoor AR experience that will make your phone’s apps and games smart jump into the light. The real world.
The missing pieces
While ‘location anchors’ were the most explicit nod to Apple’s AR plans at WWDC 2020, there were also many more subtle suggestions on the matter.
The AirPods Pro have a new spatial audio feature, for example, that will bring 3D sound to your favorite true wireless headphones. Which sounds a little puzzling, unless you watch a lot of Dolby Atmos movies with your AirPods. But the real benefit could eventually come with AR, as your phone will give simple audio winks to Maps directions or work with Apple Glasses for a truly immersive AR experience.
Similarly, iOS 14’s new ‘App Clips’ feature, which allows you to preview small parts of entire apps without downloading them, could have some immediate benefits, like quickly paying for your smart scooter (above). However, the ultimate goal feels more like it’s helping you launch AR experiences by scanning real-world objects.
There were countless other suggestions at WWDC 2020 as well – incredible ‘hand pose’ recognition for gesture controls in Apple’s vision framework, new ‘scene geometry’ in ARkit 4 that allows a lidar sensor to automatically classify different objects and materials, and as an AR developer Lucas Rizzotto noted on Twitter that a new 3D design language looks great for augmented reality and Apple Glasses.
Considering that Apple barely mentioned AR at WWDC 2020, it was an impressively loud statement for such a ‘silent’ show. Who knows, by the time WWDC 2021 arrives, Tim Cook could be wearing considerably smarter glasses.