Apple announced three major AR tools at WWDC 2019: ARKit 3, RealityKit, and Reality Composer.
ARKit is the foundation for AR on iOS. Announced back at WWDC 2017, ARKit let developers build apps with motion tracking, while being able to estimate space and ambient lighting. At the time, it basically made "Pokémon Go" look better.
ARKit 3, announced at WWDC 2019, is focused on how people actually interact with AR.
The new framework has motion-capture technology so developers can integrate people's movements into their apps.
It also has "People Occlusion," so people can walk around virtual content in a realistic fashion.
RealityKit is a new developer toolset that offers photo-realistic rendering, environment mapping, and realistic effects like animation, 3D audio, and motion blur.
Reality Composer is a new app that Apple built for iOS, Mac, and the new iPadOS. It lets developers build and prototype their AR experiences even if they've never built a 3D app before, thanks to simple tools like dragging and dropping.
Each of these tools — ARKit 3, RealityKit, and Reality Composer — is readying developers for a huge AR wave. It's as if Apple knows AR is going to be very popular soon.
ARKit 3 is Apple's foundation for convincing AR experiences. If smart glasses are going to succeed, people really need to be blown away by what they're seeing inside the glasses. Moving in front or behind of a 3D object might not sound complicated, but it's a big step towards making AR feel truly immersive.
RealityKit and Reality Composer, two brand-new tools announced at WWDC, are going to make it easier for more developers to build AR applications.
When people buy new hardware, they want things to do with it — that's where applications come in. If Apple is going to release its first smart glasses in 2020, developers need these tools as soon as possible. Hopefully, this leads to a greater quantity, and quality, of AR apps by this time next year.
If Apple launches smart glasses at all, of course, it would create its own applications for the device. Apple has spent much of 2019 re-investing in its own apps and services, so it would be no surprise to see many of the new apps showcased at WWDC to become available in smart glasses.
Apple Maps, in particular, feels like it was built for a pair of smart glasses.
Apple overhauled its Maps app in iOS 13, which arrives later this year. It's now way more detailed, and it offers its own version of Google's Street View, called "Look Around."
The new Apple Maps will also have "better pedestrian data" and "more precise addresses." Paired with the new 3D Street View feature, we could see navigation being fun and intuitive with a pair of smart glasses. (Imagine "Crazy Taxi," where you can see virtual arrows pointing where you need to go.)
We're not sure which other iOS apps Apple would port to a pair of smart glasses, but in a 2017 report, Bloomberg said Apple engineers were "prototyping a range of applications, from mapping and texting to more advanced features including virtual meeting rooms and 360-degree video playback."
Apple's glasses will run on what the company is internally calling "rOS," or "reality operating system," according to Bloomberg.
In March, reliable Apple analyst Ming-Chi Kuo said Apple's smart glasses would be mass produced "in the middle of next year," and marketed as an iPhone accessory, as the glasses would leverage the iPhone's computing and networking to retain a lightweight form.