Here it is. The latest version of Apple’s Augmented Reality framework, ARKit 4, has just been announced at WWDC 2020.
Let’s see what’s new in ARKit 4 and for Augmented Reality app development on iOS. If you’re more inquisitive I linked all stuff directly to Apple documentation.
Location anchors
ARKit location anchors allow you to place virtual content in relation to anchors in the real world, such as points of interest in a city, taking the AR experience into the outdoors.
By setting geographic coordinates (latitude, longitude, and altitude) and leveraging data from Apple Maps, you can create AR experiences linked to a specific world location. Apple calls this process “visual localization”, basically locating your device in relation to its surroundings with a higher grade of accuracy.
All iPhones & iPad with at least an A12 bionic chip and GPS are supported.
Depth API
The new ARKit depth API, coming in iOS 14, provides access to valuable environment depth data, powering a better scene understanding and overall enabling superior occlusion handling.
It relies on the LiDAR scanner introduced in the latest iPad Pro.
The new Depth API works together with the scene geometry API (released with ARKit 3.5), which creates a 3D matrix of readings of the environment. Each dot comes with a confidence value. All these readings combined to provide detailed depth information, improving scene understanding and virtual object occlusion features.
Improved Object Placement
The LiDAR scanner is bringing more improvements to the table of AR development. In ARKit 3, Apple introduced the Raycasting API, which allows placing virtual objects with ease on real-world surfaces by finding 3D positions on these. Now, thanks to the LiDAR sensor, this process is quicker and more accurate than before.
Raycasting leverages scene depth to understand the environment and place virtual objects attached to a real-world plane, such as a couch, a table, or the floor. With the LiDAR sensor, you don’t need to wait for plane scanning to spawn virtual content, as it does it instantly.
This added to the improvements of Object Occlusion make virtual objects appear and behave more realistically.
My thoughts
It looks like we will get the LiDAR scanner in the iPhone this year. I hope that will bring more and more apps to the AR world.
If you read my Twitter you probably know that I am writing AR apps too and one of them is even my engineering work 😊
Comments
Post a Comment