Just as predicted (but perhaps with less fanfare than usual), Apple has made the move of adding a depth sensor to the rear camera of its latest edition of the iPad Pro.
What is surprising is the type of sensor it is, and that plot twist raises the level of intrigue for Apple’s development of augmented reality smartglasses.
On Wednesday, Apple unveiled the 2020 iPad Pro, which runs on the A12Z Bionic chip, and includes an Ultra Wide camera and Liquid Retina display along with the usual array of cameras, sensors, and speakers. But the headlining feature is the inclusion of a light detection and ranging (LiDAR) scanner as a depth sensor, and its potential to facilitate next-level augmented reality experiences for mobile apps.
Like the depth sensors in the HoloLens 2 or Magic Leap 1, the LiDAR scanner in the iPad Pro functions as a time of flight sensor, measuring light as it bounces off surfaces in the camera view. It’s the same tech used in some robots and self-driving cars.
According to a statement from Apple, “The LiDAR scanner measures the distance to surrounding objects up to 5 meters away, works both indoors and outdoors, and operates at the photon level at nano-second speeds.”
Along with adding the new depth sensor, Apple has added new computer vision capabilities in iPadOS 13.4 that converge the depth data with image data from its cameras and motion data from its sensor array to construct spatial maps of an environment.
This results in near-instantaneous surface detection and content placement for ARKit apps, as well as better motion capture and people occlusion. In addition, the latest version of ARKit will add the Scene Geometry API for developers to take advantage of the new hardware.
Apple’s Measure app serves as an example of how the new iPad’s LiDAR scanner improves AR apps on the iPad Pro. The scanner enables the app to place vertical and edge guides or determine a person’s height automatically. The updated app also has a new Ruler View for more precise measurements and a save function for maintaining lists of measurements and the corresponding screenshots.
On the home design front, Ikea Place, Apple’s patron saint of furniture placement ARKit apps, will gain a new Studio Mode later this year that will explore room sets for user’s homes and suggest products that fit in with the user’s current decor.
Also, computer-aided drafting (CAD) app Shapr 3D will be able to render 2D floor plans and 3D models of rooms with the LiDAR scanner and display new designs, created with those scans, in the camera view of the room itself.
The new iPad Pro is available to order now starting at $799 for the 11-inch model with just 128 GB storage and only WiFi connectivity and up to $1,649 for a 12.9-inch version with 1 TB storage and cellular connectivity. The new tablet is compatible with the new Magic Keyboard with trackpad, backlit keys, and floating design, available starting in May for $299.
In its usual hyperbolic fashion, Apple says the addition of the LiDAR scanner to the sum of its usual parts, including its processor, dual cameras, sensor array, speakers, and its large and vivid display, “extends the lead of iPad Pro as the world’s best device for augmented reality.” And, if the speed of the AR experiences in the promotional video for the product is anywhere close to how it performs in real life, then that statement may not just be marketing hype.
But perhaps the most exciting thing about the new LiDAR scanner comes in the products yet to come from Apple, as it serves as possible foreshadowing for how the company will implement a depth sensor in its smartglasses.
Just as ARKit serves as the foundation for the AR software of Apple’s future AR wearables, the hardware deployed in Apple’s iPhones and iPads, such as the U1 chip in the iPhone 11 Pro, does the same on the wearables front. And, if the LiDAR scanner is indicative of Apple’s smartglasses plans, it may soon blow everyone else out of the water.