As brands and content makers create more augmented reality experiences, the demand for tools to create 3D content grows in kind.
And Apple may have just leveled the playing field for 3D content creation.
On Monday, at WWDC 2021, Apple introduced Object Capture, a photogrammetry tool built on the Swift programming language and coming to the Monterey edition of macOS via RealityKit 2, the next version of Apple’s AR engine.
Object Capture stitches together a series of photographs to create a 3D model of the subject. Users can capture photos in sequence using their iPhones, iPads, or other cameras, then import the images into Reality Kit 2 to generate the 3D model. Users can also preview the content via AR Quick Look of the model to confirm accuracy.
Content generated via Object Capture can then be used in AR experiences created via Reality Composer or Xcode, as well as third-party platforms like Unity MARS and Maxon’s Cinema 4D. It’s unclear whether LiDAR via iPhone and iPad is required for Object Capture.
Arts and crafts marketplace Etsy and furniture retailer Wayfair are among the early adopters of the technology. The latter will use Object Capture to expand the products customers can preview via ARKit in their mobile app.
Along with Object Capture, Apple is adding a new set of APIs via RealityKit 2 for “more realistic and complex AR experiences with greater visual, audio, and animation control, including custom render passes and dynamic shaders.”
As important as ARKit has been in giving developers the ability to integrate AR into mobile apps, creating the 3D content underlying those experiences is another set of tasks altogether.
Gifting everyone with a Mac or Macbook and an iPhone or iPad the ability to create their own 3D objects without third-party software or hardware could facilitate a huge leap forward in the proliferation of AR content.