See what lidar can do on iPhone 12 with this 3D scanning app

The Canvas app scans houses in 3D using the iPhone 12 Pro lidar. Expect much more from this.

Occipital

the iPhone 12 Pro depth scan lidar sensor seems ready to open up many possibilities for 3D scanning applications on phones. A new one designed for home scanning, called Canvas, uses LIDAR for more precision and detail. But the app will also work with non-professional iPhones that date back to the iPhone 8.

The approach taken by Canvas indicates how LIDAR might work in iPhone 12 Pro applications. You can add more precision and detail to processes that are already possible through other methods on non-lidar-equipped phones and tablets.

Read more: The lidar technology in iPhone 12 does more than just enhance photos. Check out this cool party trick

Canvas, created by the Boulder-based company Occipital, originally released for the iPad Pro to take advantage of your LIDAR scan earlier this year. When I saw a demonstration of its capabilities back then, I saw it as a sign of how Apple’s depth sensing technology could be applied to measurement and home improvement applications. The updated app takes scans that are clearer and sharper.

Since LIDAR-equipped iPhones debuted, a handful of streamlined apps have emerged that offer 3D object scanning, larger-scale spatial scan photography (called photogrammetry), and augmented reality that can combine meshed maps of spaces with virtual objects. But the sample scan of Occipital’s Canvas app on the iPhone 12 Pro, embedded below, looks sharper than the 3D scan apps I’ve played with so far.

IOS 14 de Apple gives developers more raw access to iPhone lidar data, according to Occipital vice presidents of product Alex Schiff and Anton Yakubenko. This has allowed Occipital to build its own algorithms to make the best use of Apple’s lidar depth map. It could also allow Occipital to apply depth map data to future enhancements to its app for non-lidar-equipped phones.

It is possible to scan 3D space without specific depth mapping lidar or time-of-flight sensors, and companies like 6d.ai (acquired by Niantic) have already been using it. But Schiff and Yakubenko say lidar still offers a faster and more accurate upgrade to that technology. The iPhone 12 version of Canvas takes more detailed scans than the first version on the iPad Pro earlier this year, primarily due to iOS 14’s deeper access to LIDAR information, according to Occipital. The latest lidar-enabled version is accurate within a 1% range, while the non-lidar scan is accurate within a 5% range (literally, making the iPhone 12 Pro a professional upgrade for those who may need the boost).

Yakubenko says that based on previous measurements from Occipital, Apple’s iPad Pro lidar offers 574 depth points per frame in one scan, but depth maps can jump up to 256 × 192 points in iOS 14 for developers. This generates more details through artificial intelligence and camera data.

Canvas room scans can be converted into viable CAD models, in a process that takes about 48 hours, but Occipital is also working to convert the scans more instantly and add semantic data (such as recognizing doors, windows, and other details. room) with AI.

As more 3D scans and 3D data start to live on iPhones and iPads, it will also make sense for common formats to share and edit files. While iOS 14 uses a USDZ file format For 3D files, Occipital has its own format for its most detailed scans, and can output in .rvt, .ifc, .dwg, .skp and .plan formats when converted to CAD models. At some point, 3D scans can become as standardized as PDFs. We are not there yet, but we may need to get there soon.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.