See what iPhone 12 Pro’s new lidar feature can do with this 3D-scanning app
TheThe deep scan lidar sensor seems poised to open up a lot of possibilities for 3D scanning apps on phones. Canvas, a new device for home scanning, uses lidar for added accuracy and detail. The app also works with non-Pro iPhones that go back to the iPhone 8.
Canvas’s approach shows how lidar can be played in iPhone 12 Pro apps. It can make processes that are already possible with other methods on phones and tablets without lidar equipment more precise and detailed.
Canvas was developed by Boulder-based company Occipital and originally introduced for the iPad Pro to take advantage of lidar scanning earlier this year. Back then, when I saw a demo of its capabilities, I saw it as a sign of how Apple’s depth sensing technology can be applied to home improvement and measurement apps. The updated app does scans that are clearer and clearer.
Since the debut of iPhones equipped with lidar, a handful of optimized apps have been created that offer 3D scanning of objects, large-format space scan photography (so-called photogrammetry) and augmented reality, with which meshed maps of rooms can be combined with virtual objects. However, the sample canvas app scan embedded below from Occipital on the iPhone 12 Pro looks sharper than the 3D scan apps I’ve played with so far.
According to Occipital Vice Presidents for Product Alex Schiff and Anton Yakubenko, developers will have better access to the iPhone’s lidar data. This enabled Occipital to develop its own algorithms to make the most of Apple’s lidar depth map. It could also allow Occipital to apply the depth mapping data to future improvements to its app for phones without lidar equipment.
Scanning 3D space without special lidar or time-of-flight sensors with depth imaging is possible, and companies like 6d.ai (acquired by Niantic) have already used it. However, Schiff and Yakubenko say that lidar still offers a faster and more accurate upgrade of this technology. The iPhone 12 version of Canvas performs more detailed scans than the first version on the iPad Pro earlier this year, largely due to iOS 14’s deeper access to lidar information, according to Occipital. The latest lidar-enabled version is accurate to within a 1% range, while the non-lidar scan is accurate to within a 5% range (which literally makes the iPhone 12 Pro a pro upgrade for those who might want the boost need).
According to Yakubenko, Apple’s iPad Pro Lidar offers 574 depth points per frame on one scan according to previous measurements from Occipital, but depth maps can jump up to 256 x 192 points in iOS 14 for developers. This leads to more details through AI and camera data.
Canvas room scans can be converted into working CAD models in a process that takes approximately 48 hours. However, Occipital is also working on instantly converting scans and adding semantic data (like recognizing doors, windows and other room details) with AI.
As more and more 3D scans and 3D data are saved on iPhones and iPads, it makes sense to share and edit the files for common formats as well. While iOS 14 uses a USDZ file format for 3D files, Occipital has its own format for more in-depth scans and can be output in .rvt, .ifc, .dwg, .skp and .plan formats when converting to CAD models . At some point, 3D scans may become as standardized as PDFs. We’re not quite there yet, but we may have to be there soon.