The new tool allows you to create 3D models from the regular pictures of real-life objects.
As Apple explains, Object Capture is a simple and powerful API on macOS Monterey that helps developers to create high-quality, photo-realistic 3D models of real-world objects in minutes. All you need to do take pictures using your iPhone, iPad, or DSLR and Object Capture will turn them into 3D models optimized for AR. You can then view them in AR Quick Look or add to AR scenes in Reality Composer or Xcode.
This brand new feature is built into the iOS version of the Unity AR Companion App. Unity reveals that the AR Companion App will release later this fall, and full documentation will be available soon.
When you open the app, you will see an interactive UI and will have to set up a guide object over the object you would like to capture. As soon as you line up the guide, you can start taking pictures. A few shots from different angles are needed for building a 3D model. For each photo, the app will put a "pin" on the shell, which indicates that the angle has been successfully captured. Red pins mean the shot is blurry or unusable and it is better if you retake it.
Once all the pins are green, you can process the photos, generate the 3D model, and put it to use.
"This project is exciting to us for a number of reasons, first and foremost because of how deeply it aligns with our mission to continue to democratize content creation. Apple's announcement of this functionality means that this capability is now much more accessible to a wide range of creators, and we're looking forward to seeing indie game developers, mid-sized studios, students, and more now start to use real-world object capture in their process," comments Unity in their announcement.