Search by Shape In The New Substance 3D Modeler's Public Beta Release

Adobe continues to implement experimental AI features in its software with the latest Substance 3D Modeler 1.8.5 public beta release featuring Search by Shape and Smart Upres.

Recently, Adobe has started to introduce new and experimental features to the Substance 3D ecosystem starting with Substance 3D Sampler and Substance 3D Stager, aiming to bring creative freedom and faster workflows for industrial designers, 3D artists, and other professionals.

The latest Substance 3D Modeler 1.8.5 release in the Public beta program is no exception and introduced new Search by Shape and Smart Upres functions.

Asset library content can work as a good starting point or reference, or be useful as background content to help flesh out your scene. With Search by Shape, you can roughly block out objects, and then search the Substance 3D Assets library for parts with a similar shape, using keywords and checkboxes to refine the search if necessary.

Image Credits: Adobe, Substance 3D Modeler 1.8.5

The Increase resolution algorithm has also been updated to better maintain sharp edges and details.

Image Credits: Adobe, Substance 3D Modeler 1.8.5

As for fixes, this release fixes the bug that occurred while switching between Elastic and Inelastic warp with different settings:

Image Credits: Adobe, Substance 3D Modeler 1.8.5

Substance 3D Modeler is available in Public beta via the Creative Cloud with an Adobe ID. With the Public beta program, you can try out experimental features and builds of Substance 3D Modeler before they reach full release.

Check the Substance 3D Modeler 1.8.5 release notes here and join our 80 Level Talent platform and our Telegram channel, follow us on InstagramTwitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more