Unity 2017.3 Is Here

2017 is almost over and Unity released 2017.3, introducing several new features and improvements for both artists and developers.

2017 is almost over and Unity released 2017.3, introducing several new features and improvements for both artists and developers.

Here’s a little overview of the key features:

Panoramic 360/180 video

We are particularly excited to bring you improvements to panoramic 360/180 and 2D/3D video workflows. You can now easily bring in various styles of 2D or 3D video in Unity and play it back on the Skybox to create 360-video experiences targeting standalone, mobile and XR runtimes.

Particle System improvements

Improvements include new Unlit and Surface particle shaders and ribbonized particle trails. These allow particles to be connected based on their age. Since each point on these ribbonized trails is represented by a particle, they can be animated, for example, by using them in conjunction with the Noise Module.

Script compilation – User-defined managed assemblies

You will be able to define your own managed assemblies based on scripts inside a folder. By splitting your project’s scripts into multiple assemblies, script compilation times in the editor can be greatly reduced for large projects. 

Managed Memory Profiler support
You can now take advantage of Mono/.NET 2.0 support for the APIs required to take managed memory snapshots. This makes it possible to take memory snapshots directly inside the editor.

The updated Crunch Library

The Crunch Library can now compress DXT textures up to 2.5 times faster, while providing about 10% better compression ratio. But more importantly, the updated library is now capable of compressing ETC_RGB4 and ETC2_RGBA8 textures, which makes it possible to use Crunch compression on iOS and Android devices.

Graphics improvements

There is now support for HDR compressed lightmaps (BC6H) on PC, Xbox One and PlayStation 4. We also made a number of GPU instancing improvements, and we’re adding Dynamic Resolution as an engine feature debuting on the Xbox One platform with other platforms to follow later.

Lighting improvements

We are introducing Lighting modes for the Progressive Lightmapper (Baked Indirect, Shadowmask and Subtractive), LOD support with realtime probes providing a more intuitive workflow, and HDR encoding support for baked lightmaps for higher visual quality.

VR device info

To help you optimize VR experiences, you can now capture VR-device refresh rate, dimensions, aspect ratio, HMD-tracking and controller-tracking as part of device info and device status events.

Physics

Improvements include cloth self-collision and inter-collision technology and improved constraint painting.

Animation

We are introducing Playable scheduling, which allows you to prefetch the data before it is actually played. The first implementation affects AudioClipPlayables, but in the future, the scheduling will be used by other assets: audio, video and timeline.  We added support for integer and enum animation parameters defined within an Animator Controller. This enables you to access and assign values from scripts giving you even better control over the flow of the animation state machine. We are also introducing a New “2D” mode button in the animation Preview window. Finally, it is now possible to zoom, frame and autofit in the Animator window!

There are also some very cool engine improvements, graphics features, which can really help you in the production.

 

Progressive Lightmapper

Lighting modes support

In Unity, it is possible to control lighting pre-computation and composition in order to achieve a given effect by assigning various modes to a Light (Realtime, Mixed and Baked). Using Mixed mode significantly reduces realtime shadow distance increasing performance. Higher visual fidelity can also be achieved as far distance shadows are supported along with real-time specular highlights.

In Unity 2017.3, you can do the same thing with the Progressive Lightmapper choosing from among the following Lighting modes:

In Baked Indirect mode, Mixed lights behave like realtime dynamic lights with additional indirect lighting sampled from baked lightmaps and light probes. Effects like fog can be used passed realtime shadow distance where shadowing would otherwise be missing.

In Shadowmask mode, Mixed lights are realtime, and shadows cast from static objects are baked into a shadowmask texture and into light probes. This allows you to render shadows in the distance drastically reducing the amount of rendered shadow casters based on the Shadowmask Mode from the quality settings.

In Subtractive mode, direct lighting is baked into the lightmaps, and static objects will not have specular or glossy highlights from mixed lights. Dynamic objects will be lit at realtime and receive precomputed shadows from static objects via light probes. The main directional light allows dynamic objects to cast a subtractive realtime shadow on static objects.

To try out the lighting modes in the Progressive Lightmapper, make sure that you have a Mixed Light in your scene and then, in the Lighting Window, select a Lighting Mode.

Light LODs with real-time probes and baked lightmaps

 We’ve added the ability to generate lighting for level of detail (LOD) objects with real-time light probes in addition to baked lightmaps, offering a more intuitive workflow for users to author their lighting. The LOD allows you to have meshes in lower complexity when the camera is far away and higher complexity when the camera is closer. That way you can reduce the level of computation when rendering faraway objects. When you use Unity’s LOD system in a scene with baked lighting and Realtime GI, the system lights the most detailed model out of the LOD Group as if it is a regular static model. It uses lightmaps for the direct and indirect lighting, and separate lightmaps for Realtime GI.

However, for lower LODs in an LOD Group, you can only combine baked lightmaps with Realtime GI from Light Probes or Light Probe Proxy Volumes, which you must place around the LOD Group.

To allow the baking system to produce real-time or baked lightmaps in 2017.3, simply check that Lightmap Static is enabled in the Renderer component of the relevant GameObject.

This animation shows how realtime ambient color affects the Realtime GI used by lower level LODs:

HDR support in the lightmap pipeline

We’ve added support for HDR compressed lightmaps (BC6H) on PC, Xbox One and PlayStation 4 for achieving even better quality visuals. The advantage of using High Quality lightmaps is that they don’t encode lightmap values with RGBM, but use a 16-bit floating point value instead. As a result, the supported range goes from 0 to 65504. The BC6H format is also superior to DXT5 + RGBM combination as it doesn’t produce any of the banding artifacts associated with RGBM encoding and color artifacts coming from DXT compression. Shaders that need to sample HDR lightmaps are a few ALU instructions shorter because there is no need to decode the sampled values, and the BC6H format has the same GPU memory requirements as DXT5.

The HDR is easily enabled by setting the Lightmap Encoding option in the Player Settings to High Quality.

Choosing High Quality will enable HDR lightmap support, whereas Normal Quality will switch to RGBM encoding.

When lightmap Compression is enabled in the Lighting Window, the lightmaps will be compressed using the BC6H compression format. 

GPU instancing improvements

GPU Instancing was introduced in 5.6 to reduce the number of draw calls used per scene by rendering multiple copies of the same Mesh simultaneously, and thus significantly improving rendering performance.

There are a number of improvements to GPU instancing, including Per-instance properties which are now packed into a structure data type with instancing constant buffer containing only one array of such structures.

For most platforms, the instancing array sizes are now calculated automatically and no longer need to be specified by user (maximum allowed constant buffer size/size of the above-mentioned structure type). Instancing batch sizes are therefore improved on OpenGL and Metal, and instancing shader variants compile much faster and potentially cost less in terms of CPU-GPU data bandwidth.

360 Video Player

The new video player introduced earlier this year, made it possible to use 360 videos and make them truly interactive by adding CG objects, ambisonic audio, visual effects, and more.

In 2017.3, you can now bring a 2D or 3D 360-video into Unity and play it back on the Skybox to create standalone 360-video experiences targeting VR platforms.

Unity offers built-in support for both 180 and 360-degree videos in either an equirectangular layout (longitude and latitude) or a cubemap layout (6 frames).

Equirectangular 2D videos should have an aspect ratio of exactly 2:1 for 360-degree content, or 1:1 for 180-degree content.

Equirectangular 2D video

Cubemap 2D videos should have an aspect ratio of 1:6, 3:4, 4:3, or 6:1, depending on the face layout:

 

Cubemap 2D video

To use the panoramic video features in the Unity Editor, you must have access to panoramic video clips, or know how to author them. Check our documentation pages for a step-by-step guide on how to display any panoramic video in the Unity Editor.

Keep in mind that many desktop hardware video decoders are limited to 4K resolutions and mobile hardware video decoders are often limited to 2K or less, which affects the resolution of playback in real-time on those platforms.

ARCore SDK Preview 2 for Unity

Technical improvements introduced with the new developer preview include a C API for Android NDK, functionality to pause and resume AR sessions allowing users to pause and continue tracking after app is resumed, improved runtime efficiency and finally, improved trackable and anchor interface across anchor, plane finding, and point cloud. Learn more about ARCore Developer Preview

Vuforia 7 support

With Unity 2017.3 support for Vuforia 7, you can build cross-platform AR apps. Vuforia 7 introduces Model Targets, a new way to place digital content on specific objects using pre-exiting 3D models. Also new is Vuforia Ground Plane, a capability that allows you to place digital content on a horizontal surface such as a floor or table. Ground Plane will support an expanding range of iOS and Android devices, taking advantage of platform enablers such as ARKit where available. Learn more about Vuforia in Unity 2017.

1 of 2
Ground Plane (available for free) place digital content on floors or tables.Model Targets. Recognize a new class of objects based on their geometry used for example for placing AR content on top of industrial equipment, vehicles or home appliances.

OctaneRender for Unity now available from Otoy

We are excited to welcome OTOY’s path-traced GPU-accelerated render engine for the Unity Editor. OctaneRender for Unity is available for free, or at $20 or $60 for packages that unlock more GPUs and OctaneRender plugins for leading 3D authoring tools. Born on GPUs, OctaneRender is an unbiased render engine, tracing each ray of light in a scene with physics-grade precision to deliver unrivaled photorealism in CG and VFX. Octane Render for Unity works with Unity 2017.1 and above.  Learn more about OctaneRender for Unity.

Made with Unity scene rendered with OctaneRender for Unity available on the Asset Store for free. 

Check out the full description at Unity’s official blog.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more