logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Unreal Engine 4.19 Released!

The new version brings improvements to rendering, physics, landscape terrain, and many more systems.

Unreal Engine 4.19 is finally here! The new version brings improvements to rendering, physics, landscape terrain, and many more systems so that you can build worlds that run faster than ever before. V.4.19 introduces the new Live Link plugin which seamlessly blends workflows between external content creation tools and Unreal Engine so you can see updates as you make changes to source content. With the continued improvements to Sequencer, you can be the director with even more control of your scenes in real time.

Check out an overview of the major features introduced in the newest version below:

New: Temporal Upsampling

We have added a new upscaling method called Temporal Upsample that performs both a temporal accumulation of the frame at lower resolution and primary spatial upscale, reducing output blur.

1 – Temporal Upsample Enabled; 2 – Temporal Upsample Disabled

In previous versions, the idea of Screen Percentage enabled you to render the 3D scene at lower resolution, and spatially upscale it to the output resolution, before the UI gets drawn on top. This was a very flexible method to hit a GPU budget on less powerful hardware, but it required you to make a tradeoff between output sharpness and GPU performance.

To replace that single screen percentage, we now offer two separate screen percentages used for upscaling:

  • Primary screen percentage that by default will use the spatial upscale pass as before;
  • Secondary screen percentage that is a static, spatial only upscale at the very end of post-processing, before the UI draws.

The temporal upscaler that happens in the Temporal Anti-Aliasing (TAA) pass enables consistent geometry sharpness from varying primary screen percentages from 50% up to 200%. In effect, even though the screen percentage may be lowered, with TAAU enabled, geometry in the background that would usually become muddy or blend together can now maintain its detail and complexity, like the fence and telephone pole in the example above.

For additional information, see the Screen Percentage with Temporal Upsample page.

New: Dynamic Resolution

We now have support for Dynamic Resolution, which adjusts the resolution as needed to achieve a desired framerate, for games on PlayStation 4 and Xbox One! This works by using a heuristic to set the primary screen percentage based on the previous frames GPU workload.

Support for additional platforms will be coming in a future release. For additional information, see the Dynamic Resolution page.

New: Unified Unreal AR Framework

With this release of Unreal Engine, the Unified Unreal Augmented Reality Framework provides a rich, unified framework for building Augmented Reality (AR) apps for both Apple and Google handheld platforms. The framework provides a single path of development, allowing developers to build AR apps for both platforms using a single code path. The Unified Unreal AR Framework includes functions supporting Alignment, Light Estimation, Pinning, Session State, Trace Results, and Tracking.

Courtesy of Theia

New: Unified Unreal AR Framework Project Template

Also new is the Blueprint template HandheldAR, which provides a complete example project demonstrating the new functionality.

New: Physical Light Units

New to Unreal Engine 4.19, all light units are now defined using physically based units. Some light units were already well defined, but others were using undefined, engine specific units. The unit selection of a light is done through a drop-down menu (where applicable). For compatibility reasons, the default light units are kept compatible with previous versions of the engine. The new light unit property can be edited per light, changing how the engine interprets the “Intensity” property when doing lighting related computations.

Project courtesy of Litrix

New: Live Link Plugin Improvements

The Maya Live Link Plugin is now available and can be used to establish a connection between Maya and UE4 enabling you to preview changes made in Maya in real-time inside the UE4 Editor.

Once enabled, you will need to copy the binary files associated with your version of Maya to your Maya Plugins folder and enable it through the Maya Plugins Manager.

You can find pre-built binaries of the Maya Live Link Plugin inside your Engine installation folder in the following location: EngineExtrasMayaLiveLinkLiveLinkMaya.zip. Inside the zip file are binaries for Maya 2016/2017 and 2018 for Windows. If you require binaries for other versions, the source code for the Plugin can be found in the EngineSourceProgramsMayaLiveLinkPlugin folder which can be used to build the binaries.

The Maya Live Link UI window can be enabled through the MEL console with the command MayaLiveLinkU. At the top right is a display that shows whether this instance of Maya is connected to an Unreal client. Below Unreal Engine Live Link is a list of all the subjects currently being streamed (in the image above, only one is being streamed) and in the lower window, controls for adding and removing subjects from streaming is available.

Motionbuilder Live Link Plugin

The Motionbuilder Plugin offers the same functionality as the Maya Plugin and shows up in the Editor as a connection in a similar way. It also has a custom UI for managing streaming:

Objects can be selected from the the current scene and added to the streamed list (as shown above). From there, their names can be set in the Subject Name column and their Stream Type (Camera, Skeleton, etc.) can be set. Streaming on the subject can also be enabled and disabled from here.

Stream Active Camera as Editor Active Camera

Controlling the active camera within Maya will now manipulate the Editor’s active camera. We’ve also reworked the Editor update hook so that streaming is more robust and facial rigs/leaf bones now correctly update.

Added Virtual Subjects to Live Link

Virtual subjects are a way to combine multiple subjects coming into Live Link into one subject that can then be used by the Editor. Virtual Subjects are created within the client and contain the bones of multiple real subjects, all tied to a common root.

Below is the virtual subject being applied to a skeletal mesh in the editor.

Here we have created a new virtual subject (using Add > Add Virtual Subject) and set it to read subjects from the Maya Live Link source. The virtual subject is set to consist of the three Item subjects being sent from Maya.

Live Link Plugin Development

The purpose of Live Link is to provide a common interface for streaming and consuming any kind of animation data from sources outside of UE4 (for example, DDC tools and Motion Capture servers). It is designed to be extensible via Unreal Plugins, allowing third parties to develop new features with no need to make and then maintain Engine changes.

There are two paths for integrating in Live Link:

  1. Building an Unreal Engine Plugin that exposes a new Source to Live Link. This is the recommended approach for anyone that already has their own streaming protocol.
  2. Integrating a Message Bus end-point in third party software to allow it to act as a data transmitter for the built-in Message Bus Source. This is the approach we have taken for our Maya and MotionBuilder Plugins.

For an overview of general Plugin development, see the Plugins documentation pages.

Live Link Motion Controller Support

Live Link can now be used with Motion Controllers. The motion source property of a Motion Controller can be set to a subject within Live Link. When set in this way, the position of the Motion Controller Component is governed by the first transform of the subject.

The Motion Controller integration can also access custom parameters on the Live Link subject. These are passed via the curve support built into Live Link subjects.

To access the values, it is necessary to derive a new Blueprint from MotionControllerComponent and override the OnMotionControllerUpdated function. During OnMotionControllerUpdated, it is valid to call GetParameterValue on the Motion Controller.

Here is an example of a possible way to drive a light components Intensity from an Intensity parameter in Live Link:

Miscellaneous Improvements

  • Added ability for Live Link Sources to define their own custom settings – This was requested by partners like IKinema building Live Link support
  • Live Link Sources pushing skeletons to Live Link Client can now pass source GUID as well – Message bus source now pushes GUID when sending skeleton.
  • Added virtual Initialization function and Update DeltaTime parameter to Live Link Retargeter API

New: Sequencer Improvements

We continue to make improvements to the functionality and workflow of Sequencer to make it more powerful and increase efficiency.

Copy/Paste/Duplicate Object Tracks

You can now copy/paste/duplicate object tracks and their child tracks from the right-click context menu. You can copy a spawnable to the same Level Sequence or another Sequence or a possessable from one Level Sequence to another and the object will be bound to the same object in the other Level Sequence. You can now also copy from one UMG animation to another.

Level Sequence Dynamic Transform Origin

You can now offset the Actors controlled by Sequence Transform tracks with a global offset at runtime. This allows you to also reuse a Level Sequence in different coordinate spaces. To use this feature, inside the Details panel for your Level Sequence, enable Override Instance Data then assign a Transform Origin Actor.

The Transform Origin section specifies a transform from which all absolute transform sections inside the Sequence should be added to (Scale is ignored). For the best results, keyframe your Actor’s starting transform at 0,0,0 inside your Level Sequence and let the Transform Origin Actor you define drive the position from which to start in the world (depicted below).

Whenever we press a key (see above), the cube moves and our Actor continues to walk along a path, starting from the location of the Transform Origin Actor.

Sequencer Anim BP Weight Control

Sequencer weight blending now works with Animation Blueprints. You can use the same slot for the animation, and control weight by curve. Assign the same Slot Name under Properties for each animation to blend. Then keyframe the Weight values you desire. Inside your Anim BP, use the Slot Node. The AnimBP will use whatever Weight values you provided when blending.

Improved Sequencer Editor Performance

  • Compile On The Fly Logic – Sequencer is now able to compile partially or completely out-of-date evaluation templates from the source data as it needs. This affords much more efficient compilation when working within Sequencer in the editor.
  • Details Panel Updates – The details panel now defers updates while scrubbing/playing in the Sequencer Editor.

Minor Improvements

  • New .ini setting for the default behavior of “When Finished”.
    • For level sequences, it continues to default to RestoreState. For UMG, it’s now set to Keep State.
      [/Script/LevelSequence.LevelSequence]
      DefaultCompletionMode=RestoreState
  • Always create a camera cut track when adding a camera so that there’s less confusion as to why a camera doesn’t take control when sequencer is activated.
  • Rotations will no longer flip when autokeying/adding a new key.
  • Option to show selected nodes in the tree view only. This can be useful when working with a lot of actors in your level sequence to limit scrolling up and down the view.
  • Option to bake transform track to keys at frame intervals.
  • Option to create a camera anim from a transform track.
  • Ability to bind (path track, camera cut track or attach track) to spawnables in subsequences.
  • Can now drag sections up a row, instead of only just down.

New: Landscape Rendering Optimization

The Landscape level of detail (LOD) system has been changed so that instead of being distance-based it now uses screen size to determine detail for a component, similar to how the Static Mesh LOD system works. The LOD distribution now gives more coherent sizes on distant triangles based on their screen size, which can maintain detail that was not possible before.

1 of 2

(Left, old method; Right, new method)

You can visualize the Landscape LODs by going to Viewport > Visualizers > LOD.

The following settings are available to control where LODs transition based on screen size. Additionally, material tessellation can now be controlled, which gives the added performance benefit while using a Directional light with dynamic shadowing enabled.

New: Proxy LOD System (Experimental)

The new Proxy LOD system is an experimental Plugin for producing low poly LOD with baked materials for multiple meshes. The new system is used by HLOD and is a replacement for Simplygon.

Note: Currently, only Windows builds are supported.

To enable the tool, look for the “Hierarchical LOD Mesh Simplification” option under Editor in the Project Settings dialog. From the Hierarchical LOD Mesh Reduction Plugin, you should be able to select the new module “ProxyLODMeshReduction”.  After the prompted editor restart, the Plugin will replace the third party Simplygon tool for static mesh merging LODs. This new Plugin is accessed in two ways: The HLOD Outliner, and the Merge Actors dialog.

1 of 2

Source Geometry: 11 geometry pieces, 38,360 tris, and 27,194 verts | Result Geometry Proxy: 1 geometry piece, 8,095 tris, and 5,663 verts

New: Material Parameters Editing and Saving

You can now save parameter values to a new Child Instance or Sibling Instance in the Material Editor and the Material Instance Editor!

  • Save to Sibling Instance enables you to save the current parameters and overrides to a new Material Instance with the same Parent Material.
  • Save to Child Instance enables you to save the current parameters and overrides to a new Material Instance with the current instance as the Parent Material.

The Material Editor now features a Parameter Defaults panel. Here, you’ll have access to the default values set in your Material Graph for any parameter. You can easily change any of your parameter default values here.

New: Material Layering (Experimental)

Material Layering enables you to combine your Materials in a stack, using the new Material Layer and Material Layer Blend assets! This enables you to build the correct Material Graph without building sections of nodes by hand. This functionality is similar to Material Functions, but supports the creation of child instances.

 

Material Layering is experimental and can be enabled in the Project Settings > Rendering > Experimental section by setting Support Material Layers to true.

For additional information, see our post on the forums to get started and leave us feedback!

Epic Games 

You can find the full breakdown of the newest version here.

Join discussion

Comments 2

  • jaime

    couldn't find the livelink plugin in the path they said

    0

    jaime

    ·6 years ago·
  • Juris Perkons

    When I did read that there are new features in Landscapes, I was hoping that looping landscapes (levels) will be that. Oh, well. Maybe next time.

    0

    Juris Perkons

    ·6 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more