First Look at Blender’s Eevee
Events
Subscribe:  iCal  |  Google Calendar
Utrecht NL   24, Nov — 26, Nov
Philadelphia US   30, Nov — 3, Dec
London GB   30, Nov — 1, Dec
London GB   30, Nov — 3, Dec
Dortmund DE   1, Dec — 3, Dec
Latest comments

Looks cool,what's the method for rendering animated substance?(Flipbook?)

The plugin is called Hair Grabber and you can find it here:https://gumroad.com/l/GqVoR. It basically has parameters for manipulating cards in a spline manner

Thanks dear Great blog. I really like this much, I have found lots of interesting stuff on your blog. Again thanks for that all interesting information. Visit for seo service in delhi

First Look at Blender's Eevee
5 June, 2017
News

Grant Wilk, the host of an educational YouTube channel called Remington Graphics, uploaded a new video that gives a first look at the Blender’s realtime rendering engine called Eevee. The engine will be released as a part of Blender 2.8, but you can already try it for yourself.  

So, what is Eevee? Let’s take a closer look at the engine. First of all, it follows the industry PBR trend, supporting high-end graphics coupled with a responsive realtime viewport.

Scene Lights

It supports all the realistic Blender light types (i.e., all but Hemi).

Unlike the PBR branch, the developers will make sure adding/removing lights from scene doesn’t slow things down. For the tech savy: Eevee will use UBO (Uniform Buffer Object) with the scene light data to prevent re-compiling the shaders.

Next Eevee can support specularity in the shaders, and expand the light support to include area lights. The implementation implies Eevee expands the GGX shader to account for the UBO data.

Regular Materials

Uber Shaders

Eevee will implement the concept of Uber shaders following the Unreal Engine 4 PBR materials. Since Eevee goal is not feature parity with UE4, don’t expect to see all the UE4 uber shaders here (car coating, human skin, …).

An Uber shader is mainly an output node. For this to work effectively the developers also need to implement the PyNode Shader system. This way each (Python) Node can have its own GLSL shader to be used by the engine.

UI/UX solution for multi-engine material outputs

Multiple engines, one material, what to do?

A material that was setup for Cycles and doesn’t have yet an Eevee PBR Node should still work for Eevee, even if it looks slightly different. So although the developerss want to support Eevee own output nodes, they plan to have a fallback solution where other engine nodes are supported (assuming their nodes follow the PyNode Shader system mentioned above).

Advanced Materials

More advanced techniques will be supported later, like:

  • SSS
  • Clear Coat
  • Volumetric

Image Based Lighting

Eeveee will support pre-rendered HDRI followed by in-scene on-demand generated probes. This makes the scene objects to influence each other (reflections, diffuse light bounce, …).

Spherical harmonics (i.e., diffuse only) can be stored in the .blend for quick load while probes are generated.

Time cache should also be considered, for responsiveness.

Glossy rough shaders

Agent 327 Barbershop project by Blender Institute - rendered in Cycles, reference of Glossy texture in wood floor

Agent 327 Barbershop project by Blender Institute – rendered in Cycles, reference of Glossy texture in wood floor

Eevee can’t support glossy with roughness reflections without prefiltering (i.e., blurring) probes. Otherwise you get a terrible performance (see PBR branch :/), and a very noisy result.

Diffuse approximation

Visual representations of the first few real spherical harmonics, from wikipedia

There are multiple ways to represent the irradiance of the scene, such as cubemaps and spherical harmonics.

The more accurate way is to use a cubemap to store the result of the diffuse shader. However this is slow since it requires computing the diffusion for every texels.

A known compromise is to store low frequency lighting informations into a set of coefficients (as known as Spherical Harmonics). Although this is faster (and easy to work with) it fails in corner cases (when lights end up cancelling themselves).

Eevee will support spherical harmonics, leaving cubemaps solely for baked specular results.

Probe Objects

Like force fields, probe should be empty objects with their own drawing code.

Environment map array

Reference image from Unreal Engine 4, room design by NasteX 

Large objects (such as floor) may need multiple probes to render the environment correctly. In Unreal environment map array handles this on supported hardware. This is not compatible with OpenGL 3.3 core. Eevee can still support this (via ARB extension) on modern graphic cards. 

Post Process Effects

For the Siggraph deadline the developers will add the following effects:

  • Motion Blur
  • Bloom
  • Tone Map
  • Depth of Field
  • Ground Truth Ambient Occlusion

Other effects that the developers would like to implement eventually:

  • Temporal Anti-Alias (to fix the noisy fireflies you get from glossiness)
  • Screen Space Reflection (more accurate reflection, helps to ground the objects)

The developers are going to implement these features by Siggraph 2017, with a more polished usable version by the Blender Conference.

Follow this link to try the new engine.

Source: Blender Blog
Comments

Leave a Reply

1 Comment on "First Look at Blender’s Eevee"

avatar
marni
Guest
marni

Tech talk is all well enough, but i think more people are interested in how Eevee is shaping up in relation to its competition. Will it be faster or about as fast as FurryBall, Redshift, etc.?

wpDiscuz