New Standards of Real-time Digital Rendering from SIGGRAPH
Events
Subscribe:  iCal  |  Google Calendar
Las Vegas US   20, Feb — 23, Feb
Barcelona ES   26, Feb — 2, Mar
Barcelona ES   26, Feb — 2, Mar
Austin US   9, Mar — 19, Mar
San Francisco US   19, Mar — 24, Mar
Latest comments
by Vilaskis
3 hours ago

Thank you Richard. If I find some more ways to improvise for optimization, then I'll tell you definitely

by janelle
3 hours ago

would love to see the substance graph.

by Richard
6 hours ago

The visual shader system will be great for modular asset pack makers. You see some incredibly high quality modular asset packs on the Unreal store, whilst the ones for Unity are so-so, which I think is down to the ease of creating shaders on Unreal vs Unity. Alternatively you have to make your shaders in something like Uber Shader system which immediately splits your customer base.

New Standards of Real-time Digital Rendering from SIGGRAPH
3 August, 2017
News

MEETMIKE, a VR experience being shown at this week at SIGGRAPH 2017 conference, sets new bar for real-time digital human rendering. The project features a digital version of VFX reporter Mike Seymour being managed and rendered in real-time by the real Mr. Seymour. Mike Seymour plays host, interviews industry veterans and researchers inside of VR during the conference. You just have to see this thing! 

The result is a model rendered at 90 FPS in VR using UE4. The projection is all about details: amazing eyebrows and eyelashes,pores of the skin, and a mind-blowing facial model. 

Seymour puts on a Technoprops stereo camera rig which watches his face as it moves to get the things going. The images are tracked and solved with the tools by Cubic Motion, and that scanned data is then used by a facial rig from 3Lateral. 

Here are some technical details form fxguide

  • MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
  • Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
  • For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
  • These are in combination with around 750 blendshapes in the final version of the head mesh.
  • The system uses complex traditional software design and three deep learning AI engines.

You can read a paper by Seymour and Epic Games researchers Chris Evans and Kim Libreri titled Meet Mike: Epic Avatars to get more details on the project.

Source: roadtovr.com

Leave a Reply

Be the First to Comment!

avatar
wpDiscuz