New Standards of Real-time Digital Rendering from SIGGRAPH
Events
Subscribe:  iCal  |  Google Calendar
7, Mar — 12, Jun
San Francisco US   19, May — 24, May
Krakow PL   21, May — 23, May
London GB   29, May — 1, Jun
Birmingham GB   1, Jun — 4, Jun
Latest comments

Amazing art. I'm curious how the rocks manage to be such a natural part of the terrain! It really looks like they have been there for ages.

Great job and very inspiring! Thanks for sharing.

Frankly I do not understand why we talk about the past of this CEO. As a player I do not care about what he did or not until his games are good. As an Environmental Artist instead I see a game with a shaky graphics. It is completely without personality, emotion and involvement. It can hardly be considered acceptable especially for the 2019 platforms (which I understand will be the target of this game). Well, this is probably an indie group, with no experience facing a first game in the real market. And that's fine. Do the best you can that even if you fail, you will learn and do better. From a technical point of view the method you are using is very old. It can work but not as you are doing it. I bet you're using Unity, it's easy to see that since I see assets from their asset store. Break your landscapes more, they are too monotonous and contact real 3D artists and level designers. One last thing, the last screenshot is worse than all the previous ones. The lights are wrong and everything screams disaster. Avoid similar disasters in the future.

New Standards of Real-time Digital Rendering from SIGGRAPH
3 August, 2017
News

MEETMIKE, a VR experience being shown at this week at SIGGRAPH 2017 conference, sets new bar for real-time digital human rendering. The project features a digital version of VFX reporter Mike Seymour being managed and rendered in real-time by the real Mr. Seymour. Mike Seymour plays host, interviews industry veterans and researchers inside of VR during the conference. You just have to see this thing! 

The result is a model rendered at 90 FPS in VR using UE4. The projection is all about details: amazing eyebrows and eyelashes,pores of the skin, and a mind-blowing facial model. 

Seymour puts on a Technoprops stereo camera rig which watches his face as it moves to get the things going. The images are tracked and solved with the tools by Cubic Motion, and that scanned data is then used by a facial rig from 3Lateral. 

Here are some technical details form fxguide

  • MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
  • Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
  • For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
  • These are in combination with around 750 blendshapes in the final version of the head mesh.
  • The system uses complex traditional software design and three deep learning AI engines.

You can read a paper by Seymour and Epic Games researchers Chris Evans and Kim Libreri titled Meet Mike: Epic Avatars to get more details on the project.

Source: roadtovr.com

Leave a Reply

Be the First to Comment!

avatar
wpDiscuz