Great job and very inspiring! Thanks for sharing.
Frankly I do not understand why we talk about the past of this CEO. As a player I do not care about what he did or not until his games are good. As an Environmental Artist instead I see a game with a shaky graphics. It is completely without personality, emotion and involvement. It can hardly be considered acceptable especially for the 2019 platforms (which I understand will be the target of this game). Well, this is probably an indie group, with no experience facing a first game in the real market. And that's fine. Do the best you can that even if you fail, you will learn and do better. From a technical point of view the method you are using is very old. It can work but not as you are doing it. I bet you're using Unity, it's easy to see that since I see assets from their asset store. Break your landscapes more, they are too monotonous and contact real 3D artists and level designers. One last thing, the last screenshot is worse than all the previous ones. The lights are wrong and everything screams disaster. Avoid similar disasters in the future.
But are they real or is it a mockery? or a scam? Truly horrible flat graphics and lacking a real sense of aesthetics. Ui devoid of consistency and usability. Do they really have a graphic art department? Imho in 2018 using such tricks so massively denotes profound technical incompetence.
Kristin Farrensteiner from Germany discussed the production of her latest project, which features some amazing technical detail and great animation. It’s a nice little introduction into doing great animation with UE4, nVidia Hairworks and Apex Cloth!
My Name is Kristin Farrensteiner, I am 22 years old, studying “Media Design” in Germany at the HS Hanover since 2012. I have started 3D modeling in late 2013 and worked on an unreleased adventure game at Daedalic Entertainment for a few months. I’m currently looking for jobs and opportunities to find a way into the industry after college.
I have always been inspired of interactive movies such as Beyond: Two Souls, Life is Strange or the cinematography in the Final Fantasy or Metal Gear Solid series. Especially the realistic characters fascinated me and their creation using modern scanning techniques and motion capture, as seen in Metal Gear Solid V – The Phantom Pain or more recently Hellblade.
In my free time I really enjoy creating and concepting characters, giving myself tasks to keep me motivated and learning. A great source of inspiration for me – when it comes to characters – is Rafael Grassetti.
I always wanted to combine my passion for characters and game development and found that chance within the practical part of my bachelor thesis, which I dedicated to interactive movies.
My project called “Desolated System” is characterized by gloomy scenarios and cinematic elements: In the role of a girl awakening after a coma, you find yourself in a hospital bed, disoriented. The hospital room is destroyed. Broken lamps, sparks bursting, red warning lights and evidence of a gunfight accompany you on your path down the hallways of the clinic, on your escape from the hospital. I like to call the game an interactive movie, which is my term for a combination of cinematics with quick-time events and playable third-person exploration parts. The player has the ability to experience the story at their own pace and make decisions along the way, driving the plot’s outcome. In this first part of the game, I introduce the player to the main character Evie and confront him with the desolate reality he steps into.
This project started out on the drawing board about half a year ago. I start a project usually collecting tons of reference material, drawing character sketches and drawing storyboards for the shots. After planning I started working on the character and environment modeling. I worked alone for the most part but got some great support in the last month and got great feedback along the way to keep improving on my designs. I then went into exploring the possibilities of the Unreal Engine 4. I had limited experience in UE4 and quickly fell in love with the new blueprint scripting, the material editor and matinee for the camera set-up. Real-time rendering really helped me develop the visual style quicker, because I saved a lot of time not having to render it with Arnold or Vray. The time I saved, I could put into research of some interesting real-time tech like nVidia Hairworks and Apex Cloth. In the later stage of the project I had great support with animations, soundtrack, additional assets and scripting to put in everything before the deadline.
The theme of the main character, Evie, was to create a superhuman girl that is imprisoned as a hospital patient, kept for experimentation – until something bad happens… The player is presented with no information about her past and the world around and uncovers the story and her fate together with her. I chose a young female protagonist to emphasize her vulnerable exterior and have a stark contrast to her hidden powers.
My initial inspirations included Ellie from Last of Us and concept art from Ignacio Fernândez Rîos “Maschinen Project”.
The white hair and pale skin were to make her look like a “lab rat” and the prosthetic arm should introduce an unsettling alien element for Evie to get used to. In the development of the arm I tried to make sure it does not look like a weapon and has non-aggressive, female shapes. Here I looked towards Big Boss from Metal Gear Solid 5 and Terminator for reference.
I usually gather around lots of references/mood boards before starting a character sculpt, consisting of a mix of anatomy references, especially faces of people, even pore and wrinkle shots, clothing reference, also some character screenshots from Final Fantasy 15, MGS 5 and Beyond: Two Souls.
After that point I sketch out design ideas on paper to see what shapes are working, with emphasis on readability and the character traits I want to communicate. I stepped back from my original designs and towards a simple patient gown. This also helps to focus more on the character’s emotions and special features like the arm.
As a student I only had limited resources when it comes to budget or time, so I focussed on optimising my workflow towards using tools that make a realistic look easy to achieve. I would really like to work with scan data and motion tracking, but for this project I needed to come up with techniques that get similar results without the required technology.
For character sculpts I start by using a base mesh that I reuse for almost every character and then modeled to 3D scan data references from Anatomy360. This workflow allowed me to create multiple characters for the game, in addition to Evie.
A character that appears in cinematics really only shines when you have facial detail, so I focussed a lot of effort here. I try around with different lighting, camera angles, take screenshots and paint them over. Then I refine the facial features step by step, get feedback and I keep up this process until the anatomy and the emotion feels just right.
My face detailing process involves using photo textures from pores and wrinkles in zBrush. I project both the zDepth and the Color channel at once to get a perfect match between color and shape. It’s very important to adjust the textures in Photoshop before, canceling out highlights and shadows to get an albedo texture like you would get from using scan data.
After I finish this workflow step, I move on to clothing. I have become accustomed to Marvelous Designer to do a quick base mesh for most of the textiles I create. I then re-import the mesh into zBrush to smooth or add some wrinkles and optimize for later simulation and animation, for instance creating more space between skin and cloth. I don’t use external optimization tools afterwards and instead use the quad draw tool in Maya to retopologize the mesh so I can rig it properly.
For Evie’s prosthetic arm and for other hard surface pieces, I blocked out shapes in Maya and refined them afterwards. The design is heavy on longitudinal cylinders and hydraulic cables mimicking the flow of muscles and bones. I also make sure to use some 45 degree angles that are good for a mechanical, futuristic appeal.
For non-organic texturing, usually Substance Painter works best for me. I’ve been using it since beta and the “smart materials” are a great way to start off your textures, using curvature and world normal data from your model to create a base texture. I then go in and create color variations or additional normal detail like seams and wear. I created a neat export preset for Painter that packs metal, roughness and opacity directly into one texture for use in the Unreal Engine 4 PBR shaders which again saves a lot of time.
For the realistic looking skin material in UE4, I used the Unreal SSS shader model. I tweaked the effect using a thickness and subdermal map and added fresnel highlighting. I set up one master material and then used several instances with modified parameters like roughness and normal multiply, so I could dial in the right amount of specularity and detail for different lighting conditions. I advise everyone to use material instances, to get real-time controls over your shader parameters. In one scene I even make Evie’s eyes and facial veins glow and this is all done via blueprint and a float parameter in the shader.
nVidia Hairworks and Apex Cloth
I have worked with nCloth and nHair in Maya in the past, but for this project I wanted to expand my knowledge of real-time techniques. Apex Cloth lets you set up cloth-weights per vertex in Maya for simulation in UE4. Setting the max influence to 120 instead of 1, I was able to create smooth transitions between simulated and non-simulated parts of Evie’s patient gown. This was mostly important for the part where you are running around the hallway or getting up after laying on ground. The physx apex simulation in unreal can work with wind sources, so I set one up to create some dynamics to the scene.
The only hassle I encountered was the collision system, where you set up a collision rig via spheres and integrate it into the character rig via hierarchy parenting. Tweaking the collision and the clothing asset’s physics properties in Unreal took some time but got me to satisfying results.
For the Hair I went through a couple of workflows and animation methods. At first I tried Steve Holmes’ workflow for creating the hair and for rigging simulated nHair strands with IK-splines on bones, blending it with FK and parent constraints to have more predictable results. The results can be seen here:
It looked acceptable for my first real-time hair system, but I was encouraged to try something more dynamic, which led me to Hairworks. For getting Hairworks to really “work” I updated to the latest unreal version and had to compile the plugin. I hope for a better solution in the future to make it easier for everyone to use. If you’re still interested make sure you have to have an nVidia account and a GitHub account and link them together to access tutorials and the source code on GitHub.
For building the actual hair I created some cv curves in Maya as guide curves, representing the shape the hair should maintain. The curve origin point should be connected with one vertex of the groom mesh, connecting all the vertices of the groom mesh with curves. Setting up the hair in Unreal is pretty fun! It almost reminds me of Maya’s nHair. Within the engine, you can adjust clumping, strand width, volume and even waviness – all in real-time! But since the character has just awoken from a coma, I couldn’t go too crazy with the hairstyle (maybe next time).
All-in-all, Hairworks and Apex create great results but takes a while to get it right. Pros are that you can configure both in the Engine in real-time. Cons are that they sometimes crashed Maya on export and the integration took a lot of research and experimentation. Also, we had to fall back to the version without Hairworks for ATI graphics cards.
I was very much inspired by Metal Gear Solid’s cinematography which puts the viewer very close to the action and focusses a lot on the emotions of the characters. I took interest in their tracked “handheld” camera feel and tried to reproduce it within my game.
Again, I didn’t have a professional studio or tracking hardware, so I resorted to improvising my own tracking methods. I used a video camera and my living room as the stage and recorded some camera shots.
I took these shots into Adobe After Effects to create a 3D camera animation from the footage that could be exported as an .fbx to Maya. I tweaked the camera animation in Maya (because it was quite jittery) and then attached a skeletal mesh to it to export that to Unreal, giving me a bone position to attach the cinematic camera to.
This workflow may be overly complex for Maya, as I was aware there is a script for Max that let’s you export camera animations directly. I will definitely look into improving this step of the workflow in the future, maybe with the new Sequencer tool that Epic Games added as an experimental feature, right after I was done with setting up my Matinees… : I would have liked to work with Sequencer earlier, as I am sure it would have reduced the amount of work for retiming and rearranging shots. It really love this kind of cinematography, but had to decide against it in the end, because time was running out.
What we did in the end to create a dynamic handheld feel, was layering a second matinee, recreating the movement of the tracked cam data we had, on top of the existing shots. This one only changes the rotation and FOV of the camera slightly, to give some variation. For each shot I was able to dial in the amount of this effect from 0.0 to 1.0, giving me total control of the effect in real-time.
What was also essential, for the look I wanted to achieve, was the postprocessing setting within the camera setup. You can achieve some realistic results by just adding some color correction and field of view. I aimed for a color palette with a lot of blueish greens and red as a signal color. I used Unreal’s color grading technique in this case.
Conclusion & Recommendations
I enjoyed the rapid experimentation in Unreal, seeing the results of lighting, camera shots and rendering in real-time. It gives your workflow a much more interactive feeling, great for iterating on your shader settings and lighting. And again, it did not require offline rendering that could easily take a week or two for some projects.
For the future I’m looking forward to recreating some shots with the new Sequencer and to improve the lighting in some areas. I advise you to stay focussed on improving your workflow at every turn in your project and experimenting with new tools and plugins from time to time. It really pays off in quality and speed of production, and you will know how to solve your specific tasks with more fun and better results! I learned a lot from other great artists along the way and I’m really glad to be part of a growing community that shares their creative ideas!