You should edit the dates: https://code.blender.org/2018/08/blender-2-8-planning-update/ Blender 2.8 will be published in 2019.
llo there mates, it is unlimited molded piece totally portrayed, continue with the broad work continually. Friv
Marcel Lenoir talked about the way his tiny team managed to build a Myst-like environment Buccaneers Falls in Unreal Engine 4.
My name’s Marcel Lenoir and I’m 23. I’ve always been stunned by the visuals in video games, and this feeling you get when you’re in front of a well created world. So 10 years ago I decided I would work in the video game industry no matter what.
Today I’m an environment art student in my final year at Objectif3D (a cgi / game art and design school in Montpellier, France).
This year, during projects, the class structure is setup to simulate a professional environment. We have short deadlines, and retakes from our Instructor and Lead Artist for the project, Thomas Keslacy.
Buccaneers Falls Environment
The assignment for this exercise was to create a cinematic in Unreal Engine 4 in 6 weeks which :
- Is around 1 minute long
- Contain a landscape authored in World Machine
- Respect the constraints due to the real time (reasonable polycount and texture sizes)
- Run at about 50 FPS minimum (on a GTX 1060)
- Show a reproduction of 3 professional artworks in it:
Environment : Buccaneer Headquarters by David Alvarez
Character : Survivor 2 by Darren Bartley
Vehicle : Hover Bike by Shem Dawson
Our team was the following :
Benjamin Piquemal: Character Artist / Animator
Guillaume Antoine: Environment Artist
William Dupré: Environment Artist
Marcel Lenoir: Vehicle / Environment Artist
First thing I did was set up a pre-production document that summarized a lot of things like pre-production workflows, files sharing systems, nomenclature, note of intent, synopsis, story board, etc…
The second thing was contacting David Alvarez to get as much resources and informations on his work as possible. And it turned out very well, he kindly provided us with a detailed production document and high resolution images of his work.
During pre-production we roughly represented what we wanted for the surroundings:
Then William authored the main shapes directly in Unreal Engine with the landscape tool, a bit like a blockout version, which he then exported to World Machine via the height map to get those hard edges and more precise erosion, then back to Unreal with the height map also.
One important thing is that we knew exactly what part of the world the camera would be looking at. And so we used the rocks to recreate the hill around the cave but everything else is just the landscape itself textured with an artificial boulder noise that blend well with the actual rocks.
About the kind of Thai islands, it’s actually just one rock scaled in various ways.
The rocks were essential in this project, but we didn’t had much time to sculpt a large quantity of them. So they needed to be very versatile, give a different aspect with every angles.
Guillaume sculpted and baked the arch creating the cave entrance, and the 3 main rocks to break the silhouette and add variety in the scene, here’s some instances of these rocks and the arch:
He also created three tileable textures in Zbrush that we used with the vertex painting on the rocks and for the landscape. As well as some small rocks and props to add variety.
The shader was pretty complex but it allowed us to do a lot of things with the rocks:
Using vertex painting we could paint: a classic rock surface, a sharper rock surface, a wet rock surface and add some red lichens on top of any of them. And we could dynamically generate a kind of deposit on the top of the rock using the WorldAlignedBlend node.
Also, for optimization, the stairs are made with one step repeated several times.
During the pre-production, we set up a list of every assets distinguishable from the artwork, those were the priority.
We extrapolated the additional assets and set up a Trello with all the deadlines corresponding.
The assets pipeline was classic, asset modeling was made in 3ds Max and sometimes in Zbrush to get the high poly details.
We used xNormal to bake the normal maps and occlusion (cavity also if needed). And then Photoshop to texture everything.
Depending on the distance to the camera, the assets were more or less heavy in terms of tris and textures sizes.
Due to the artwork, we had to create a lot of props in a very limited time, so to speed up the texturing process I set up an ID-based Master Material:
Basically you just had to model your prop, unwrap it and then paint in Photoshop up to 3 masks (Red, Green or Blue) which you can then hook to any materials you want in Unreal and have access to several settings like material rotation, scale or translate for a specific mask, input for mesh AO/cavity maps and their influences, pixel ratio checker etc…
The destroyed moon was created before the skydome, on some shot the sky occupied a lot of the screen space so I felt like it needed something to tickle the eye. I choose the moon from the movie Oblivion as a reference.
I used the APEX Physx Lab to fracture a sphere, then with a line of MAXScript I scattered randomly the fragments. The texture is just a white noise for the surface and an emissive lava core in the middle.
I also did a lighting test before getting it into the project:
Setting up a custom skydome in Unreal Engine is a bit tricky if you don’t want to sacrifice a week re-creating a whole system yourself. So I chose to spend one day to understand exactly the process of Epic to create their default skyshere. It’s a combination of an Actor Blueprint and a Master Material referring to the eventual directional in the scene to get the sun orientation.
After that it was just a matter of finding a 360° sunset photo, tweaking it a bit in photoshop and adding it inside the in-place skysphere system.
I chose this picture because it had those violet/dark pink colors we find in the artwork’s background, and the sun height matched almost perfectly with our in-game directional sunlight.
The light orbs must’ve been the most costly asset in terms of performance, William modeled and textured the ropes and I set up the blueprint composed of the following:
Adding a bit of bloom (not to much) make them really stands out and give them a warm feeling.
Since the movable lights in the orbs does not cast any indirect lighting, other sources of light were needed to light the world.
I used several static point lights and spots as fill lights to create that “smooth” lighting on the main scene. Along with some rim lights to highlight certain silhouettes to get the viewer’s eye.
The main scene lighting:
And a demo of the blueprint:
In my opinion, having moving details in an environment gives it a lot more credibility, the purpose being to make the player forget that this environment was created only for him. The world must feel like an independent living entity.
In terms of composition, the movement can be used to balance an image.
Take this shot for example (and let’s ignore the middle):
On the left third you have a cliff filled with those intricate static forms getting the viewer’s attention. And on the right is only the birds, but they are moving and really contrasting with the sky, two ingredients adding some “attention weight” to fight that left cliff and balance the image.
So that why I think this composition works better in motion:
The fact is that in a video game you can look everywhere for as long as you want. So, to me, here lies the crucial importance of having a lot of little things to discover, a lot of details to keep the player immersed into the world.
Relying on Photoshop
Only Photoshop was allowed for texturing in this project. And it was simply a matter of mastering the bases.
If you learn directly on Substance or Quixel tools, you might never understand a lot of fundamentals aspects of texturing.
And if you’re hired by a company not using these tools, you have to be able to achieve the same results with Photoshop.
It was kind of a pain sometimes, but I learned a lot about time saving workflows on Photoshop, like using the Layer Compositions or the Recorded Actions.
And now that we use Substance, I feel a lot more confident because I understand exactly what’s going on under the software.
The Biggest Lessons
This project comforted me in the idea that as a video game student you should never hesitate on contacting professionals if you need it. If they are bothered they just won’t reply, but most of the time they’re nice and willing to help, and most importantly they are a great source of experience.
I also learned that, when taking shots of your finished work, you should always think about the composition and quality of these images. Because, even if it’s not the main “product”, it’s still the first thing people are going to see.
With this kind of environment, you have to keep thinking about optimization, and it can be hard. But in the end, when you’ve got your real time environment finished, you can do whatever you want with it. For example we were able to try it in VR later and it was very satisfying.
It was easier to me to speak about the things I did, but this project really was a team effort. From the character/vehicle/camera animation, to the creation of the intricate structures in the main base, a lot more work has been achieved on this project than depicted above.
When working in a team, you have to be able to achieve more than “just your part”, help each other when needed and share knowledges to be stronger than just the addition of the team member’s capacities.