Bringing a complex scene in UE4

Sebastian Giannoulatos explained how he managed to port a huge 3d scene into Unreal Engine 4 and talked about some challenges he faced.

Sebastian Giannoulatos talked about his crazy visualisation of Nanyang ADM (this is actually a university building in Singapore) and how he managed to bring it successfully from Mental Ray to Unreal Engine 4. It’s a very impressive project. The lighting here was actually achieved with a little help from Koola.

Building the Nanyang Environment

The Nanyang ADM building is actually a very old project of mine. I did it for Mental Ray as part of my graduating project in college. Seeing how powerful this new generation of game engines are I wanted to see if such a complex scene is possible to do in real-time. The answer is yes and a very strong no, hahaha.

1 of 4

The porting and set-up in Unreal took a little over a month working on it in my spare time.

When I decided to port it over to Unreal I thought I could just re-export the models and that would be the end of it.

I hadn’t accounted for the fact that I hadn’t done any UV’s for most of it and how inefficient I was at modeling as a student. So most of the building had to be redone from scratch. Luckily back in 2009 I had discovered how truly small our world really is. Since there’s not many high resolution photos of the building in the internet, I contacted a student that went there over Facebook. He put me in touch with a photography student and she went out and shot tons of high resolution pictures. I am very grateful to them; without their help I couldn’t have done it from half way around the world.

1 of 4

Challenges

The biggest problem with this project was that it’s very hard to work with modular pieces and as a result 90% of the models are unique. Sure there’s only 5 or 6 different kinds of windows but point snapping and rotating hundreds of them manually would have created more problems than it solved, so I created all the window fronts in Maya as a single model and curved them using deformers then exported each wing as a unique model. The same goes for the rest of the scene all models have the same origin so I just have to drag them into the world to assemble the building.

Lighting and Material Production

The lighting is very straightforward. It’s a single directional light with the ‘Atmosphere Sun Light’ setting activated and a Skylight with its ‘Lower Hemisphere Is Black’ setting turned off. That was important to get the foliage to look right.

1 of 4

Epics documentation helped a great deal with figuring out how to best implement the Two-Sided Foliage model.

When building lighting, I used a modified lightmass.ini setting that was described on the Unreal Engine forums.

All credit goes to Koola for those settings on the flip side the scene took 19 hours to bake lighting.

The glass was by far the most challenging part of this project. The problem is that even though screen space reflections are fantastic, they only reflect what’s in view of the camera, hence the name. I needed to be able to see the opposite wing of the building reflected in the glass when I looked at it even if it was out of view. So I had to resort to a very old-school way of doing things – Cubemaps. The resulting reflections are relatively inaccurate because they can only be drawn from the point of view of the capturing camera and games and visualizations used to be able to render only a small handful of them. But modern hardware and software optimisations allowed me to place one every other pane without noticeable performance loss and so I was able to capture fairly accurate reflections.

In the end I had to place 295 scene capture cameras, export their captures to textures and apply them to 295 glass materials as a cubemap. While I cringe at how inefficient that is, it still runs at over 60 fps on a gtx 770.

120

The old way is still the best way when paired with modern advances in glossy screen space reflections and optimized through the use of instanced materials that cut down on complex calculations and reliance on system memory.

It’s not perfect of course but it’s close enough. That’s why I say that while modern PBR engines are amazing at what they do often producing results that mirror offline renders a scenario such as this is ill suited for a real-time visualization.

Sebastian Giannoulatos, 3d artist

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more