Hello ! I am a video game student @ILOI & I am very thankful, your speech is very motivating .
Except the dude clearly doesn't know much of anything about the 3D game pipeline. Yeah, if you're very skilled, a high poly sculpt could, certainly. But then there's retopology, UV mapping, texture baking, rigging, animating, other means of optimization once imported into the engine. Granted it wouldn't take anywhere near the production time of a AAA character (Which the High-poly sculpt took maybe 10-15 hours altogether, but the finished character took ~94 hours). And granted pokemon models aren't nearly as complex as that, but I think at least a 1-3 hours from start to finish to be a fair average expectancy of artists who know the work flow well enough. I just hate how people are so critical of artists when they clearly don't understand what goes into it.
During SIGGRAPH 2015 Andrew Schneider, Principal FX Artist at Guerrilla Games, had a long and very interesting talk about the creation of real-time volumetric cloudscapes in Horizon: Zero Dawn. This title was presented at E3 2015 for PS4 and immediately captured the attention of the press and gamers. The presentation, which Andrew used during the talk, is readily available for download!
Guerilla Games is well-known for the Killzone series. FPS games usually restrict the player to a predefined track, which means that developers just placed clouds using billboards and highly detailed sky domes. This technique is widely used in most 3d games right now. This allows to develop “art directed sky” with lots of details. All you need is Photoshop and some artistic talent. However the studio’s new game Horizon: Zero Dawn demanded a different approach.
Horizon is an open world game. You can explore the land and climb mountains. Developers have simulated the spinning of the earth by modeling a time of day cycle. The weather also changes, so naturally the sky has to change dramatically. The sky also takes a large part of the screen, so Guerilla just had to make it aesthetically pleasing. Here’s what they did.
Artists explored different ways to create and light individual cloud assets. Here’s some quotes from the presentation:
Our earliest successful modeling approach was to use a custom fluid solver to grow clouds. The results were nice, but this was hard for artists to control if they had not had any fluid simulation experience. Guerrilla is a game studio after all. We ended up modeling clouds from simple shapes, voxelizing them, and then running them through our fluid solver until we got a cloud like shape. Then, we developed a lighting model that we used to pre-compute primary and secondary scattering. The result you see here is computed on the CPU in Houdini in 10 seconds.
We explored 3 ways to get these cloud assets into game. First, we tried to treat our cloud as part of the landscape, literally modeling them as polygons from our fluid simulations and baking the lighting data using spherical harmonics. This only worked for the thick clouds and not whispy ones.
So, we though we should try to enhance the billboard approach to support multiple orientations and times of day . We succeeded but we found that we couldn’t easily reproduce inter-cloud shadowing. We tried rendering all of our voxel clouds as one cloud set to produce skydomes that could also blend into the atmosphere over depth. It sort of worked. At this point we took a step back to evaluate what didn’t work. None of the solutions made the clouds evolve over time. There was not a good way to make clouds pass overhead, and there was high memory usage and overdraw for all methods. Maybe a traditional asset based approach was not the way to go.
The company decided to go with voxel clouds, although it was not such a popular idea among programmers. They went into Houdini and generated some tiling 3d textures out of the simulated cloud shapes. Using Houdini’s GL extensions, the team built a prototype GL shader to develop a cloud system and lighting model.
Finally, the team got close to mimicking the reference but it all fell apart when they put the clouds in motion. And it was also painfully slow for a game, so the team went another way. They aimed to develop some good noises at lower resolutions that have the characteristics they like and then went out to find a way to blend between them based on a set of rules. It’s pretty complicated stuff, but the results they got are absolutely amazing. There’s more about the production process in the presentation. The developers studied the clouds, analyzed their nature and build an incredible tool to create dynamic and visually stunning clouds.