Four Technologies To Change Art Production
Events
Subscribe:  iCal  |  Google Calendar
1, Jul — 1, Aug
San Diego US   19, Jul — 23, Jul
Torino IT   25, Jul — 29, Jul
Shanghai CN   3, Aug — 7, Aug
Vancouver CA   12, Aug — 17, Aug
Latest comments

Enjoyed reading the article. I used to be a gaming addict in my teenage and everyone used to scold us for that. Now we are witnessing a great revolution in the field of gaming. Developers of gaming are setting up new benchmarks. I use gaming devices made by https://netgears.support/netgear-powerline/

by Filip van Halter
10 hours ago

Ace bru!

Does OWHQ have some kind of sleeping quarter? Just wondering, I mean, where do they sleep?

4 Technologies To Change Art Production
6 April, 2017
Report
During GDC 2017 at the Art Direction Bootcamp, Andrew Maximov, Lead Technical Artist at Naughty Dog, gave a talk on the future of art production for video games. It was a prognosis, detailing four of the top technological advancements that are going to drastically change our approach to game development.

These changes cause a lot of fear in the artistic community. I know this as I interact with that community every day. The introduction of 3D scanning, simulation, and procedural generation is changing the way we treat game development. These new elements may remove the need for parts of the game production pipeline that have been standard practice for ages. Megascans and SpeedTree already are taking away jobs from foliage artists. Environment artists can now scan entire buildings in a day. And developers used Houdini to build an entire town in Ghost Recon: Wildlands. Technology is changing the way we live. It’s changing the way we work. And if you really want to freak out about it, read Nick Bostrom’s book Superintelligence. But, we’re not about to dive into deep philosophical discussions—we’ll leave that to Elon Musk.

Instead, we’ll discuss Andrew Maximov’s informative talk while adding our own commentary on the subject as well as naming a handful of companies that already are influencing the way we treat production today. In doing so, we hope to lessen a community’s fear about the future and demonstrate that there is a light at the end of the tunnel for those working within the video game industry.

Optimization Automation

Optimization is a fairly common struggle for game developers. Game artists have been grappling with technical restrictions for ages. Back in the NES days, color itself was a technical resource. It had to be carefully managed because older hardware was unable to visualize many colors on a screen at one time. If you want to check out how a game artist’s tools looked back then, take a look at the Sega Digitizer System.

Plenty of compromises had to be made back in the day when these technical restrictions were largely prevalent throughout the industry. Today, color is no longer a technical issue. But, this poses a question: What other aspects of our game production pipeline will become optimized in the future?

There are many aspects of the game production pipeline that look to be going away: Manual Low to High Poly, UV Unwrap, LoDs, and Collision. In the future, games will display everything and anything game developers want to portray on a screen.

Many of those items already are being automated today. Developers are automating the level of details and improving UV unwraps. The more this happens, the faster chunks of the pipeline will become obsolete. And frankly, we believe this it a beneficial trend moving forward because these processes have very little artistic value.

If you’re interested in learning more about the ways technology changes the game production pipeline, feel free to listen to lectures by Michael Pavlovich and Martin Thorzen. These technical artists can teach you plenty about the way tools make game production easier.

Capturing Reality

Capturing reality is nothing new in the world of video games (remember the original Prince of Persia?) but has become a bit controversial within the industry.

Back in 1986, Jordan Mechner, the creator of Prince of Persia, and his brother went outside to snag something other than fresh air. Mechner captured his brother running around a parking lot with a video camera, and then he rotoscoped the footage pixel by pixel to paint what he had captured into the game.

Thus, the concept behind all those new up-and-coming scanning techniques are something the industry has been familiar with for quite some time.

Max Payne (2001) utilized facial scan techniques and Sam Lake’s face model for the titular character with amazing results. Today, these scans can be applied to a character’s entire body—that’s how Norman Reedus and Guillermo del Toro ended up in Death Stranding!

DICE is one of the first major companies to regularly utilize photogrammetry on large scale productions. In doing so, DICE cuts down the development production time and overall cost of game asset creation. Battlefield 1 and Star Wars Battlefront were mainly produced with photo scanning techniques. The company covers this development process extensively in its official blog. Kenneth Brown and Andrew Hamilton’s talk from GDC in 2016 also highlights the influence and importance of photogrammetry—by turning to photogrammetry, they managed to cut down the production time of Star Wars Battlefront in half (and more with automation!).