Wow, that's great. Have to try this out!
Wow beautiful environment. Very thorough and detailed. But I think there are a few images that are not showing up (error?). Is that just me? Interested in seeing those other pictures...
Jack. First of all, I want to apologize for offending you. We published this just to show how the tech could be used. We don't actually care about the message. But you do bring up a viable point, that for some people - this might be an issue, so I take this post down.
The Unity’s opening keynote at GDC 2018 was all about great things for 3D artists. The company talked about the next-level rendering and advanced materials, discussed the upcoming GPU-based progressive lightmapper (which is a huge thing), revealed an art challenge by Will Wright, and more. Missed the event? Check out the recording of the keynote and study the recap from the Unity team below.
Here is a small piece about rendering and ML to give you an idea about the future of Unity:
Next Level Rendering
Silvia Rasheva, Producer of Unity’s Demo team, joined us on stage to introduce the team behind Unity’s flagship demos such as The Blacksmith (2015), Adam (2016), Neon (2017), and Book of the Dead (2018). The Demo team drives advanced usage of the Unity engine and works closely with Unity’s R&D team to push the envelope of what is possible to achieve with our technology.
Natalya Tatarchuk, Director of Graphics, and Lucas Meijer, Technical Director, followed Silvia on stage to demo Unity’s new Scriptable Render Pipeline (SRP), which brings many strengths: It’s configurable, lean, user-centric, and comes with two options: the High-Definition Rendering Pipeline (HD RP) prioritizes stunning, high-fidelity visuals with performant results on GPU-compute-capable consoles and PC hardware, while the
Lightweight Rendering Pipeline (LW RP) is optimized towards delivering high performance across hungry applications like XR and platforms such as mobile.
Natalya and Lucas dove into how Book of the Dead was able to utilize the HD RP to allow their small team to achieve a high-quality production.
The demo takes advantage of photogrammetry assets – some created by the Unity Demo team and some coming from Quixel. We are proud to announce that the Quixel Megascans Library is coming to the Unity Asset Store.
2018.1 will also introduce Templates, which are project starters with default settings already tuned, including a sample scene. Here are some samples of the HD RP and LW RP templates:
To address the incredulous reactions by the community at seeing the quality demonstrated by Book of the Dead, the project was shown running on PlayStation 4 Pro.
Natalya and Lucas gave us a sneak peek at Unity’supcoming GPU-based progressive lightmapper, which gives instant feedback to the artist during the process of tuning lights and baking at ~10x the speed.
Next, Mike Wuetherick, Producer on the Made with Unity Team, and Adam Myhill, Head of Cinematics, took us through the improved artist creation workflow. With Unity 2018.1 we’re adding Cinemachine Storyboard. This is a new feature designed to aid you when roughing out your shots with Cinemachine and Timeline. It quickly lets you really establish the feeling you’re going for with grey-box levels and scenarios. In conjunction with our announcement earlier this month of bringing ProBuilder into Unity, the worldbuilding process is faster than ever.
Here is a quick summary of the improved artist workflow we showed:
Next, Danny Lange, our VP of AI and Machine Learning, took to the stage to highlight our commitment to democratize machine learning. We’re committed to lowering the barriers to entry so that you can make machine learning an integral part of your game development. You no longer need to program every solution, every NPC, and every permutation of how a person may interact with your game – you can focus on making systems learn.
Danny introduced our latest release, ML-Agents 0.3, which brings many new features, including Imitation Learning. Imitation Learning lets your system learn from real people playing your games, and can be trained to adjust to your players. The agent, in this case, the NPC being trained, does not play perfectly, like a robot, but rather imperfectly, like a player – and all of this training happens in real-time.
Machine Learning insights allow us to build tools that optimize your games for retention and engagement. App performance is critical to these factors – more than 50% of 1-star reviews in the Google Play store mention performance, making it one of the most important problems we can solve. We want your games to be accessible to all devices without you having to sacrifice graphics or effects – which is why we built LiveTune. LiveTune tailors your game for every device in real-time. It adjusts assets, effects and rendering on each phone model, thus providing the best possible experience for any player on any device.
We want to help you reach not just cohorts, but individual players, with the content and in the context that is most relevant for them. The first step we’re taking towards this is IAP Promo. IAP Promo surfaces the best possible in-app promotion to each player based on their game behavior and likelihood to engage. For more information on IAP Promo, and to get started with it, take a peek at this blog.
Danny closed by recognizing that each person has a combination of hardware, software, skills, and interests that create millions of options. We want to give you the tools that make your game accessible to everyone and deeply engaging to every person who plays.
Make sure to study the full recap here.