logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Making Retrofuturistic Animation in 3ds Max, Maya & Marmoset Toolbag

Shada Harb talked about the work process behind the 2Q84:LoneStar project, explained its idea, and shared how the character was animated.

Introduction

My name is Shada Harb. I am a 3D Artist from Beirut, Lebanon. I received my Bachelor of Arts in 3D Game Art & Animation from SAE Institute Berlin in 2021. Upon graduating, I left my work position at SAE Institute as Game Art Department Assistant and moved to Brooklyn, New York in August 2021.

Hailing from Beirut, where an economic crisis and corruption are ravaging the country, I have taken an unconventional trajectory in my career and have yet to officially work in the 3D art field. Instead, I have decided to take some time to explore a new life and find myself in my craft while enjoying various jobs such as bartending and now working as an optician at a French eye-wear brand, Anne & Valentin.

The 2Q84:LoneStar Project

The animation was created as the final project for my bachelor's degree. The core aim of the project was to hone previous skills attained, such as Environment Art and Clothing Creation, while simultaneously adapting to and learning skills required for the creation of a 3D animation – the latter being an art form I had not explored.

The animation concept is of a non-binary character stuck in a cycle of repetition. It touches on the allure of playing a video game and how that can be a form of escapism while also twisting that concept into a matter that is dark and absurd. It is both a relatable and terrifying concept, one that many “gamers” have potentially experienced: the relief that comes from the escape of playing an immersive game or exploring the realms of virtual reality, but it could also pose a threat. The decision to have a non-binary character was rather natural as gender and sex do not impact the situation posed in the concept. 

The time setting of the scene is as if the 1980s were occurring in the future and in a parallel world. Most elements are from the 80s, while the core machine elements are futuristic, serving as a contrast. I was definitely inspired by the aesthetics of the 80s and the technological possibilities that a futuristic/cyberpunk world could bring.

Without a doubt, most aspects of the story itself were inspired by my own attempts to explore the state of mind I was in during that time and the struggle I was experiencing, partially from not knowing how to be in touch with the reality of the state of things: being separated from home and family while my country was destroyed without my being able to say goodbye.

Initially, I planned to be responsible for the entire animation creation from scratch, not including the audio and sound effects of course. However, seeing as I had limited time for completion and a massive load of tasks including ones I had never attempted before, I enlisted the help of Anastasiia Holumbovska – SAE Institute Berlin’s head instructor of Game Art at that time – as she had experience with 2D Art, rigging and animating, motion capture and Maya. 

I was responsible for the overall creative/art direction of the animation, the storyboard and storytelling, the environment art and props, character design, clothing creation, camera setup, rendering, credits scene, and editing. 

The tools I used throughout the project were 3ds Max, Maya, Marmoset Toolbag, Substance 3D Painter and Designer, Marvelous Designer, ZBrush, Premiere Pro, and After Effects.

Environment & Assets

All main assets were created manually. The room started as a blockout scene in 3ds Max before being transferred to Maya for scene implementation. As a 3D modeler, I am most efficient in 3ds Max. Every main asset was created separately and taken through the entire prop art pipeline individually in order to ensure that they were created to perfection.

I prioritized quality over quantity as I could have modeled an infinite amount of assets to add to the scene without insurance that there would be enough time to texture every one of them. Some minor assets, such as trash elements, were downloaded for free as ready-made assets. Others, such as the trash bags and the table lamp, were created by fellow 3D Artists at SAE Institute Berlin.

My primary “hero assets” consisted of the room structure itself, the television, the game console, the radio, the machines, the headset, and the character's clothing. Each of these was created entirely from scratch down to the details, including the “LoneStar” logos. 

TV 3D Model – 3ds Max

(NES) Console 3D Model – 3ds Max

(NES) Controller 3D Model – 3ds Max

Machine 3D Model – 3ds Max

Headset 3D Model – 3Ds Max

I began with low poly modeling in 3ds Max followed by a high poly model. These were then baked in Marmoset Toolbag and taken to Substance 3D Painter for texturing. Some materials – such as the painted walls or specific logos -  were created in Substance 3D Designer and then imported into Painter. 

TV Texturing Render Test – Substance 3D Painter

TV Texture Maps

(NES) Console Texturing Render Test – Marmoset Toolbag

Console Texture Maps

Radio Texturing Render Test – Substance 3D Painter

Headset Texturing Render Test – Substance 3D Painter

Machine Texturing Render Test – Substance 3D Painter

Characters, Clothes & Animations

The character was created from a base mesh that Anastasiia had used in a prior project. It was then re-sculpted/adapted by Anastasiia and detailed according to the image I had of what the character would look like. Luckily for us, that mesh already had a retopologized version and we could play with the subdivision levels in ZBrush easily.

The jacket and shorts were created entirely in Marvelous Designer. It’s absolutely one of my favorite programs and things to do. It was definitely complex as I had a very detailed design in mind for that kind of jacket, and trying to have all these layers for the cuffs and zippers while also maintaining the desirable folds was challenging.

After sewing all my patterns and being satisfied with the overall structure, I went over the clothes in ZBrush and detailed them further. Some folds were refined and exaggerated and others were added. The zipper and stitches were also done in ZBrush with the help of ready-made brushes. The socks were then sculpted in ZBrush by extracting a mesh from the feet and then molding and refining that to have the look I was going for. Afterwards, decimated models of the clothes were imported into Maya so I could retopologize them manually. 

Clothing Creation – Marvelous Designer

Socks 3D Sculpt – ZBrush

Socks Retopology – Maya

Character/Clothes Render Test – Maya

Character/Clothes Render Test – Maya

Character/Clothes Texturing Render Test – Maya

When it came to the character animations, it was definitely improvised. The original plan was to do motion capture. We had a mo-cap suit at SAE that we planned to utilize. We captured the mo-cap with me being the actor as I had a particular idea for the movements of the character. Unfortunately, we realized afterwards that the suit was faulty and every recording was basically unusable. That is what led to the need for rigging and animating the character from scratch, and thankfully, Anastasiia had some prior experience with that.

We ended up filming my movements with the song that was used in the video, Sunshower Vox, so she had a very specific blueprint for the animation. These little struggles lead to chunks of the animation being not quite polished to my taste, but that was a big opportunity on learning how to let go and improvise instead.

The TV screen texture animations were fairly simple as I had most of the content downloaded from different sources. The gameplay video was extracted from the game 198x by Hi-Bit Studios. The game over screen effect was created using After Effects. Then, the entire sequence was exported as PNGs to be loaded onto the TV screen material in Maya as an image sequence. Other minor animations within the scene, such as the radio, were also easy due to Maya’s intuitive keyframe method. The credits scene was created entirely in Marmoset Toolbag with the help of its effortless keyframing methods for controls of asset visibility and emissive map intensity.

TV Screen Animation Setup – Maya

Radio Key Frame Animation – Maya

Credits Scene Animation – Marmoset Toolbag

The Camera Work

Roughly half of the camera shots were determined early on during the pre-production phase. I had a storyboard blueprint made that I could follow throughout the production. Since 2D art is not my expertise, I have never had the ease of drawing out the images in my head. Instead, I created the 3D blockout of the room in 3ds Max with a dummy in place of the character.

I placed cameras throughout the scene and took some shots of the scene, which I then took to Photoshop and drew over in a very scrappy manner. Many of these shots and angles ended up being used when actually filming and rendering the animation, such as the shots of the radio and the tapping of buttons on the controller. 

3D Blockout – 3ds Max

Storyboard Part 1 – Photoshop

Of course, after the scene was entirely implemented in Maya, I had newfound creative freedom to go off the blueprint and experiment with different shots. I did my research on different shot types and how they are used to convey a story. I knew I needed some establishing shots where one could try to understand the setting of the scene and the state of the room (why is there a dead plant? How much time has passed with this character stuck in their headset?)

I considered how I could show that there is an absurd loop happening here without being forceful about it – a loop that was definitely inspired by Samuel Beckett’s “Play”. That is how I came to create those simple beginning and end shots of the radio and the loading screen. The Dutch angle was also used on the character upon witnessing their twitching eye after being electrocuted to emphasize the strange and disorienting absurdity of that moment. 

The entirety of the camera work was both exciting and challenging as I knew I had a limited set of shots I could do and thus needed to convey the story as efficiently as possible to avoid having an insane render-time. A lot of shots had to be re-done after rendering them because I wasn’t content with the overall look or purpose of the shot, so there was definitely some back-and-forth tweaking all around. But at some point, I had to call it and be content with what I had in order to give the much-needed time for all the camera shots to render in high-quality in Arnold, Maya. 

Conclusion

Without including the pre-production phase, the project took about 6 months to finish: from September 2020 to March 2021. This was done as I was writing my BA thesis and working at SAE Institute. The pre-production phase alone took about a month of me doing my research on methods and techniques as well as gathering a large number of references for every aspect of the project. And to be honest, the project needed even more time, but I had to make do with the deadline I had.

Apart from the big inconvenience of having unusable motion capture data, one of the challenges was having an extremely heavy scene to work with on mediocre computers/laptops. For example, I wanted the wires of the machine and the headset to have a very fluid movement that followed the character naturally. We attempted to animate the wires with nCloth in Maya, but Maya would consistently crash or slow down whenever they would be tweaked. So there were some chunks of the wire animation that had to be done manually as well.

Wires nCloth – Maya

Setting up the lighting was also slightly challenging. Personally, I found it not as intuitive as setting up lights in Unreal Engine or Marmoset Toolbag. The two main sources of light were to be from the window and the TV. Originally, I desired a pink light from the TV and a cold blue in the atmosphere. However, while attempting to execute this, I realized it was not working as I had imagined. It required a lot of refining and trial in order to get the results I could work with.

The rendering phase was one of the most intense. I had around twenty-six camera shots with each ranging around 200 frames, give or take, and one frame alone took roughly thirty minutes to render. Thus I set up a render farm on the computers at SAE. However, this did not go as smoothly as I hoped. For some reason, some of the computers would shut down mid-rendering and I would have to set them up again. Some renders had inexplainable errors such as rendering only portions of a frame or the TV screen suddenly turning white. Therefore the rendering phase took extremely longer than anticipated and caused its own need for improvisation.

Render Error – Arnold Maya

Personally, the challenges experienced while creating the animation almost made me feel one with the character: in constant pursuit of something which, with every mistake, causes immense deterioration. This deterioration refers to the project itself: all throughout the production, every misstep or error encountered made me feel as if the vision I had in my mind would not be fully accomplished. However, given that I prefer imagining an ambitious project and do my best at it, I learned how to let go and accept that my vision is correlated with the best I can give.

Overall, it was an intense learning experience. Storytelling has always been a passion of mine, regardless of the art medium (a big joy in my life has been my previous experiences of acting in theatre.) Storytelling through cameras was a first and it was definitely illuminating as it allowed me to explore hidden areas within the craft. Even better, I experienced the thrilling taste of directing a personal production while working with other incredible artists – such as Rafael Nicolau who created all the sound effects – which was by far the most rewarding experience.

The project also determined the natural incline I have for creating 3D prop art while reaffirming my desire to create profound and innovative video games and tell stories that do not shy away from the deepest (and sometimes darkest) parts of ourselves and of the world we live in.

To me, ultimately, 2Q84:LoneStar was about diving into the internal world of the character and posing such questions: what if the subjective survival method for someone is to escape from reality? What if the reward system within that escape includes punishment? – a nod to the judgment we impose on ourselves when we attempt self-care in any shape or form. To what extent do we have free will? What if we had no choice in being stuck? What would that make of us? And the LoneStar machine was the catalyst for the questions.

Shada Harb, 3D Artist

Interview conducted by Arti Burton

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more