Alex Dracott and his mates talked about the production of the epic Cyber Bow for Oculus: modeling, VFX texturing, arrow effect assembly, timing and more!
Cyber Bow: Idea*
*For the design and idea behind the Bow, the answer comes from the project’s designer Richard Weschler.
The Cyber Bow was created for the Core 2.0 update on the Oculus platform. Released in beta at the end of the last year, this update overhauled the user’s virtual reality home allowing users to customize their virtual home space with themed furniture, wallpapers, interactive props, and skyboxes. The goal of this update was to provide a sense of self-expression to users as well as deliver on the fantasy of having your very own virtual home space.
The Cyber Bow is a part of our Sci-Fi collection of themed objects and is one of our highlight objects for users to earn. We already had a more traditional bow and arrows under a different theme so we thought it would be fun to really run with the idea of how a futuristic bow and arrow could look and function.
Modeling & Texturing*
*For the modeling and texturing of the Bow I’ll leave the answers to Jacob Stone, a super talented hard surface artist I worked with at the time.
The Cyber Bow was actually the third bow done on the Home 2.0 project. Initially, I created a much more ornamental bow (the Lion’s Bow). As I was working on the Lion’s Bow I quickly realized that it looked overly ornate for something that was supposed to be a common item for players to enjoy. Thus the Commoner’s Bow was created (a much simpler version). Following both of those bows being made and given that we already knew that we were going to have a set of sci-fi themed items, I wanted to have a sci-fi bow to go along with it.
Lion’s and Commoner’s Bows:
After receiving concept from one of our concept artists (Gabo Garza) which used the dimensions of the other 2 bows as a template I was able to jump on it. I primarily use Maya for my modeling package and as soon as I got the major forms modeled I quickly got it into Unreal to start checking proportions and general feel of the bow. The next step was getting the concept artist to check it out in VR and help me understand some of the shapes better that weren’t working well yet. I ended up having some liberty with the shapes towards the final stages of modeling the high poly doing things like adding construction lines, screws, knobs to turn, buttons and so on. Once the high poly was done I built out the low poly, and given the nature of the project, I couldn’t go overboard on the tri-count so we tried to keep it around 10k triangles.
As for texturing, Substance Painter was my number one go-to program on Home 2.0. After getting my bakes looking right it was time to start looking at material breakups and construction materials. I’ve found that normal maps don’t work as well in VR so geometry and materials are your best friends when it comes to believability. Like with the other two bows where I went with three main materials, this one had paint, coated metal, bare metal, and a small amount of rubber on the handle grip. They all have the ability to have a good range of roughness and metallic values. Originally I wanted the bow to be a bit cleaner-looking but after working out the textures I realized I wanted it to look like it was heavily used and had taken a beating. Using smart materials as a starting point and then hand editing the layers using masks I was able to achieve this look by chipping away at the paint and coated metal exposing the bare metal.
The arrow effect itself has 2 parts. Let’s call the main effect that happens as you draw up the bow the pull effect and the effect that gets played as you fire it, the release. For the radial elements in the pull effect, everything you see is actually one single static mesh. The radial elements (the rings, etc) were all modeled in Maya. That mesh was then imported into the blueprint that made up the arrow itself. This was done so that when you fired the arrow, whatever effects you had when you were pulling back the string, could travel with the arrow as it flew, as opposed it just disappear with the release effect. It gave you a bit of agency as to what your cyber arrow looked like!
The choice to make it a mesh was done for a few different reasons. The first and most important was that the effect was made for VR. Flat cards and large sprites just don’t hold up. This is compounded by the fact that the bow is held in both hands which makes it incredibly manipulatable, so whatever I made needed to hold up from any angle you could use it at (aiming down the sight or just playing around with it in front of you). The second reason for this was to give me absolute control over the visuals without needing to create a bunch of custom particle systems, which would increase the draw calls of the object. Extra important in VR again because of performance.
When it came to the preparing for the development of the effect, I remember having a conversation with Richard when we came to the idea of one to one mapping the pull of the bow to a value in a blueprint. The second we had that hooked up I tested it with color and knew the haptic feedback of having control over something would be really impactful. That plus a long time fan of the UI elements of the recent Iron Man movies definitely sent me down the direction that I ended up with the final effect!
The behavior seen in the pull effect comes from 2 parts in just a single shader with vertex animations handled procedurally in Unreal. The first part is the ambient behavior (them rotating) and the second part is an animation who’s timeline is driven by a direct 0 to 1 mapping of the bowstring pull. With the bow at rest that value is 0 and at full pull, the value goes to 1. That value then gets piped directly into the shader to feed that part of the animation.
I’ll cover the rotating at the end, but to discuss the pull effect it’s worth looking at all the behavior as separate elements where each can be examined individually. Each element has a separate vertex color applied and used differently in the animation calculation. I’ll explain more in detail but generally speaking here are the vert weightings used and applied to the mesh:
- Red Channel – Timing Offset Weighting
- Green Channel – Rotation Speed
- Blue Channel – Slide vs Scale animation decider
- Alpha Channel – Slide vs Scale multiplier
Starting with the timing, each “piece” of the mesh activates at a different time. This was done with an offset driven by weighting in the vertex color in the red channel. A value of 0 would start it early, and 1 would start it at the end. This can be visualized below. It’s worth mentioning that this activation state also drove the opacity of each piece fading in. Two for one!
The next element of the effect comes from the pieces that slide in or scale into their final position. This was actually an A or B situation which I used the blue vertex channel for. If the value was greater than .5, it slid in. If it was less than .5, then it scaled in. Finally, I scaled how much it slid in or scaled in by a multiplier for variation purposes (alpha vertex color channel).
The scaling behavior was done by using the timing value to lerp between a starting and ending scale. The scales were driven by multiplying the original position along the YZ vector of the arrow in local space (assuming X was local forward). The sliding was similarly lerping between a starting and ending position by moving the verts in an offset along the arrow’s forward vector. Since the object was dynamically moved in the environment the orientation of the mesh was passed forward dynamically from the blueprint (though now I know this could be done by transforming vectors from world to local space in the shader).
Finally the last part, the ambient rotation was weighted by the green vertex channel. 0 – .5 was the speed in one rotational direction, while .51-1 put the rotational speed in the opposite direction. The actual rotation was done with the “Rotate About Axis” node with the arrow’s forward vector used for the rotation axis. Add all of that up and you get the final effect!
The textures for the VFX in this piece can be broken into a few categories. The main aspects are the texture of the pull effect and all the rings. Since the effect is mesh-based, I was actually able to use a pretty simple trim strip to get the effects I needed. The base texture itself was generated quickly in Substance Designer.
The shader (besides the complicated work described above) is pretty standard emissive material that uses the texture as a mask. Because the behavior of the pull effect is so complicated and already has so much motion I didn’t want to overdo it with extra noise with shader effects.
The release effect ghost left behind when the arrow fires are just small GPU sprites scaled quite a bit along one axis. Similarly, the trail for the arrow is just a few extra sparks. Finally, the impact effect is a final burst of small sparks partnered with a big flash mesh. I say flash mesh because rather than use a sprite, we layered a few sprites on top of each other in a model to create a bit of fake volumetrics. That mesh actually got re-used for many other effects in the project (see Bow Vid 02 above).
The other element that is left over after the main flash from the arrow impact is similarly a mesh. We couldn’t afford proper decals on the project and since the “Cyber” bow is digitally themed I thought it would be fun to create a glitchy looking effect for the impact that could slowly fade out. The sizes of the flash and glitch effect are actually linked to how strong the arrow pull is.
The timing for this effect is probably the easiest thing to talk about! It’s almost 100% player driven. The release itself is when the player lets go of the arrow and they drive how fast all the elements lock into place. If you pull the bow back, you get a crazier effect. If you go slowly, you can appreciate the visuals and enjoy a larger build up. The one-to-one mapping of the input to the effect animation creates a pretty fun response. It’s something I am definitely looking forward to exploring more in VR.
The speed of the arrow flight was tuned by Richard, though the lingering effect from the release was timed by me. The most complicated aspect of that release element is that it lingers in world space, so if you move the bow after firing the effect stays behind (which felt weird when we tested it). Generally, for timing purposes, I focus a lot on iteration and intent. I decide what the feeling I want to achieve is (satisfying buildup, large impact, ambient peaceful etc) and then iterate on that feeling a ton. We definitely don’t shy away from acting out actions in real life, making loud noises to demonstrate timing to each other, or sometimes even record behaviors and examine them later to see what makes them work great. It’s something that may sound a bit silly but is invaluable when it comes to communicating ideas, timing, or the feel of certain effects.
Advice for Learners
When it comes to recommended reading, I don’t have anything specific but I would recommend two things.
First, there are definitely active communities that are great to join in and be a part of. RealTimeVFX, for example, is a great first place to learn more about VFX from and see what the industry is doing. The second is the massive amount of resources that are available for free already. Epic’s content drops are full of fun complex materials, particle systems, and blueprints that are just packed with new techniques to learn. I still find new stuff in there and I’ve been tearing them apart for years now!