That helmet tho I think that one is spot on with kinda like a classic feel to it.
If I'm not mistaken, in the canon Samus can form the suit around her with her mind. In that case it's not necessary to make the suit industrial-looking (or the arm cannon that big) or have the paint stripes mentioned above, since Samus doesn't have to go buy parts to weld in place to upgrade anything. Also those glow plugs (bolts?) look bad, I get the blizzard look but I would change those and make them not come out of the suit like that. Something that wouldn't be necessary for someone that can form the suit around them.
I like everything EXCEPT the caution stripes on her thighs. The caution stripes look terrible. Take them off.
Gerald Orban and Adam Myhill of Blackbird Interactive (BBI) spoke at Unite 2015 Boston. They talked about how they used procedural camera systems to effectively compose their dynamic vehicle behaviors in the RTS Homeworld: Shipbreakers.
“By only improving the camera in the game, the entire game will leap forward in the player’s perception of quality.”~Gerald Orban
Introducing the Speakers
Gerald Orban – Spent 5 years at BBI (blackbird interactive) as lead programmer and been there since its inception. At BBI he developed the graphics, physics system and code, and core engine for Shipbreakers.
Adam Myhill – Director of Photography/backup Art Director at BBI (when Rob Cunningham isn’t in the office). He has 15 years of industry experience as Director of Photography, Art Director, CG Supervisor, he also designed the procedural camera system in Frostbite.
The problem with the camera situation is that:
- Cameras are often overlooked and are often done as tasks that are done by people that aren’t passionate.
- It’s not a simple problem, in fact, it’s quite the contrary.
- Are often fixed purpose monolithic chunks of code – it’s sort of hardcoded with a couple of variables that are exposed but you can’t really deal with the relationships between things.
- the workflows are extremely painful and the tools are very frustrating
- Once you’re done, the cameras don’t feel right. It feels very robotic, there’s brittle motion which is not compelling.
- Proceduralism is not common with cutscenes – an animator may change something and your cameras will be broken.
The idea behind procedural cameras is that you’re shooting blind, you don’t know where the actors will be because ‘virtual actors’ don’t exist during edit/design time. However, the shots will look good with specific framing.
Gerald and Adam created this tool for Unity at BBI to solve the problems and challenges where the virtual actors don’t exist until the players push play in the game. They wanted this to be robust enough to resist the behavioral changes and tuning changes. They also wanted actors and the environment to be able to change without having to reauthor cameras.
All of these function on the virtual camera rig.
Virtual Camera Rig
The virtual camera rig is a couple of unity game objects with the root object being the dolly, which is what the virtual camera moves around translationally. It’s quite flexible, you can have any number of objects in between the dolly and the child camera itself to add any kind of extra behaviors through scripting or other animation driven variables.
- Flattens the 3d world into 2d ‘screen space’ framing which allows you to compose things on screen.
- Tuneable easing, gives camera weight.
- Dead zone ignores subject motion.
- Hard boundaries always keep the shot, tracking the subject.
- It’s not as easy as it looks. They had three or four iterations scrapped and rewritten, because the vector math and making sure the precision is respective across the distances, they had to make sure it didn’t break down when the subject is 10km away and you have a 1 degree lens to film it with.
They are targeting the baserunner with the green dot. The red areas are where the subject can never go, and the blue areas is where it tries to keep the shot in based on the delay amount you set. This is real time and the game is not running and it’s in preview.
These settings are saved at the per shot level.
An important thing to note when you’re tracking something that is moving quickly, you don’t want to hardpin your camera to it, you want to decouple those. It’s just like a smart camera operator that’s in-game trying to get the best shot out of what’s happening. It’s also rare that you would want to have your target (green) at the center of the object, in this case you would want the offset controls and have them local to the vehicle’s space, then you target the window.
The cool thing is, on your timeline you can set up a bunch of different shots, pick your subjects and where you want to shoot them and compose them, and you just let it go to do what it does. The outcome is different every time but it’s in the spirit of what you would want. Sometimes the differences can be quite entertaining. You do it all visually.
It’s the way the camera body mounts to a moving object, such as a GoPro camera or a car mount and it allows you to take your camera and stick it to something while giving you some offset and relationship controls. So you can putit onto a character or a vehicle and you can animate it so you can do animated dolly shots. It also has tuneable tracking and dampening for each axis.
- Procedural positional and rotational noise.
- You have as many layers as you want per channel and you get a lot interesting mixes by mixing low, mid and high frequencies together.
Adam tried to get some good camera shakes so he took the movie Ronin And pulled a clip out, putting it into SynthEyes which is a camera extraction software for match moving. The top row is the frequency of the camera and it has rotation curves. If you’re trying to emulate noise this is an interesting technique for emulating noise.
Composer + Transposer + Noise
These three combined are extremely powerful.
- A rich toolbox to empower your creative talents
- Mount camera to object A and track target B
- Free your engineers to make other cool stuff
Unity is very interesting when it comes to order of operations, it all kind of picks and chooses when it’s going to run. That’s why they came up with a master script that owns when and how things get applied based off what the user chooses to apply from the module set.