Donald Trump, insulation is a seamless wall with airpockets. Ceilings can be printed using a re-enforcing scaffold for support. Try googling info..
Really awesome work and the tutorial is fantastic. Thanks for sharing.
Absolutely no information about the 4.2 release - was it ever released in September. There is about as much information on trueSKY as there is in any of the so called products that use it. For me this lack of transparency is killing there business and points to fundamental issues with the technology. Google trueSKY in YouTube and you'll hardly get any information at all. For such a ground breaking technology this is very suspicious. Do they not have a marketing team - do they even care? Sounds like a very small company which wishes to remain small and doesn't understand what they can become because with the technology they have they should be targeting a bigger profile, revenue streams and audiance than they have and the lack of foresight here with the Simul management is quite frankly very disapointing. Another 10 years could easily disapear for these guys and they will simply remain a small fish. Very sad.
Jonathan Cooper shared some intimate secrets of the animation production for Mass Effect.
Jonathan Cooper is a game animator with an impeccable record. Throughout his career he worked on Tom & Jerry, Jade Empire, Deus Ex: Human Revolution, Assassin’s Creed III, Uncharted 4 and Uncharted: The Lost Legacy. This week he went on Twitter to talk about his work on Mass Effect, which recently celebrated a 10 year anniversary. Here are the 10 interesting facts he shared with the community.
Gone With The Wind with Mass Effect
Mass Effect comprised my first ever mocap shoots, back in 2005, flying down here to LA to Giant Studios in the Culver Studio lot. We shot the initial gameplay actions on the same sound-stage as was recorded Gone With The Wind – perhaps inspiring the romances?
During this climactic scene of Anderson punching diplomat Udina, cinematic lead Shane Welbourn, (suited up as Anderson), accidentally clocked the Udina actor on the jaw. I had to work with the poor chap shortly afterwards and he was less than impressed.
The very first breakthrough in the facial animation/customization system, spearheaded by former Olympian Ben Hindle, was when we were able to create then-US president George W. Bush.
Learning Dialogue from ‘The Extras’
The close over-the-shoulder camera style I used for conversations in Mass Effect was inspired primarily by Ricky Gervais’s The Extras. This is not a joke – that entire series was built on awkward, close conversations.
Long Lenses Win
It was a fight to get the design to stick with long-lenses as they were accustomed to framing characters against interesting backgrounds rather than close-ups on the face alone. They were eventually convinced after this first proof-of-concept: E3 2006 demo
30% Procedural Conversations
Procedural generation of conversations allowed for a great base-line conversation that included facial, body and camerawork, leaving them to only be improved by cinematic designers. ~70% were touched by hand, leaving 30% procedural only, like this one.
Battle for Interactivity
BioWare traditionally used black bars to denote interactive conversations, but we won a heated argument to remove them from Mass Effect to blur the line between interactive dialogue and cutscenes. This allowed us to seamlessly weave interactivity into dramatic cinematic scenes.
Fighting The Dead Stare
Every gameplay animation in Mass Effect had the eyes constrained forward by default to overcome the dead stare commonly found in games. This detail afforded design access to all 1200 animations for the conversation system, applying additional eye motion on top per line as required.
Puppetshop, the animation/rigging tool developed for and used by the entire trilogy by Kees Rijnen @lumonix was stealthily made available for free to the public after the first game. (Sadly it’s no longer supported or available anywhere – but you could have made all the aliens you want).
In perhaps a first – Mass Effect had fully localized lip lip-sync in all languages, possible only because the systemic lip-sync was entirely procedurally-generated. (I only learned this after my later Quebec colleagues enthused about the French version).
You can find the original thread here.