$16 for a *very* non-performant material? If this was intended for use in high-detail scenes, not meant for gameplay, one would generally just use a flipbook animation, or looping HD video texture (both of which are higher quality and available for free all over). I love options, but c'mon, that's pretty steep. $5, maybe. And you can loop in materials, using custom HLSL nodes. Also, there are better ways of doing this, all around. Somewhere on the forums, Ryan Brucks (of Epic fame) himself touched on this. I've personally been working on a cool water material (not "material blueprint", thankyouverymuch) and utility functions, and am close to the quality achieved here, sitting at ~180 instructions with everything "turned on". The kicker? It's pure procedural. No textures are needed. So this is cool, no doubt about that. In my humble opinion though, it's not "good". It doesn't run fast, and it's more complicated than it needs to be.
Lee is right - you can use a gradient effect when you vertex paint in your chosen 3d modelling platform (I've done it in max), meaning the wind effect shifts from nothing to maximum along the length of the leaf/branch/whatever.
I'm fairly certain you can vertex paint the bottoms of the foliage and control the movement using vertex colors along with the wind node. I did this in an earlier project and was able to create a scene with grass that moved less and less as it went down until stationary. I created the grass and painted the vertexes black to red (bottom to top) in Maya.
- Gaze Tracking
- Head Pose Tracking
- Realtime Tracking
- Offline Refinement
Excellent tracking is the core of faceshift. We can track with high accuracy, because we learn your personalized avatar. The avatar is created from a few training expressions, and the resulting tracking is very stable, accurate and expressive. We are tracking 48 blendshape parameters, which allows us to capture even small emotions.
We also track the eye gaze and head pose, brining the characters alive. And we do all of this in realtime, giving you the possibility to adapt your acting on the fly. For the final result, we offer an offline refinement and editing stage which further increases the tracking accuracy.
- File Based
- Animation Export (fbx)
- Blendshape Curves Export
- Virtual Marker Export
- Plugin Based
- Motion Builder
- TCP/IP streaming via open format
All excellence at tracking is useless, if it is not easy to get the results onto your own rigs, in your own workflow. There are four ways of using faceshift:
– export the animation on one of our predefined rigs or personalized avatars as a fbx file, which can be imported in every major animation software.
– work directly within your animation package with our maya or motion builder plugins.
– integrate with existing marker based pipelines. No more need to place the markers accurately on the actor, you define them once and export as bvh or c3d for every take.
– and for the real geeks we offer a TCP/IP streaming protocol which gives you access to all the tracking data in real time, allowing you to come up with your very own workflows and applications.
- Custom Avatar Creation
- Fully Rigged
- Gaze Tracking
We did not set out to make an avatar creation software, but the result is pretty cool. As a byproduct of using faceshift you get a fully rigged, expressive avatar, which you can export to use in your own applications, and which you can drive with faceshift.
And people are coming up with very creative uses for this, from sliced wooden head sculptures to surreal fashion photography.
- Runs with
- PrimeSense Carmine 1.09*
- Asus Xtion Live Pro
- Microsoft Kinect
- Runs on
- Linux [not yet released]
We support all major operating systems and the affordable RGBD cameras from PrimeSense, Asus, or Microsoft. And we made sure that our software is efficient such that you can run faceshift and your animation software simultaneously even on current laptops.
* we sell sensors comparable to the PrimeSense Carmine 1.09 on our webstoreThese are the best sensors commercially available.