The Red9 ProPack SkinTools are designed to manage skinning data and allow users to quickly save, load and debug skin data for their characters. This also allows you to remap joints, quickly moving skinWeight info between joints in the cluster. In production we tend to pass r9Skn files around routinely when iterating on characters and it’s now a crucial part of our internal workflows so it made sense to expose it to ProPack users!
In this second video in the series we show you just how easy it is to map animation data over to the Red9 Puppet Rig, on-mass, using the Red9 Browser and the “Pro:To_BND” functionality. This hopefully gives you some insight to the power of the Browser when used with the in-built Extensions Manager to add custom functionality, allowing TD’s to expose functions directly to the Animators.
We go through the AnimBinder setup itself, a file that accompanies all Red9 Puppet deliveries. This is ideal for layering fast modifications to any initial character remapping, something only exposed to clients running the Puppet. The Pro Binder setup also natively supports HIK under the hood, allowing you to remap data from any fbx source, including files from MotionBuilder.
Finally we show some of the tricks unique to ProPack to deal with that initial stage of animation clean-up, filtering and re-directing.
With these tools in place you never need to worry about those unforeseen issues when working with MoCap data.
Contact us for more information about either the PuppetRig, ProPack or our Facial rigging services
This new series of videos are designed to give you a wider overview of the Red9 ProPack and what it’s like to work with us on a professional level to help streamline and speed-up your workflows.
This first video goes through our Red9 PuppetRig solution, the new DagMenu system and dynamic CharacterPicker. Crucially it also shows we use an abstract layer to communicate with all the tools and the API, meaning that none of our systems are tied directly to the rig solution itself.
Whilst we’d love everybody to use the Puppet Rig we’re more than aware that this is an impracticality for many studios who have invested time into their own rigging systems, this is a neat way to simply add a layer of MetaNodes to fool our systems into thinking they’re dealing with a native Puppet Rig.
Now available in Red9 ProPack!
This is a huge new feature for any animators dealing with dense baked data…. the Red9 StudioPack already has an interactive curve filter to deal with re-sampling animation curves but soon the ProPack will be getting an all new Butterworth algorithm, one of the main filter methods from MotionBuilder for cleaning up noisy data.
The Butterworth filter is actually derived from an audio filter so is superb at taking noise out and re-sampling animation data, we’ve taken this method and wrapped it into a neat, interactive tool inside Maya.
This will save anybody dealing with MoCap or facial data a massive amount of production time!!
This second video is as a result of a request from a number of clients who are dealing with budget moCap setups and needing solutions to help clean-up up some of the low frequency noise inherent in the data …. a perfect case for the Butterworth algorithm. In this case it’s data from Perception Neuron system.
ProPack just got more powerful with the addition of a crucial new system to check the status of assets before they hit production. We’ve been using this internally for a while now to check our rigs before we ship them out to clients and we thought we’d expand it to all ProPack users…. 😉
Read more here: ProPack Health Manager
Great news for everybody running our Red9 StudioPack from the Autodesk Exchange website, this morning a fresh build went live which is Win, MacOS and Linux compatible. This is a big update for the Exchange Site codebase as the last version was nearly a year old. Lots of bug fixes, a few new toys for you animators to play with and deeper integration for the ProPack, including full exposure in the API Docs so you can see just how far the ProPack codebase has grown.
There’s also a new CodebaseTarcker.xml which will go through all the majors changes over the last 6 months or so.
As usual get in touch and let us know what you think!
Client : Sony Interactive Entertainment London Studio
Project : PlayStation VR Worlds: The London Heist – Red9 Facial Systems
“The London Heist” was part of PlayStation VR Worlds, released in October 2016 along with the new PlayStation VR headset. We were tasked with the facial rig systems for the 2 key characters, Mickie and Frank, and this was the one of the first projects to involve the Red9 facial system, a task made even tougher when dealing with exploring the new medium of VR. It was a huge privilege to be involved in such a key development in VR tech and a massive buzz to see your work develop in the virtual world. Huge thanks to SIE London Studio for letting us be part of this key release for them.
We wanted to share this huge update to the Animation Re-Direction system in the Red9 ProPack. This new setup has been driven largely by the needs of one of our clients dealing with mass MoCap and needing to straighten out data shot in a tight volume, a really tough thing to do without a tool like this.
The new setup not only keeps all the power of the previous setups, allowing us to direct animations via curves, shaping animations, but also now un-wrap animation data.
As usual if anybody has any questions or is interested in talking about this or our ProPack pipeline drop us an email!
Client : Mi, Manchester
Project : Delta Safety Video Game
The project was for the latest in-flight video from Delta Airlines which is due to go onto all of their planes very shortly, the project also includes a HTML5 game! We were approached by MI in Manchester to do 6 facial rigs from scan data, 2 main hero rigs, the flight attendant and captain, plus 4 secondary rigs for the Chelsea footballers in the sequence. The flight attendant and the captain were scanned in LA, whilst we directed the facial scan sessions in London for the Chelsea players.
From the scan data we provided the facial models and the performance rig setup under huge time constraints but it was a blast!
Thanks to Mi and to Delta for their support!
This one is a huge deal for anybody doing performance capture and managing cutscenes. We embed timecode into the Maya rig allowing us to reload and re-build scenes from multiple r9Anim files by referencing a timecode pointer. This will even allow us to sync animation data based on the internal timecode of a piece of audio.