This week I finally got around to working on our motion capture pipeline for animation.
We use a Noitom Perception Neuron 18 kit to capture our motion. It is sensor based and takes about 10 minutes to suit up and start shooting. The capture and data export process was a breeze, but getting the data onto a rig in Blender was a bit more tricky to figure out.
In the end I ended up exporting the motion as a 3DSMax bvh from Axis and then importing the data using the Blender Add-On called MakeWalk. For our rigs we use a modified Rigify rig that has a deform skeleton customized for Unity. Rigify is a nice auto-rig with FK/IK match built in, and I much prefer it to the heavier BlendRig 5. Because the skeleton is modified, it took a bit of python scripting to get the data importing correctly, but as you can see in the video below, I got it working.
The only real drawback to this process is that MakeWalk does not solve finger data, so the finger data is lost on import. But overall, not bad as a base. Next will be looking into the curve filters and animation layers Blender has to clean the data.