At this point, we don't have the ability to refine performances by editing data manually. We recognize that more control is needed and are currently investigating several ways to improve the workflow. Please feel free to offer suggestions in this area. Part of the reason for going public now is to allow users to help guide the development.
If an animation could be frozen into keyframe data after interpreting the motion capture, that would be extremely useful.
The way I'd hoped to see this program go was something like this: you perform lines for the application, and it interprets your movement as data for the character, perhaps tweaking sensitivity and doing a few takes until you have a pretty solid starting point.
Then, instead of exporting a PNG sequence, the movements are frozen into keyframe data and opened in after effects. Any further tweaks of the animation would have to be done from AE, but it would provide a great jumping off point for animations. Sort of like how you paint with rotobrush but freeze it before you start making adjustments to the matte.
Motion capture is great for big movements, but ultimately the character will need to be composited if you're making a complex scene anyhow. Keyframed character data being opened up in after effects (possibly with a puppet rig in place, or skeleton in the character to assist in further movement?) would provide a great way to keep going.
I saw in a video mentioning placing tracking points on the puppet then adding limbs and such after.
The example it gave was a dot where the arm would join the body, it tracked that point in AE once it had been recorded and then the arm attached to that point
I would like to be able to record something, have that be static then record again manipulating the previous recording's result. Ideally I would use this with key triggers and to edit small errors. I imagine this implemented based off of illustrator's image trace vertices.