For camera tracking, only the face currently works. We have tried experiments with the rest of the body and it just doesn't work as well with current webcam tech. Ideally in the future we could support depth-sensing cameras (like the Microsoft Kinect) to allow for this sort of thing.
We suggest adding draggable handles to any limb to make it move. Sticks & Dragging (Adobe Character Animator Tutorial) - YouTube
Thanx alo that was helpful
Hope u can figure out away to do that soon .
Microsoft Kinect - yes that would be most helpful!!! Using for poor mans 3D Mocap. Creating vids and apps with youth leading the content creation, full body motion capture would be very useful in Character Studio (or AfterEffects with CharacterStudio face ani mixed with rigged AE body).
I would also love to see kinect v2 support.
I was wondering if there has been updates on support for body motion tracking capabilities. There are several systems such as Optitrack (along with other 3rd party systems) that are able to export bright moving dots by using an IR light source with reflective markers (Figure A).
Any ability to track an incredibly high contrast point (from a software window on the computer displaying a dot that corresponds to a marker (Figure B) via screen capture.
A secondary camera with a direct IR feed. This would allow brightly reflected markers to be the focus of the image (Figure C)).
Since Character Animator is dealing with 2D space it seems like tracking the relationship between markers from a single camera might be feasible. If a point disappears (either due to an inaccurate track or rotation of body), Character animator could simply stop the limb's motion at the moment it lost the point. This would all be done with a SECONDARY camera, not the one tracking facial movements.
Aside from reducing tracking errors the idea of using physical markers and IR capture has proven to work in the industry for years. Many people have become attached to the idea of using markers and prefer them for the tracking of gross motor skills.