I've added this as a Suggestion, but I would like to know what others think?
In other face animation type apps, they have a tendency to use selectable and dragable tracking dots. These are used to define the face feature ie, click on the side of the mouth and side of eyes. These points are then used for future animation.
Ch's Face Tracking is very good and a big plus against other programs, but think it could be a whole lot better if the tracking dots, could be selected, moveable and then their position confirmed against the Puppet and your own face in the Camera view.
The Tracking Dots seem to be grouped... eye.. nose.. mouth... head etc.. If the Tracking Dots could be selective and confirmed in position as described above, then the potential to improve the quality of the tracking and the use of this improved tracking to trigger animations could be significantly improved.
I have two suggestion:
1. Use the Auto Face Tracking to detect the facial feature and then give the user the ability to adjust the Tracking Dots as described above.
2. Remove the Auto Face Tracking to detect the facial feature and get the users to define the facial feature first, like many other programs. Then allow Ch to animate using the set Tracking Dots.
Interesting idea - thanks for the suggestion!
Yes, why is nobody else pointing this out. It seems like a massive and obvious oversight.
The face tracking in CH is so great! In real time, it seems to give a rock solid track of eyelids and mouth. If I want to do the same in AE I have to run a very slow calculation that is highly error prone. And if I want to drive visemes in CH, I have to use audio which is really inconsistent and error prone as well.
It feels like we are wasting all that great tracking data in CH.
Im on board-
And you could allocate dots for other things too, non human puppet features, or even track limbs with your hands...