Does anyone have a suggestion on how I can confine the eyes and mouth to a specific boundary within the face?
When I move to an extreme position the mouth goes out of bounds. Actually just had another suggestion, if I created multiple angles for the head like Wendigo, could the head movement trigger those angles? so if I turned my head slightly to the right it would trigger the 3/4 angle head, if I turned completely to the side it would trigger the profile head?
Also would be cool to control the percentage of how much you have to turn your head for the different head to be triggered. I think there may be a workaround in the meantime but it would be an awesome feature.
Could the Werewolf.ai file (available as part of the Character Animator Examples archive at Character Animator Examples) be a good example to follow? As you tilt your hear, the head background moves with it.
Regarding being able to trigger different views based on pivoting your head, that's a good request. It's been discussed internally, but we don't have anything ready for that support at this time. Of course, the more you turn away from the camera, the less we can track eye/mouth movement.
Regarding the chin question... if the chin was on the same warpable skin as the neck, maybe that'll work... if I'm understanding your question.