• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
1

Tracking Dots That You Can Select, Move And Match Puppet to Face

Guest
Nov 29, 2016 Nov 29, 2016

Copy link to clipboard

Copied

I've added this as a Suggestion, but I would like to know what others think?

In other face animation type apps, they have a tendency to use selectable and dragable tracking dots. These are  used to define the face feature ie,  click on the side of the mouth and side of eyes. These points are then used for future animation.

Ch's Face Tracking is very good and a big plus against other programs, but think it could be a whole lot better if the tracking dots, could be selected, moveable and then their position confirmed against the Puppet and your own face in the Camera view.

The Tracking Dots seem to be grouped... eye.. nose.. mouth... head etc.. If the Tracking Dots could be selective and confirmed in position as described above, then the potential to improve the quality of the tracking and the use of this improved tracking to trigger animations could be significantly improved.

I have two suggestion:

1. Use the Auto Face Tracking to detect the facial feature and then give the user the ability to adjust the Tracking Dots as described above.

2. Remove the Auto Face Tracking to detect the facial feature and get the users to define the facial feature first, like many other programs. Then allow Ch to animate using the set Tracking Dots.

Views

874

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 29, 2016 Nov 29, 2016

Copy link to clipboard

Copied

Interesting idea - thanks for the suggestion!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 01, 2018 Oct 01, 2018

Copy link to clipboard

Copied

Yes, why is nobody else pointing this out. It seems like a massive and obvious oversight.

The face tracking in CH is so great! In real time, it seems to give a rock solid track of eyelids and mouth. If I want to do the same in AE I have to run a very slow calculation that is highly error prone. And if I want to drive visemes in CH, I have to use audio which is really inconsistent and error prone as well.

It feels like we are wasting all that great tracking data in CH.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Oct 02, 2018 Oct 02, 2018

Copy link to clipboard

Copied

LATEST

Im on board-

And you could allocate dots for other things too, non human puppet features, or even track limbs with your hands...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines