You can scale the rotation information using a simple multiplier expression like
I recommend doing this with a mocha track and stabilized precomps.
Here is a tutorial showing some more advanced modifications of the face, but you can replace the eyes in the same way:
Since the tutorial above is a bit older, also consider this tutorial that shows very quickly how to create a stabilized precomp with the current version of MochaImport+
And this tutorial shows that this is even working well if the face is turning a bit (in this tutorial I attach a faked wound to the head):
Finally, if you hesitate to switch to mocha and want to do everything inside After Effects, you can also use the mask tracker in combination with MaskTracker+ to create stabilized precomps:
whoa! you have just blown my mind !! never heard of mocha....
looks like my day is going to be invested in waching tutorials.
ill report back with my results
thankyou so much
You can also use the normal trackers in AE, you just need to track scale and rotation using the eyes as your track points. The easiest way is to stabilize the shot using scale and rotation while tracking the eyes. The eyes should now not change position or spacing in the shot as the actor moves toward the camera. You add your artwork over the eyes as separate layers. The last step is to remove the stabilization on the shot by adding a null, adding expressions to the null to assign the null's position to the stabilized layer's (your footage) anchor point, the the scale of the null to the stabilized layers scale property that calculates the change in scale, then adding an expression to the null that reverses the rotation. The last step is to make the null the parent of the stabilized layer and your added artwork and effects layers. If you name your original footage "stabilized" you can use this animation preset: Dropbox - destabilize Rotation Scale.ffx
This is what the expressions look like on the Null Layer:
x = value;
y = value;
tx = thisComp.layer("stabilized").transform.scale;
ty = thisComp.layer("stabilized").transform.scale;
nx = x/tx*x;
ny = y/ty*y;
Yes, in principle, you can do it with the point tracker, but since it is much less robust than mocha, I really recommend to learn mocha.
And since mocha is bundled with AE, there is really no reason to do not use it, unless you have a super easy track for which it is not worth the extra effort.
But as soon as you have only a little bit of perspective motion or when the point tracker has issues tracking the eyes (say when the eye blinks or the pupil is moving) I would definitely try mocha.
To get started with mocha in general, see this tutorial
so, when i choose track in mocha.... im getting a yello text which says "frame not rendered"
any ideas why it is doing that?
probably the format of the video file is not supported by mocha.
I recommend to convert the video clip to a sequence if png, tiff or jpeg images.
See "What to Do if Mocha Does Not Support Your Footage" at
Hmmm ! they ae regular .mov files straight from my Canon Camera
.mov is listed in the supported file types
by the way i have found a temporary workaround while i try to get mocha going, which i think does the trick
If i create a 2 point motion tracker for each eye , switching the primary and secondary track points around, then create 2 seperate null objects anything on left side of his face gets parented to NUll 1, anything on the right NULL 2, this seems to allow the objects to get farther apart as it gets closer to camera. however im not sure of the results in terms of keeping the pupils steady because the character im editing at this moment, actually has no pupils..but i suspect that if i can harmonize the tracking, scale coordinates then it should work
any ideas of easy ways to harmonize the trackers? currently im having to track from scratch with each one, which is kind of problematic and tedious
I have tried by duplicating the trackers, then switch track points around by copy and paste the key frames from track point 2 in the first tracker to track point 1 in the second and so on..... but it seems to be coming out completely with random results
The problem with mov is that it can contain all kinds of codecs and maybe mocha is not supporting your particular one.
Converting to a jpeg sequence is a little bit of extra disk space and time, but it will definitely work and save you a lot of time that you otherwise waste fixing your inaccurate track (mocha is really way more accurate).
which program do i do this conversion with?
You can do it inside of After Effects.
Add your clip to a new composition and add it to the render queue.
Then choose the output module "TIFF sequence with Alpha" and render it out.
To save some space on your harddrive, you can also change the output module settings by setting the Format to "JPEG sequence" and the channels to "RGB" instead of "RGB + Alpha".
If you render this, you will get a folder with lots of Tiff or Jpeg images (one for each frame). But you can import it in AE and mocha as if it would be a video file by just selecting the first image. When you import it in AE, you just have to make sure that in the file import dialog the checkbox "Tiff image seuqence" or "Jpeg image sequence" is checked after you selected the tiff/jpeg image file.