I'm assuming you mean Track Camera. After Effects does not have a 3D tracker.
Are you new to AE? What kind of shot is it? What is the format? AVCHD or MP4 footage from amateur cameras take a lot of horsepower to decode and run through AE for things like Camera Tracking. Some shots simply will not track. If there is not enough fixed geometry in a shot the camera solution may either fail or in some cases cause a crash.
A 25 second shot is an eternity in Camera Tracking terms. With out a bunch more information about your shot and what is going on when you are running the track it's hard to advise anything other than "make sure AE and your drivers are up to date and then get back to us."
When you are doing effects you should only work on the actual frames that you are going to use in the production. For example, I recently had a walk and talk shot with some talent where he walked and talked to the camera for about a minute and a half. There were about 8 places where the actor walked past some windows where we had to replace some of the objects in the windows. The rest of the shot was just fine. I broke each of those 8 sections down into separate comps. The longest was about 120 frames. I camera tracked each cut separately and then inserted my 3D elements. The shots were rendered and then cut into the original footage in Premiere Pro. Tracking the entire shot would not only been a huge waste of time, but solving the camera for that much footage would be incredibly difficult.
It usually show's the working in background message then crashes on "solving camera" anyone have any ideas what the issue may be?
Without seeing your footage, nobody can tell you much, but I tend to concur with Rick - it's probably too long to begin with and then on top of it may be shot unsuitably, resulting in illogical spatial relations and in turn the impossibilty to get a proper solve (the crash is a bug, though). I also have that tingle in my left toe that tells me "game captured footage", which would then be another cause for concern...
The sole purpose of the recording was to make myself familiar with the camera tracking process. It is just a .MOV video of me walking through my living room and into my computer room. 60fps 1080p video.
You cannot solve walking across multiple rooms. A certain amount of markers has to be in view for a minimum duration and you need proper parallax. You cannot defeat the math, especially since AE's tracker doesn't allow manual corrections and calibration with persistent markers. You need to shoot something else and try again.
1 person found this helpful
First, there is probably no reason at all to shoot at 60 fps unless you are planning to slow things down. You've just upped your calculation and rendering time by more than 100% and did not gain anything.
Second, if you were walking around with the camera then you may have gotten some footage that the camera tracking can use, but if you set up the camera and you walked around then you need 3D tracking - completely different than camera tracking, and AE does not do that.
Third, if you used a consumer camera then your footage is probably highly compressed and that alone may be giving you problems.
If your AE is up to date and you have no conflicting software (like some virus scanners) and your system's drivers are all compatible and up to date, and you still can't get camera tracking to work then try some other shots. Adobe stock is a good place to download videos to test. For example this shot will not camera track because there is no real fixed geometry in the shot, just a bunch of people moving:
This one will camera track but the lens distortion at the edges means that you'll have problems if your inserted element moves to the edges of the frame. The distortion can be corrected enough to get a good track all the way to the edges but you'll have to be careful when picking the targets to make sure they are in the same plane.
And this one of a guy walking down the street may be successfully tracked as long as the tracker picks up enough detail and you eliminate the tracking that attaches to the guy with the phone:
I would explore Adobe Stock and download a few sample videos to track so you can learn how tracking, camera tracking, and stabilization works. It's a lot easier to do than just running around the house with your smart phone. Once you see how tracking works you can design shots that will work for your productions.
The only reason I was using the 60fps shot was so that it stayed consistent with the rest of the video. Can you have parts of your video be 24fps then the rest be 60?
Yes, but if you're playing the video back at 60, it will look different in that section and the audience might not know why, but it'll feel weird. If you're playing the video back at 24, it won't make any difference at all.
There is a huge misunderstanding about frame rates and perception. You should study up. 60p gives you more options when it comes to movement of high contrast detail but the motion blur associated with 60p makes other shots look awfully rough because our eyes don't run at 60p and we are used to motion blur that is about what you get at 24 to 30 fps.
Then there is the misunderstanding about refresh rates (most displays run at 60hz or 50hz) and frame rates. There is also the misunderstanding about playback via streaming services. Most, YouTube included, will slow down 60p stuff to 30 fps due to bandwidth limitations. You can be pretty sure that your 1080p material will be viewed by most folks at 30fps, or 29.97, the standard for playback in every country that has 60 Hz current, or 23.976 or sometimes 24 or 25 in PAL (50 Hz electricity), and only the occasional viewer will actually get 60 progressive frames per second. It's a bandwidth issue and the folks that stream video are a lot more sensitive to bandwidth than they are to your 60fps footage.
Then there are broadcast misunderstandings. All cable and broadcast companies currently in the US limit their HD content to 60i, which is the same as 29.97 interlaced. There are a few exceptions occasionally for things like special sports broadcasts, but 99% or more of all broadcast video is 29.97 or 25 (PAL) interlaced no matter how the program was originally produced.
When you talk about frame rates the 60 can be misleading. 59.94 fields per second is the standard for 60i and it is the same field rate and frame rate as 29.97 fps NTSC broadcast. If you send 30p original footage to a broadcaster, with almost no exceptions, the material will interlaced and sent out to the public at 59.94 fields per second. Many broadcasters will also down sample the 1080 frame size to 720 because bandwidth costs money. If you send 24p material to a broadcaster, in most cases the footage will be converted to 29.97 interlaced, commonly called 60i, and 3:2 pulldown added. Every motion picture that you see on TV that was not specifically made for television has this 3:2 pulldown interlaced format. Every one of them.
The only practical reason to shoot 60p is to get more data to track (requires more resources) or more motion samples to recalculate, or to slow down the footage. In most cases there is no advantage.
The other thing to think about is that if you have low light and you have your phone or consumer camera or even semi pro camera set to shoot 60p, the shutter speed will probably be adjusted to something less than 1/60 of a second to give you a decent exposure and you will end up with two identical frames. In many low light conditions you just end up with pairs of identical frames taking up space, using up the data rate, and slowing things down.
Some shots that I do that are going to involve a lot of visual effects and compositing are shot at 60 or even 120 fps to give me more options, but always drop that HFR footage in a standard frame rate sequence or timeline to render the production master. I will usually add what would be typical 24 or 30 fps motion blur to the shot as a last step to keep things looking consistent.
I hope this helps. Anyone that does video, even for a hobby, should take the time to learn about and experiment with frame rates. In my work I usually shoot 29.97 fps progressive. Even when I am using a film camera I run the film at 29.97fps unless the only place the film is going to be seen is a theater because I think that it just looks better. The choice is up to you but just remember, even if you do have true 60fps camera original, given the same data rate (bandwidth) you'll have half the image information that you would have in a 30fps frame. That's just how it works.
1 person found this helpful
I shoot and edit a lot of footage with tracking. Shooting, even just setting up the shots, not even discussing the processing is much much more than can be conveyed in a one page or paragraph answer. It's art and science and took me a long time to get to the point I knew enough to make it work nearly every time. Don't expect to master it in an afternoon.
Do you still have any questions? Please mark the best answer here as the correct one if it helped you. If not, let us know what more we can assist you with.