Because AE is a compositing app and Premiere is a Non Linear Editor. The playback engines are completely different as is the way pixels, layers, motion, and just about everything else is handled. Until the basic architecture of AE is revamped you're stuck learning how to edit to audio by using markers, looking at the wave form, holding down the Ctrl/Cmnd key while scrubbing to find syllables or beats or a half dozen other techniques that have been used to sync cuts to audio for about 80 years.
Can someone explain to me exactly why Premier can play the audio along with the video and AE still can't?
I like Rick's longer answer:
Sony Vegas, Premiere Pro, Final Cut, Avid are all NLE's (Non Linear Editors) and they are specifically designed to playback a video stream. With any of them, if you stack enough layers or effects on the video they will have to render a new video stream based on pixel based calculations for every pixel in the stack. This rendering, especially for HD sources or for complex plug-ins, will take quite a bit of time.
After Effects, Flame, Fusion, Shake -- are all pixel based image processing applications that act very much like Photoshop. They calculate the values of every pixel in every frame, come up with a new pixel, and then play those pixels back as a video stream. More importantly, AE and all the other pixel based compositing apps, always work internally with completely uncompressed pixel data. NLE's rely on codecs and in some cases, hardware, to playback the video. It's an entirely different way of working with moving images.
In After Effects you enable the preview by loading a bunch of frames into RAM then the video stream is played back. You start the process with the 0 key on the numeric keypad and not with the Space Bar as you do in nearly every NLE ever created. The length of the preview depends entirely on how much free ram you have available and it takes some time to generate these new pixels. The more layers, the more effects, the more calculations that need to be performed the longer it will take to process the RAM preview. There's currently no way around this rendering time. A modern NLE will handle an amazing number of video streams simultaneously, but as soon as you exceed the capability of the system you're stuck with a render. Most NLE's, given the same number of calculations, actually take a little longer than After Effects to do the same kind of effects. Open GL, and other GPU acceleration helps many NLE's achieve higher performance but it has yet to be implemented into a pixel based compositing app. The sad truth of the matter is that if you want to do compositing in any of the available compositing apps, you have to wait for renders. They are getting better. Memory management and efficiency is improving. GPU accelerated effects are being added, but for now, that's about as good as it gets.
I hope this helps. As long as you use After Effects to create shots and don't try to make it do the work of a NLE you should be fine. Movies come from NLE's, amazing shots come from AE.
- Rick Gerard