The reason H.264 was deprecated in AE is that it's not very good at making them compared to the Adobe Media Encoder. There are rather technical reasons for that, but they're not important right now. What is important is coming up with a proper workflow for you.
From the sound of things, it would be better for you to render in AE. However, you don't want to do H.264 out of AE.
So here's my suggestion (and this is the workflow I tend to follow). Render out of AE into a production codec. This can result in a large file if you go right into an uncompressed AVI (which isn't a big deal), but I tend to use Cineform which is free from GoPro (another good choice is DNxHD which is free from Avid). However, if you don't feel like installing third-party codecs, a Quicktime with the PNG codec is smaller and Quicktime with the Photo JPEG codec is even smaller. (While not technically lossless, at high quality settings the Photo JPEG codec is visually lossless.)
Then you can use the file you just rendered with the intermediate codec in the Adobe Media Encoder to make your final deliverable.
Not only does this workflow allow you to utilize AE's superior rendering and AME's superior compression, but it also lets you experiment with various compression settings on your deliverable (multipass encoding, various bit rates, etc.) to get the highest quality video WITHOUT having to re-render your AE comp over and over - you're just working from a single video file! It's a much faster way to get the best product.
that's the entire reason for the thread, to find a solution to a poor workflow, but that poor workflow is what you're recommending. I don't understand this. Your recommendation is precisely the innefficient, poor workflow I'm trying to fix. that's how i have to do it now. render lossless, then when it's done, render the render to compress? doesn't that sound a bit silly?
Why can't AME render projects that utilize plugins like element 3d? anything i render that puts use on my cuda cores can't be done with AME so what's the point of even having higher end video cards if they can't even be used to render? Doesn't this entire conversation sound counter-intuitive? I'm trying to wrap my head around a reasonable explanation. GPU rendering is what the video card is for
I shouldn't use the codec built into AE that accomplishes what I need, I should add another 20% of my resources to render a video twice, when simply enabling the deprecated codec resolves the issue. But you said there are technical reasons that for some reason aren't important? well I do beg to differ there, I'd like to know why Adobe recommends not using the obvious solution, and instead points me to a work around that is inefficient, adds time, and resouces
I'm not trying to be snotty, i'm trying to understand why the tools do not work as intended and I'm forced to duplicate effort.
And FYI, I'm not interested in 'experimenting' I produce 6 to 8 videos per week, and while most are relatively short, my goal is to optimize my workfrlow, not use workarounds to accomplish what I would expect to work out of the box.
I've got a breakdown and failure with BOTH After Effects and AME according to the solution. don't use the industry standard render option, and don't use AME as I'd expect to use it, because it too doesn't work right?
Okay, then provide details about:
Your OS: precise version number right down to the last decimal point
Your AE version: the same precision as above applies
Your nifty card, and the details about it; I'm talkin' driver version numbers and the like
Otherwise, this discussion isn't going anywhere.
What difference does my machine specs make? i'm asking why the software does not work as it should. how are my computer specs relevant in the slightest when asking why After Affects recommends not using the industry standard codec, while the solution to render in AME does not work either. and it's not just me that cann't render projects that utilize element 3d. i've not seen anyone that can do it. it appears to me to be a failuren of the software, my nifty card makes no difference, it's in the supported video card list. the reason this conversation isn't going anywhere as you put it, is because nobody has answered the question why the software doesn't work. and i did list my video card, and my cpu, RAM, and hard drives are all plenty powerful. Just to appease your question, i am using a 2600 CPU, 32GB of RAM, Two Vertex 4 SSD in RAID0 with 900mbps r/w for system, and a Vertex 4 SSD for project files and storage at 450mbps r/w. the software is CC using the latest version pushed from Creative Cloud. and windows 7 64 bit, none of which are relevant to the question of why AE recommends not using the industry standard codec and the best practice is rendering projects twice because neither AE, nor AME work correctly to do it.
Do you not agree that duplicating effort and adding more processes is not a viable, reasonable answer?
I'm trying to understand WHY it doesn't work right, and if it's a fault of the software, ok, i can accept it, i just want to understand why it exists.
I must be really stupid here. It sounds like you are asking how to set up the Render Cue or the AME to use a specific render setting automatically. This is as simple as can be in the Output Module and a basic skill everyone should have. Just open the Output module settings window and make your 'special' setting the default. I do this all the time and change it based on client requirements. It takes about 15 seconds. Here's what the panel looks like:
I've chosen a DI (Digital Intermediate) JPEG 2000 with no audio as the default in this example because that is what one of my clients wants for the projects I have been working on this week. Next week I'll probably switch it to a Black Magic Design 10 bit output because that's what the next client in the cue wants.
The procedure is almost identical for the Adobe Media Encoder.
The presets in the AME are "industry Standard" and produce excellent results. Lossless ( the default for the output module) is also an industry standard production codec. You can create any setup you want and then just define it as the default. I'm not seeing the problem unless you've tried this and it does not work.