I'm looking for some information regarding the final bitrate of alpha channel encoded video. We are producing transparent flvs. When targeting a number, say 200kbps, in Adobe Media Encoder, the final encoded flvs has a bitrate of sometimes as high as 350kbps or 400kbps. And that's just the video.
When I encode without the alpha channel with final flv video bitrate is basically exactly what I targeted.
I'm assuming GSpot and MediaInfo to look at the final flvs.
Why doesn't Media Encoder produce and flv at the target bitrate? Why the higher bitrate? Is there a way to set the final target for an flv? Is there better software?
Probably because you're not actually encoding "nothing" when creating transparency. An alpha channel is an additional 8-bit grayscale channel, and it probably can't or isn't compressed as much as the RGB channels, or else the transparency would not be clean. The only way to decrease the file size is to lower the total bitrate, which will subsequently have an impact on the quality of the "visible" video.
Thank you. That makes sense. I just wonder why Media Encoder isn't encoding the final flv to the target. What is it targetting? The final video output is sometimes double the video bit rate when I compare and flv with and without the alpha channel. Is this normal?