AGREED! Please help us!! I'm trying to figure out this 'bitrate' stuff, myself.
According to the manual...
"Bitrate Specifies the number of megabits per second of playback for the encoded file. (This setting is available only if
you select CBR as the Bitrate Encoding option.)"
Why does changing this setting so dramatically reduce file size?
And the preview isn't affected?
And why is this setting available when I have VBR selected? (contrary to the above manual quote)
I want to say that bitrate only affects the way data is released over an internet connection, but.... idk
4 people found this helpful
Every video file is in fact composed of a series of separate still-images played back at very high speeds (usually 24fps for film, and 30fps for american TV). Since we are dealing with digital files and not film, the video stream is encoded as data. You can't have an unlimited amout of data to describe each video stream, you would run out of storage space, so we have to have some method of allocating/describing the amount of data we are using to display a video file. This is where bitrate comes in. All digital video files have a bitrate setting.
Bitrate (usually measured in KBPS) tells you the amount of data that is being used to display each second of video. Video files with higher bit rates have more data to describe them and are therefore higher quality. But since more data is being allocated per second of video the files are larger as a result.
Some common video types and bitrates:
DV25(DvCam, MiniDV, etc.): 25,000 KBPS
DVD: 5,000 KBPS
VCD (MPEG-1 Approximately VHS tape quality): 1167 KPBS
When encoding or transcoding files there may be two choices (some formates don't allow for both choices) for how it is done; CBR (Constant Bit Rate) and VBR (Variable Bit Rate).
CBR (Constant Bit Rate): you set a single target bit rate and the compressing program (Media Encoder in this case) compresses your video to match that target. Data that will not fit inside this bitrate target is discarded, this is why video looses quality when it is compressed.
VBR (Variable Bit Rate): is a more flexible environment. This allows you to set a target bitrate, a minimum value, a maximum value, and an amout of deviation from the target. The compression program then tries to compress your video, staying above your minimum target and inside your maximum value. The basic idea is that not every second of video requires the same amout of data to display and therfore should be managed differently.
VBR also requires that you make two passes usually to work. On the first pass the program analyzes the video file, and on the second it uses the data that it analyzed on the previous pass to compress the video.
haha! Hot dangid, you are the bomb! That was fabulous
I know aboout VBR-Encoding. Still the "Bitrate Variability" setting remains a mystery to me, as the "deviation from the target" is already explicitly set by the minimum and maximum- bitrate. If it sets the deviation it claims to set it would either override the minimum/maximum settings (which would make no sense, as it would make the minimum/maximum bitrate settings obsolete) or it would be overridden itself by those settings (which would make the Bitrate Variability obsolete).
So why would I need that setting? Does it do anything else?
Also, I haven't encountered this setting for any other codecs (or on any other encoders). VBR usually only gives minimum bitrate, maximum bitrate (both in kb/s usually btw, not in percent) and the target bitrate (some even don't ask that!)
I hope someone can enlighten me.
Here is a little more background info concerning selecting a bitrate for video delivered over the Web:
Video bit rate
One of the principle of goal setting is to "Begin with the end in mind". In this case it'll be very hard to give good recommendations because the end is not defined. So I'll just make a few assumptions and you can correct me as needed.
First, I'll assume that since you are converting to Flash, you want to deliver this video over the Internet. If that's true, then we'll have to make some assumptions on the Internet connection download speeds of your potential viewers. Let's just say that most have at least a 1.5Mb connection or faster.
OK, that would mean that a video bitrate of half that should usually provide a video download that is not interupped by buffering (most of the time anyway). So assuming a video bitrate of 750kbps, what would the optimum display dimensions be?
Before we decide, here's a little info about bitrate. For highest quality playback, the video bitrate is tied directly to the display dimensions. That is, the larger the display, the more incoming data is required to properly display the video. Think of bitrate in terms of a can of paint. If you have 1 quart of paint, you might be able to do a very nice job on a 32 X 24 foot area. But if you try to stretch that same amount of paint out over a 64 X 48 foot area, the coverage will not be nearly as good and you get poor results.
In the same way, a video displayed at 640 X 480 pixels will require 4 times the bitrate as a video displayed at 320 X 240 pixels to produce the same quality. So for example a video with a bitrate of 100kbps, displayed at 160 X 120 will produce the same quality results as a video with a bitrate of 1600kbps if displayed at 640 X 480.
So to boil it all down, video bitrates of 750kbps, even up to 1000kbps can usually get delivered of the Internet on most high speed connections. Higher bit rates may work for really fast connections but will cause problems for viewers with slower connections. Video display size has a direct bearing on the final quality. In the 750 to 1000kbps range, display size should be kept around 450 or 500 width max (and whatever height the aspect ratio calls for). Yes it can be displayed larger, but the quality will suffer.
Sound like your audio settings are fine, especially for Internet delivery.
As for framerate, maintain the original raw video framerate for best results. So if the video was shot at 24fps, leave it.
As for video converters, do you have the Flash 8 Video Converter? It works just fine for video to be delivered over the Internet. Remember, you are taking a Cadillac version of video (h.264 HD) and stuffing it into a Chevy body to get it to work over the Internet.
Best of luck!
Another Ninja? I knew it... Adobe is run by little dangerous men in black pyjamas.....*pullshurikenfromshoulder* Ouch!
As for the explanation: Thanks, that sums it up nicely. Still, the meaning of the setting "Bitrate Variability" is unclear to me.
Think of variable bitrate as stepping down on the gas when driving up a hill.
Certain parts of a video will require more simaltanious pixel changes, a zoom or pan for example, compared to shooting a man sitting behind a desk just talking. A constant bitrate would work fine for that situation, but that same bitrate could be overpowered during a pan, when every pixel on the screen must change. Wouldn't it be nice if you could ... step on the gas and get a little more power, or in this case a larger data stream to cover those times when so many pixels are all changing at the same time.
The quality of a constant bitrate video could deteriorate at times if not enough data (the bitrate) was available to record the action.
And here's another view on it:
I know all this. Having encoded videos using various encoders and output-formats for the past ten years I really ought to....
What I'm unclear about is the specific setting "Bitrate Variability" in the VBR-settings of Adobe Media Encoder when using On2-VP6 and two VBR-passes to encode an FLV. I have never encountered this setting anywhere else, and I'm totally in the dark what it really does. As I said, the variability of the bitrate when using VBR is usually determined using the minimum- and maximum bitrates (which are present as usual in this case). Hence the ADDITIONAL setting called "Bitrate Variability" does not make sense.