Curious...just came over recently from FCP and I don't remember Compressor giving me an option to adjust the MINIMUM bit rate when exporting HD projects for DVD. I figured I'd just leave it where the preset has it (at 2.8mbps minimum)...BUT...then I hovered over it and saw an interesting popup/explanation of what it does which kind of confuses me. It says:
Higher values set a higher minimum quality, but reduce quality of more difficult scenes.
The first part makes sense so I figured I'd raise it...but then the second part makes me think I should keep it low. Sort of confuses me. Any thoughts??? FYI: My projects are Weddings with a fair amount of action, etc.
Normally, these are my settings depending unless I can't fit the project onto the disc in which case I adjust:
CBR at 7.5mbps
Target Bit Rate: About 6.8 or 7mbps
Max Bit Rate: Usually 8mbps
Target Bit Rate: 25mbps
Max Bit Rate: 30mbps
Unless someone tells me that I should raise these settings higher for better quality output (if the project size allows of course) these are what I've been using to get maximum quality out of my videos without jepordizing playback due to bit rate max for each media (which I think I read was 10mb and 40 or 50mbps respectively. I was just thrown by the minimum bit rate description above.
Thank you in advance for your help!
Some encoders offer the MIN bitrate setting for VBR, others don't. Even in Encore, DVD offers it, Blu-ray does not. Think of Variable Bit Rate encoding as such: if you have a very long video, that means the bitrate would be low to fit the project on disc, so quality will suffer. But, VBR can "allocate" the bitrate on as "as-needed" basis by first examining the video and determining which scenes need a higher bitrate (action) and which scenes might look fine at a much lower bitrate (static scene, less detail, etc.), then encoding based on that analysis.
With an example of 3-5-7 VBR settings, a simple white on black title screen is going to be encoded at the minimum "3", a high-energy dance segment will be encoded at "7", and overall, the "average" will be "5". Because of this distribution of quality where it is needed most, the video should look better than if it had been encoded at CBR 5 for the whole thing.
My thoughts and experience are this - for shorter videos, why use VBR at all? If you have a video of an hour or less for instance, you have the opportunity to encode the WHOLE THING at 8, which is tops! Why used VBR, which would then encode some scenes at a lower quality? Just use CBR 8 and get the best quality for all of it.
With VBR 2-pass, you are likely doubling up the encoding times without any benefit in many cases.
About Blu-ray, I believe the MAX allowable is 40Mbps, but to me, H.264 at 20 looks very good so 30 might be a good MAX, why push things and have possible playback issues? There is a point of diminishing returns - the video can only look "so good" and there comes a point where any difference from a higher bitrate cannot be discerned with the eye.
Just my opinion
Safe Harbor Computers
The whole notion of a minimum bitrate is crazy unless you have specific broadcast requirements that require you to pad out video to keep the connection alive.
If the encoder can express the image with zero loss in less than the minimum bitrate why would you pad it with zeros to get the bitrate high enough to meet the min?
The idea of a nominal/average bitrate is easy to understand.
The Maximum bitrate is often misunderstood though.
On some encoders the max rate sets the wiggle room (max - average) that can be used if there is a burst of required information.
On other encoders it sets the maximum rate at which the video buffer is allowed to fill as per the specifications. eg Blu-ray is capped at 40Mbps. The Video encode itself MAY EXCEED THAT 40Mbps for a split second but will only LOAD into the video buffer at up to 40Mbps.
For example, if you set constant bitrate at 20Mbps and no Maximum rate when the video first starts loading from the blu-ray disc to the video buffer it loads faster than the blu-ray maximum of 40Mbps. Thus you MUST have a maximum rate defined (for blu-ray compliance) even though it's a constant bitrate. Some encoders do this for you but some leave it up to the user to get right (and thus should provide a max slider and constant slider to set both).
Jeff- absolutely agree. It's very rare that you need to use all 40Mbps available to you. 20 and 30 can look great (depending on the detail in the sceen and how much motion there is).
VBR 2-pass does have value if you're trying to get down to lower rates like 10Mbps to fit a long title on a disc. If you don't action shots with sceen cuts will leave artifacts as they are bit starved.
I'm actually a big fan of Constant _Quality_ (CQ) vs CBR or VBR. You then know what quality you're going to get on every frame and from experience will know how big it will come out to be. CQ is also considerably faster to render because you have no rate control computations to do. x264pro has a CQ option for this very reason.
hope that helps.
This all does make sense and I appreciate your responses. However, I still don't totally understand my main question: why does it say what it does about minimum bit rate (see below) and what should I do? I don't want to reduce the quality of more difficult scenes but I also want a my minimum quality to be low and not look good during "simple" scenes. Yes, CBR is great, and I use it most of the time...but sometimes I have to squeeze in a bit more onto a DVD and I use VBR to get a little extra out of my video at a smaller size. In this case what should I do about the minimum bit rate? Leave it at 2.8 and risk a lower minimum quality (if that even effects the final video), or raise it up to 4 or 5 or whatever but risk apparently reducing the quality of more difficult scenes. Why would it even do that? Doesn't make sense.
Again, it says:
Higher values set a higher minimum quality, but reduce quality of more difficult scenes.
Seems like a catch-22 to me.
When you first called attention to this tooltip, I thought it was erroneous. But the deal is that if you increase the minimum bitrate (without also increasing the max), then the easier frames/scenes soak up more of the bandwidth, leaving fewer bits to spend on the difficult content. It becomes clearer if you take it to the extreme, setting Minimum to 39Mbps, Target to 40, and Max to 41--you've effectively set it up so VBR is forced to act like CBR.
Given the infinite range of content that you could have within a sequence, I don't think there's a one-size-fits-all answer to your question as to what you should set it to. Try starting with a system preset that's in the right ballpark and review the results. If the quality of the easy scenes is not adequate, bump up the minimum bitrate. If you can't find the right balance of quality in the easy and hard scenes, then you probably need to increase the target bitrate, and with it the min and max.
Mark wrote (correctly) - "Given the infinite range of content that you could have within a sequence, I don't think there's a one-size-fits-all answer to your question as to what you should set it to. Try starting with a system preset that's in the right ballpark and review the results. If the quality of the easy scenes is not adequate, bump up the minimum bitrate. If you can't find the right balance of quality in the easy and hard scenes, then you probably need to increase the target bitrate, and with it the min and max."
This is why an Encoding Engineer is a Profession unto itself. It takes years of experience to optimize a sequence to fit inside the delivery parameters and at the same time look the best it can in that given constraint. We haven't even started talking about modifying the settings over time. - Currently the whole sequence is treated the same way.
At least you can drop multiple encodes onto the same timeline inside Encore to be able to break up a title into scenes with different encode settings. For example, title sequences and credits need different settings vs the movie itself. It's amateurish to do the whole movie at one setting.
About the warning message - Higher values set a higher minimum quality, but reduce quality of more difficult scenes.
This makes sense if you consider it this way - there is a finite amount of space available on the disc. Using CBR, we could dial in the data rate to fill the disc, with all scenes of the video being encoded at the same data rate regardless of the complexity of the scene. If there are a lot of scenes that could get away with a lot lower compression rate and still look good, that is wasting space that could potentially be used to encode some scenes at a much higher rate if they need it.
With VBR, we take the CBR rate from previous example and make that the AVERAGE, because we know that rate will fill the disc. The more scenes that can be encoded at something below the AVG, the more "bandwidth" is "left over" for the complex scenes to be encoded higher than the AVG. If the MIN is 4, but you have scenes that would look just fine at 3, then you are unecessarily using up space that could otherwise have gone to other scenes "more worthy of the bandwidth", which would be encoded above AVG setting.
If VBR is set to 3-6-8, and you have several scenes that will actually look good at the 3 setting, that would provide more leftover space than a 4 MIN setting, therefore allowing for more scenes to push closer to the full 8 setting if they need it, because the space wasn't already used up encoding stuff at 4 that could've been encoded at 3.
OK, done rambling
The problem with that logic is that VBR normally only adjusts the rate for the surrounding 20 frames. (ie +- 10)
It's not as if it will give you low bitrates for the head shots and high bitrates for the car crash. They will both be at the Average rate but the car crash will be able to spread them around +- 10 frames a little better.
VBR only matters when you do a scene cut (large delta change in static detail) or scenes with a lot of motion.
In general CBR or CQ will serve you just fine. CQ with Max Rate (like in x264pro) gives you the best of both worlds because you get the quality you want but the datarate is clamped to the Max Rate so that you don't go over the bit budget.
Ah, I've got to assume then that the Minimum is setting how much can be stolen from a frame to help before and after frames that need more bits. So it's biasing the rate control algorithm.
I think it's a useless value since it messes with the rate control in an unknown way.