I just switched over to AME 2018 and tried a test encode. SImple compression-only H.264 transcode of a 24m clip of 4k input to 4k output. AME defaulted to adaptive high bitrate preset. Encoding was 54m vs. 18m encode time, 13.4GB vs. 1.9GB file size.
Forgive the noob question, but what goodness does adaptive high bitrate offer to make it worth these costs?
Copy link to clipboard
Does anyone from Adobe read these forum posts? The favor of a reply would be greatly appreciated.
I would like to know as well.. Adobe!
I believe you are comparing the adaptive high bitrate preset with the regular high bitrate preset (non-adaptive), correct? The non-adaptive high bitrate uses a fixed bitrate of 10Mbps (12Mbps Max) regardless of your target output. This may be enough for HD output, but may not be enough for 4k output. The image quality may suffer.
The new adaptive preset adjust its bitrate according to the target output's frame size and frame rate. For example of 4K 25fps output, the bitrate is set to 63.3Mbps (79.2Mbps Max). Higher the frame size/rate for output, higher the output bitrate used in adaptive presets. This is why you got the bigger file size, which results in slower encode time. But, this should give you much higher image quality that's good for 4k output. If this is too big for your material, like your video contains a lot of still images or you don't mind for lesser image quality, you can try the adaptive medium bitrate or low bitrate preset.
You can manually adjust the bitrate if you need to. Also, we still provide the regular "Match Source - High bitrate" and "Match Source - Medium bitrate" presets if you prefer those presets.
Hope this helps
Great info there... So how does that method of export measure up against, say, the standard High Quality HD preset for 1080p?