I'm trying to make sense of different results I got from exporting the same HD sequence to identical target bitrates (25 mbps) but with different settings. There is a world of differences between these results and I cannot understand why. Only settings that changed were VBR (1 / 2 pass) or CBR and software or hardware encoding.
Here are the average bitrates it generated:
- VBR 1 pass (hardware): 8.2 mbps
- VBR 1 pass (software): 22 mbps
- CBR (hardware): 8.5 mbps
- VBR 2 passes (software): 22 mbps
Considering I used a target bitrate of 25 mbps (max bitrate of 30 for VBR 2 passes), the software encoded bitrates make sense to me. This was expected. The hardware encoded bitrates however (CBR and VBR 1 pass) don't.
Visually all these exported videos look identical when compared at 200% in Premiere, meaning 25 mbps was completely overkill for these jobs, but I'd still like to understand what is happening. What can explain such a huge gap between the target bitrate and end result during hardware encoding? Is it ignoring user set target bitrate and computing its own receipe?
My graphics card is a Quadro K4200 if that matters.
Copy link to clipboard
Can you please share your system info? Like OS and version, CPU, RAM, GPU vRAM, Pr version? Thanks. Also a screen grab of your issue will be helpful.
FAQ: What information should I provide when asking a question on this forum?
Here are the specs of that computer:
Windows 10 Enterprise (1803)
Media Encoder 22.1.1 (Build 25)
Premiere 22.1.2 (Build 1)
16 GB RAM
Intel i7-6700 3.4Ghz CPU
Quadro K4200 GPU
As for a screen grab not sure it adds much to the discussion. The issue is actual exported file size vs set target bitrate in export settings.
These exportations were all done using Media Encoder with a set target bitrate of 25 mbps. 2 were hardware encoded, the other 2 software.
Copy link to clipboard
The resulting files should always be close to the Target Bitrate without exceeding the Maximum Bitrate.
With Performance set to Hardware instead of Software, the encode times should be faster while the results are similar if not identical.
The difference between Variable Bitrate and Constant Bitrate is temporal, so Spatial differences may not be obvious to the naked eye.
For delivery of a 1st generation HD H264 file to social media like YouTube and Vimeo, 1-pass VBR 16 should be fine. For UHD, it comes up to 2-pass VBR 40.
For source footage in an edit, 35 Mbps is the floor professionally.
Warren that's what is puzzling. How do you explain the differences I got in file size between software and hardware encoding? So far, I can't. I don't understand what is happening. These exportations have a lot of static graphical elements so there's that, but why am I getting an exported bitrate so far off the target bitrate when I hardware encode with a Quadro K4200? I'm going to try to test this again on a different system later on.
This is only anecdotal because I can't speak to your specific question with any kind of technical confidence because I don't honestly know. It obviously seems like the Hardware Encoding is being a lot more aggressive with the macro-blocking and temporal compression on the unchanging or little changing parts of your sequence. My anecdote is that I personally only use Hardware Encoding for review-quality exports. Particularly when it comes to VFX-related work (not to mention looking at the number of support related issues) I just find that Hardware Encoding is a lot more error-prone than software, both in terms of actual render-stopping errors and render glitches that you find in QC.
Phillip after doing some more testing it seems that indeed the much lower file size is due to some aggressive compression on the static elements in the sequence. Interestingly enough, this is only with the Quadro though. I tested this same configuration (same sequence, 25 mbps VBR 1-pass hardware vs software encoding) on a different system (Windows 10 Pro 21H2, i9-10900K, 64GB RAM, RTX3080 GPU) and was not able to reproduce this odd behavior. The exported file bitrates were more or less the same no matter if they were hardware or software encoded.
I then tried a different sequence on the first system with the Quadro, one with plenty of movement and no static graphical elements. There was still a difference between hardware encoding (21 mbps) vs software (25 mbps) but this time much less pronounced.
This leads me to think there is some technological difference between the Quadro and Geforce cards that could explain why there is such huge modulation when encoding with the Quadro. Either that or Media Encoder is interacting differently with these 2 graphic cards.
I'd be very curious to hear from other Quadro owners if they experience similar behavior when hardware encoding in Media Encoder.
Thanks for the offer @Warren Heaton but this will not be possible, company policy about data confidentiality doesn't permit this. However if you feel so inclined you could just use this simple setup to try and reproduce as this is mostly what I tested on this afternoon and I was able to consistenly get 3 times lower bitrate on hardware encoding.
- 10 second HD 29.97 sequence
- A still png of some kind of simple graphic for the whole duration, covering 100% of the frame (here's one: )
- any talking head footage scaled down to occupy around 20% of the frame, picture in picture style (see joined graphic). Mine was HD h.264.
Then export to VBR 1-pass 25mbps (max 30mbps) at maximum render quality in software and then in hardware, without changing any other setting. There was audio too but I don't think it matters.
I just tried it again and get 3 x lower bitrate when hardware encoding.
Awesome. Well I appreciate that you took the time to do some testing! Informative for all of us!