Confused about export bitrates (software vs hardware encoding)
I'm trying to make sense of different results I got from exporting the same HD sequence to identical target bitrates (25 mbps) but with different settings. There is a world of differences between these results and I cannot understand why. Only settings that changed were VBR (1 / 2 pass) or CBR and software or hardware encoding.
Here are the average bitrates it generated:
- VBR 1 pass (hardware): 8.2 mbps
- VBR 1 pass (software): 22 mbps
- CBR (hardware): 8.5 mbps
- VBR 2 passes (software): 22 mbps
Considering I used a target bitrate of 25 mbps (max bitrate of 30 for VBR 2 passes), the software encoded bitrates make sense to me. This was expected. The hardware encoded bitrates however (CBR and VBR 1 pass) don't.
Visually all these exported videos look identical when compared at 200% in Premiere, meaning 25 mbps was completely overkill for these jobs, but I'd still like to understand what is happening. What can explain such a huge gap between the target bitrate and end result during hardware encoding? Is it ignoring user set target bitrate and computing its own receipe?
My graphics card is a Quadro K4200 if that matters.
