• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

I have played around with different hardware decoding and encoding settings...

LEGEND ,
Dec 05, 2024 Dec 05, 2024

Copy link to clipboard

Copied

...on my main Intel i9-14900K system.

 

The specs are:

  • Intel i9-14900K
  • 64 GB DDR5-6400 RAM
  • GeForce RTX 4070 Ti
  • 1 TB Samsung 980 PRO SSD (C: drive)
  • 2 TB Samsung 980 PRO (D: drive)
  • Windows 11 Pro 24H2
  • Premiere Pro 25.1

The integrated Intel UHD Graphics 770 is enabled on this system.

 

I turned on and off hardware decoding and hardware encoding, and I also set the hardware decoder to Intel (Quick Sync) only or Nvidia (NVDEC) only. The hardware encoding is locked to Nvidia (NVENC) only or software only (Intel Quick Sync hardware encoding is not possible with a discrete non-Intel GPU installed).

 

The overall results, in order from slowest to fastest (based on my PugetBench for Premiere Pro scores using the Standard preset):

  • Nvidia hardware decoding enabled, hardware encoding disabled
  • Nvidia hardware decoding enabled, hardware encoding enabled
  • Both hardware decoding and hardware encoding disabled
  • Intel hardware decoding enabled, hardware encoding disabled
  • Hardware decoding disabled, hardware encoding enabled
  • Intel hardware decoding enabled, hardware encoding enabled

The result with Premiere Pro set to the default decoder and encoder settings was equal to that achieved with the Nvidia decoder enabled and hardware encoding enabled due to the default decoder setting, which was supposed to utilize both the Intel and Nvidia decoders, was broken in Premiere Pro 25.1.

 

Please note that my findings are for my particular system only. If your system has a weaker CPU, then the hardware decoding performance will always beat software-only decoding. It just so happens that my i9-14900K is more than powerful enough to handle software decoding.

 

If you have the chance to play around with the decoding and encoding settings, please share your findings in this discussion.

 

Randall

Views

74

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Dec 10, 2024 Dec 10, 2024

Copy link to clipboard

Copied

So if you've an i9-49000k, the graphics card is not that important for rendering?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 10, 2024 Dec 10, 2024

Copy link to clipboard

Copied

LATEST

The GPU is still important in rendering. In all of my tests the discrete GPU is utilized in rendering where it can, and all of my testing had the GPU set to hardware encode H.264 and HEVC by default. The lower score for decoding (called "Processing" in PugetBench) with the Nvidia hardware decoding enabled versus software-only decoding is likely due to the inherent bottlenecks in the PCIe interface (as well as all other full-duplex interfaces) – in testing, simultaneous transfers through a full 16-lane PCIe bus in both directions create some latencies within, particularly when each of the simultaneous transfers use the entire lane width of the interface.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines