Copy link to clipboard
Copied
I'm looking at upgrade options for my edit workstation. My question is with Premiere set to hardware encoding I know it uses my GPU card. But does Premiere also use the CPU on chip GPU if present as well as the GPU card.
The reason I ask is is there any advantage getting a CPU with on chip graphics or is it a waste of money when Premiere only uses the separate GPU card for encoding acceleration.
Copy link to clipboard
Copied
It depends on which hardware acceleration you're referring to. There are two completely different forms.
The first, is GPU accelerated effects, like color corrections, Warp stabilizer, and some other effects. That one is (nearly) totally dependent on the discrete GPU card.
The other, completely different form, is for H.264/5 "long-GOP" decoding/encoding. Which is dependent mostly on the hardware associated with the CPU. Especially as some Intel CPUs have the needed hardware, some don't, and I don't think any AMD CPUs do.
This is affected also (apparently) with some by the accompanying on-board gpu chip.
So ... is it H.264/5 long-GOP you're concerned with, or ... color corrections/Warp and other GPU accelerated effects?
Copy link to clipboard
Copied
Thanks for the great reply. I'm more concerned with long GOP 264 export times. This seemed to improve a lot when I upgraded my video card to a GTX1650. Ok it's not top end gear but I am on a limited budget. So that seems to suggest the 264 export was being handled by the GTX1650 rather than the integrated graphics.
Copy link to clipboard
Copied
You may have had GPU effects that were boosted by that card. I don't know of Nvidia cards having H.264 capabilities.
But @RjL190365 would.
Copy link to clipboard
Copied
Your GTX 1650 has an NVENC encoder for both H.264 and HEVC encoding. And in your case, then only the GTX 1650 is used for that encoding job. Remember, on a system with both a discrete GPU installed and the integrated on-CPU graphics enabled, Adobe supports both decoders simultaneously but only the discrete GPU for encoding.
Copy link to clipboard
Copied
Ah ok, thanks for clarifying. So it's still worth spending the extra on a quicksync vs of a CPU.
Copy link to clipboard
Copied
Especially if the footage that you're working with had been encoded with H.264 (8-bit 4:2:0) or HEVC (with newer Intel 11th-Gen or newer CPUs, 8-bit or 10-bit 4:2:0 or 4:2:2) to begin with.
Copy link to clipboard
Copied
Thanks RJL. Yes almost all my source material is H.264 (8-bit 4:2:0) so will let the iGPU CPU.
Copy link to clipboard
Copied
Actually, Adobe by default uses both the integrated GPU and the discrete GPU simultaneously for decoding (or playback) of H.264/HEVC material. But when you encode to either H.264 or HEVC (which occurs during export), then only the discrete GPU is used. Adobe does not currently let you use the integrated GPU for encoding if a discrete GPU is present in your PC.
Copy link to clipboard
Copied
Any idea on thinking behind that? Seems logical to use all the hardware you can when rendering to minimise export time.
Copy link to clipboard
Copied
I would imagine it has to do with the encoding process. You're suggesting it get divided up, a few bits done here, a few bits done there? I'm not sure that is workable.
Copy link to clipboard
Copied
My thinking is that it seems to work dividing it up for decoding so why not encoding.