I've read so many threads about this issue, tried various things, and I still can't get Premiere to use my GPU.
Drivers are updated and I've even completely disabled the onboard graphics card, but it still won't pick up on the 1660.
Most people do not understand Premiere's use of the GPU. If you have it 'set' in the Project Settings panel, Mercury Acceleration (Metal for AMD/Apple, CUDA for Nvidia) ... then Premiere will use the GPU for what Premiere uses the GPU for. Period. Which so far is mainly for frame resizing and color changes.
Look in the Effects panel ... click the GPU-accelerated effects lego block at the top so it turns blue. As below:
It will now show all effects that use the GPU.
Outside of those, it will not currently use the GPU.
And if you're commenting on the 'software only' phrase in the Export dialog box during H.264 exports, that only refers currently to having an Intel CPU that has their QuikSync feature "inside" and is enabled in the BIOS.
Two different things.
Yeah, this is why I need the GPU. In fact, ins trying to get it to use my 1660, I disabled the onboard graphics. Now when I apply lumetri schemes and warp stabilization, Premiere just grinds to a halt! I get that only certain processes utilize the GPU, but I use those processes and it not accessing my GPU is rendering Premiere unstable.
Some CPUs are designed to rely on the onboard graphics chip for part of their computations. For those CPUs, disabling the onboard chip causes problems, and that could be part of your issue. I would suggest re-enabling the onboard chip and seeing if it works better.
Note, depending on the relative 'speed' of your CPU and GPU, you may see varying levels of GPU activity. If they are pretty balanced, and you have say Warp and Lumetri on a clip, you can often nearly peg both CPU and GPU. If your GPU is pretty potent and fast, with a mid/low-level CPU or one with say only 16 GB of RAM, then ... your CPU won't be working fast enough to send that much data to the GPU for processing at any one moment. So the CPU may be working but the GPU is coasting.
Conversely, if your CPU/RAM is good and GPU weak, then the CPU will coast while the GPU is working.
If you don't have enough RAM for the CPU to run flat-out, the CPU will coast no matter what is going on.
And all of the above comments only apply when there is a decent level of GPU accelerated effects being processed at this moment by the CPU.
Thanks, yeah, maybe more of my setup would be useful:
i7-3770k @ 3.5ghz
running off a SSD
GTX 1660 it
When rendering it maxes out the CPU, but the GPU isn't used at all. Uses about half my ram. Clearly it needs the GPU for something though, because disabling the on board caused it all to slow enough to where it isn't even loading my windows when I've got lumetris applied. I just wish the GPU did SOMETHING for the effects it is supposed to help with.
That is an ancient CPU by today's standards ... released in 2012! And worse, it's only four cores with a smallish-cache, so even with a decently fast speed in Mhz, it's ... not going to put up impressive numbers.
And I'm wondering if it is old enough it cannot properly utilize that 1660. Your GPU is a ton hotter than your CPU.
Yep, that's definitely true. But why would my onboard do more than my dedicated? And the CPU does better on its own when set to mercury software playback. The CPU is definitely my weak link, but it doesn't explain the complete lack of use of the GPU. Something isn't working right.
I'm having the same issue. I have an i7-3930k (6 core, 12TH) and a GTX 1660 Super. When I use warp stabilization, it does not use the GPU at all. It will use 1 CPU core per clip being stabilized, but it doesn't touch the GPU.
Neil, are you implying that if I upgrade my mobo and chip to something more recent, it will then utilize the GPU?
I have the same issue. Here are my system specs
i7 - 10720H CPU @ 2.60GHz
The person with the best help on these tech things is @RjL190365 and hopefully he'll respond. Generally, yes, the newer CPUs oft have bits that Premiere can use to push pixels across both the CPU and GPU. Especially noted now on H.264 and say RED media.
Now, understand, part of the Warp process is nearly completely CPU-based, the other part can really hammer the GPU. So ... during the process of running Warp, it for part of the time will not use the GPU nearly at all. Then GPU use will spike.