Skip to main content
Participating Frequently
December 21, 2022
Question

Premiere Pro not using GPU even when selected as renderer?

  • December 21, 2022
  • 3 replies
  • 18145 views

I am totally confused. Why is my computer showing 100% CPU utilization when playing clips or rendering and 0% on the GPU?


I followed the instructions to enable GPU rendering as they are written here: https://helpx.adobe.com/x-productkb/multi/gpu-acceleration-and-hardware-encoding.html#:~:text=How%20to%20enable%20Mercury%20Playback,OpenCL%2FCUDA%2FMetal).

This changes absolutely nothing for me. I really have a hard time believing an overclocked i7-6700k is causing a bottleneck. Why is this happening?

My relevant specs:
i7-6700k at 4.6GHz
Nvidia 980Ti
32GB DDR4 RAM
m.2 SSD

3 replies

New Participant
July 25, 2024

Hey I have core i5 12400f and rtx 3060ti but my premiere pro only using cpu for playback and rendering what should I do help me

R Neil Haugen
Braniac
July 25, 2024

That's a rather limited CPU, only 6 cores (very small these days), with apparently no iGPU, nor QuickSync H.264 hardware either. 

 

I don't know what use you're expecting for the GPU in normal playback and rendering and export processes. It isn't simply an additional chip ... it has specific funtions and uses. Such as any color/tonal changes, major resizing of images, that sort of thing.

 

Hence heavily used for Warp, Lumetri, speed ramps involving optical flow, and some other effects.

 

If your sequence uses those, the GPU will get used as the CPU gets to those tasks and sends them TO the GPU.

 

If your sequence doesn't have those, it won't have anything to send to the GPU to begin with.

New Participant
August 14, 2025

Then how to fix this case? same issue btw, im using Ryzen 7 5800H and RTX 3070 mobile as GPU (laptop user btw). how to fix laggy preview in premiere pro?

Braniac
December 21, 2022

Is your H.264 4k footage 10-bit and/or 4:2:2? If so, then no GPU on a Windows PC currently supports hardware decoding of this material at all. Only 8-bit 4:2:0 footage is currently supported at all for H.264.

 

Also, are you applying any GPU-accelerated effects at all on your timeline? If not, then only the CPU is used for playback - period (outside of the hardware decoding that I mentioned above).

 

Rendering, decoding and encoding are all completely separate processes from one another. Somehow, you got two of those three mixed up.

JonesVid
Braniac
January 24, 2023

Just spotted this

Have upgraded my camera and lowest 4K video setting is 10 bit 4.2.0. H264 or H265.

My GPU is Nvidia 2070 Super (latest drivers) and I'm running an i9 9900K with all cores locked at 5GHz.

The Graphics GPU on the 9900K is enabled as well.

Both GPU and Graphics GPU enabled in Preferences for Encode/Decode

All has worked well with my previous H264 footage - but just noticed you said only H264 4.2.0 8 bit is supported?.

 

Is this the case for H265 as well?

Any possibility the H264 situation will change?.

 

Braniac
January 24, 2023

Nope. With both your CPU and GPU only 4:2:0 8-bit is supported for hardware decoding of H.264. This is a hardware limitation of both components.

 

And even with HEVC, only 8- or 10-bit 4:2:0 is supported in Premiere Pro for hardware decoding with both your CPU and GPU. You'll need an 11th-Gen Intel CPU or newer just to support hardware 4:2:2 decoding for HEVC (with the integrated graphics enabled and using Intel Quick Sync for hardware decoding).

 

As for Nvidia, that company has never supported hardware decoding of 4:2:2 material at all. The newer Nvidia GPUs support hardware decoding of 4:4;4 HEVC material, which Adobe's implementation does not currently support.

 

In fact, all Windows GPUs ever manufactured only support 4:2:0 8-bit for H.264 hardware decoding. You will need a Mac computer with Apple Silicon (M1 or M2 CPUs) just to hardware decode 10-bit and/or 4:2:2 H.264 footage.

Participating Frequently
December 21, 2022

I am editing H.264 4k footage at 1/4th quality. But this happens with 1080p too