• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Premiere Pro not using GPU even when selected as renderer?

New Here ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

I am totally confused. Why is my computer showing 100% CPU utilization when playing clips or rendering and 0% on the GPU?


I followed the instructions to enable GPU rendering as they are written here: https://helpx.adobe.com/x-productkb/multi/gpu-acceleration-and-hardware-encoding.html#:~:text=How%20....

This changes absolutely nothing for me. I really have a hard time believing an overclocked i7-6700k is causing a bottleneck. Why is this happening?

My relevant specs:
i7-6700k at 4.6GHz
Nvidia 980Ti
32GB DDR4 RAM
m.2 SSD

TOPICS
Error or problem , Freeze or hang , Hardware or GPU , Performance

Views

9.2K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

I am editing H.264 4k footage at 1/4th quality. But this happens with 1080p too

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

Is your H.264 4k footage 10-bit and/or 4:2:2? If so, then no GPU on a Windows PC currently supports hardware decoding of this material at all. Only 8-bit 4:2:0 footage is currently supported at all for H.264.

 

Also, are you applying any GPU-accelerated effects at all on your timeline? If not, then only the CPU is used for playback - period (outside of the hardware decoding that I mentioned above).

 

Rendering, decoding and encoding are all completely separate processes from one another. Somehow, you got two of those three mixed up.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

I am editing Mavic Air 2 footage, I belive it is 8-bit 4:2:0

If my CPU is at 100% usage and GPU barely hits 15%, does that mean it's a CPU bottleneck or is there some other setting that's wrong? Hardware? Software? I am lost and I thought I was good with troubleshooting lol

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

The GPU is not simply an extra added CPU tool. It's a separate item, and within each video post app, the devs use them for very specific things. (And yes, differently app to app.)

 

In Premiere, the GPU is used primarily for resizing, for creating frames from blending modes, for color ... there is the list of "GPU Accelerated Effects" ... search for it.

 

If you are doing something concerning resizing (including Warp) or color (Lumetri) then the GPU will get called on by the CPU to help process that clip.

 

If not, it may not be doing much.

 

And RJL has the hard data on the hardware issues including H.264/5 encodes.

 

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

I understand the CPU and GPU are different units with different functions. 

I am telling Premiere to use my GPU for rendering and playback, why is that not happening? 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 21, 2022 Dec 21, 2022

Copy link to clipboard

Copied

Did you read my post?

 

Unless those specific tasks are being used for the frame being processed, the GPU will not be in much use. So ... do you have Warp, Lumetri, or resizing going on on all frames of your sequence?

 

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Dec 28, 2022 Dec 28, 2022

Copy link to clipboard

Copied

I am having massive issues now since the latest update. I use Digital Anarchy's beauty box. Never had a problem with it. Playback was always smooth and now since 2023 update I can't even use it and it is supposed to be a GPU accellerated effect. I downloaded a older version of Premiere and for some reason still same issue. Never had this issue until the 2023 update. I am running a 3090TI. I don't think the dude responding to you really understands your question and is just talking to talk.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advocate ,
Dec 28, 2022 Dec 28, 2022

Copy link to clipboard

Copied

@jasonc64408805  If simple version downgrade does not work, it suggests that you also have to downgrade something else, like GPU driver. Oh, and clear media cache (actually do that each time after any driver/software update)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Dec 28, 2022 Dec 28, 2022

Copy link to clipboard

Copied

Your situation is different ... check with Digital Anarchy to see if their plugin is working with Pr2023 or not, and they're pretty good at helping out on these things.

 

I've not run Beauty Box recently, perhaps I'll get time to test that today on my rig.

 

There are a number of issues, especially on Macs, with 2023. So we all have to be careful not to conflate things. Which of itself can be confusing.

 

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 24, 2023 Jan 24, 2023

Copy link to clipboard

Copied

Just spotted this

Have upgraded my camera and lowest 4K video setting is 10 bit 4.2.0. H264 or H265.

My GPU is Nvidia 2070 Super (latest drivers) and I'm running an i9 9900K with all cores locked at 5GHz.

The Graphics GPU on the 9900K is enabled as well.

Both GPU and Graphics GPU enabled in Preferences for Encode/Decode

All has worked well with my previous H264 footage - but just noticed you said only H264 4.2.0 8 bit is supported?.

 

Is this the case for H265 as well?

Any possibility the H264 situation will change?.

 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jan 24, 2023 Jan 24, 2023

Copy link to clipboard

Copied

Nope. With both your CPU and GPU only 4:2:0 8-bit is supported for hardware decoding of H.264. This is a hardware limitation of both components.

 

And even with HEVC, only 8- or 10-bit 4:2:0 is supported in Premiere Pro for hardware decoding with both your CPU and GPU. You'll need an 11th-Gen Intel CPU or newer just to support hardware 4:2:2 decoding for HEVC (with the integrated graphics enabled and using Intel Quick Sync for hardware decoding).

 

As for Nvidia, that company has never supported hardware decoding of 4:2:2 material at all. The newer Nvidia GPUs support hardware decoding of 4:4;4 HEVC material, which Adobe's implementation does not currently support.

 

In fact, all Windows GPUs ever manufactured only support 4:2:0 8-bit for H.264 hardware decoding. You will need a Mac computer with Apple Silicon (M1 or M2 CPUs) just to hardware decode 10-bit and/or 4:2:2 H.264 footage.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Enthusiast ,
Jan 25, 2023 Jan 25, 2023

Copy link to clipboard

Copied

Many thanks for your response on this question.

In fact the camera (Lumix S5 Mk11) supports numerous formats and I always use mov format, however you can use mp4 format and select 4.2.0 8 bit , but hardly worth it as 4.2.0 10bit H265 is the normal choice for me now.

As regards Graphics Hardware on the CPU, I have in my plans an upgrade of my PC planned later in the year when funds permit to an i9 13900K so that should help.

I don't use many GPU accelrated effects at present but I am sure that will change over time.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 24, 2024 Jul 24, 2024

Copy link to clipboard

Copied

Hey I have core i5 12400f and rtx 3060ti but my premiere pro only using cpu for playback and rendering what should I do help me

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 25, 2024 Jul 25, 2024

Copy link to clipboard

Copied

LATEST

That's a rather limited CPU, only 6 cores (very small these days), with apparently no iGPU, nor QuickSync H.264 hardware either. 

 

I don't know what use you're expecting for the GPU in normal playback and rendering and export processes. It isn't simply an additional chip ... it has specific funtions and uses. Such as any color/tonal changes, major resizing of images, that sort of thing.

 

Hence heavily used for Warp, Lumetri, speed ramps involving optical flow, and some other effects.

 

If your sequence uses those, the GPU will get used as the CPU gets to those tasks and sends them TO the GPU.

 

If your sequence doesn't have those, it won't have anything to send to the GPU to begin with.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines