NVIDIA GPU not being used when rendering

New Here ,
Nov 10, 2020

Copy link to clipboard

Copied

Hello

 

I have a Laptop with the following configuration:

Intel i7 10875H

RTX 2070

64GB RAM

 

Latest premiere update and lates Nvidia drivers.

I have seen about the new Hardware Encoding and Decoding on premiere to faster export of H.264, so I followed all the configurations and when I start to export a video the CPU utilization is always on 100% and GPU max 15%, shouldn't this function take off some of the work from CPU and use more the GPU ?

and I did the test changing from Hardware encoding to Software encoding, the render time doesn't change and all stays the same, CPU 100% and GPU around 5-15%

 

There is a lot of videos on youtube from people showing this new function and showing things like CPU around 50% of usage while exporting with Hardware enconding anable.

 

Is there anyone with the same issue? 

Thanks 🙂

TOPICS
Error or problem, Export, Hardware or GPU, Performance

Views

129

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Adobe Community Professional ,
Nov 10, 2020

Copy link to clipboard

Copied

Could we have a screen shot of your export settings.

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
New Here ,
Nov 10, 2020

Copy link to clipboard

Copied

sure

 

Screenshot 2020-11-10 213427.png

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
LEGEND ,
Nov 10, 2020

Copy link to clipboard

Copied

You got your bitrate set way too high for hardware encoding. At that ultra-high bitrate Premiere Pro's (and MainConcept's) H.264 encoder will default to software-only encoding with absolutely no indication whatsoever. Hardware encoding bitrate is currently limited to about 60 Mbps maximum.

 

That is the behavior if you were using NVENC for hardware encoding. If on the other hand you were using Intel QuickSync for hardware encoding, the hardware-encoded bitrate, if set too high, will automatically default to a very low, fail-safe bitrate of only about 16 Mbps.

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
New Here ,
Nov 11, 2020

Copy link to clipboard

Copied

Hello,

 

thank you for your response,

I tested with lower bitrate and the  CPU usage droped to around 90% at the export and GPU about 4%.

Then I tested changing to softaware enconder with the same settings to see if would have some change, but everything stayed the same as above, the CPU and GPU usage and time of the render.

If I change from Softaware enconding to Hardware encoding is not making any difference, in both settings premiere is using my CPU to render.

 

in the export settings I Tested diffrent configuratios: the profile and level to the default "Main" and to "Hight" with CBR and changing to VBR 1 pass 20Mbs.

 

 

 

Screenshot 2020-11-11 112853.png

 

 

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Enthusiast ,
Nov 11, 2020

Copy link to clipboard

Copied

Does your sequence have any effects on it? Some effects only render on the CPU.

To do a quick test, open Media Encoder and drop the video directly in it and export. Check if anything changes.

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
New Here ,
Nov 11, 2020

Copy link to clipboard

Copied

I had Lumetri and Unsharp Mask and you are right about the effect, I deleted the Unsharp mask and now it is using the hardware encoding and using less CPU and more GPU on this case.

So the hardware encoding only works with specific effects and more simplified timelines.

 

Thank you, Sami.

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Enthusiast ,
Nov 11, 2020

Copy link to clipboard

Copied

You can tell which effects can use the GPU by looking at the little badges next to them in the effects window.

Check this link for more info on understanding the different types of effects.

Likes

Translate

Translate

Report

Report
Reply
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more