• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Why system run graphics card when export video and Premier Pro not use graphics card

New Here ,
May 14, 2022 May 14, 2022

Copy link to clipboard

Copied

Hi All!

I have a problem. When i render or export video, system run graphics card when export video and Premier Pro not use graphics card

1.png

I enabled Mecury Playback Engine GPU (OpenGL)

2.png

And I enabled hardware in Media

3.png

And I enabled hardware when export video

4.png

BUT NOTHING HAPPENS! PLEASE HELP ME.

VIDEO AFTER EXPORTING ERROR AT 10th SECOND

5.png

My video: My Drive - Google Drive

PremierePro

AMD R7 5800H

RX 5500M

Ram 16GB

My English very bad, sorry and appologize all!

TOPICS
Editing , Error or problem , Export , Formats , Hardware or GPU , Performance

Views

265

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 14, 2022 May 14, 2022

Copy link to clipboard

Copied

You've got two completely different hardware things there. The first is GPU use (Mercury Acceleration setting), and the second is H.264/5 encoding which depends on your CPU/motherboard hardware capabilities for long-GOP encoding. Not a GPU thing at all.

 

Premiere's use of GPU is confusing for so many users.

 

It does not routinely use the GPU as an assistant to the CPU. The GPU is used for very specific things, mostly color and major resizing operations. They have a "GPU Accelerated Effects" list that shows what the GPU is used for.

 

And, in the Effects panel, you can see the "lego blocks" listed for each effect ... whether they're GPU-accelerated, 32-bit float, or YUV ... and if you don't see a lego block for one of those, it doesn't have that coding.

 

So while it is processing a clip that has hefty Lumetri or Warp or resizing going on, the GPU will get called on as the CPU gets to what it needs from the GPU.

 

Other than that, it doesn't use the GPU that much.

 

As to the "hardware encoding" for long-GOP H.264/5 format/codecs, that is totally dependent on your CPU having the hardware bits for such encoding. Not at all a GPU thing.

 

But very confusing, as both for GPU use and for H.264/5 encodes they use the phrases of hardware or software encoding.

 

And ... a note on long-GOP hardware vs software encoding: hardware is faster, but software frequently does a better/prettier job of the encode.

 

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guide ,
May 15, 2022 May 15, 2022

Copy link to clipboard

Copied

LATEST

The GPU cam be used for hardware encoding and decoding H.264/265. Nvidia Grpahics cards use Nvenc as seen in the video link below. 

https://www.youtube.com/watch?v=1L-erwmRxAU


This video demonstrates how to enable GPU acceleration and GPU encoding and decoding. It also shows how to enable Intel's Quick Sync. Resolve https://amzn.to/3NcL1zP Adobe CC 1 year https://amzn.to/3Da0qMN All links are monetized.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines