• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Graphics card not used in timeline render

Engaged ,
Mar 20, 2021 Mar 20, 2021

Copy link to clipboard

Copied

I noticed when rendering the timeline the graphic card is not being used. I have high CPU and RAM but nothing on Graphic card in Taskmaster performance. Is the computer using all it's resources?

Mercury Playback Engine is enabled.
https://helpx.adobe.com/x-productkb/multi/gpu-acceleration-and-hardware-encoding.html

Windows 10 64-bit
Gigabyte Z490 AORUS Ultra Intel
LGA 1200 ATX Motherboard
Intel I9-10900K CPU@3.70GHz
RAM 16GB
NVIDIA GeForce RTX 2070 Super
& NVIDIA GeForce GTX 1650


Windows 10 64-bit
Gigabyte Z490 AORUS Ultra Intel
LGA 1200 ATX Motherboard
Intel I9-10900K CPU@3.70GHz
RAM 16GB
NVIDIA GeForce RTX 2070 Super
& NVIDIA GeForce GTX 1650
TOPICS
Performance

Views

236

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advocate ,
Mar 20, 2021 Mar 20, 2021

Copy link to clipboard

Copied

Please provide more info:

- Source footage origin and properties (codec, resolution, framerate)

- Same for timeline plus what effects are applied and what is the color of the timeline render bar

- Export settings

- Load numbers for CPU and GPU during render and export

- GPU driver version

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Mar 20, 2021 Mar 20, 2021

Copy link to clipboard

Copied

PrPro uses the GPU as a separate item. It's not just an addendum to the CPU. So certain things are sent to the CPU, certain things to GPU if available. If you're doing color work, Warp, several things like that, the GPU will be used a lot more than it is for basic pixel decoding.

 

But even that varies depending on which CPU you have. It used to be a lot simpler to figure out what was CPU and what was GPU work. But with the way some CPUs have different encoding capabilities for say H.264 than others, it's almost bizarre the number of options for how it's processed now.

 

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Mar 23, 2021 Mar 23, 2021

Copy link to clipboard

Copied

LATEST

Sorry Ronald. We can answer your questions by giving us more info about your media, effects you used, sequence settings, etc. Short answer is that not everything uses the GPU. Let us know, OK?

Kevin

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines