• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

GPU Utilisation

Community Beginner ,
Mar 09, 2018 Mar 09, 2018

Copy link to clipboard

Copied

Hiya,

I'm trying to figure out what causes the GPU to be used when encoding H.264 sometimes but not always.

I have two systems I use, one is at work with the following basic specs:

  • Dual Xeon E5-2650 V2s (16c 32t total)
  • Quadro K4000
  • 64GB RAM

Two is at home with the following specs:

  • Ryzen 1600 (6c 12t)
  • GTX 970
  • 8GB RAM (yes... I know... it's not an editing system but sometimes I need it to be!)

On the Xeon system, I queue some 4K C300 MK2 footage with Magic Bullet Looks applied to encode to 1080p H.264 and the Quadro will get slammed. 90% usage minimum. CPU will be about 20% or less.

On the Ryzen system I queue some 1080p Screen captured footage to effectively just get recompressed to drop the file size.  The GTX 970 will get 0% usage and the CPU will be 95% + usage.

Is Magic Bullet causing the GPU to be used? Or is it the fact the footage is 4K to 1080p?  I need to know for two reasons...


One is that we need a new GPU to replace the Quadro K4000 soon, it's just not got the VRAM for playback etc. Two is that my GPU at home isn't being used and I feel like it could be....

Cheers!

Views

483

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Nov 29, 2018 Nov 29, 2018

Hi Edd,

I'm trying to figure out what causes the GPU to be used when encoding H.264 sometimes but not always.

Sure, I'll try.

I have two systems I use, one is at work with the following basic specs:

  • Dual Xeon E5-2650 V2s (16c 32t total)
  • Quadro K4000
  • 64GB RAM

Two is at home with the following specs:

  • Ryzen 1600 (6c 12t)
  • GTX 970
  • 8GB RAM (yes... I know... it's not an editing system but sometimes I need it to be!)

OK, fine.

On the Xeon system, I queue some 4K C300 MK2 footage with Magic Bullet Looks applied to e

...

Votes

Translate

Translate
Adobe Employee ,
Nov 29, 2018 Nov 29, 2018

Copy link to clipboard

Copied

Hi Edd,

I'm trying to figure out what causes the GPU to be used when encoding H.264 sometimes but not always.

Sure, I'll try.

I have two systems I use, one is at work with the following basic specs:

  • Dual Xeon E5-2650 V2s (16c 32t total)
  • Quadro K4000
  • 64GB RAM

Two is at home with the following specs:

  • Ryzen 1600 (6c 12t)
  • GTX 970
  • 8GB RAM (yes... I know... it's not an editing system but sometimes I need it to be!)

OK, fine.

On the Xeon system, I queue some 4K C300 MK2 footage with Magic Bullet Looks applied to encode to 1080p H.264 and the Quadro will get slammed. 90% usage minimum. CPU will be about 20% or less.

If you are scaling from 4K to 1080p in the export process, the GPU will be utilized for scaling according to protocol for Mercury Playback Engine. Other items that engage the GPU would be frame rate changes, color space changes, blend modes, and GPU accelerated effects. Checking briefly, "Looks" is GPU accelerated and enabled with Mercury GPU acceleration.

On the Ryzen system I queue some 1080p Screen captured footage to effectively just get recompressed to drop the file size.  The GTX 970 will get 0% usage and the CPU will be 95% + usage.

The GPU is not used because nothing in the Mercury Playback protocol has been called: no scaling, no GPU accelerated . Keep in mind that the actual "encoding" in the export process is ALL CPU based.

Makes sense?

Is Magic Bullet causing the GPU to be used? Or is it the fact the footage is 4K to 1080p?  I need to know for two reasons...

Both. As I indicated.

One is that we need a new GPU to replace the Quadro K4000 soon, it's just not got the VRAM for playback etc. Two is that my GPU at home isn't being used and I feel like it could be....

Try some tests with the new info you have regarding why the GPU kicks in and why it does not. Should make sense now, I hope. Report back with your findings.

Thanks,
Kevin

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Nov 30, 2018 Nov 30, 2018

Copy link to clipboard

Copied

LATEST

Hi Kevin,

I really appreciate the detailed response. Everything you say makes sense, I hadn't realised that the GPU was only used for certain "additional" things and that the CPU is otherwise the main workhorse for encoding.

Cheers,

Ed

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines