GPU Utilisation
Hiya,
I'm trying to figure out what causes the GPU to be used when encoding H.264 sometimes but not always.
I have two systems I use, one is at work with the following basic specs:
- Dual Xeon E5-2650 V2s (16c 32t total)
- Quadro K4000
- 64GB RAM
Two is at home with the following specs:
- Ryzen 1600 (6c 12t)
- GTX 970
- 8GB RAM (yes... I know... it's not an editing system but sometimes I need it to be!)
On the Xeon system, I queue some 4K C300 MK2 footage with Magic Bullet Looks applied to encode to 1080p H.264 and the Quadro will get slammed. 90% usage minimum. CPU will be about 20% or less.
On the Ryzen system I queue some 1080p Screen captured footage to effectively just get recompressed to drop the file size. The GTX 970 will get 0% usage and the CPU will be 95% + usage.
Is Magic Bullet causing the GPU to be used? Or is it the fact the footage is 4K to 1080p? I need to know for two reasons...
One is that we need a new GPU to replace the Quadro K4000 soon, it's just not got the VRAM for playback etc. Two is that my GPU at home isn't being used and I feel like it could be....
Cheers!
