GPU and CPU encoding loads with effects
Alright, so correct me if I'm wrong here, and if I'm wrong, would be nice with some explanation. ![]()
So I have this 20 minute timeline I'm currently rendring/encoding to h264 with Media Encoder and CUDA on.
The timeline consists of A LOT of 200fps .R3D clips time mapped so it's realtime in a 25fps timeline. The footage is a variation of 3-5K, downscaled to 2K (2048x858 due to 2.40:1 ratio).
I have the dust & scratches native premiere effect on almost all clips (dirty lens) and with an adjustment layer that consists of a simple grade/color correction on top of all clips.
So to the actual question:
When I rendered out a preview of the timeline, I get rendertimes down to around 11minutes, seeing my CPU load is between 70-95%, and my GPU load is around 30-60%.
Now when I'm rendring with the scratch & dust effect on the clips, (and yes, I know I'll get longer encoding/render times due to the effects), I see that my rendringtime is around 40minutes, CPU load is around 95-100%, but my GPU only uses around 10-15% ??
How come that adding a effect like scratch & dust makes an impact on how it's rendered between the GPU and CPU? In my head it should only "add more stuff cpu/gpu has to do" in other words, lets say that the GPU would need to work around 60-80% load?
I have a newly built Threadripper 1920x, 32gb DDR4 3600mhz, SSD's with an 2080ti, which is more than capable to render heavy stuff, I just don't understand why it's not that efficient when adding that effect..
Best regards
