Copy link to clipboard
Copied
I have HVEC hardware decoding enabled. I'm attempting to edit 4k60 H.265/HVEC 10 bit footage. I'm getting 100% CPU utilization and 0% GPU utilization with my nVidia 1080 with the latest studio driver.
Is hardware decoding not possible for my scenario? I'd love to stop going through the proxy workflow.
Copy link to clipboard
Copied
Hi Default,
No, you'd need one of them handy dandy shiny new GPUs for that. Here's the NVIDIA matrix that spells out your fate. Sorry about that.
Regards,
Kevin
Copy link to clipboard
Copied
Hi Kevin,
This chart says that my 1080 should support 10bit H.265 HEVC decoding for 4:2:0 but it doesn't say whether 4:2:2 is supported.
I repacked some of this footage with the H.265 HEVC codec in Adobe Media encoder to send to a client. CBR 800 Mbps. When I bring this repacked footage into a timeline it plays much better and utilizes up to 70% of my GPU decoding during full res playback. Is there a reason this repacked footage utilizes my GPU? Transcoding to this format takes a long time but there seems to be minimal information loss. When compared side by side with the same footage transcoded to Pro Res HQ 422 there is noticable artificating in the Pro Res when grading, and no artificating with the H.265. Without hardware encoding it seems unfeasible to repack all my footage to H.265, although the nVidia matrix says that hardware encoding should be possible.