• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

stuttered source playback - is hardware decode of graphic card supported?

New Here ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

Dows premiere uses the hardware video decode functio of modern videao cards?

system: Windows 10, latest.

CUDA on

hardware accelerated in setting: on

My graphic card (GTX970) or any modern card offers hardware decode for h.264 file format. (8 bit only).  (The new RTX 2060/70/80 even better, offer videodecode for h.264 /h.265 /vp9 in 8 or 10 bit.)

https://developer.nvidia.com/video-encode-decode-gpu-support-matrix

- VLC player. When i play back a h.264 video with external program like VLC player, Windows taskmanager show30-50%  activity for video decode. CPU is 25%. 4K source is played bask smooth.

- Premiere pro, same 4K video. In task manager no videodecode activity. I tested 8bit source: 1920 /3840 /4096 and 8K. All the same. No video  hardware decode support. Playback is missing frames. 100% cpu.

Question.
Premiere is NOT using hardware video decode form my videocard. Is this as intended? ()

Why is PP not using video hardware decode for the source monitor. Playback videos in source monitor is of course without effects and now missing frames for 4K source. Faster cpu will help but CPU utilization will alwaysd a lot less if videocard hardware decode is used. That is where it is for. And sources like h.264 and h.265 in 4K are normal now.

3840 h264 30fps in vlc.jpeg

nvidia video decode matrix.jpeg

Views

755

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

Premiere Pro can use the QuickSyc feature of some Intel processors for hardware decode/encode, but not the GPU.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

It seems strange to use a CPU-buildin relative slow video if there is a fast videocard. I suppose that a RTX1060/70 has more decoding possibilities than a buildin GPU.  Not all CPU's have a buildin GPU.


Maybe this was a choise from the design team years ago when videocards didnot have sophisticated harddecoding.?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

Playback is just one factor of editing. The engineers have to build a system designed to handle a ton of different effects on different media that's grabbed bit by bit from there and there and assembled into "a video" on the fly.

It's incredibly more complex than a simple video player.

And within this NLE are some effects that will rag that card out. Would you rather the card was used for basic playback instead of the heavier effects?

I wouldn't. We need *balanced* rigs designed with the app in mind. And that differs app to app.

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 12, 2019 Jul 12, 2019

Copy link to clipboard

Copied

Thanks, in understand that other video edit programs (resolve?) use one or more grphic card to accelerate not only the effects but also the display output. What i rear on the internet it is very effective.

I can imagine that such a accelation is also a possibility for Adobe.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 12, 2019 Jul 12, 2019

Copy link to clipboard

Copied

LATEST

Every app does things differently in building their process, how it scales processing through core counts, how/why/when it uses GPU, how ... everything ... is split up during the processing process.

Users think "Oh, they should just code it so it uses cores as you've got them, so a dual Xeon with 20 cores is fully used."

But it ain't that easy. MANY of the calculations are needing to be processed sequentially. That doesn't work split up to separate cores, many particular processes needs to be on one core. So how many things are processed sequentially versus can be processed simultaneously? That's a complex question, and difficult to rebuild code in changing. Takes a ton of thought and work.

Same with GPU use. I work some with mogrts & such, which is well, best built in Ae. Go to NAB or MAX, and listen to the arguments between Ae pros over which effects should or should not be ported to the GPU that aren't already. Some REALLY want X effect via GPU, another will moan that as he is far more worried about the way it processes something already through the GPU, and if they add the other effect to the GPU stack it will slow his renders down. So he doesn't want the additional effect ported to the GPU.

Avid, Resolve, Baselight, Premiere, FCPx all use the hardware differently from each other. Some are better at some things, and not so good at others.

Resolve ... starting as a color app, of course ... was built around needing to feed many monitors simultaneously and hash through heavy media. So multiple GPU use was an absolute need for their users.

Very few editors have up until recently run more than two monitors nor HAD to run heavy media, as proxy workflows have been common. So in Premiere, the work was on working the CPU for the various effects and such, integrating use of the GPU as the CPU could get to it.

Well, yea, that's changed, hasn't it? I'm running three monitors, and that's getting pretty common. So this is one area that quite naturally, because of the different user needs of the two tools, their developers had gone entirely different directions.

The Premiere coding needs to catch up in some areas, Resolve coding in others. I can assure you from talking with folks in both groups, they are both working furiously at it ... which is why competition is good for us tool users.

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 07, 2019 Jul 07, 2019

Copy link to clipboard

Copied

I would agree.  My thinking is that the video chip on a CPU should be fully disabled by the user to prevent issues when using a proper GPU, issues which were were present aplenty in PP's past and even today throw up errors for many users.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

Designing the functions of the supported hardware for an NLE is a complicated set of trade-offs. And every design team chooses different ones as best seems to fit the aims they're going for. Premiere was built around the CPU/RAM subsytem being the primary 'base' encoder, with the GPU hauled in for assigned specific tasks. Some of the 'reserved' GPU tasks, like Lumetri processing and Warp Stabilizer, can dominate many GPU's capabilities.

Ergo, the way the engineers designed the system.

There's always room to look at modifications, of course. Feel free to suggest them here, and especially on the bug/features UserVoice page ... and if you do post there, reply back here with a link so others may "upvote" if they choose.

Adobe UserVoice Bug /Feature form: https://adobe-video.uservoice.com/forums/911233-premiere-pro

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 06, 2019 Jul 06, 2019

Copy link to clipboard

Copied

Thanks, i am not sure if there is some setting to force Premiere to use hardware decoding of source material.

the point is that the very fast hardware decode form a good videocard is there already for many years. It is used in many even free videoplayers. 2K ,4K even 8K is possible on relative slow CPU computer.

It doesnot seem too complicated if free videopayer can handle it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines