Hello all, I have a quick, but important question about Premiere Pro CC 2018. It seems that Premiere Pro is using my integrated graphics on my CPU rather than using my installed and working dedicated graphics card to render GPU accelerated effects and such. I am running a fresh install of Windows 10 with the latest updates. I also have a fully updated Creative Suite. Below I have listed my system specs and screenshots for reference to the issue. Notice that under GPU Engine in Task Manager, it lists GPU 0, my integrated graphics. I have also done some research of my own, and I have come across adding the "cuda_supported_cards.txt" into Premiere Pro's installation directory. I have done that, and I have tried both typing "GeForce GTX 1060" and "GeForce GTX 1060 6GB" into the document, but its the same story for both cases: it uses my integrated graphics. The file currently lists "GeForce GTX 1060".
Thank for any and all help,
7th Gen Intel i7-7700K Processor (No current overclock)
nVIDIA GeForce GTX 1060 6GB
256GB M.2 SSD
2TB 7200 RPM HDD
16GB of 2400MHz DDR4 Memory
Windows 10 Pro 64-bit
Screenshots (Idle, Premiere Pro loaded and open, not rendering):
Screenshots (Premiere running, Rendering previews for a project of mine):
[Moderator note: moved to best forum for technical issues.]
Copy link to clipboard
I would go to say that for the people in this form, it is a problem. It may not be a problem on most desktops/laptops, but still, the people in this form are here to voice their problem.
Copy link to clipboard
So I've learned a couple things since my last post. I just built a desktop yesterday and was using a laptop before that. On the laptop, AME was using the IGP and GPU but the GPU usage was pretty low (5%-10%). I figured it was normal. On my desktop however, the IGP usage is 100% and the GPU is on average above 50%. The difference here is a 45W TDP with a locked CPU and an inadequate power supply for such demanding power draw vs an unlocked CPU that can draw over 100W easy with plenty of power supplied to both the CPU and GPU. My laptop was power limit throttling constantly and I was often editing and rendering at 1.0 GHz. Awful. Without the throttling, the GPU usage was much better but there was such a short window before it started throttling. If you're on a laptop or something with similar power constraints or have a locked CPU that might be the issue.
I've learned also through testing on my desktop that it is much much better to have the IGP enabled. Renders complete much faster with less heat and editing is also much smoother. Hope this helps.
Great information there ... thanks for posting!
I have a dumb question.
Exactly how do I enable IGB? How do I get to it, etc.
I have premiere 2018, ver 12.....
I looked at your specs. You should be able to turn on the IGP in your BIOS. I'm not sure what motherboard you have or how the bios is for that series but for mine it was under: 'Advanced' menu > System Agent (SA) Configuration\Graphics Configuration > iGPU Multi-Monitor setting >Enable
It seems that the BIOS automatically turns off the IGP when you install a graphics card. Don't forget the install the driver for it if not already done.
So has this embarrasing issue been resolved finally ? I'm about to embark on a full time gig using PP and will invest in a proper AMD GFX for the task, but I don't wanna waste money on a powerfull GFX if Adobe are still acting like —.
Moderator note: Profanity warning. It is unacceptable to use profanity.
My only gripe is, this seems like it would be fixable but instead we’re just told to disable the iGpu thus disabling potentially three screens the user could have used if the issue was fixed by Adobe.......
I am also frustrated with how little Adobe takes advantage of CUDA in my GTX 1060 6GB graphics card.
I don't have an issue of Adobe using the wrong card, it's just that it hardly uses the GPU at all during playback. Adobe is using 100% CPU during un-rendered playback while my graphic card just sits at under 5% utilization. However, if I just open up any of my h264 files in Windows Explorer, the file is opened in "TV & Movies" and I see GPU utilization spike up to 50% while CPU is at 5%, essentially the opposite of what happens during playback in Adobe. I don't get it.
Dell XPS 8900
Intel i7-6700 CPU @ 3.40GHz (6th Gen)
24GB DDR4 Memory at 2133 MHz
GeForce GTX 1060 6GB
Samsung 850 Evo SSD - O/S and Apps
Samsung 860 Evo SSD - Media, project files, media cache, scratch
Seagate Barracuda 1TB 7200RPM - Long-term storage
Adobe's design uses the GPU for certain things off the GPU Accelerated Effects list. Especially things like color, warp stabilizer, and major re-sizing things.
GPU Accelerated Effects: https://helpx.adobe.com/premiere-pro/using/effects.html
So if you're not using something off the list at the moment in playback or exporting, the GPU will not be used for much. Especially H.264 decoding/decompressing is a CPU-only task.
That's just an explanation of what is. Over time, they've added more effects to that list every release. There's still a lot more GPU use they could make, and for a rig like yours with a GPU way past 'balanced' with that four-core CPU, it would be great if they did use the GPU more.
A 1050/4GB would maybe have been a better 'fit' for you CPU/RAM, as when used with GPU-accelerated effects you'd be running a lot more percentage of the capabilities of the card. To really use that 1060/6Gb, you'd need to probably have a six-core CPU. As is.
Building computers for PrPro/NLE use is a tough thing ... most gaming concepts for building a 'video' rig not only don't apply, they are counter-productive. Fast cores: 3.8Ghz or better, up to around 10 cores, is a starting point. Then with as close to 10GB of RAM per core as you can go ... then matching a GPU to that capability. Using SSD drives for OS/programs, maybe an m.2/Nvme for projects/cache and even "active in-use" media if you can.
All that aside, pushing to get more things on the GPU Accelerated list is something to work for.
There's a bunch of GPU related UserVoice ideas/complaints posted. Couldn't find one specifically for adding to the general workload of the GPU ... so I filed one.
You are most welcome to go up-vote this and any others that tickle your fancy ...
Hi and sorry for my english, and if my question has an answer above...
I have a desktop, without GPU, and i want to use my intragrated GPU in my 8700k intel.
I have an old version (Premiere Pro CS6). how can i use it, if i can use it in this old version...
Thank you from before..
Unfortunately all Windows versions of CS6 do not support OpenCL at all, so with just the integrated graphics your system will be "locked" to the software-only (or CPU-only, with no graphics utilization whstsoever) mode.
Just to make this clear. This is happening on all laptops even around the world. I have a lot of friends including myself using different brand name laptops and they are having the same issue. This is obviously an Adobe problem. Sad to say, It will not be fixed until we VOTE for the fix. So....
1. Log on
3. Click Vote (The more votes, the better)
4. And wait for a patch. If you hurry and vote, we can get enough votes for the next patch coming soon. It has to be enough votes for them to consider the problem to be fixable. Good luck everyone.
I know you mean well here, but can we please encourage users to vote on the much more highly upvoted user's voice below, we dont need the ability to select the graphics card, we need Premier Pro to recognise and utilise the more powerful graphics card automatically.
^ Please update this one with all your might ^
For those that need a quick fix:
I've ended up pre-rendering the timeline and selecting 'use previews' on the rendering window. This yielded the best results with a %100 success rate.
I do hope they fix this issue ASAP though
This is going to seem silly..., BUT I was having the same issue after IT moved my office. After hours of scouring forums looking for a solution, it turns out my monitors were plugged into the MB and not the Nvidia GPU so windows assumed I wanted the Intel HD graphics to be the default. I just plugged the monitors into the right slot, disabled the Intel HD Graphics through the device manager for good measure and everything was good to go.
Tried disabling the Intel HD Graphics in Device Manager. Didn't work for me. Over a month ago I had a support call with Adobe. They did a screen share with me and saw the problem. They agreed that they need to do something in Premiere Pro CC 2019 to fix the problem.
Noticing Dell mentioned a lot here. I have a new Dell XPS 8930 with same problem. Adobe: please fix this issue!
This is a user to user forum. If you wish to communicate with the development team please post on the UserVoice system which goes directly into the engineers system.
And ... please do.
Just FYI, I recently used the Premier Rush app, and it chose the more powerful graphics card in my notebook during export/rendering (50% use on powerful graphics card, ~0% on motherboard graphics as shown in the Task Manager). Rush doesn't seem to have a preference for choosing a graphics card, but it chose correctly. It would seem that Adobe CAN do it correctly, they just need to have the Rush engineers teach the Premier engineers.
For those who are still angry about your GTX card not being used at all, I have one way to use it FULLY during EXPORT ONLY (Meaning to use GTX Hardware encoder instead of your iGPU hardware encoder, which I find was much, much, much faster.)
I have a 2year old laptop with i7 6700HQ and GTX 960M, running latest PR 2019. Thought I still see minor usage on GTX GPU during replay, I recently found a way to use the GTX card for exporting, which surely saved me a HELL lots of time.
But before I share my solution, I would like to point out something that should be the root of "Preview not using GTX GPU" issue. Hardware decoding. If you have checked "Accelerated H264 decoding" in PR settings, you will notice that your iGPU's "Decoder" engine was specifically high load in task manager. This is because PR is using your iGPU's hardware decoding unit to speed up decoding your raw videos, instead of CPU decoding. However, the problem is your GTX GPU can also decode the same video, at a much higher speed than iGPU, but it is not used, due to adobe ONLY recognising Intel QSV Codec (The official name for video encoding/decoding unit in an iGPU). This means that even if you disable your iGPU, Adobe will ONLY use the software to decode your video instead.This applies to AME and AE also.
HOWEVER, as many of you have mentioned, your GTX GPU is not 0%, but rather a very minor usage. You will also observe that it's VRAM is actually used. This is because your PR UI, including preview panel, is RENDERED on your GTX. What this means is that Windows uses GTX card to render your PR user interface, while the video you are processing with PR are STILL processed/decoded by Intel iGPU's QSV. This explains why forcing PR to run on GTX GPU does not really "transfer" the usage from iGPU to GTX. Your GTX IS indeed being used to DISPLAY PR, not DECODE and make your previews. GTX simply copies the decoded video from iGPU to display. That's all it does.
That being said, now it should be more clear about what is the root cause of this issue. Hence, by only saying "not using GTX GPU" is very vague, and in fact, wrong.
Now that you know how multi-GPUs works, time to share my solution to encoding using GTX.
Care: GTX GPU Built-in encoder offers much higher speed at a small cost of compression quality, especially for low-bitrate video. I recommend setting a relatively higher bitrate for this.
Below is the link to a encoder plug-in. After installing, restart PR and click export. You will find "Voukoder" as an option in the "Format" dropdown list. After selecting it, select "NVIDIA NVENC H264" in the video settings below. This is so far the ONLY thing I've used on PR that can drive my GPU to 100% usage and max VRAM usage.
I did a simple test with a 1080P 2.30min video. Using PR's default "H264" encoder took 3min29sec, while using NVENC took me 59sec only..
Oh and this plugin is open-source, meaning it is complete free of virus/whatever malware.
Sure. Happy to see my suggestion helped
By the way, I need to encode video to be 1080 high quality one. But this codec encoded 20 min of such video in to 80 MB which little bit confuses me)))
can you advise where I can read about codec setting so it will make a better quality?
Due to the way how hardware encoding works, at the same bitrate, its quality will be generally lower than software encoding. However this improves as Nvidia improves its hardware encoding unit in newer GPUs. For example, tests have shown that at the same bitrate, a 1080P video encoding by GTX1060 is about the same quality as x264 fast preset.
Hence, your encoding settings actually depends on how new is your GPU. I have a GTX 960M and I encoding 1080P at 12-18Mb/s and it looks as fine as x264. My suggestion is that use VBR and set the target bitrate to be 10%-20% higher than your usual x264 settings