Highlighted

Hardware question (video Cards)

New Here ,
Sep 23, 2020

Copy link to clipboard

Copied

I currently have a Geforce GTX 780 Ti 3gb with 2880 Cudacores. I've been looking at the Geforce 2070 8Gb and noticed it only has 2304 Cudacores. I realize the Clock speed and 5 more GB of faster VRAM has a lot todo with it. But less cudacore concerns me. 

 

Should I see an improvement with my video rendering in Premiere Pro?

 

Rest of my system which will be upgraded ASAP

I7 6700K 4.5Ghx (soon to be AMD 3900x 12 core 570x MB)

32GB DDR4 

Samsung M.2  OS and Apps

Samsung SSD  Data

Corsair SSD scratchdisk 

TOPICS
Performance

Views

81

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more

Hardware question (video Cards)

New Here ,
Sep 23, 2020

Copy link to clipboard

Copied

I currently have a Geforce GTX 780 Ti 3gb with 2880 Cudacores. I've been looking at the Geforce 2070 8Gb and noticed it only has 2304 Cudacores. I realize the Clock speed and 5 more GB of faster VRAM has a lot todo with it. But less cudacore concerns me. 

 

Should I see an improvement with my video rendering in Premiere Pro?

 

Rest of my system which will be upgraded ASAP

I7 6700K 4.5Ghx (soon to be AMD 3900x 12 core 570x MB)

32GB DDR4 

Samsung M.2  OS and Apps

Samsung SSD  Data

Corsair SSD scratchdisk 

TOPICS
Performance

Views

82

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Sep 23, 2020 0
LEGEND ,
Sep 23, 2020

Copy link to clipboard

Copied

You cannot directly compare CUDA core count across completely different GPU architectures. The GTX 780 Ti uses a much older architecture than the RTX 2070. A Kepler core is nowhere near as powerful as a Turing core to begin with. Worse, the GTX 780 Ti cannot output 10-bit color via OpenGL (OpenGL output is limited to 8-bit color on all gaming GPUs prior to Pascal). And the two newest versions of Premiere Pro now require full support for CUDA 10.2 or higher in order to run properly. Unfortunately, all Kepler GPUs are depreciated in the latest version of CUDA, CUDA 11, to CUDA 10.1 or 10.0. And this will disable all GPU acceleration in the two newest versions of Premiere Pro simply because these GPUs are now too old (and thus cannot use any of the GeForce Studio drivers at all).

 

And with the release of the new RTX 3000 series GPUs using the new Ampere architecture, expect the next major driver branch of the Nvidia driver to completely discontinue support for all GeForce GPUs earlier than the desktop 900 series.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 23, 2020 0
New Here ,
Sep 24, 2020

Copy link to clipboard

Copied

So, I f I'm understanding this correctly the 2070 is too old to add any
benefit in rendering in premiere pro. What would you suggest?

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 24, 2020 0
LEGEND ,
Sep 24, 2020

Copy link to clipboard

Copied

No. You got it backwards. It's your GTX 780 Ti that's too old. In fact, that 780 Ti dates way back to early 2014; thus, that 780 Ti is now more than six years old. The RTX 2070 is much newer, dating back only to the second half of 2018.

 

In addition, your 780 Ti is a bit low on the amount of RAM. Adobe now strongly recommends a GPU that has 4 GB or more of its own discrete RAM to function properly.

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 24, 2020 0
New Here ,
Sep 24, 2020

Copy link to clipboard

Copied

Thank you, you've been a great help. I was also considering the 1660 ti
would that be able to handle 10 bit h.265? Or should I stick with the 2070?

Likes

Translate

Translate

Report

Report
Community Guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
Reply
Loading...
Sep 24, 2020 0