• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

10bit colour dept and Nvivia GTX 1080 within Premiere Pro?

Explorer ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

I found on internet, as well as here on the forum, people saying that it is not possible to send 10 bit image to a (10 bit) display within Premiere Pro when using NVIDIA GTX 1080 (in 10 bit mode).  So just wanted to check if this is an outdated information or is it still the case?

Views

3.7K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

LEGEND , Mar 22, 2019 Mar 22, 2019

Still correct.

But even if it weren't, it's still not the best option.  The idea with external displays is to remove variables like the OS and GPU driver from the signal chain.

You need something like the following for both 10 bit and proper calibration.

Intensity Pro 4K | Blackmagic Design

Votes

Translate

Translate
LEGEND ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

Still correct.

But even if it weren't, it's still not the best option.  The idea with external displays is to remove variables like the OS and GPU driver from the signal chain.

You need something like the following for both 10 bit and proper calibration.

Intensity Pro 4K | Blackmagic Design

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

I'd have to check with the engineers. Internally Pr is all built for 32 bit float except when processing certain older effects (which are marked and best avoided).

The scales on the right side of the Lumetri scopes are confusing because of being "simplified". For instance it is thought Pr must be treating data at data levels rather than video because RGB Parade shows 0-255.

Wrong.

That number setup is to show what a monitor mapping will be as this is designed for editors not colorists.

Switch to say Waveform YC (with/without chroma) and suddenly the right side scale is 16-235.

Oh. So it's working in 8 bit?

Wrong again. Lumetri, like the main color works and processing, is all in 32 bit float, and treats all formats as that format standard specifies. So other than a few image sequence formats and a 4444 format or two, everything is handled video/legal level in whatever bit depth the clip comes into Pr in. So a 10 bit clip is viewed as a 10 bit clip but the Waveform YC scale will still say 16-235 as they just didn't add the other number set into the UI. You have to mentally translate.

Now that is internal processing and they haven't said what display bit depth it can send to monitors. I'll see if I can find that information.

Neil

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

I'd have to check with the engineers.

I don't think that would help.  GTX cards are limited to 8 bit output.

Thinking about it now, though, I don't know if that applies to RTX cards.

Either way, dedicated I/O is still the best option.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

This isn't true. The GTX 1080s and 2080s do support 10-bit if the monitor supports it. The 980s may as well, too.

The option to turn it on is on the Nvidia Control panel under the change resolution tab. Under #3 change the option to Use Nvidia color settings and you will have the choice under output color depth to change it to 10-bit.

Here is a screen grab:

10bit.JPG

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Mentor ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

Does it really matter ??

If I shoot stuff with my Nikon d800 ( use hdmi out to atomos recorder using DNxHD ) I get a 10 bit space ( 4.2.2 ) but I only get the same color in that space ( 4.2.0 ). The only advantage is the higher bit rate and the 10 bit depth so whatever effects and other stuff I add to the timeline is a tiny bit 'better' for the end product ( assuming I export something remotely resembling the timeline ).

Isn't this sorta stuff kinda like blowing smoke up each other's behinds ???  I mean, do I really need to talk to an engineer ( or the god of color realm in the world of computer RGB, Web, Broadcast, Projection, etc. ? )

Yikes !  No wonder I'm often misinformed and confused !

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

I think the point was proper monitoring of what you shot. (truer colors, no banding.) My particular point was that 10-bit is possible with GTX cards. So why not use it if you have it.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Mentor ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

eiso primary calibrated to rec 709 2.2

resolve 15

black magic decklink 4k card with SDI in and out , etc.

SDI to video monitor ( ickygamma )

timeline and export default of rec 709 2.4

works fine.

hobbyist.. not professional . don't know any engineers or smart people.

good to go for vimeo.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Mentor ,
Mar 22, 2019 Mar 22, 2019

Copy link to clipboard

Copied

LATEST

don't know what you're shooting etc. Got S log stuff recently from DP and it was mp4 4.2.0. He also has problems now and then with banding. Uses FCP and resolve. I worked on the stuff and did edit to learn about the source material. Came out nice ( resolve) but it was 4k and I had to optimize the footage for my pig computer ( the edit one I use ) cause pig was gasping for breath after only about 2 min. of timeline stuff.

Came out nice. But HE ( a pro from gazillion years of shooting pro stuff ) has banding problems sometimes on his own computer stuff.  Go figure.

It is what it is.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines