• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Standardization of Color Temp Adjustments in Adobe CC

Enthusiast ,
Feb 01, 2025 Feb 01, 2025

Copy link to clipboard

Copied

I recently discovered what may appear to be an oversight and/or dumb down within Adobe Creative Cloud apps. 

The vast majority of my post editing is done in LrC. I was not even aware of the differences between LrC, PS, and ACR. I really like the way LrC scale uses the Kelvin scale allowing the user to better understand the image color temperature. LrC provides a better analogy in comparing the image to real world K temps. 

Not sure what the scales represent in either PS or ACR. Would like to see these apps be set to mirror LrC. This way regardless of the app you are working in and would like to set the temp to ~5600k be the same.

 

 

Idea No status
TOPICS
Windows

Views

589

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
25 Comments
Community Expert ,
Feb 01, 2025 Feb 01, 2025

Copy link to clipboard

Copied

I don't use LrC, so I can't comment.

 

The Adobe Camera Raw plugin (ACR) raw processing engine is what powers raw development in LrC, so if they use different scales or terminology in the interface that must be cosmetic.

 

The Camera Raw Fillter (CRF) inside Photoshop is based on ACR, however, this can't process raw data and the temp/tint, exposure and other raw processing controls don't work the same on rendered data as they do on raw camera sensor data.

 

Photoshop is for processed/rendered data, so it doesn't work the same way as ACR. For example, many photographers think that the Exposure adjustment in Photoshop is for photographs in 8 or 16 BPC mode, however, it is designed for HDR 32 BPC imagery.

 

Can you post comparative screenshots from LrC, ACR and Photoshop so that the forum can see exactly what features and differences you are comparing?

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 01, 2025 Feb 01, 2025

Copy link to clipboard

Copied

Stephen 

Your'e missing the point.

What I am stating which should be clear in one word "Standardization"

 However within a suite of apps you apply should have rules which are applicable to all apps within the suite. If you have one app which applies the measurment scale which is known in the scientific community as color temperature "Kelvin", why would you refer to it as +/- 100. What is this measuring?  When setting up studio lighting, the color temperature is based on Kelvin not on a +/- 100. ACR should use the same scale for color temperature as this is the unit of measurement. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 01, 2025 Feb 01, 2025

Copy link to clipboard

Copied


westdr1dw wrote:

I was not even aware of the differences between LrC, PS, and ACR.



Again, can you please post comparative screenshots from LrC, ACR and Photoshop so that the forum can see exactly what features and differences you are comparing?

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 02, 2025 Feb 02, 2025

Copy link to clipboard

Copied

@westdr1dw Here is the point:

 

There is nothing in the camera that measures the actual color of the light in the scene. There is no actual Kelvin number, just a compensation to produce a visually neutral result according to certain algorithms in the camera firmware.

 

Same thing in a raw processor. The raw file is just a data dump from the camera sensor, the photons that were recorded. There is no inherent information about the light in the scene.

 

So, the Kelvin number you read in LrC/ACR is the amount of compensation applied to produce a visually "neutral" result. The Kelvin scale is an appropriate parameter to refer to with raw files, because it corresponds to a normal range of light conditions in the photographed scene. But note that the Kelvin scale does not apply to LED or fluorescent.

 

In a rendered RGB file, all this is moot and irrelevant. The "temperature" you have set in the raw processor/camera is now baked into the RGB numbers. It's already done. An RGB file just is what it is. Obviously, you can still adjust along the blue/yellow axis, and you can call that "temperature" for convenience and to separate it from the green/magenta axis - but it doesn't make any sense to specify Kelvin numbers. Hence, a -100 to 100 scale is much more logical.

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 02, 2025 Feb 02, 2025

Copy link to clipboard

Copied

'What I am stating which should be clear in one word "Standardization"'

Just for clarity, Lightroom and ACR work exactly the same and, on raw files, use a Kelvin scale.

However, as D Fosse and Stephen point out, the camera raw filter, or Lightroom/ACR working on an already processed RGB file use a +/- numeric scale to further adjust the temp & tint. There is no requirement for an RGB image file to have the colour temperature in its metadata, therefore no way for an image processing application to know what the starting point of an RGB image, which could have been processed in any application or camera or even computer rendered,  is in order to use a Kelvin scale. Hence the +/- arbitary scale.

So there is no discrepency between applications.

 

Dave

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 02, 2025 Feb 02, 2025

Copy link to clipboard

Copied

Apparently you do not use studio lighting, color temp scale is measured in Kelvin. If you call it color temp, then use appropriate scale. Regardless of the file format. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 02, 2025 Feb 02, 2025

Copy link to clipboard

Copied

quote

Apparently you do not use studio lighting, color temp scale is measured in Kelvin. If you call it color temp, then use appropriate scale. Regardless of the file format. 

Which post are you referring to exactly? 

Where would that Color Temperature be documented in a processed file like a psd, tif, jpg etc.? 

 

Please post the requested screenshots. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 02, 2025 Feb 02, 2025

Copy link to clipboard

Copied

'Apparently you do not use studio lighting'

 

I think you'll find that I do, and have done so since the 1980s.  If you want a Kelvin scale for RGB, perhaps you could suggest how it should be calculated on an unknown, pre-processed file with no such metadata.

Dave

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 02, 2025 Feb 02, 2025

Copy link to clipboard

Copied

@westdr1dw 

I use studio lighting for a living. That's my job, as photographer at an art museum.

 

I am very familiar with the Kelvin scale and what it represents. But it doesn't apply to RGB data, where any compensation for the color of the light has already been done.

 

Take any image you want. What's the starting point on the Kelvin scale? What Kelvin value would you assign? It doesn't make any sense.

 

The ACR filter uses a small subset of the full ACR raw processor, using the same controls and sliders, with the same names. But an RGB file and a raw file are two very different things.

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 06, 2025 Feb 06, 2025

Copy link to clipboard

Copied

What is the baseline/reference point being used in LrC for the Color Temperature? 

https://www.adobe.com/ca/creativecloud/video/discover/color-temperature.html

We just need to apply the same rules between the apps. 

 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 06, 2025 Feb 06, 2025

Copy link to clipboard

Copied

What you are asking for is different to the discussion on how colour temperature can be used.

You want Photoshop to assign a starting point for a pre-processed RGB file on the Kelvin scale. The preprocessing could have warmed, cooled or tinted the image. So how would you measure that from just the pixel data in the image, which is all it has to go on.

 

Dave

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 06, 2025 Feb 06, 2025

Copy link to clipboard

Copied

There is actually an algorithm which can calculate the differences in colors in the image. However, again we seem to be talking past each other. The bottom line is in the title "Standardization" if you call something Color Temp then this should be based on the Kelvin scale. When I buy light bulbs I buy based on the color the light emits. The unit of measure is Kelvin. In my early days of photography understanding and applying the scale to your work is important. However, today we have the flexibility to warm or cool the image in post processing. Not sure where you guys went off course. In any case I will leave it alone. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 06, 2025 Feb 06, 2025

Copy link to clipboard

Copied

You seem to still make the assumption that something is actually measuring the light in the scene. That assumption is wrong, there is nothing in the camera and nothing in the software that does this.

 

The Kelvin adjustment in the raw processor measures only one thing: how wrong the image looks, and how much it needs to be compensated to look right. This usually - but not always! - correlates to the conditions the image was shot in.

 

If you have a processed RGB image there is nothing that looks wrong. It has already been corrected.

 

A light source can be measured, and a monitor can be measured. Therefore they will have an actual absolute Kelvin temperature. But there is no way to "measure" an image. How would you do that?

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 07, 2025 Feb 07, 2025

Copy link to clipboard

Copied

...and actually, this whole discussion can be neatly fitted into a nutshell with an example.

 

What is the Kelvin value of this image? What is the position of the slider on the Kelvin scale?

kelvin_3.jpgexpand image

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 07, 2025 Feb 07, 2025

Copy link to clipboard

Copied

You should be able to see what LrC is shows to be the color temp. If this value is incorrect, then LrC is not measuring the color temp. An Adobe person may have to answer the color temp algorithm?

However, you seem to be missing the point. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 07, 2025 Feb 07, 2025

Copy link to clipboard

Copied

As explained earlier in this thread, Lightroom will do exactly the same as Photoshop/ACR and give you a scale -100,0,+100 on a preprocessed RGB image. It only gives a Kelvin value on raw images. You seem to have a misconception that Lightroom works differently to ACR in this respect, it does not.

 

Dave

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 07, 2025 Feb 07, 2025

Copy link to clipboard

Copied

And also - still - that the Kelvin value is somehow measured and absolute. It is not measured. It's just a compensation towards visually neutral, set in increments that correspond to the Kelvin scale.

 

Have you noticed that the Kelvin value you get in ACR/Lr is not the same as the camera reports? If you tried a third raw processor, you'd get yet a third value. And so on. They will all be different! That's because it doesn't reflect actual Kelvin, it's just a compensation, and since they are all different processing engines, they will compensate by different amounts.

 

Now, I assume it would be possible to build a colorimeter sensor into a camera. But it's not done because it's a lot of extra cost for no benefit. The current system works perfectly well.

 

Also, in Photoshop, consistency and predictability is king. Again, it would be possible to report the ACR/Lr number to Photoshop. But what if it's not a photograph at all, but a digital painting? What if it's a composite from different sources? That happens all the time. And what if the image doesn't come from Lr/ACR, but somewhere else? How about those who use, oh, say, Capture One? Should they then allow the scale to switch?

 

There's a hundred reasons this can't work in practice, and why the -100 to +100 scale is the only practical option with RGB files.

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

This sounds like an argument to take up with Adobe engineers? If you call it one name in one of the apps, then maintain consistency. Without looking I think ACR refers to it as color temp. If this is the case then U/M should be Kelvin. Any idea on the +/- 100 value is measuring? There should be standardization across the photography world. Especially in Adobe products where they handle several different formats. This is in no way to say Adobe is wrong. 

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 08, 2025 Feb 08, 2025

Copy link to clipboard

Copied

@westdr1dw  Did you import the exact same RAW file into ACR?
Or are you referring to opening a JPEG or similar file in ACR and adjusting +100, -100?

If you imported a RAW file, then the values you mentioned—5600K, 3200K, 6300K, etc.—should appear exactly as expected.

However, if it’s not a RAW file, then of course, those values won’t apply in the same way. (This is likely the same for LrC as well—try importing a JPEG file into LrC and see for yourself.)

It's not a raw file.

It looks like other users have already explained this to you, but you seem to have a major misunderstanding about it.


 

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 17, 2025 Feb 17, 2025

Copy link to clipboard

Copied

I have no misunderstanding. The slider shows temperature. The reference point is a deviation from the daylight temperature of 5600 kelvin +/-. Really need to learn the values you are looking at. 

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 17, 2025 Feb 17, 2025

Copy link to clipboard

Copied

Adobe could have programmed the reported slider values to display the same with rendered images as with raw sensor images, but they didn't; they used a different scale. They could have left it as is, but they didn't. This would appear to indicate a reason for this departure between raw sensor data and rendered data.

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 17, 2025 Feb 17, 2025

Copy link to clipboard

Copied

Yes, which is what we've been saying all along. It's not that it isn't technically possible. It's that it doesn't make any sense.

Votes

Translate

Translate

Report

Report
Community Expert ,
Feb 17, 2025 Feb 17, 2025

Copy link to clipboard

Copied

It makes sense, to some, the data is rendered, not raw.

 

Edit: I understand  that you're looking for consistency, regardless of the source of the data. I'm personally happy with things being contextual based on the data source.

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 17, 2025 Feb 17, 2025

Copy link to clipboard

Copied

You say that you didn't misunderstand, but the content of your thread seems to have entirely stemmed from misunderstandings. It also appears that many other contributors have kindly and thoroughly explained it to you...

 

Votes

Translate

Translate

Report

Report
Enthusiast ,
Feb 18, 2025 Feb 18, 2025

Copy link to clipboard

Copied

LATEST

No, as I have stated all along "Standardization" of terms between the apps. As an example what almost everyone who has ever used a computer in the past 30+ years has encountered an Adobe Acrobat file. In the early years of the Acrobat file the user became very familiar with the icon and what to expect when opening the file. However, in the past few years others developed apps to create PDF files. This has caused some unexpected results. The PDF files are like a box of chocolates and you don't know what you are going to get.

Votes

Translate

Translate

Report

Report