Skip to main content
Legend
February 1, 2025
Open for Voting

Standardization of Color Temp Adjustments in Adobe CC

  • February 1, 2025
  • 25 Antworten
  • 1884 Ansichten

I recently discovered what may appear to be an oversight and/or dumb down within Adobe Creative Cloud apps. 

The vast majority of my post editing is done in LrC. I was not even aware of the differences between LrC, PS, and ACR. I really like the way LrC scale uses the Kelvin scale allowing the user to better understand the image color temperature. LrC provides a better analogy in comparing the image to real world K temps. 

Not sure what the scales represent in either PS or ACR. Would like to see these apps be set to mirror LrC. This way regardless of the app you are working in and would like to set the temp to ~5600k be the same.

 

 

25 Antworten

Legend
February 18, 2025

No, as I have stated all along "Standardization" of terms between the apps. As an example what almost everyone who has ever used a computer in the past 30+ years has encountered an Adobe Acrobat file. In the early years of the Acrobat file the user became very familiar with the icon and what to expect when opening the file. However, in the past few years others developed apps to create PDF files. This has caused some unexpected results. The PDF files are like a box of chocolates and you don't know what you are going to get.

Jqqerry
Inspiring
February 18, 2025

You say that you didn't misunderstand, but the content of your thread seems to have entirely stemmed from misunderstandings. It also appears that many other contributors have kindly and thoroughly explained it to you...

 

Stephen Marsh
Community Expert
Community Expert
February 17, 2025

It makes sense, to some, the data is rendered, not raw.

 

Edit: I understand  that you're looking for consistency, regardless of the source of the data. I'm personally happy with things being contextual based on the data source.

D Fosse
Community Expert
Community Expert
February 17, 2025

Yes, which is what we've been saying all along. It's not that it isn't technically possible. It's that it doesn't make any sense.

Stephen Marsh
Community Expert
Community Expert
February 17, 2025

Adobe could have programmed the reported slider values to display the same with rendered images as with raw sensor images, but they didn't; they used a different scale. They could have left it as is, but they didn't. This would appear to indicate a reason for this departure between raw sensor data and rendered data.

Legend
February 17, 2025

I have no misunderstanding. The slider shows temperature. The reference point is a deviation from the daylight temperature of 5600 kelvin +/-. Really need to learn the values you are looking at. 

Jqqerry
Inspiring
February 9, 2025

@westdr1dw  Did you import the exact same RAW file into ACR?
Or are you referring to opening a JPEG or similar file in ACR and adjusting +100, -100?

If you imported a RAW file, then the values you mentioned—5600K, 3200K, 6300K, etc.—should appear exactly as expected.

However, if it’s not a RAW file, then of course, those values won’t apply in the same way. (This is likely the same for LrC as well—try importing a JPEG file into LrC and see for yourself.)

It's not a raw file.

It looks like other users have already explained this to you, but you seem to have a major misunderstanding about it.


 

Legend
February 9, 2025

This sounds like an argument to take up with Adobe engineers? If you call it one name in one of the apps, then maintain consistency. Without looking I think ACR refers to it as color temp. If this is the case then U/M should be Kelvin. Any idea on the +/- 100 value is measuring? There should be standardization across the photography world. Especially in Adobe products where they handle several different formats. This is in no way to say Adobe is wrong. 

D Fosse
Community Expert
Community Expert
February 7, 2025

And also - still - that the Kelvin value is somehow measured and absolute. It is not measured. It's just a compensation towards visually neutral, set in increments that correspond to the Kelvin scale.

 

Have you noticed that the Kelvin value you get in ACR/Lr is not the same as the camera reports? If you tried a third raw processor, you'd get yet a third value. And so on. They will all be different! That's because it doesn't reflect actual Kelvin, it's just a compensation, and since they are all different processing engines, they will compensate by different amounts.

 

Now, I assume it would be possible to build a colorimeter sensor into a camera. But it's not done because it's a lot of extra cost for no benefit. The current system works perfectly well.

 

Also, in Photoshop, consistency and predictability is king. Again, it would be possible to report the ACR/Lr number to Photoshop. But what if it's not a photograph at all, but a digital painting? What if it's a composite from different sources? That happens all the time. And what if the image doesn't come from Lr/ACR, but somewhere else? How about those who use, oh, say, Capture One? Should they then allow the scale to switch?

 

There's a hundred reasons this can't work in practice, and why the -100 to +100 scale is the only practical option with RGB files.

davescm
Community Expert
Community Expert
February 7, 2025

As explained earlier in this thread, Lightroom will do exactly the same as Photoshop/ACR and give you a scale -100,0,+100 on a preprocessed RGB image. It only gives a Kelvin value on raw images. You seem to have a misconception that Lightroom works differently to ACR in this respect, it does not.

 

Dave