I found a good explanation, with visual comparisons, in this article:
iMac With Retina 5K Display Review: Do Those Extra Pixels Really Matter?
The visual examples include Lightroom.
It's a confusing topic. Both the "maximum" resolution and the "default" resolution use 5120 x 2880 pixels. The difference is that the "default" resolution is displaying as if it was 2560 x 1440, but with double the detail. This is shown in the macro example further down in the linked article.
You're actually always running the Retina display at its "maximum resolution" in that every individual hardware pixel is being addressed at every setting. The only difference is the pixel density/clarity of the content and the UI. As explained in the article, the UI can be scaled independently of the content. The UI is shown at the "scaled" resolution so that you can read it, while the content is shown at the full hardware pixel resolution. This is why you can have the third comparison in the linked article, where text appears to be the same size on both displays, but obviously 2x sharper on the Retina display at its "default" resolution.
bcdavis1975 wrote I would think, in an ideal universe, you'd want to be able to run your display at it's maximum resolution but "scale" the OS and application interfaces to the desires relative amount of real estate/readability. If I understand you correct... that's not what's happening when you adjust Apple's "scaling" options. |
Yes, that's what they're after, and that's what the examples should show. Content is always shown at full resolution so that hardware pixels aren't wasted, while UI is scaled for readability and is visibly sharper when shown at the same physical size as on a low-resolution display. Misunderstandings seem to come out of an assumption that the system "scaled" resolutions aren't showing detail at the level of the hardware pixels, but they are.