basics of color correction (tv part)
I found this really good and concise explanation of color basics re: basics. It was some guy in Los Angeles, who won't mind me sharing this.
I just copied pasted stuff to notepad and won't edit the thing ...you'll get the idea the way it is...It's pretty cool way of just explaining and might help eliminate some confusion, like my own, for example.
=========begin paste=====
The Mini Monitor isn't just a TB to HDMI adapter, like the adapters your can buy for $7 off Amazon.
It converts a computer data signal to a TV signal. Fundamentally different types of signals. This is why you can't just plug a TV directly into your HDMI port in your computer. It would show up as second computer monitor, extending your desktop, displaying a computer signal.
TV video signals can be progressive, interlaced, different frame rates, and adhere to standard formats like ATSC, PAL, etc., encoded in Y'CbCr. Computer video signals are different beasts entirely, and computer monitors that are color-accurate and 10-bit for graphics work still are not equipped to handle interlaced signals or the various frame rates you find in TV signals.
The Mini Monitor's purpose is to allow you to view a true TV signal.
Thus BMD would never need to build in a second-screen full screen video playback like Premiere or FCP into Resolve, at least for color correction, because that could never be a true TV signal. It might, however, be useful in Resolve's growing capacity as an NLE apart from color correction, to be used purely as a non-accurate preview for editorial purposes (which is what Premiere and FCP and other NLEs are doing).
--------------
The issue here is the type of signal. As the broadcast online editor and/or colorist, you are responsible for shaping and delivering the actual deliverable end product: an electrical signal. Not a picture, not an image, not a video. These are higher-order constructs that exist in our minds. The actual thing that we are delivering, wether by file or by tape, is a representation of an electrical signal. A very particular type of signal. You need to monitor the kind of signal you are going to deliver. So you have to monitor your images as a TV signal.
A computer display with an application window full-screened is a computer signal with information about all sorts of stuff. It is not a TV signal.
You need to be able to "see" your signal. This means to view the pictures encoded by that signal, and at the same time measure various other properties of that signal via a set of external scopes. You do this to ensure that your signal is in technical compliance with laws that govern TV signals in your region, and to help make color correction decisions.
You use a Mini Monitor to get a TV signal out of your grading app, then monitor that signal on a calibrated TV and a set of external scopes. Your signal is managed by the Mini Monitor, containing only your TV signal.
Plugging your monitor into the computer, you are seeing a signal generated by the graphics card containing other information managed by the OS. We don't want to "see" or monitor that. That's not what we are delivering.
Think of it this way: the Mini Monitor turns yours computer into a "TV station" of sorts. You use your calibrated TV to "watch" the TV signal "broadcast" by your computer-turned-TV station.
Same metaphor extends to grading movies. In this case, your UltraStudio or other monitoring device feeds a projector. Your computer then becomes a movie theater projector, and you watch it in a grading theater on a silver screen.
It's all about the signal.
