If you're striving for absolutely-precise colors, you probably won't get it in H.264.
For one thing, it's 8-bit color. There will be inevitable differences between the colors in a 16 or 32 bit comp and H.264.
For another thing, H.264 uses 4-2-0 color sampling, aka color resolution. You can think of a video picture as being in two layers: a black & white layer, and a color layer. The B&W layer has a full-resolution image -- every pixel on the screen is represented in B&W. But you can cheat on the color layer. In rare video codecs you'll have 4-4-4 color sampling, meaning for each & every B&W pixel there's a corresponding color pixel. It's far more common to have codecs using 4-2-2 color sampling -- for every two B&W pixels, there's one color pixel. Amazingly, it looks really good, and you can't tell... and there are a boatload of codecs that are 4-2-2.
Then there's 4-2-0 color sampling. For every FOUR B&W pixels there's one color pixel. As you can guess, there's a lot of averaging of color values going on. Oh, it's good enough to fool the human eye, but it won't fool a computer. And you don't notice it unless there are hard, well-defined edges with lots of color contrast. It's a virtual lock you'll notice it then.
So long story short -- H.264's the wrong codec for precise color reproduction.