Copy link to clipboard
Copied
Hoping someone can explain what the cause is here to me, because it's driving me nuts. I have some static B&W png images in a timeline. I am applying a normal-blended opacity mask as a "crop", as it allows me to feather the edges as I overlay it over a background. Pretty straight forward stuff. But I am noticing that applying that mask- even without applying any kind of special blend mode or having any actual background element for it to even blend with- seems to be inherently decreasing the embedded .png image's quality.
Here's the settings I'm using:
Here's the "unmasked" still:
And here's the masked version of the still:
If you open the two images and back-and-forth between the two, you'll see that the masked version is applying some kind of adjustment that the mask settings don't describe to me. Given it's a black and white comic image with text that's already somewhat upscaled, this is rather a problem for me.
Copy link to clipboard
Copied
So, based on a suggestion elsewhere, I tried adjusting the project settings for the render engine. I had been using the Mercury playback software-only. Changing to the CUDA engine seems to improve quality, and result in it behaving the same with-and-without the mask: but now raises the question of why the different render engines are resulting in different behavior here at all?
CUDA render, mask off vs mask on:
Copy link to clipboard
Copied
can you share your system specs?
how are you checking quality, after export or after timeline rendering? (on both software only and GPU acc)
if after export, what format are you exporting to? can you share both your sequence and export settings?
usually masks require GPU acceleration