Copy link to clipboard
Copied
Hey everyone, apologies ahead of time if this has already been touched upon before. However I'm quite confused as only one of the shots in my video is facing this issue.
using AE and ME 2023 specs:
Windows 11
AMD Ryzen 9 5900x
RTX 3070 Ti
128gb DDr4 RAM
Basically I'm at the end of my pipeline for this animation I'm making, after rendering I comped in nuke and exported every shot as a PNG sequence (no alpha), and then plugged it through Topaz Video Ai to upscale my animation from 1080p to 4K, still with each shot as a seperate PNG sequence. So now I have a collection of 4K png sequences that I just want to compile together for my final animation file.
However, for whatever reason the volumetric lighting in the last shot is quite horrific. Now I understand this can be an issue when compositing the raw volumetrics onto other render layers with color management and color spaces and whatnot, but that shouldn't be much an issue here. I'm aware I'm losing some color information but that really isn't my concern. I've been encoding the video using media encoder, using mostly default settings. (H.264 format, present Match Source - High Bitrate, 4K 30fps)
Normally I would do more testing before reaching out, but as I said I have other scenes that have equal to or more volumetrics where this isn't an issue. So at this point I'm left scratching my head. Maybe it's something to do with the individual comp I have the image sequence in? That's the only variable I can imagine although I created the comp from footage so it should have the same settings as all the rest.
Any tips on what I should try out would be greatly appreciated
The photo included shows an original frame from the sequence (left) and a screenshot of the same frame after it is compressed (right)
Copy link to clipboard
Copied
You upscaled your footage, which in itself could have introduced banding. It may not be visible at first, but only show up when layering. and your shot looks like there is no real black in there, so it all becomes a murky mess of dark grey producing a haze. Combine that with the limitations of MP4 encoding and the usual 8bit banding and I think you have your explanation. Point in case: Your volumetrics span more than the available 1024 pixels/ 256 levels that would be a smooth gradient and there will inevitably be some banding. The usual tricks apply like adding noise to improve dithering, blurs to blend colors and all that. Anything more will require to see the individual passes/ layers and settings with how they are comped together, i.e. screenshots of the timeline and soloed layers.
Mylenium
Copy link to clipboard
Copied
Thank you for your input Mylenium. Again I don't believe upscaling the footage caused banding as I am not layering after upscaling. I will try upscaling at a higher bitrate and see if that helps at all, thank you. I'm not exactly sure what you mean by "there is no real black there". I suppose I would understand if the banding only occured over only the terrain which is not black, but the stars in the background is as close as I can get to black, the only thing making the sky not black is the volumetrics. Can you explain a little more? I don't understand what difference having pure black as the background would have, maybe it would improve quality some but surely I'm not expected to only use volumetrics in a purely black scene.
I totally understand there will be banding, my problem is that when I encode the video results vary. Let me dive further into that. I've included another screen shot, the one of the farthest left is a screenshot from the original mp4 I was looking at but a different shot where this banding is not an issue. The middle screenshot is the same shot we've been looking at but from a different mp4 I rendered out of AE (same encoder settings, nothing different to this shot, I've been doing updates to other shots which is why I'm at a different mp4 now and can't use this one) and the screenshot all the way to the right is the original shot and mp4 that we were looking at in the beginning for comparison. To recap my problem is with varied results when using the same sequence and same encoder settings. Maybe it is more of a bug or error.
Regardless I will try upscaling again at a higher bitrate and if that offers no help I will go back to Nuke with this shot and try to add some blurs and grain to the volumetrics and see if that helps, I was hoping not to go back to Nuke as this shot takes over an hour to render out of there. Again I don't see any of this being necessary as I have already produced the desired result, I just want to be able to produce that result consistently.
Copy link to clipboard
Copied
I did some messing around and now I'm slightly more confused. I upscaled using 16bit rather than 8 and while that offered some help it was still giving me noticreable banding. When encoding the video I had been using practically default settings:
format: H.264 Preset: Match Source - High Bitrate (I've also been checking off maximum depth and maximum render quality) I tried increasing the target bitrate, using VBR, 2 pass with min at 16 and max at 32 and again it looked slightly better but not by much. I then switched the preset to High Quality 2160 4K and it seemes to produce the desired results consistently, however the file size increased 10 fold.
I'm confused to what the difference between using the presets "Match Source - High Bitrate" and "High Quality 2160p 4K" are in this case. My source was already 4k, why is there such a difference between the two? I noticed in bitrate settings it is much higher for the 4K deafult versus the Match Source default, is this the only difference between the two and is this whats causing my video file to now be a full gigabyte?
Anyone have any tips on how to use this preset with this quality without increasing my file size 10x? It seems this can fix my problem but with the cost of a very large video file which is not really what I was hoping to give to my client. This isn't entirely the fix I was looking for either considering my last reply which shows I was able to produce this result with a much smaller file size yet I can't reproduce that result consistently at that file size.
The screenshots I included are from the ogrinial mp4 for comparison on the left and on the right a screenshot from the new - much bigger in terms of file size - mp4 on the right.
Copy link to clipboard
Copied
If you reduce banding in a 16- or 32-bit project but then render it to an 8-bit codec, the banding will return. You may be able to reduce the perceptible amount of banding by adding some noise or grain to the footage. With very subtle gradients like the ones you have in your screenshots, you can even run the risk of some banding using a 10-bit codec.
Here are the numbers. There are only 256 color values available per channel. If the gradient is mostly blue and goes from a blue value of 200 to a blue value of 150 over 1000 pixels, you will get a hard color value change every 20 pixels. When you start combining the changes in Red and Green over the same distance, the combination of color changes can line up and give you a distinct color change about every 40 or 50 pixels. When you combine that with the Color Sampling that H.264 compression uses by averaging the color in blocks of 4 pixels, you can get some fairly hard lines that show up as banding. Adding Noise or Grain to the shot hides those edges by breaking up the 4X4 pixel blocks of color with slightly different values.
I hope this helps. I almost always add a little film grain in an adjustment layer on top of everything when I am creating compositing shots. It just works better.
Copy link to clipboard
Copied
Thank you for your reply Rick, I appreciate you taking the time to explain exactly why banding occurs in the first place. That helps give me a better understanding for future projects.
I'm not sure if this screenshot will mean much to you but I feel we're probably at the point in this discussion I should show how I'm laying the volumetric lighting in the first place. This is a screenshot from my comp in Nuke, as you can see I'm seperating each of the individual lights out so I can adjust each light individually. I applied a different amount of blur and animated noise to each light, the noise generally getting smaller as it goes further back in space. The original intent however was to replicate a sort of atmospheric fog that the volumetric light is passing through however so the scale of the noise is quite big. Off screen of the screenshot I'm then applying all of these layers over the rest of the scene using a screen operation. Probably goes without mentioning but I'm using the same setup with all the other shots with volumetrics with slightly different values.
I'm just confused as I would have figured if my bitrate were too low all of the shots including the same volumetrics would appear with the same noticeable banding, I suppose maybe this is just a case where one shot needed more noise layering with smaller values. I'm still not quite sure. Thanks again for your time.