Keyframe distance effects don't make sense
I create a lot of timelapse videos and one issue I run into quite often is blocky artifacts in clouds and skies. One approach I've taken to try to deal with this is changing the keyframe distance which, if I understand it correctly, defines the distance between reference frames that are used for some kind of relative compression in the non-reference frames. In theory, the best video quality occurs when keyframe distance is 1 so that every frame is compressed independently and not as a change to some other frame. That much makes sense to me.
But my experiments show exactly the opposite - the larger the keyframe distance, the better the image quality. I've linked some examples below. These are all for an H.264 two-pass VBR encode between 140 and 160 Mbps. I used the "High Unrestricted" profile as well as "Main 5.2" and don't see much difference. I've tried both hardware and software encodes and they also look pretty much the same. The 72 distance is the default and provides the best overall image quality. 10 is a little worse but probably about the same. 1 is horrible. As far as I understand what keyframe distance is, that's exactly the opposite of what's supposed to happen.
So... what am I doing wrong?
Video for distance = 72: https://drive.google.com/file/d/1cmdECFXaDnde_kfiN8kbVwZ_EprWW1ZX/view?usp=sharing
Video for distance = 10: https://drive.google.com/file/d/1hs8LPHAjosRyrku75KqZtxC1C-6MNv41/view?usp=sharing
Video for distance = 1: https://drive.google.com/file/d/16-hURgVLN1kSZIdHUcKNruu8sFuLmrUL/view?usp=sharing
