If you want accurate perspective changes, the camera movement must be the same, or at least close to the same, in the greenscreen stage as in the 3D environment. Perspective is controlled by camera position and framing by focal length. If you are shooting miniatures, then the camera movement must be scaled.
If the miniature has been created at a 10-to-one ratio (1 foot on the miniature equals 10 feet in real life), and you move the camera 20 feet on the green screen stage, the camera on the miniature should move 2 feet to match the perspective changes.
The farther away the camera is from the subject, the smaller the perspective shift is. If your actors close the distance to the camera by only about 10%, it is much easier to drop them in a virtual set by just motion tracking. The first shot in your sample works fairly well because there is very little movement toward or away from the camera, but as soon as you try and change the angle to the camera, the shot falls apart.
In the second shot, getting on the lift, the camera movement closely matches the virtual camera, so the perspective matches fairly well.
The easiest way to line up the shots would be to shoot the greenscreen shots with sufficient tracking markers or fixed objects in the scene for you to get a good camera track, then export the camera tracking data to the 3D app and use that to match the moves. Mocha Pro/Syntheyes has far better tools than the After Effects Camera Tracker for that kind of work.
Without some fairly decent camera tracking data in the greenscreen footage, you are stuck with trying to match the movement as well as you can by hand. Automating the process requires the accurate camera, not the actor or a moving prop, tracking.
... View more