Copy link to clipboard
Copied
Hi All,
I am somewhat new to photo and video and have a question that may be easy for someone to answer.
I typically work with videos shot with my iPhone , which I know the pixel count is 1920 by 1080, however, I recently made a video from a photo sequence entirely. The photos are 3024x4032 (standard iphone photo size), so when i first rendered the project out it was really blurry and quite massive, however, I resized the sequence to be 1440 by 1080 which is the same ratio but much less pixels. I then had to scale the images down by to about 50 percent and that seemed to work.
I am really looking for some clarification as to why an image has so many more pixels than a video and also when making the sequence 1440 by 1080 is the overall quality going down now because it is so much less than the original photos which are 3024 by 4032?
I appreciate help!
Matt
1 Correct answer
Matthew:
I initially suspected it was a bit rate issue (or some other settings issue), but I tried re-creating what you're doing with some photos I took of plants over the weekend and came across something that might be causing your loss of picture quality when you export at 100% rather than 50%.
Go to your Sequence and double-check what the Video Previews setting is for Width and Height. If it's 1440x1920, then that is the likely cause.
Here's what I did:
- Take photos at 3024x4032.
- Import photos in
Copy link to clipboard
Copied
matthewr58346897 wrote
I am really looking for some clarification as to why an image has so many more pixels than a video
Because a still camera only has to capture, process and store 1 frame, while a video camera has to do that 24,30,60 or more times in a single second. Historically, the hardware available was hard pressed to do so with high pixel count images.
With advances in hardware, we now have video cameras that record (capture, process and store) images at 7680×4320 (8K) so this is becoming more and more moot.
when making the sequence 1440 by 1080 is the overall quality going down now because it is so much less than the original photos which are 3024 by 4032?
There are certainly less pixel per image, but it should look quite good and playback. Image quality is subjective and the device the video is being played back on has an impact - an iPhone versus theatrical projection, for example. If it looks good to your eyes, then it is good.
The smaller the image is displayed, the sharper it will look.
MtD
Copy link to clipboard
Copied
Thank you for the answer which leads to another question . what happens to all the extra pixels when the image is dropped into the standard video timeline? It is almost half the pixels?
Copy link to clipboard
Copied
Not sure I understand your question. If you are asking how the smaller images is derived from the larger, if you had a 1000x 500px image and resized it to 500x250px, Premiere drops (does not use, ignores) every other pixel in the image to create the smaller one.
The reduction of pixel count to a smaller sized image (displayed appropriately) is much, much less consequential to the apparent sharpness of an image than going the other way - enlarging an image, where Premiere needs to add pixels and guess exactly what those added pixels should look like.
MtD
Copy link to clipboard
Copied
Wow this so helpful. My final two questions then are this.
1. When the images 3024 images are dropped into the 1440 timeline why do they need to be scaled down if the aspect ratio is the same?
2. If I keep the sequence settings at 3024 by 4032 why is the video so large and so blurry despite the larger number of pixels?
Copy link to clipboard
Copied
matthewr58346897 wrote
1. When the images 3024 images are dropped into the 1440 timeline why do they need to be scaled down if the aspect ratio is the same?
Premiere assumes you want to place the larger source file into the timeline so that you can take advantage of its larger size for the purposes of repositioning or resetting the size of the image. If you don't want that behavior, and you want Premiere to automatically scale the image to fit into the frame size of the sequence (determined by the source files largest dimension) go to Preferences > Media and set the Default Media Scaling to Set to Frame Size:
and click OK. The next time you edit a source image into the sequence, it will be set to the frame size of the sequence automatically.
2. If I keep the sequence settings at 3024 by 4032 why is the video so large and so blurry despite the larger number of pixels?
I don't know. It should not happen. How are you viewing the exported file? If you import that exported file back into Premiere, does it still look incorrect in Premiere?
Post a screen shot of your export settings with the Summary revealed, like this example below:
MtD
Copy link to clipboard
Copied
Hello
Here are the screen shots. When I render the video out at this settings at the 3024 by 4032 frame size the quality is terrible, however, when adjusted to 1440 by 1080 it becomes much clearer. I am not sure why that is given the second option removes a ton of pixels. I did find that if I scale the bit rate up it helps but then the file is huge and I cant really share it to my phone or anything.
Thank you for the help.
Copy link to clipboard
Copied
With H264, the bit rate has to increase of the frame size increases.
Using a bit rate for HD at nearly double the resolution result in diminished picture quality.
Copy link to clipboard
Copied
What exactly are some established frame sizes and what is the standard bit rate for HD?
Copy link to clipboard
Copied
There is no standard bitrate for HD, all depends on what the end file is used for.
Standard framesizes: 1920x1080, 3840x2160.
Copy link to clipboard
Copied
Matthew:
I initially suspected it was a bit rate issue (or some other settings issue), but I tried re-creating what you're doing with some photos I took of plants over the weekend and came across something that might be causing your loss of picture quality when you export at 100% rather than 50%.
Go to your Sequence and double-check what the Video Previews setting is for Width and Height. If it's 1440x1920, then that is the likely cause.
Here's what I did:
- Take photos at 3024x4032.
- Import photos in to Premiere Pro.
- Select all of the photos in the Project panel and drag and drop them onto the New Item icon to create a Sequence at settings as close to the still images as possible.
- Chose Sequence > Sequence Settings to check the Custom settings.
- Under Sequence > Sequence Settings > Video Previews clicked "Reset" to change 1440x1920 to 3024x4032.
- Add fade in at head, fade out at tail and transitions along with a push in on each image.
- Chose File > Export > Media..., setting the Format to H264 and the Preset to Match Source - High Bit rate.
The rendered movie looks great as an H264 mp4 with a target bit rate of 10mbps and maximum of 12mbps at 3024x4032.
It seems with this workflow Premiere Pro automatically sets the Video Preview to half the width and half the height which would cause previews to render faster, but it commits you to either changing it later or being sure not to use the rendered Video Previews on export.
-Warren
Copy link to clipboard
Copied
Wow this was extremely helpful. Thank you.
Copy link to clipboard
Copied
One technical consideration, you're effectively turning your still images that have a still image aspect ratio into video. Rather than maintain the still image aspect ratio, I'd go with an established video aspect ratio at an established frame size. For example, at 1440x1080, you're using an established aspect ratio but not an established frame size (1440x1080 is actually the non-full raster size for HDV video that's presented at 16x9).
So, I'd go with 1920 x 1080 (16x9) or 1080 x 1080 (1x1). Of course, this means repositioning each still image to best fit the chosen aspect ratio or at least scaling to fit the width while cropping the top and bottom. Your stiles are close enough to 3840 x 2160 that you could opt for that instead of 1920 x 1080.

