1

I'm currently developing a mobile app that will have a library of 2-5 minute videos (approx 100 in total) and going through the process of determining which versions of the videos to have ready to serve to different mobile devices. In my research, I have noticed that there is a lot of room to play with video settings such as dimensions and bitrate.

As a first test, I am attempting to find the minimum video size I can deliver to an iPhone XS with dimensions 1125x2436 without losing any noticeable quality. I started by scaling the video to 1125x2436 and creating versions with 5 different bitrates ranging from 500kbps-4400kbps. I noticed that at ~1500kbps, the video looks great and the size is cut ~1/3 so that was a good start.

Then after doing some reading, I saw that in adaptive bitrate scenarios Apple recommends delivering video of lower bitrate AND lower resolution. So in my next test I just cut both in half - scaled to 562x1218 and bitrate at 750kbps and noticed the video also looked great on the iPhone. So 1125x2436 at 750kbps looks bad, but 562x1218 at 750kbps looks great on the same device. To some extent this makes sense to me as you need less bits to fill a smaller screen but what I'm not understanding is how the scaling plays a factor. Shouldn't it essentially pixelate because the resolution is 1/2 of the iPhone dimensions? And at a higher level, is there a somewhat concrete way to figure out this optimal resolution / bitrate balance given the dimensions of a device? We want to most modern smartphones (iPhone 6 and later, Samsung Galaxy, etc.) so we need to be prepared for a range of dimensions (aspect ratios 9:16 or 6:13).

wachutu
  • 317
  • 2
  • 11

1 Answers1

1

Q: Shouldn't it essentially pixelate because the resolution is 1/2 of the iPhone dimensions?

A: Absolutely not.

Q: is there a somewhat concrete way to figure out this optimal resolution / bitrate balance given the dimensions of a device?

A: Absolutely not.

This is a major oversimplification, BUT video encoding has little to do with resolution or bitrate. Its all about information density.

Think of it this way: If I wanted to display 12 hours of blue frames on a TV, it would contain very little information (In-fact I just encoded 12 hour of video of infinite resolution in a few byes in that sentence). But If I wanted every frame to be a little different than the previous frame, I would need to describe/encode the differences between frames.

The larger the differences per frame, the more bits need per second. Hence the "optimal" dimensions and bitrate are a moving target and change constantly through a video depending on temporal and spacial information density

szatmary
  • 29,969
  • 8
  • 44
  • 57