I'm currently developing a mobile app that will have a library of 2-5 minute videos (approx 100 in total) and going through the process of determining which versions of the videos to have ready to serve to different mobile devices. In my research, I have noticed that there is a lot of room to play with video settings such as dimensions and bitrate.
As a first test, I am attempting to find the minimum video size I can deliver to an iPhone XS with dimensions 1125x2436 without losing any noticeable quality. I started by scaling the video to 1125x2436 and creating versions with 5 different bitrates ranging from 500kbps-4400kbps. I noticed that at ~1500kbps, the video looks great and the size is cut ~1/3 so that was a good start.
Then after doing some reading, I saw that in adaptive bitrate scenarios Apple recommends delivering video of lower bitrate AND lower resolution. So in my next test I just cut both in half - scaled to 562x1218 and bitrate at 750kbps and noticed the video also looked great on the iPhone. So 1125x2436 at 750kbps looks bad, but 562x1218 at 750kbps looks great on the same device. To some extent this makes sense to me as you need less bits to fill a smaller screen but what I'm not understanding is how the scaling plays a factor. Shouldn't it essentially pixelate because the resolution is 1/2 of the iPhone dimensions? And at a higher level, is there a somewhat concrete way to figure out this optimal resolution / bitrate balance given the dimensions of a device? We want to most modern smartphones (iPhone 6 and later, Samsung Galaxy, etc.) so we need to be prepared for a range of dimensions (aspect ratios 9:16 or 6:13).