I'm building an iOS app that will stream videos with a maximum length of 15 seconds. I read good things about HLS, so I've been transcoding videos with a segment size of 5 seconds. It's good that if the first part of the video takes too long to load, then we can fall back to a lower quality for the next 10 seconds.
However, I'm not sure if the added complexity is worth it. The main disadvantage is that we need to transcode additional videos for web. Another problem is that AVPlayer on iOS is basically a black box, and it would be difficult or impossible to build features such as caching segments to disk, or re-using the bandwidth measurements between videos. I think we would have to build our own HLS player from scratch so that we can have these features, and that would take a lot of effort.