-1

For a project I'm working on, I'm trying to stream video to an iPhone through its headphone jack. My estimated bitrate is about 200kbps (If i'm wrong about this, please ignore that).

I'd like to squeeze as much performance out of this bitrate as possible and sound is not important for me, only video. My understanding is that to stream a a real-time video I will need to encode it with some codec on-the-fly and send compressed frames to the iPhone for it to decode and render. Based on my research, it seems that H.265 is one of the most space efficient codecs available so i'm considering using that.

Assuming my basic understanding of live streaming is correct, how would I estimate the FPS I could achieve for a given resolution using the H.265 codec?

The best solution I can think of it to take a video file, encode it with H.265 and trim it to 1 minute of length to see how large the file is. The issue I see with this approach is that I think my calculations would include some overhead from the video container format (AVI, MKV, etc) and from the audio channels that I don't care about.

Vivek Seth
  • 177
  • 1
  • 6
  • 1
    You don't have to create an audio stream and muxing overhead is usually much less than 1%. You could also create a raw H.265 bitstream. – Gyan Jun 19 '16 at 05:55

1 Answers1

4

I'm trying to stream video to an iPhone through its headphone jack.

Good luck with that. Headphone jack is audio only.

My estimated bitrate is about 200kbps

At what resolution? 320x240?

I'd like to squeeze as much performance out of this bitrate as possible and sound is not important for me, only video.

Then, drop the sound streams all together. Really though, 200kbit isn't enough for video of any reasonable size or quality.

Assuming my basic understanding of live streaming is correct, how would I estimate the FPS I could achieve for a given resolution using the H.265 codec?

Nobody knows, because you've told us almost nothing about what's in this video. The bandwidth required for the video is a product of many factors, such as:

  • Resolution
  • Desired Quality
  • Color Space
  • Visual complexity of the scene
  • Movement and scene changes
  • Tweaks and encoding parameters (fast start? low latency?)

You're going to have to decide what sort of quality you're willing to accept, and decide subjectively what the balance between that quality and frame rate is. (Remember too that if there isn't much going on, you basically get frames for free since they take very little bandwidth. Experiment.)

The best solution I can think of it to take a video file, encode it with H.265 and trim it to 1 minute of length to see how large the file is.

Take many videos, typical of what you'll be dealing with, and figure it out from there.

The issue I see with this approach is that I think my calculations would include some overhead from the video container format (AVI, MKV, etc) and from the audio channels that I don't care about.

Your video stream won't have a container at all? Not even TS? You can use FFmpeg to dump the raw stream data for you.

Brad
  • 159,648
  • 54
  • 349
  • 530
  • How would color space affect output video size? All else equal, if I changed the color space could I reduce the bitrate? – Vivek Seth Jun 19 '16 at 13:52
  • @VivekSeth Not by any amount that matters in your case. Changing your color space on the web is only going to make your video incompatible with a bunch of clients. All the things in that list you can easily control, and you latch onto colorspace? You can experiment with this and determine for yourself what works best for you. – Brad Jun 19 '16 at 16:44