0

I am looking for a standardized approach to stream JPG images over the network. Also desirable would be a C++ programming interface, which can be easily integrated into existing software.

I am working on a GPGPU program which processes digitized signals and compresses them to JPG images. The size of the images can be defined by a user, typically the images are 1024 x 2048 or 2048 x 4096 pixel. I have written my "own" protocol, which first sends a header (image size in bytes, width, height and corresponding channel) and then the JPG data itself via TCP. After that the receiver sends a confirmation that all data are received and displayed correctly, so that the next image can be sent. So far so good, unfortunately my approach reaches just 12 fps, which does not satisfy the project requirements.

I am sure that there are better approaches that have higher frame rates. Which approach do streaming services like Netflix and Amzon take for UHD videos? Of course I googled a lot, but I couldn't find any satisfactory results.

One3Three7
  • 43
  • 1
  • 8
  • In general Netflix or other vendors what the stream is by using HTTP basically but they buffer the content, for example buffer the first 5 minutes of the film and while you are watching the film the continue the download and buffer the system – camp0 Nov 13 '19 at 19:22
  • 3
    Are you looking for an image by image transfer or a video stream? Those are two different things. – Timo Nov 13 '19 at 19:27
  • You state that your current `"approach reaches just 12 fps"`. Have you profiled to find the bottleneck? Is it the CPU usage when generating the jpegs or network bandwidth when transferring those jpegs? Or something else? – G.M. Nov 13 '19 at 19:30
  • I am looking for a image by image transfer. Unfortunately it is not possible to pre-buffer some of the images, as they should be displayed in real time. I haven't profiled it exactly, but I'm pretty sure the cause of the bottle neck is the confirmation on the receiver side. If I skip the confirmation, the transmitter sends so fast that the images are no longer displayed correctly at the receiver. However, I would prefer a standardized procedure, since I am sure that this problem has been addressed intensively by experts for a long time. – One3Three7 Nov 13 '19 at 19:43
  • How fast are you able to render the JPGs if they are just being read from a file locally on your machine? Videos have their own formats, e.g., mpeg-4. Video formats take advantage of the fact that the next frame may only have minor differences from the previous frame, so the compression can be much higher. – jxh Nov 13 '19 at 19:44
  • @jxh Well, it depends and and varies from the analog input signals and images sizes. Since I use the GPU for processing and compression, 150-250 fps are possible. – One3Three7 Nov 13 '19 at 20:02
  • @One3Three7 why do you even need the confirmation if you just display the data? When you say 12 fps is too low, it sounds like live visualization. This, on the other hand, would lean towards video streaming instead, since it is optimized for fluent picture display on variable bandwidth networks. – Timo Nov 13 '19 at 20:35
  • @Timo Indeed, it is a live visualization. I am not familiar with video or image processing, that is the reason why I asked. So, what would be your approach to transfer these images or video, to allow a real-time visualization? – One3Three7 Nov 13 '19 at 21:04

3 Answers3

3

Is there a standardized method to send JPG images over network with TCP/IP?

There are several internet protocols that are commonly used to transfer files over TCP. Perhaps the most commonly used protocol is HTTP. Another, older one is FTP.

Which approach do streaming services like Netflix and Amzon take for UHD videos?

Firstly, they don't use JPEG at all. They use some video compression codec (such as MPEG), that does not only compress the data spatially, but also temporally (successive frames tend to hold similar data). An example of the protocol that they might use to stream the data is DASH, which is operates over HTTP.

eerorika
  • 232,697
  • 12
  • 197
  • 326
1

I don't have a specific library in mind that already does these things well, but some items to keep in mind:

  1. Most image / screenshare/ video streaming applications use exclusively UDP, RTP,RTSP for the video stream data, in a lossy fashion. They use TCP for control flow data, like sending key commands, or communication between client / server on what to present, but the streamed data is not TCP.
  2. If you are streaming video, see this.
  3. Sending individual images you just need efficient methods to compress, serialize, and de-serialize, and you probably want to do so in a batch fashion instead of just one at a time.Batch 10 jpegs together, compress them, serialize them, send.

You mentioned fps so it sounds like you are trying to stream video and not just copy over images in fast way. I'm not entirely sure what you are trying to do. Can you elaborate on the digitized signals and why they have to be in jpeg? Can they not be in some other format, later converted to jpeg at the receiving end?

genpfault
  • 51,148
  • 11
  • 85
  • 139
  • To clarify, I actually want to send pictures and not videos. The analog signals come from a radar sensor and are sampled by a digitizer (~4x200MB/s). The captured data is transferred by RDMA via PCIe to a GPU. On this GPU a radar algorithm is executed on the data. The processed data (float) is then mapped to pixels in BMP24. I assumed it would be useful to compress it before sending the data over the network. A GPU is suitable for such a compression and I would like to use the advantage of it. Which format would you prefer? Do you think decoding of JPGs is a bottleneck (not on GPU)? – One3Three7 Nov 13 '19 at 20:41
  • @One3Three7: The GPU should be able to encode to video. I had asked how fast you could decode, and you already said you expected more than 100 fps playback, so I eliminated that as a possible bottleneck. – jxh Nov 13 '19 at 20:55
  • @jhx Then there was a misunderstanding on my part, I was referring to processing and encoding the data, not to rendering and decoding. I'll check how fast decoding and rendering takes. – One3Three7 Nov 13 '19 at 21:14
  • @One3Three7: Note that 1Gb/s as used in the example in my answer is normally able to handle HD quality video at 60 fps. So stream formats offer much better compression, ie, allow more frames per second to be delivered. The only buffering required is to handle occasional network jitter (maybe 10 seconds of buffering if you are particular about not seeing jitter). – jxh Nov 13 '19 at 22:11
0

This is not a direct answer to your question, but a suggestion that you will probably need to change how you are sending your movie.

Here's a calculation: Suppose you can get 1Gb/s throughput out of your network. If each 2048x4096 file compresses to about 10MB (80Mb), then:

1000000000 ÷ (80 × 1000000) = 12.5

So, you can send about 12 frames a second. This means if you have a continuous stream of JPGs you want to display, if you want faster frame rates, you need a faster network.

If your stream is a fixed length movie, then you could buffer the data and start the movie after enough data is buffered to allow playback at desired frame rate sooner than waiting for the entire movie to download. If you want playback at 24 frames a second, then you will need to buffer at least 2/3rds of the movie before you being playback, because the the playback is twice is fast as your download speed.

As stated in another answer, you should use a streaming codec so that you can also take advantage of compression between successive frames, rather than just compressing the current frame alone.

To sum up, your playback rate will be limited by the number of frames you can transfer per second if the stream never ends (e.g., a live stream).

If you have a fixed length movie, buffering can be used to hide the throughput and latency bottlenecks.

jxh
  • 69,070
  • 8
  • 110
  • 193