-2

I am extracting frames from camera and doing processing on the extracted frame.Once the processing is done, I want to stream these frames with h.264 encoding to other system.How can I do that?

Raksha B
  • 7
  • 1
  • 5
  • 2
    The question is too broad to answer. You need to provide at least what is the platform or language you need to use. If you google this you can find lot of resources I think. – Lakindu Akash Feb 11 '20 at 11:43
  • Thanks for the reply.I am using python language and jetson nano for the processing.I can stream the frames after processing using socket connection.But I need to encode in h.264 format.I checked on google but couldn't get the solution I am looking for. – Raksha B Feb 12 '20 at 03:28

2 Answers2

2

You will generally want to put the H.264 into a video container like MP4 or AVI.

For example the wrapping from raw frame to streaming protocol for online video might be:

  • raw pixels bitmap
  • raw pixels encoded (e.g. h.264 encoded)
  • encoded video stream packaged into container with audio streams, subtitles etc (e.g. mp4 container)
  • container broken into 'chunks' or segments for streaming (on iOS using HLS streaming format).

Another common approach is for a camera to stream content to a dedicated streaming server and the server then provide streams to end devices using a streaming protocol like HLS or MPEG DASH. An example (at the time of writing and it appear to be kept updated) showing a stream from a camera using RTSP to a Server and then HLS or MPEG DASH from the server is here:

If your use case is simple you will possibly not want to use a segmented ABR streaming protocol like HLS or MPEG-DASH so you could just stream the mp4 file from a regular HTTP server.

One way to approach this that will allow you build on other's examples is to use openCV in Python - you can see an example in this question and answers of writing video frames to an AVI or MP4 container: Writing an mp4 video using python opencv

Once you have your MP4 file created you can place in a folder and use a regular HTTP server to make it available for users to download or stream.

Note that if you want to stream the frames as a live stream, i.e. as you are creating them one by one, then this is trickier as you won't simply have a complete MP4 file to stream. If you do want to do this then leveraging an existing implementation would be a good place to start - this one is an example of a point to point web socket based live stream and I open source and Python based:

Mick
  • 24,231
  • 1
  • 54
  • 120
  • Thanks for the reply.I need to write the frames into a container and then stream?Is that what you mean? – Raksha B Feb 12 '20 at 12:05
0

if you want to stream data over UDP socket - use RTP protocol for streaming.

Please go through the rfc specification of RFC 6184

Media Pieline for processing the camera data:

Camera RAW data ( RGB/YUV/NV12) -> H.264 encoder -> NALU packets RTP packetization-> Socket communication.

You can use ffmpeg python interface to achieve this goal.

Community
  • 1
  • 1
mail2subhajit
  • 1,106
  • 5
  • 16