0

I try to get familiar with libav in order to process a raw H.264 stream from a GenICam supporting camera. I'd like to receive the raw data via the GenICam provided interfaces (API), and then forward that data into libav in order to produce a container file that then is streamed to a playing device like VLC or (later) to an own implemented display.

So far, I played around with the GenICam sample code, which transferres the raw H.264 data into a "sample.h264" file. This file, I have put through the command line tool ffmpeg, in order to produce an mp4 container file that I can open and watch in VLC

command: ffmpeg -i "sample.h264" -c:v copy -f mp4 "out.mp4"

Currently, I dig through examples and documentations for each H.264, ffmpeg, libav and video processing in general. I have to admit, as total beginner, it confuses me a lot. I'm at the point where I think I have found the according libav functions that would help my undertaking:

I think, basically, I need the functions avcodec_send_packet() and avcodec_receive_packet() (since avcodec_decode_video2() is deprecated). Before that, I set up an avCodedContext structure and open (or combine?!?) it with the H.264 codec (AV_CODEC_ID_H264).

So far, my code looks like this (omitting error checking and other stuff):

...
AVCodecContext* avCodecContext = nullptr;
AVCodec *avCodec = nullptr;
AVPacket *avPacket = av_packet_alloc();
AVFrame *avFrame = nullptr;
...
avCodec = avcodec_find_decoder(AV_CODEC_ID_H264);
avCodecContext = avcodec_alloc_context3(avCodec);
avcodec_open2 ( avCodecContext, avCodec, NULL );
av_init_packet(avPacket);
...

while(receivingRawDataFromCamera)
{
  ...
  // receive raw data via GenICam
  DSGetBufferInfo<void*>(hDS, sBuffer.BufferHandle, BUFFER_INFO_BASE, NULL, pPtr)

  // libav action
  avPacket->data =static_cast<uint8_t*>(pPtr);  
  avErr = avcodec_send_packet(avCodecContext, avPacket);
  avFrame = av_frame_alloc();
  avErr = avcodec_receive_frame( avCodecContext, avFrame);

  // pack frame in container? (not implemented yet)
  ..
}

The result of the code above is, that both calls to send_packet() and receive_frame() return error codes (-22 and -11), which I'm not able to decrypt via av_strerror() (it only says, these are error codes 22 and 11).

Edit: Maybe as an additional information for those who wonder if

avPacket->data = static_cast<uint8_t*>(pPtr);

is a valid operation... After the very first call to this operation, the content of avPacket->data is

{0x0, 0x0, 0x0, 0x1, 0x67, 0x64, 0x0, 0x28, 0xad, 0x84, 0x5,
  0x45, 0x62, 0xb8, 0xac, 0x54, 0x74, 0x20, 0x2a, 0x2b, 0x15, 0xc5,
  0x62}

which somehow looks as something to be expected becaus of the NAL marker and number in the beginning? I don't know, since I'm really a total beginner....

The question now is, am I on the right path? What is missing and what do the codes 22 and 11 mean?

The next question would be, what to do afterwards, in order to get a container that I can stream (realtime) to a player?

Thanks in advance, Maik

Kiamur
  • 125
  • 1
  • 11
  • https://github.com/leandromoreira/ffmpeg-libav-tutorial/blob/master/README.md This might be helpful for you. – Vencat Apr 16 '19 at 11:30
  • It is... in fact, I was reading this before I posted here and was very excited because I thought I found a good starting point. However, something I seem to miss, because it is not working as I intend it to. A lot of examples deal with first unpacking a container, to get down to the raw H264 data stream. In my example, I thought, I can just skip the unpacking and jump in at the point, where the raw data is processed, but for some reason, it does not work. – Kiamur Apr 16 '19 at 12:24
  • have you tried to play your "sample.h264" file in vlc.? if it's playing, then what it's showing in media info panel about the stream.? does it has any container.? if you are using ffmpeg version less than 4, you have to call av_register_all() at the beginning. I would recommend you to achieve the things using ffmpeg command line before moving to C-API. – Vencat Apr 16 '19 at 18:10
  • Thanks Vencat, I've already utilized the cmd line tool ffmpeg to convert the raw data into an mp4 container successfully. Therefore I belive it must be possible to do with the data I receive from the camera. The version is greater than 4 btw. – Kiamur Apr 17 '19 at 05:55
  • In the media info panel of VLC, just the "Codec" tab contains some data: Codec: H264 - MPEG-4 AVC (part 10) (h264); Typ: Video; Video Resolution: 1920x1080; Buffer Size: 1920x1088; Refresh Rate: 25. I have a German VLC, maybe the trabnslation is not correct, but that are all the infos that VLC tells me about my sample.h264 file. But it is playing correctly... – Kiamur Apr 17 '19 at 06:41
  • share 10 sec sample of your recorded video, if possible (upload it on any public cloud and share the link here).I'll try to use C-API from my side. – Vencat Apr 17 '19 at 09:06
  • Thanks Vencat, here we go: http://s000.tinyupload.com/?file_id=41396612773253231019 http://s000.tinyupload.com/?file_id=08265780530731903335 One file is the raw data stream from the camera (sample.h264), the other one an ffmpeg command line tool packet to mp4 container. For me, it is important to be able to process the raw data stream. – Kiamur Apr 17 '19 at 09:47
  • 1
    As a quick help, you can use [https://ffmpeg.org/doxygen/trunk/remuxing_8c-example.html][1] ffmpeg example code to mux your raw coded h264 file to mp4 container. I have tested in ffmpeg version 4.0 and it's works fine. You can use below gcc command to compile this example code. $ gcc muxing.c -lavformat -lavcodec -lswscale -lavutil -lavfilter $ ./a.out sample.h264 sample.mp4 [1]: https://ffmpeg.org/doxygen/trunk/remuxing_8c-example.html – Vencat Apr 17 '19 at 12:03
  • Hi Vencat, it compiles and executes and I get an output mp4 file which plays just for one second. Maybe that is the case, because there are problems with the timestamps in my sample.h264 file. Anyway, it gives me something to look into with the debugger, in order to see what is going on. Thanks for that. (I probably come back with more questions ;-) ) – Kiamur Apr 17 '19 at 12:19
  • I think the obvious question now is, how I can use a "live" stream of h264 data, as I receive it from the camera in order to stream it in real time to a player. In the example, I examined the path of the input file argument and found out that all called sub-functions indeed expect a pointer to a file. Although the data I receive from my GenICam buffer is exactly the same data (to the last bit) like is contained in the sample.h264 file, I just cannot figure out how to use the direct stream from the buffer, in order to process the data with libav. It seems, libav is strictly i/o-file based. – Kiamur Apr 18 '19 at 09:24

1 Answers1

0

At least for the initally asked question I found the solution for myself:

In order to get rid of the errors on calling the functions

avcodec_send_packet(avCodecContext, avPacket);
...
avcodec_receive_frame( avCodecContext, avFrame);

I had to manually fill some parameters of 'avCodecContext' and 'avPacket':

avCodecContext->bit_rate = 8000000;
avCodecContext->width = 1920;
avCodecContext->height = 1080;
avCodecContext->time_base.num = 1;
avCodecContext->time_base.den = 25;
...
avPacket->data = static_cast<uint8_t*>(pPtr);
avPacket->size = datasize;
avPacket->pts = frameid;

whereas 'datasize' and 'frameid' are received via GenICam, and may not be the appropriate parameters for the fields, but at least I do not get any errors anymore.

Since this answers my initial question on how I get the raw data into the structures of libav, I think, the question is answered.

The discussion and suggestions with/from Vencat in the commenst section lead to additional questions I have, but which should be discussed in a new question, I guess.

Wai Ha Lee
  • 8,598
  • 83
  • 57
  • 92
Kiamur
  • 125
  • 1
  • 11