1

I have a Java function which receives bytes of an H264 stream as follows:

void bytesReceived(byte[] bytes, int size)

Using ffmpeg, how can I transcode these bytes to some sort of image format? I would be happy with mp4, jpeg, etc. I've seen lots of examples using files and ffmpeg, but I don't know how I'd use the command line operations it offers to handle a stream of bytes.

Thanks.

Mitchell
  • 7
  • 1
  • 3
  • You can pipe this stream to ffmpeg: `ffmpeg -f h264 -i - -c copy file.mp4` – Gyan Jul 19 '17 at 19:19
  • @Mulvya Do you mean to suggest that a bytestream could be passed via the command line? What would this method implementation look like? I am familiar with Java processes / Runtime execution, but would I have to somehow redirect these bytes to standard input in order to continually transcode? The byte array will not be representative of the entire video, just the current portion: I am streaming these from an image source (camera). – Mitchell Jul 19 '17 at 19:27
  • ffmpeg can receive the bitstream on its standard input, so that's where you have to redirect it. Not familiar with Java, so can't help you there. If you start the process mid-stream, some of the initial frames won't get transcoded, as they'll likely be missing their reference frames. – Gyan Jul 19 '17 at 19:30

1 Answers1

0

Since you incoming data is already compressed (H.264) and you are fine with MP4. I'd use https://github.com/sannies/mp4parser to just wrap your incoming H.264 stream into a MP4 file in pure Java.

Markus Schumann
  • 7,636
  • 1
  • 21
  • 27
  • Hi @Markus Schumann, where can I get a good example of this? my input is h264(video/avc) from android's Mediacodec sent over sockets. I want to render the frames as they come in and also save as mp4 simultaneously. I tried FFMpegGrabber but it hangs on the start method. I also tried Xuggler, but it could not create the needed container. – gbenroscience Mar 16 '20 at 19:00