6

I see a ton of info about piping a raspivid stream directly to FFMPEG for encoding, muxing, and restreaming but these use cases are mostly from bash; similar to:

raspivid -n -w 480 -h 320 -b 300000 -fps 15 -t 0 -o - | ffmpeg -i - -f mpegts udp://192.168.1.2:8090ffmpeg

I'm hoping to utilize the functionality of the Picamera library so I can do concurrent processing with OpenCV and similar while still streaming with FFMPEG. But I can't figure out how to properly open FFMPEG as subprocess and pipe video data to it. I have seen plenty of attempts, unanswered posts, and people claiming to have done it, but none of it seems to work on my Pi.

Should I create a video buffer with Picamera and pipe that raw video to FFMPEG? Can I use camera.capture_continuous() and pass FFMPEG the bgr24 images I'm using for my OpenCV calculation?

I've tried all sorts of variations and I'm not sure if I'm just misunderstanding how to use the subprocess module, FFMPEG, or I'm simply missing a few settings. I understand the raw stream won't have any metadata, but I'm not completely sure what settings I need to give FFMPEG for it to understand what I'm giving it.

I have a Wowza server I'll eventually be streaming to, but I'm currently testing by streaming to a VLC server on my laptop. I've currently tried this:

import subprocess as sp
import picamera
import picamera.array
import numpy as np

npimage = np.empty(
        (480, 640, 3),
        dtype=np.uint8)
with picamera.PiCamera() as camera:
    camera.resolution = (640, 480)
    camera.framerate = 24

    camera.start_recording('/dev/null', format='h264')
    command = [
        'ffmpeg',
        '-y',
        '-f', 'rawvideo',
        '-video_size', '640x480',
        '-pix_fmt', 'bgr24',
        '-framerate', '24',
        '-an',
        '-i', '-',
        '-f', 'mpegts', 'udp://192.168.1.54:1234']
    pipe = sp.Popen(command, stdin=sp.PIPE,
                    stdout=sp.PIPE, stderr=sp.PIPE, bufsize=10**8)
    if pipe.returncode != 0:
        output, error = pipe.communicate()
        print('Pipe failed: %d %s %s' % (pipe.returncode, output, error))
        raise sp.CalledProcessError(pipe.returncode, command)

    while True:
        camera.wait_recording(0)
        for i, image in enumerate(
                        camera.capture_continuous(
                            npimage,
                            format='bgr24',
                            use_video_port=True)):
            pipe.stdout.write(npimage.tostring())
    camera.stop_recording()

I've also tried writing the stream to a file-like object that simply creates the FFMPEG subprocess and writes to the stdin of it (camera.start_recording() can be given an object like this when you initialize the picam):

class PipeClass():
    """Start pipes and load ffmpeg."""

    def __init__(self):
        """Create FFMPEG subprocess."""
        self.size = 0
        command = [
            'ffmpeg',
            '-f', 'rawvideo',
            '-s', '640x480',
            '-r', '24',
            '-i', '-',
            '-an',
            '-f', 'mpegts', 'udp://192.168.1.54:1234']

        self.pipe = sp.Popen(command, stdin=sp.PIPE,
                         stdout=sp.PIPE, stderr=sp.PIPE)

        if self.pipe.returncode != 0:
            raise sp.CalledProcessError(self.pipe.returncode, command)

    def write(self, s):
        """Write to the pipe."""
        self.pipe.stdin.write(s)

    def flush(self):
        """Flush pipe."""
        print("Flushed")

usage:
(...)
with picamera.PiCamera() as camera:
    p = PipeClass()
    camera.start_recording(p, format='h264')
(...)

Any assistance with this would be amazing!

3 Answers3

6

I have been able to stream PiCamera output to ffmpeg with something like the following:

import picamera
import subprocess

# start the ffmpeg process with a pipe for stdin
# I'm just copying to a file, but you could stream to somewhere else
ffmpeg = subprocess.Popen([
    'ffmpeg', '-i', '-',
    '-vcodec', 'copy',
    '-an', '/home/pi/test.mpg',
    ], stdin=subprocess.PIPE)

# initialize the camera
camera = picamera.PiCamera(resolution=(800, 480), framerate=25)

# start recording to ffmpeg's stdin
camera.start_recording(ffmpeg.stdin, format='h264', bitrate=2000000)

Or is that not what you're looking for?

  • This is probably a lot easier :) although putting AVC video in a MPEG-1 PS container is weird — does that even work? In any case, OP is streaming to UDP as TS, so that part is under control. – hobbs Jul 22 '17 at 02:34
  • Seems to work for me! The output is playable in omxplayer. But I'm open to suggestions for improvements! – Kevin Villela Jul 23 '17 at 03:19
  • This is exactly what I needed! I was clearly misusing the pipes to the subprocess, but it helped me realize a few of the settings I was trying to use in FFMPEG were also giving me issues. Thanks! – VeniVidiReliqui Jul 24 '17 at 16:17
  • 1
    @KevinVillela a `.mp4` file (ISO BMFF) would be the usual. – hobbs Jul 25 '17 at 03:32
1

Two problems that I see at first glance:

  1. In your first example, you're writing your data into the subprocess's stdout instead of its stdin. That definitely doesn't work, and probably causes a hang.

  2. In both examples, you're starting the process with stdin=sp.PIPE, stderr=sp.PIPE and then never reading from those pipes. That means that as soon as ffmpeg writes enough output to fill the pipe buffer, it will block and you'll have a deadlock. Use the default stdout=None, stderr=None to let ffmpeg's output go to your process's stdout and stderr, or connect them to a filehandle opened to /dev/null to discard the output. Or use the communicate method to get the output each time you write some input, and do something useful with it (like monitor the status of the streaming).

hobbs
  • 223,387
  • 19
  • 210
  • 288
0

Hi you can use opencv & ffmpeg & restream it to wowza or else .

Here is a sample with opencv && ffmpeg

    int main(int argc, char* argv[])
{
    if (argc < 4){
        cout << "eksik parametre" << endl;
        return -1;
    }
    int fps = 1;                        //fps degeri varsayilan 1
    char *input_adress = argv[1];       //goruntunun alinacagi dosya yada adres bilgisi
    char *output_adress = argv[3];      //ciktinin gonderilecegi dosya yada adres bilgisi
    sscanf(argv[2], "%d", &fps);        //fps degeri okundu
    VideoCapture video(input_adress);   //kamera acildi
    if (!video.isOpened()){
        cout << "Yayin acilamadi!!!" << endl;
        getchar();
        return -1;
    }
    Mat frame;//frame ornegi
    FILE *pipe;//pipe icin acilan process in input streami
    char *cmd = (char*)calloc(100 + sizeof(output_adress), sizeof(char));//komut icin alan alindi
    sprintf(cmd, "ffmpeg -y -f image2pipe -vcodec mjpeg -r %d -i -  -r %d -vcodec libx264 -f flv %s", fps, fps, output_adress);//ffmpeg komutu
    //ffmpeg komutu aciliyor
    if (!(pipe = _popen(cmd, "wb"))){
        cout << "Acilamadi!!!" << endl;
        return 1;
    }
    float wait_time = 1000.0f / (float)fps;//kac milisaniye bekletilecek
    while (true)
    {
        try {
            //videodan siradaki frame okunuyor
            video >> frame;
            if (frame.empty())              //eger bos frame ise video bitmis demektir
                break;
            adjust_brightness(frame);       //parlaklik ayarlamasini yapıyoruz
            write_jpeg(frame, pipe);        //jpeg formatina cevirip pipe a yazıyoruz
            if (waitKey(wait_time) >= 0) break;
        }
        catch (Exception ex){
        }
    }
    video.release();
    return 0;
}

in write_jpeg moethod fwrite and fflush calling.

This will call with

testOutput.exe ORIGINAL_SOURCE RTMP_OUTPUT

Emre Karataşoğlu
  • 1,649
  • 1
  • 16
  • 25