1

The goal is to stream video from a Raspberry Pi (Raspivid/H.264) over the network and into an OpenCV application running on a laptop.

The open CV capture is as follows (C++):

cv::VideoCapture cap;
cap.open("cam_1"); // cam_1 is a FIFO 

cv::Mat frame;

while(1){
    cap >> frame;
    cv::imshow("", frame);
    cv::waitKey(10);
}

The FIFO Stream is created as follows:

mkfifo cam_1

Once the OpenCV program is running, the netcat listener is started:

ncat --recv-only --keep-open --verbose --listen 5001 > cam_1

Once the netcat listener is running on the laptop, the stream is started from the Raspberry Pi

raspivid --verbose --nopreview -b 2000000 --timeout 0 -o - | ncat 192.168.LAPTOP.IP 5001

or, for debugging purposes, a local file on the laptop can be streamed into netcat:

cat video.h264 | nc 192.168.LAPTOP.IP 5001 

Both of which give the following error:

Unable to stop the stream: Inappropriate ioctl for device (ERROR)icvOpenAVI_XINE(): Unable to initialize video driver.

What is interesting is that if I start the Netcat listener on the laptop, then kill it with CTRL+C, and then start it again before starting the video stream, with either method... then the video plays properly.

I cannot figure out why starting the netcat listener and then killing it, and then starting again has an affect or what the affect is. I have considered that possibly I need to echo an EOF or BOF into the FIFO before the video, I am unsure of what that syntax would be.

I have tried all flavors of Netcat.

user3345225
  • 31
  • 1
  • 5

2 Answers2

2

I just solved this using the following https://stackoverflow.com/a/48675107/2355051

I ended up adapting this picamera python recipe

On the Raspberry Pi: (createStream.py)

import io
import socket
import struct
import time
import picamera

# Connect a client socket to my_server:8000 (change my_server to the
# hostname of your server)
client_socket = socket.socket()
client_socket.connect(('10.0.0.3', 777))

# Make a file-like object out of the connection
connection = client_socket.makefile('wb')
try:
    with picamera.PiCamera() as camera:
        camera.resolution = (1024, 768)
        # Start a preview and let the camera warm up for 2 seconds
        camera.start_preview()
        time.sleep(2)

        # Note the start time and construct a stream to hold image data
        # temporarily (we could write it directly to connection but in this
        # case we want to find out the size of each capture first to keep
        # our protocol simple)
        start = time.time()
        stream = io.BytesIO()
        for foo in camera.capture_continuous(stream, 'jpeg', use_video_port=True):
            # Write the length of the capture to the stream and flush to
            # ensure it actually gets sent
            connection.write(struct.pack('<L', stream.tell()))
            connection.flush()

            # Rewind the stream and send the image data over the wire
            stream.seek(0)
            connection.write(stream.read())

            # Reset the stream for the next capture
            stream.seek(0)
            stream.truncate()
    # Write a length of zero to the stream to signal we're done
    connection.write(struct.pack('<L', 0))
finally:
    connection.close()
    client_socket.close()

On the machine that is processing the stream: (processStream.py)

import io
import socket
import struct
import cv2
import numpy as np

# Start a socket listening for connections on 0.0.0.0:8000 (0.0.0.0 means
# all interfaces)
server_socket = socket.socket()
server_socket.bind(('0.0.0.0', 777))
server_socket.listen(0)

# Accept a single connection and make a file-like object out of it
connection = server_socket.accept()[0].makefile('rb')
try:
    while True:
        # Read the length of the image as a 32-bit unsigned int. If the
        # length is zero, quit the loop
        image_len = struct.unpack('<L', connection.read(struct.calcsize('<L')))[0]
        if not image_len:
            break
        # Construct a stream to hold the image data and read the image
        # data from the connection
        image_stream = io.BytesIO()
        image_stream.write(connection.read(image_len))
        # Rewind the stream, open it as an image with opencv and do some
        # processing on it
        image_stream.seek(0)
        image = Image.open(image_stream)

        data = np.fromstring(image_stream.getvalue(), dtype=np.uint8)
        imagedisp = cv2.imdecode(data, 1)

        cv2.imshow("Frame",imagedisp)
        cv2.waitKey(1)  #imshow will not output an image if you do not use waitKey
        cv2.destroyAllWindows() #cleanup windows 
finally:
    connection.close()
    server_socket.close()

This solution has similar results to the video I referenced in my original question. Larger resolution frames increase latency of the feed, but this is tolerable for the purposes of my application.

First you need to run processStream.py, and then execute createStream.py on the Raspberry Pi. If this doesn't work execute the python scripts with sudo

user2355051
  • 605
  • 1
  • 8
  • 26
1

If you touch the FIFO after OpenCV is trying to read it but before you start streaming to it, then it will work.

user3345225
  • 31
  • 1
  • 5