I'm trying to stream video from different Raspberry Pi's ..; RPI 2B /2 B+ and RPI 3. I found out how to stream rtmp / flv video through NINGX and it works fine but now I need to convert this stream to MJPEG stream. Actually I need it as input for motion (detection)
I've tried mjeg_streamer to publish a MJPEG stream using FFMPEG / image2 in front to split the stream into image chunhs. But it's too slow and to much greedy
after many searches I've installed streameye which seems perfect. but the process traps 10s after a web client connects to the server hosted on the raspberry py (2/3 same result so not an ARM61/71 issue I think .. same with FFMPEG or AVCONV).
... and more over it crashes the ssh session we are connected to access the rapsberry device. An FFMPEG trap generating a SSH session issue ?. Note that the raspberry was not down after connection loss as all it's services were still up and running. Just need to connect via SSH to again.
I don't know if it's due to my ffmpeg feeding command or if it's a streameye nug. Didn't find any info about on google but a web client issue that would have been fixed in 0.8 (the version I'm working on)
from a FLV / h264 I run
avconv -r 15 -f flv -i rtmp://127.0.0.1/app/my_stream -c:v mjpeg -crf 15 -f mjpeg - | streameye -p 8080
the source stream is defined as follow (no audio stream) :
Input #0, live_flv, from 'rtmp://boxnet-0-eth/live/boxnet-2':
Metadata:
Server : NGINX RTMP (github.com/arut/nginx-rtmp-module)
displayWidth : 480
displayHeight : 320
fps : 0
profile :
level :
Duration: 00:00:00.00, start: 0.000000, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p(progressive), 480x320, 1k tbr, 1k tbn