3

I am generating a video stream from my webcam using ffmpeg from my Raspberry Pi, by using the following command:

ffmpeg -nostats -loglevel level+info -f v4l2 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 500k -bf 0 http://192.168.10.6:3030/stream

My API server receives the stream data and send it through socket.io to my (React) client. I can receive the data on my client, but it's a flux of ArrayBuffer; I'm not sure how to transform this data to display an actual video on canvas. Any idea?

The API route that handles the stream:

import logger from '@src/startup/logging'
import { Router } from 'express'
import { io } from '@src/index'

const stream = Router()

stream.post('/', (req, res) => {
  logger.info('received')

  req.on('data', (chunk: string) => {
    logger.info(chunk.length)
    io.sockets.emit('data', chunk)
  })

  req.on('end', () => {
    logger.info('Stream Terminated')
    res.end()
  })

  // res.status(200).send('video')
})

export default stream

The client page that should transform the data in images (doesn't work):

import { useEffect, useState } from 'react'
import socketIoClient from './socketIoClient'

export default function App(): JSX.Element {
  const [srcBlob, setSrcBlob] = useState(null)

  useEffect(() => {
    const handleSocketIo = async () => {
      const socket = await socketIoClient()

      socket.on('greeting', (data: string) => {
        console.log(data)
      })

      socket.on('data', (data: ArrayBuffer) => {
        console.log(data)
        // I just copied this from another example, but it does not display anything
        const arrayBufferView = new Uint8Array(data)
        const blob = new Blob([arrayBufferView], { type: 'image/jpeg' })
        const urlCreator = window.URL || window.webkitURL
        setSrcBlob(urlCreator.createObjectURL(blob))
      })

      socket.emit('greeting', 'hello from client')
    }

    handleSocketIo()
  }, [])

  if (!srcBlob) {
    return <div>Loading...</div>
  }

  return <img src={srcBlob} />
}

This is part of the output I get when I use ffmpeg:

[info] Input #0, video4linux2,v4l2, from '/dev/video0':
[info]   Duration: N/A, start: 174842.704640, bitrate: 147456 kb/s
[info]     Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
[info] Stream mapping:
[info]   Stream #0:0 -> #0:0 (rawvideo (native) -> mpeg1video (native))
[info] Press [q] to stop, [?] for help
[info] Output #0, mpegts, to 'http://192.168.10.6:3030/stream':
[info]   Metadata:
[info]     encoder         : Lavf58.20.100
[info]     Stream #0:0: Video: mpeg1video, yuv420p, 640x480, q=2-31, 500 kb/s, 30 fps, 90k tbn, 30 tbc
[info]     Metadata:
[info]       encoder         : Lavc58.35.100 mpeg1video
[info]     Side data:
[info]       cpb: bitrate max/min/avg: 0/0/500000 buffer size: 0 vbv_delay: -1

This is an example of the data received by the client (console.log):

enter image description here

devamat
  • 2,293
  • 6
  • 27
  • 50
  • Done something similar. The frames were in mpeg images sent one after the other. So I picked the data as I would from a file and constructed an image for each. – Tarik Jan 03 '21 at 02:56
  • I am using mpeg as well; how do you build images? – devamat Jan 03 '21 at 02:58
  • Well, the first thing is to determine how you are getting the image data. Could you elaborate. Because once you know how the image are sent and how to receive them, the rest is trivial. – Tarik Jan 03 '21 at 03:00
  • See https://stackoverflow.com/questions/59091927/displaying-png-image-in-arraybuffer-in-react – Tarik Jan 03 '21 at 03:07
  • Thanks, I added more details to my question! – devamat Jan 03 '21 at 03:08

0 Answers0