0

I am looking if i can run Raspi and Lepton FLIR cameras simultinuesly in python code and OpenCV in jetson nano. i was able to run both cameras by this command from the terminal

gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3264, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=2 ! 'video/x-raw,width=800, height=600' ! videoconvert ! ximagesink & gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,format=UYVY ! videoscale ! video/x-raw,width=800,height=600 ! videoconvert ! ximagesink

and i am looking if can implement the above command in this python / opencv code

import cv2
print(cv2.__version__)
dispW=640
dispH=480
flip=2

camSet='nvarguscamerasrc !  video/x-raw(memory:NVMM), width=3264, height=2464, format=NV12, framerate=21/1 ! nvvidconv flip-method='+str(flip)+' ! video/x-raw, width='+str(dispW)+', height='+str(dispH)+', format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink'
cam= cv2.VideoCapture(camSet)
 
while True:
    ret, frame = cam.read()
    cv2.imshow('nanoCam',frame)
    if cv2.waitKey(1)==ord('q'):
        break
cam.release()
cv2.destroyAllWindows()
Community
  • 1
  • 1
  • Do both cameras provide a common framerate ? You may check with : v4l2-ctl -d0 --list-formats-ext for video0 or v4l2-ctl -d1 --list-formats-ext for video1. Note that v4l2-ctl command is provided by package v4l-utils. – SeB Dec 09 '21 at 18:48

1 Answers1

0

Although I understand that your example gives some flexibility, it doesn't make much sense to capture at higher resolution than needed for processing and it can cost resources time.

nvarguscamerasrc should be (in some extent) able to manage the scaling.

Assuming your FLIR camera can acheive 30 fps, you would try :

import cv2
print(cv2.__version__)

cam0= cv2.VideoCapture('nvarguscamerasrc !  video/x-raw(memory:NVMM), width=800, height=600, format=NV12, framerate=30/1 ! nvvidconv flip-method=2 ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink drop=1', cv2.CAP_GSTREAMER)
if not cam0.isOpened():
    print 'Failed to open cam0'
    exit

cam1= cv2.VideoCapture('v4l2src device=/dev/video1 ! video/x-raw,format=UYVY,framerate=30/1 ! videoscale ! video/x-raw,width=800,height=600 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1', cv2.CAP_GSTREAMER)
if not cam1.isOpened():
    print 'Failed to open cam1'
    exit

while True:
   ret, frame0 = cam0.read()
   ret, frame1 = cam1.read()
   cv2.imshow('Cam0',frame0)
   cv2.imshow('Cam1',frame1)
   if cv2.waitKey(1)==ord('q'):
       break

cam.release()
cv2.destroyAllWindows()

[EDIT: not sure if the following works with USB cams]

As your FLIR camera provides UYVY, you may try to use nvv4l2camerasrc plugin instead of v4l2src. Be sure that your sensor is in UYVY mode, and try instead for leveraging scaling and color conversion in HW :

cam1= cv2.VideoCapture('nvv4l2camerasrc device=/dev/video1 ! video/x-raw(memory:NVMM),format=UYVY,framerate=30/1 ! nvvidconv ! video/x-raw,format=BGRx,width=800,height=600 ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1', cv2.CAP_GSTREAMER)
SeB
  • 1,159
  • 6
  • 17
  • Hi @Seb Thanks for your reply For FLIR Lepton camera, Do i need to add cv2.CAP_GSTREAMER at the end ? because it is USB camera – Abdussalam Elhanashy Dec 09 '21 at 20:16
  • Yes, as these VideoCaptures use a gstreamer pipeline, gstreamer backend would be used. The source plugin of each pipeline gets the correct path (nvarguscamerasrc for CSI cam through argus, v4l2src or nvv4l2camerasrc for USB cameras with V4L2 support such as UVC compliant ones). You may post the output of the v4l2-ctl commands above for better advice. – SeB Dec 09 '21 at 20:42