4

I am working on Mediapipe real time Hand tracking application in for android. Provided demo is using Camera input from SurfaceTexture's ExternalOES I want to use network stream which is coming from webrtc. Network stream is in YUV_I420 format so I am converting it into RGB and creating RGB packet using AndroidPacketCreator like this.

Packet imagePacket = packetCreator.createRgbaImageFrame(yuv_converted_bimap);

and then passing it to mediapipe graph in FrameProcessor class

mediapipeGraph.addConsumablePacketToInputStream(
          videoInputStream, imagePacket, custom_timestamp);

With this way everything is working fine only performance is degrading i.e with camera stream if it is able process 4-5 FPS then with this YUV to RGB approach it is processing only 2-3 FPS. I want to find another approach where I can use YUV stream directly to send mediapipe graph. I did some research but could not found anything. Anyone has any Idea how to do that?

Afsar edrisy
  • 1,985
  • 12
  • 28
  • Cool, I want to do the same, do you have a repository where I can follow you ? I'm interested how exactly you provide the network video stream – hannes ach Apr 10 '21 at 08:22

0 Answers0