0

I just purchased an onwave ip camera ,the main aim was to do image processing and monitoring using Android tablet.While coding natively and when cross compiled for java,python and C++ with FFMPEG .The Videocapture class works with ip camera url works flawlessly. It shows frames of the ipcamera which usues rtsp protocol for streaming. for e.g in C++ `

       Mat frame
        Videocapture cap;
         cap.open(rtsp:// the url);
           while(true)
           {
               cap.read(frame);
               waitkey(1);
            } 

The code works flawlessly it gives me the frames from the camera stream on my LAN with little or no delay.Same thing for python and when compiled for Java.

However the problem comes when coming to android ,as opencv sdk for android doesnt natively support ffmpeg.At first i didnt want to compile it with ffmpeg for android all over again instead went for JavaCV which comes with prebuilt ffmpegframegrabber class and also preserves the native source codes of opencv .However framegrabber failed me when i tried it show the frames on bitmap and the there was huge rendering problem with packet loss and the frame came all garbled,also tried with the FrameRecorder class and recorded the file in the background but with same result. Later I tried by using the Mediaplayer for Android .Attached is my code using the mediaplayer.

    package com.example.rob.androidipcamera4;

   import android.app.Activity;
   import android.content.Context;
   import android.media.MediaPlayer;
   import android.net.Uri;
   import android.support.v7.app.AppCompatActivity;
   import android.os.Bundle;
   import android.util.Base64;
   import android.view.SurfaceHolder;
   import android.view.SurfaceView;
   import android.view.Window;
   import android.view.WindowManager;


  import java.io.IOException;
  import java.util.HashMap;
  import java.util.Map;

  public class MainActivity extends Activity implements        MediaPlayer.OnPreparedListener,SurfaceHolder.Callback {
final static String RTSP_URL="rtsp://192.168.1.7:554/onvif1";
private static String USERNAME="";
private static String PASSWORD="";
private MediaPlayer mediaplayer;
private  SurfaceHolder surfaceholder;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    requestWindowFeature(Window.FEATURE_NO_TITLE);
    Window window=getWindow();
    window.setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,WindowManager.LayoutParams.FLAG_FULLSCREEN);
    window.setBackgroundDrawableResource(android.R.color.black);
    setContentView(R.layout.activity_main);
    //Configuring the surfaceview
    SurfaceView surfaceView=(SurfaceView)findViewById(R.id.surfaceView);
    surfaceholder = surfaceView.getHolder();
    surfaceholder.addCallback(this);
    surfaceholder.setFixedSize(320,320);
}

@Override
public void onPrepared(MediaPlayer mp) {
    mediaplayer.start();

}

@Override
public void surfaceCreated(SurfaceHolder sh)  {
    mediaplayer=new MediaPlayer();
    mediaplayer.setDisplay(surfaceholder);
    Context context=getApplicationContext();
    Map<String,String>headers=getRTSPHeaders();
    Uri source=Uri.parse(RTSP_URL);
    try{
        mediaplayer.setDataSource(context,source,headers);
        mediaplayer.setOnPreparedListener(this);
        mediaplayer.prepareAsync();
    }
    catch (Exception e){
        System.out.println("Sorry no media ");
    };
   }

@Override
public void surfaceChanged(SurfaceHolder sh, int f, int w, int h) {}

@Override
public void surfaceDestroyed(SurfaceHolder sh) {
    mediaplayer.release();

}
private  Map<String,String>getRTSPHeaders() {
    Map<String,String>headers=new HashMap<String, String>();
    String basicAuthValue=getBasicAuthValue(USERNAME,PASSWORD);
    headers.put("Authorisation",basicAuthValue);
    return headers;
}
private String getBasicAuthValue(String usr,String pwd){
    String credientials=usr+":"+pwd;
    int flags= Base64.URL_SAFE|Base64.NO_WRAP;
    byte[]bytes=credientials.getBytes();
    return "Basic" + Base64.encodeToString(bytes,flags) ;

}
}

though the frames were coming in good resolution also giving me an option of taking each frame and doing some motion detection,but there was about 7 seconds lag in the live stream which is not at all acceptable in monitoring.

So I think I am back to square one with compiling ffmpeg for android .I am just having doubts ,as ffmpeg compiled with opencv worked flawlessly in C++ and Python(On linux) giving me 0.2 secs lag, whether compiling ffmpeg with android will give me the same result and whether i can use the Videocapture class in android the same way i did for C++,without using NDK? It would be really helpful if anyone has ever tried this out on Android tablets and phones with ipcam using the official sdk.Or is there a other way round using the mediaplayer or JavaCV that gives me liitle or no delay without any garbled frames

rob
  • 11
  • 4

2 Answers2

0

It will take some time to configure everything you want for ffmpeg, though I did not touch it for a while, maybe something changed. Better start of by looking for a github project that has it integrated already and start from that, there should be plenty of them lying around (find a more recent one). Back when I was working on video calls, about 3 years back, there was no appropriate Android media API, currently there is are low level callbacks, so you should successfully implement anything you want.

0

i actually had solved the problem by compiling opencv libraries for android (armhf) from the source and also ffmpeg libraries which includes libav,libswsscale etc.Than i first captured the frames using ffmpegs avframe classes and converted the frame into openCV's Mat in separate pthread and applied all the image processing algos before finally calling the main function from the main program through JNI.

rob
  • 11
  • 4