0

I want to show camera and lines which I detected in real-time (using i.MX6 and Android 4.4). I use Android camera .addCallbackBuffer to get frame and use TextureView to show camera Preview.

JNI (C++) : get frame buffer->convert byte[] to Mat->than use OpenCV to do image processing

JNI can return Mat(use .getNativeObjAddr()) which already draw lines on it or return two coordinates which are the starting and ending points of the line

This code new Mat in JNI and hope to just return two coordinates. If this method can't work I will new Mat in JAVA and send .getNativeObjAddr() to JNI, then return Mat. Show Mat in TextureView?

Question: How to show camera preview and detected line at the same time using TextureView?

In MainActivity.java

public class MainActivity extends Activity implements TextureView.SurfaceTextureListener, PreviewCallback {
protected Camera mCamera;
private TextureView mTextureView;
public byte[][] cameraBuffer;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    this.requestWindowFeature(Window.FEATURE_NO_TITLE);
    setContentView(R.layout.activity_main);

    mTextureView = new TextureView(this);
    mTextureView.setSurfaceTextureListener(this);
    setContentView(mTextureView);
}
....
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
    mCamera = Camera.open(0);
    int bufferSize = mCamera.getParameters().getPictureSize().width * mCamera.getParameters().getPictureSize().height
            * ImageFormat.getBitsPerPixel(mCamera.getParameters().getPreviewFormat()) / 8;
    cameraBuffer = new byte[3][bufferSize];

    thread = new getFrameThread( this, kCameraWidth, kCameraHeight);
    for (int i = 0; i < 3; ++i)
        mCamera.addCallbackBuffer(cameraBuffer[i]);

    mCamera.setPreviewCallbackWithBuffer(this);             
    thread.start();

    if (mCamera == null) {
        throw new RuntimeException("Default camera not available");
    }
    try {
        mCamera.setPreviewTexture(surface);
        mCamera.startPreview();
    } catch (IOException ioe) {
        // Something bad happened
    }
}

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    // TODO Auto-generated method stub
    camera.addCallbackBuffer(data);//callback data to cameraBuffer
    thread.refresh(data, countFrame);//send new frame data to JNI
}
....
static {
    System.loadLibrary("opencv_java"); //load opencv_java lib
    System.loadLibrary("testLib");
}
public native void getRawFrame(byte[] data,int width, int height, int count);

}

In getFrameThread (thread to run JNI function 'getRawFrame' )

public class getFrameThread extends Thread{
  public byte[] data;
  {
  .....
  mainActivity.getRawFrame(data, 480, 720);
  .....
  }

  public void refresh(byte[] data, int countFrame){
      this.data = data;
  }
}

In JNI

#include <jni.h>
#include <stdio.h>
#include <stdlib.h>

#include <opencv2/opencv.hpp>
using namespace cv;

extern"C"
{
.....
JNIEXPORT void JNICALL Java_com_example_adas_MainActivity_getRawFrame( JNIEnv* env, jobject thisobject,
    jbyteArray data, jint width, jint height){

  int length = env->GetArrayLength(data);
  unsigned char *bufferIn = (unsigned char*) env->GetPrimitiveArrayCritical(data, NULL);

  yuvMat = Mat(height  * 3/2, width, CV_8UC1, bufferIn);
  cvtColor(yuvMat, grayMat, cv::COLOR_YUV2GRAY_I420);


  //Do lines detected
  .....
  env->ReleasePrimitiveArrayCritical(data, bufferIn, JNI_ABORT);
}
Julia Ding
  • 173
  • 16

1 Answers1

0

You can overlay the camera live TextureView with an image view, it's OK to have this image partially transparent. But this is not a good solution for augmented reality. The reason is that the content generated from OpenCV comes with delay. This delay may be not very significant, but it may be very annoying to the user.

The result is much better if you hide the live camera stream and display the image (coming from onPreviewFrame()) changed by OpenCV processing. The AR displayed on the screen will be some milliseconds delayed (compared to camera live view), but the scene will be consistent with the detected lines. I have found a good example (and detailed programming instructions) in this blog post.

You still need the TextureView, because Android Camera API dictates that you have a live preview to receive onPreviewFrame() callbacks. To hide this, you can overlay it with some other views, or manipulate the texture on the level of OpenGL.

You can also use OpenGL to display the result of OpenCV: often performance of such workflow is much better than using other ways to display the images.

You can also use OpenGL to draw the lines - the results of OpenCV processing - not in form of bitmap, and use the camera preview as a separate texture.

Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
  • Thanks, i'll try 'OpenGL to display the result of OpenCV' this method. – Julia Ding Sep 22 '17 at 01:41
  • Can you give me a hint or function to use OpenGL to display the result of OpenCV? Thanks. – Julia Ding Sep 22 '17 at 01:53
  • I can recommend [graphika](https://github.com/google/grafika) as a good starting point to learn OpenGL relationship with Android camera. You may also find this [article](http://www.anandmuralidhar.com/blog/android/corners/) useful. It illustrates my main point: the red dots are pinned to the correct pixels on the screen, even if the whole flow is a bit delayed. – Alex Cohn Sep 22 '17 at 12:30
  • 1
    Thanks again. The article explained in great detail. – Julia Ding Sep 25 '17 at 01:56