you can try to use standard camera api to show frames on the SurfaveView and get frames from camera, encode them and send to network. you can find all details about it at stackoverflow, below are the main ideas
open camera:
c = Camera.open(index);
set parameters to camera and set surface to show on screen throug surfaceholder:
/*
init paramerers firstly, like fps, framerate range, color format
*/
surfaceHolder = surfaceView.getHolder();
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
surfaceHolder.addCallback(this);
c.setPreviewDisplay(surfaceHolder);
receive frame from camera:
private Camera.PreviewCallback previewListener = new Camera.PreviewCallback()
{
@Override
public void onPreviewFrame(byte[] data, Camera camera)
{
//put **data** to encoder
};
}
put data to encoder like described in http://developer.android.com/reference/android/media/MediaCodec.html
use ffmpeg for streaming. init rtsp connection and send encoded frames to ffmpeg muxer\streamer
/*
outputBufferIndex is a buffer encoder prepared for output
*/
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
//something what sends data to ffmeg muxer\network
writeVideoFrame(outData, outData.length, bufferInfo.presentationTimeUs);
the one tricky thing is to build ffmpeg for android and create jni level to send data from java level and corresponding level. i remember there were some prebuilt ffmpeg binaries for android and also there are instructions how to build it.