4

I receive continuous UDP data (RAW image) from my device. Decoding it and use bitmap show on

my Android phone so it will look like a video. The problem is that the frame don't show

continuously. The frame change approx. per 2~3 seconds but I want to show frame by frame in 1/30.

Below is part of my code

  public class ArDroneMain extends Activity {
  Bitmap  image = null;
  public ImageView myImageView;
  protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    animationDrawable = new AnimationDrawable(); 
    myImageView = (ImageView)findViewById(R.id.imageView);
    thread.start();   
  }
}

 Thread thread=new Thread  (new Runnable() {
        Message message;
    String obj="run";      
    @Override
    public void run() {   

    while(true){     
        try { 
            Log.e("enter_video_thread","enter_video_thread");
            receiveVideoRawData();
            Log.e("///receiveVideoRawData ","receiveVideoRawData");
            message = handler.obtainMessage(1,obj);
                    handler.sendMessage(message);
        } catch (IOException e) {
            e.printStackTrace();
        }   
    }
   }
  public Handler handler = new Handler(){   


    @SuppressLint("NewApi")
    @Override
    public void handleMessage(Message msg) {
          super.handleMessage(msg);         
            String MsgString = (String)msg.obj;
            if (MsgString.equals("run"))
            {                                                                                
                          Drawable ardrone_Frame_Drawable =newBitmapDrawable(getResources(),image);
                         animationDrawable.addFrame(ardrone_Frame_Drawable,100);
                         myImageView.setBackground(animationDrawable);                       



            }


    }
};


  public void sendTriggerCommand() throws IOException{
        byte [] ip_bytes = new byte[] {(byte)192,(byte)168,(byte)1,(byte)1};//Drone address  
        byte[] buf_snd = {0x01, 0x00, 0x00, 0x00};//trigger buffer
        inet_addr = InetAddress.getByAddress(ip_bytes );
        DatagramPacket packet_snd = new DatagramPacket(buf_snd, buf_snd.length, i   net_addr, ardrone_video_port);       
        socket_video.send(packet_snd);

}

   public void receiveVideoRawData() throws IOException{
    Log.e("enter_receiveVideoRawData()","enter_receiveVideoRawData()");
    socket_video = new DatagramSocket();
    byte[] buf_rcv = new byte[153600];
    DatagramPacket rawData = new DatagramPacket(buf_rcv, buf_rcv.length);
        sendTriggerCommand();
        Log.e("sendTriggerCommand();","sendTriggerCommand();");
        socket_video.receive(rawData);
        Log.e("socket_video.receive(rawData);","socket_video.receive(rawData);");
        ReadRawFileImage readMyRawData=new ReadRawFileImage();

        image = readMyRawData.readUINT_RGBImage(buf_rcv);

}

DekangHu
  • 155
  • 4
  • 14

1 Answers1

1

I really doubt you can receive data as fast, especially a raw image.

What you could do to improve your program could be :

  • sending and receiving compressed data
  • remove the while true loop, it will never hands. You should use a boolean flag to control your thread.
  • use an AsyncTask to receive data, decode it in background, then display it in ui thread
  • use 2 thread with thread synchronisation : one that reads data from network, a second one that consumes this data and displays it. That would be a much better architecture that pipelining the 2 tasks.
Snicolas
  • 37,840
  • 15
  • 114
  • 173