3

I'm trying to build an audio recorder app for Android Wear. Right now, I'm able to capture the audio on the watch, stream it to phone and save it on a file. However, the audio file is presenting gaps or cropped parts.

I found this aswered questions related to my problem link1, link2, but they couldn't help me.


Here is my code:

First, on the watch side, I create the channel using the channelAPI and sucessfully send the audio being captured on the watch to the smartphone.

//here are the variables values that I used

//44100Hz is currently the only rate that is guaranteed to work on all devices
//but other rates such as 22050, 16000, and 11025 may work on some devices.

private static final int RECORDER_SAMPLE_RATE = 44100; 
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
int BufferElements2Rec = 1024; 
int BytesPerElement = 2; 

//start the process of recording audio
private void startRecording() {

    recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
            RECORDER_SAMPLE_RATE, RECORDER_CHANNELS,
            RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);

    recorder.startRecording();
    isRecording = true;
    recordingThread = new Thread(new Runnable() {
        public void run() {
            writeAudioDataToPhone();
        }
    }, "AudioRecorder Thread");
    recordingThread.start();
}

private void writeAudioDataToPhone(){

    short sData[] = new short[BufferElements2Rec];
    ChannelApi.OpenChannelResult result = Wearable.ChannelApi.openChannel(googleClient, nodeId, "/mypath").await();
    channel = result.getChannel();

    Channel.GetOutputStreamResult getOutputStreamResult = channel.getOutputStream(googleClient).await();
    OutputStream outputStream = getOutputStreamResult.getOutputStream();

    while (isRecording) {
        // gets the voice output from microphone to byte format

        recorder.read(sData, 0, BufferElements2Rec);
        try {
            byte bData[] = short2byte(sData);
            outputStream.write(bData);
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
    try {
        outputStream.close();
    } catch (IOException e) {
        e.printStackTrace();
    }
}

Then, on the smartphone side, I receive the audio data from the channel and write it to a PCM file.

public void onChannelOpened(Channel channel) {
    if (channel.getPath().equals("/mypath")) {
        Channel.GetInputStreamResult getInputStreamResult = channel.getInputStream(mGoogleApiClient).await();
        inputStream = getInputStreamResult.getInputStream();

        writePCMToFile(inputStream);

        MainActivity.this.runOnUiThread(new Runnable() {
            public void run() {
                Toast.makeText(MainActivity.this, "Audio file received!", Toast.LENGTH_SHORT).show();
            }
        });
    }
}

public void writePCMToFile(InputStream inputStream) {
    OutputStream outputStream = null;

    try {
        // write the inputStream to a FileOutputStream
        outputStream = new FileOutputStream(new File("/sdcard/wearRecord.pcm"));

        int read = 0;
        byte[] bytes = new byte[1024];

        while ((read = inputStream.read(bytes)) != -1) {
            outputStream.write(bytes, 0, read);
        }

        System.out.println("Done writing PCM to file!");

    } catch (Exception e) {
        e.printStackTrace();
    } finally {
        if (inputStream != null) {
            try {
                inputStream.close();
            } catch (Exception e) {
                e.printStackTrace();
            }
        }
        if (outputStream != null) {
            try {
                // outputStream.flush();
                outputStream.close();
            } catch (Exception e) {
                e.printStackTrace();
            }

        }
    }
}

What am I doing wrong or what are your suggestions to achieve a perfect gapless audio file on the smartphone? Thanks in advance.

Reslley Gabriel
  • 101
  • 1
  • 5
  • Have you tried making a much larger buffer? 1024 bytes is very small, and it would be easy for a delay on the phone to cause you to miss some audio. – Wayne Piekarski Aug 06 '15 at 15:46
  • Yes. I also tried using ten times the value returned by the AudioRecord.getMinBufferSize function, but the result was the same. – Reslley Gabriel Aug 06 '15 at 16:47

1 Answers1

1

I noticed in your code that you are reading everything into a short[] array, and then converting it to a byte[] array for the Channel API to send. Your code also creates a new byte[] array through each iteration of the loop, which will create a lot of work for the garbage collector. In general, you want to avoid allocations inside loops.

I would allocate one byte[] array at the top, and let the AudioRecord class store it directly into the byte[] array (just make sure you allocate twice as many bytes as you did shorts), with code like this:

mAudioTemp = new byte[bufferSize];

int result;
while ((result = mAudioRecord.read(mAudioTemp, 0, mAudioTemp.length)) > 0) {
  try {
    mAudioStream.write(mAudioTemp, 0, result);
  } catch (IOException e) {
    Log.e(Const.TAG, "Write to audio channel failed: " + e);
  }
}

I also tested this with a 1 second audio buffer, using code like this, and it worked nicely. I'm not sure what the minimum buffer size is before it starts to have problems:

int bufferSize = Math.max(
  AudioTrack.getMinBufferSize(44100, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT),
  44100 * 2);
Wayne Piekarski
  • 3,203
  • 18
  • 19