I'm trying to use Java Sound (with MP3SPI and VorbisSPI) to read audio data from WAV, MP3 and OGG files, storing it into a byte array, in order to play it later from there. In order to do so, I use code like this:
public Sound loadSound(File file){
AudioInputStream baseStream = AudioSystem.getAudioInputStream(file);
AudioFormat baseFormat = baseStream.getFormat();
AudioFormat decodedFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(),
16,
baseFormat.getChannels(),
baseFormat.getChannels() * 2,
baseFormat.getSampleRate(),
false);
AudioInputStream decodedStream = AudioSystem.getAudioInputStream(decodedFormat, baseStream);
// Buffer audio data from audio stream
ByteArrayOutputStream byteOut = new ByteArrayOutputStream();
byte[] buffer = new byte[4096];
int nBytesRead = 0;
while (nBytesRead != -1) {
nBytesRead = decodedStream.read(buffer, 0, buffer.length);
// Write read bytes to out stream
byteOut.write(buffer);
}
// close streams and cleanup
byte[] samples = byteOut.toByteArray();
return new Sound(samples,decodedFormat);
}
The Sound class is basically a (byte[] samples, AudioFormat format) couple. Done that, I try to read and play the byte array as follows:
public void play(Sound sound) {
InputStream source = new ByteArrayInputStream(sound.getSamples());
AudioFormat format = sound.getFormat();
// use a short, 100ms (1/10th sec) buffer for real-time changes to the sound stream
int bufferSize = format.getFrameSize()
* Math.round(format.getSampleRate() / 10);
byte[] buffer = new byte[bufferSize];
// create a line to play to
SourceDataLine line;
try {
DataLine.Info info
= new DataLine.Info(SourceDataLine.class, format);
line = (SourceDataLine) AudioSystem.getLine(info);
line.open(format, bufferSize);
} catch (LineUnavailableException ex) {
ex.printStackTrace();
return;
}
// start the line
line.start();
// copy data to the line
try {
int numBytesRead = 0;
while (numBytesRead != -1) {
numBytesRead
= source.read(buffer, 0, buffer.length);
if (numBytesRead != -1) {
line.write(buffer, 0, numBytesRead);
}
}
} catch (IOException ex) {
ex.printStackTrace();
} catch (ArrayIndexOutOfBoundsException ex2) {
}
// wait until all data is played, then close the line
line.drain();
line.close();
}
It all works fine for WAV and MP3, but OGG playback is distorted and slowed down somewhat, as if the format was wrong. I'd love if someone could point out why that happens.
Note: I kind of solved the problem by loading the whole audio file as a byte array with a FileInputStream, and then playing it with an AudioInputStream over a ByteArrayInputStream on the array. This way, though, the byte array holds the encoded data instead of the decoded one: in practice, not a big difference to me, but I'm flat out curious why the previous approach didn't work for OGG. Thanks to everyone who'll try to answer.