2

I'm playing an audio stream from the internet in my app, and I would like to display a graphic equalizer. The library that I'm using for the streaming is FreeStreamer. For drawing the graphic equalizer I'm using ZLHistogramAudioPlot. These two libraries are the only ones that fit my needs. The problem is I can't get them to work together.

The ZLHistogramAudioPlot requires a buffer and bufferSize in order to update it's view. Here is it's update method:

- (void)updateBuffer:(float *)buffer withBufferSize:(UInt32)bufferSize {
    [self setSampleData:buffer length:bufferSize];
}

Unfortunately, the FreeStreamer library doesn't provide a method to read the audiot output as it goes out towards the sound card. So, what I need is a way to read the audio output stream that's about to play through the speakers (not the byte stream from the internet, because that's received in chunks, and then buffered, which means that the histogram won't be in real-time).

I've discovered that AURemoteIO from Apple's CoreAudio framework can be used to do this, but Apple's sample project is complex beyond understanding, and there are very little to none examples online about using AURemoteIO.

Is this the best way to achieve this, and if so, any helpful info/links would be greatly appreciated.

damjandd
  • 708
  • 2
  • 9
  • 24
  • How are you playing the internet audio stream? – Rhythmic Fistman May 20 '15 at 12:36
  • I'm using [FreeStreamer](https://github.com/muhku/FreeStreamer) to play the stream via url. – damjandd May 20 '15 at 13:01
  • Can you get at freestreamers's internals? Or replace it with an AVPlayer? – Rhythmic Fistman May 20 '15 at 13:04
  • I can get at it's internals, but they are very complicated, and I can't see what I would need to change in order for this to work. I could use AVPlayer instead, but I chose FreeStreamer because it's very optimized and uses only 1% CPU to run. – damjandd May 20 '15 at 13:12
  • AVPlayer, like FreeStreamer, uses AudioQueues, which allow hardware decoding of aac/mp3. Getting 1% CPU usage with a hardware decoder might not actually be that special. Anyhow, if you do choose AVPlayer, you can easily tap into its output stream and feed it into your visualizer. – Rhythmic Fistman May 20 '15 at 13:20
  • 1
    Could you point out how I would use AVPlayer to view the output stream? – damjandd May 20 '15 at 14:29

1 Answers1

1

Here is a possible answer from looking through the FreeStreamer headers

#define minForSpectrum 1024

@implementation MyClass {
    TPCircularBuffer SpectrumAnalyzerBuffer;
}

- (void)dealloc {
    TPCircularBufferCleanup(&SpectrumAnalyzerBuffer);
}

-(instancetype) init {
   self = [super init];
   if (self) {
      TPCircularBufferInit(&SpectrumAnalyzerBuffer, 16384);
      self.audioController.activeStream.delegate = self;
   }
   return self;
}

- (void)audioStream:(FSAudioStream *)audioStream samplesAvailable:(const int16_t *)samples count:(NSUInteger)count {
    // incoming data is integer

    SInt16 *buffer = samples;
    Float32 *floatBuffer = malloc(sizeof(Float32)*count);
    // convert to float
    vDSP_vflt16(buffer, 1, floatBuffer, 1, count);

    // scale
    static float scale = 1.f / (INT16_MAX/2);
    static float zero = 0.f;

    vDSP_vsmsa(floatBuffer, 1, &scale, &zero, floatBuffer, 1, count);

    TPCircularBufferProduceBytes(&SpectrumAnalyzerBuffer, floatBuffer, count*sizeof(Float32));

    free(floatBuffer);   
}

- (void) timerCallback: (NSTimer*) timer {

    Float32 *spectrumBufferData = TPCircularBufferTail(&SpectrumAnalyzerBuffer, &availableSpectrum);

    if (availableSpectrum >= minForSpectrum) {
        // note visualiser may want chunks of a fixed size if its doing fft
        [histogram updateBuffer: spectrumBufferData length: minForSpectrum];
        TPCircularBufferConsume(&SpectrumAnalyzerBuffer, minForSpectrum);
    }


}
Daniel Broad
  • 2,512
  • 18
  • 14
  • No, these are the samples as they are received from the stream, they come in chunks, every hundred milliseconds or so, not continuously, so the histogram comes out twitchy, it doesn't update in real time. – damjandd May 21 '15 at 10:12
  • There must be enough samples to make the audio in total, use TPCircularBuffer to put the samples in and then take them out again. (Assuming your graphic is expecting the same sample rate). The FreeStreamer docs say you shouldn't be blocking actually so a buffer is mandatory really. – Daniel Broad May 21 '15 at 11:08
  • Added the code for the buffer to the answer. Also had the thought that if your visualizer is like the one i use, then it will want fixed chunks data of a size it is configured for, you can't just throw all the data at it each time. – Daniel Broad May 21 '15 at 11:19
  • Yes, but it want's a continuous stream of those chunks. The `samplesAvailable` method gets called every few milliseconds, so it does not provide a continuous stream of chunks. I will try it with the buffer. Just curios, why did you abandon the AVPlayer method? I thought that was going to be easier. – damjandd May 21 '15 at 11:55
  • AVPlayer with AudioProcessingTap would be easier, its what I use in one of my apps, however for m3u links the audio tap apparently doesn't work. – Daniel Broad May 21 '15 at 12:28
  • Who calls the timerCallback function? I don't see you starting a timer anywhere. – damjandd May 21 '15 at 13:54
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/78430/discussion-between-dorada-and-damjandd). – Daniel Broad May 21 '15 at 13:54
  • You do not want a "continuous stream" of audio. You want blocks of audio samples, either of the size that RemoteIO requests, or sized to synchronize with your visualization refresh rate (usually 30 or 60 Hz to match the CADisplaylink timer). You can get both from a dual-ported circular FIFO. – hotpaw2 May 23 '15 at 02:39
  • @dorada The code you wrote doesn't work for 32bit architecture devices, the TPCircularBuffer is empty in this case. Do you have any idea what the problem might be? – damjandd Jun 08 '15 at 09:50