3

I have the following setup:

Core Audio Callback -> Circular Buffer -> libLAME -> Circular Buffer -> libShout

This all works fine until I start doing any intensive work at which point the thread doing the encoding (libLAME) and the thread doing the streaming (libShout) is throttled and things begin to go down hill very quickly (basically the server gets audio every 2-5 sec rather than every 200 msec or less like it should and the stream becomes garbled).

I have tried the following:

  1. Encoding and Streaming on the one thread
  2. Encoding and Streaming on their own threads
  3. Setting the priority of the thread(s) to high
  4. Setting the thread(s) to realtime threads (which appears to fix it for the most part except for the fact that everything else is then throttled way too much)

I am pretty much using the stock standard example code for libLAME and libShout. i.e. Set them up for the audio format and server, then loop whilst data is available in the buffers.

What I don't understand is why the threads are being throttled when the CPU usage is maxing out at 80% and the threads aren't blocking whilst waiting on the other thread.

Any help with this would be greatly appreciated.

Anthony Myatt
  • 326
  • 1
  • 12
  • 2
    Is your circular buffer a blocking class? Are you queueing audio data, or pointers to buffers containing audio data? – Martin James Feb 21 '14 at 14:20
  • @MartinJames, I am using TPCircularBuffer (https://github.com/michaeltyson/TPCircularBuffer). – Anthony Myatt Feb 21 '14 at 15:21
  • 1
    Have you tried sticking the CPU time analyser on your app? That might shed light on what's going on - it just sounds like something in there's taking too much time. Are you definitely following the usual realtime thread guidelines - don't allocate/dealloc, don't wait on locks, don't call Objective-C stuff? – Michael Tyson Feb 25 '14 at 03:48
  • Hi Michael, thanks for taking a look at this. I did have some Obj-C methods in my background thread that was doing the encoding and streaming. I am currently in the process of rewriting this into a C++ class. As for the CPU time analyser, it simply showed that as another background thread used more time, my encoding thread used less, even though 110% of my dual core iPad's time was free. I'll write the C++ class and make the other changes suggested [here](http://stackoverflow.com/questions/21978305/synchronising-with-core-audio-thread/21999628?noredirect=1#21999628) and see how that goes. – Anthony Myatt Feb 25 '14 at 04:49

0 Answers0