1

The AudioPlayerNode is an instance variable code is a follows:

class HXAudioEngine {
   private var audioEngine: AVAudioEngine = AVAudioEngine()

   var digitFileUrl: URL? {
      didSet {
          if let digitUrl = digitFileUrl {
              do {
                  digitAudioFile = try AVAudioFile(forReading: digitUrl)
              } catch let error {
                print("Error loading Digit file: \(error.localizedDescription)")
            }
          }
      }
   }
   var digitAudioFile: AVAudioFile? {
      didSet {
          if let digitFile = digitAudioFile {
              digitAudioFormat = digitFile.processingFormat
              digitFileBuffer = AVAudioPCMBuffer(pcmFormat: digitFile.processingFormat, frameCapacity: UInt32(digitFile.length))
          }
      }
   }
   var digitFileBuffer: AVAudioPCMBuffer? {
      didSet {
          if let buffer = digitFileBuffer {
              do {
                  try digitAudioFile?.read(into: buffer)
              } catch let error {
                  print("Error loading digit file into buffer: \(error.localizedDescription)")
              }
          }
      }
  }
  var digitAudioFormat: AVAudioFormat?
  var digitPlayer: AVAudioPlayerNode = AVAudioPlayerNode()

  func playDigit() {
      let file = "d0p1m00db"
      digitFileUrl = Bundle.main.url(forResource: file, withExtension: "wav")
      audioEngine.attach(digitPlayer)
      audioEngine.connect(digitPlayer, to: audioEngine.mainMixerNode, format: digitAudioFormat)
      audioEngine.prepare()

      do {
           try audioEngine.start()
      } catch let error {
           print(error)
      }

      guard let digitBuffer = digitFileBuffer else { return }
      guard let digitAudioFile = digitAudioFile else { return }

      digitPlayer.scheduleBuffer(digitBuffer, at: nil, options: .interrupts) {
             print("Done playing digit")
      }
      digitPlayer.play()
   }
}

For some reason the file that ends up being played is just a click sound. The audio file is a 1 second long file of a lady speaking one digit.

It almost seems like the player speeds through the sound file and hence obscures the speech. The sample rate is 44100. Reading on Stack Overflow there is alot of solutions talking about the audioplayer being deallocated before the sound file is completed.

But I have tested on a separate app built through Raywenderlich tutorial: https://www.raywenderlich.com/5154-avaudioengine-tutorial-for-ios-getting-started, which actually shows the length of the sound file and the playing progress. And all seems fine but even then the audio file doesn't play properly.

Is there an issue with my implementation or is it the audio file? The audio file plays fine on itunes.

Alexander
  • 1,424
  • 18
  • 23
  • 1
    So interestingly the audio file doesn't play properly if the output is the Device speaker but will play normally if the headphones are plugged in. Does anyone know why this would be the case? – Alexander Dec 28 '18 at 08:27
  • Could it be that your phone is in silent mode? – Guy Feb 25 '20 at 14:25

1 Answers1

0

if you need for output device for playing , headphone or other kind of choices must be set audiosession category.

try AVAudioSession.sharedInstance().setCategory(.playAndRecord , mode: .default , options: .defaultToSpeaker)