1

I'm new to swift and am trying to build a step sequencer as a project.

You tap on a pattern and the code runs through each step at a specified bpm and plays the selected sound if the step is on. Fine so far. But I can't figure out how to have two different sounds play simultaneously, which can happen when there's more than one instrument.

I looked at async and concurrency a few times but I can't wrap my head around it.

If a sound and the same sound with an inverted phase are played at the same time, the waves cancel each other out, so nothing should be heard. This doesn't happen.

Test that I made:

import SwiftUI
import AVFoundation
import Foundation

struct ContentView: View {
 
 
    @State var playerA:  AVAudioPlayer!
    @State var playerB:  AVAudioPlayer!

    
    func playSound() {
           let urlA = Bundle.main.url(forResource: "sound", withExtension: "wav")
        let urlB = Bundle.main.url(forResource: "sound_inverted", withExtension: "wav")
           playerA = try! AVAudioPlayer(contentsOf: urlA!)
        playerB = try! AVAudioPlayer(contentsOf: urlB!)
           playerA.play()
        playerB.play()
        
        }

    
    
    
    var body: some View {
            HStack{
                Button(action:{
                playSound()
            }, label: {
                Text( "TRY")
            })
        }
    }
}


struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

What happens is that when the button is pressed the two sounds are played and are audible.

I tried async but I don't understand where to put what or how to structure this whole thing because it always throws up some errors.

Konstantin
  • 11
  • 1
  • Does this answer your question? [Get AVAudioPlayer to play multiple sounds at a time](https://stackoverflow.com/questions/36865233/get-avaudioplayer-to-play-multiple-sounds-at-a-time) – timbre timbre Nov 18 '22 at 19:15

1 Answers1

0

If a sound and the same sound with an inverted phase are played at the same time, the waves cancel each other out, so nothing should be heard.

This is only true if the two sounds are playing precisely in sync, within some error related to their frequencies. The higher the frequencies, the tighter the tolerances you need.

The way the above code works, when you call the first play, it starts reading the file from disk. While it's doing that, it moves on to the second play and starts reading that one from disk. When each of them get each data into memory (the timing of which is not at all certain, since hardware is involved, plus caching, plus other filesystem operations that might be going on), then they'll start playing. It would be shockingly lucky for them to start playing within the fraction of a millisecond required to cancel. (There are also several other issues I'm glossing over, such as memory allocations which take varying amounts of time, acquiring and configuring the audio hardware, which only needs to be done once, and the uncertainty about exactly when each thread will be run on a core.)

Tools like async/await or DispatchQueues do not make the kind of timing promises you need to do this kind of synchronization. It's not what they're for. This is not a real-time operating system.

To get tight synchronization, you cannot use a high-level API like AVAudioPlayer(contentsOf:). You'll need to manage more things yourself. The tool you want to use is an AVComposition which will allow you to play two tracks in sync. It may still not be precise enough to get cancellation , but it has a pretty reasonable chance if done carefully.

See Composite Assets for the tools you'll want to explore this.

Rob Napier
  • 286,113
  • 34
  • 456
  • 610