1

Ive been trying to implement the AVFoundation's framework AVFAudio in order to record audio, play audio, as well as change the audio data per the user's selected presets. Ive also been trying to find out how to save files locally to the user's device, however, upon reading apple's documentation on AVFAudio, I can hardly make any sense of which steps to take when creating these files. Ive been following along with https://www.raywenderlich.com/21868250-audio-with-avfoundation/lessons/1 and managed to set up some functions here.

Here I have set up saving the audio, but as you can see, this would only save the audio to a temporary directory. I am wondering how I can save the audio file locally to the user's device.

// MARK: Saving audio
    var urlForVocals: URL {
        let fileManger = FileManager.default
        let tempDirectory = fileManger.temporaryDirectory
        let filePath = "TempVocalRecording.caf"
        return tempDirectory.appendingPathComponent(filePath)
    }
    

I am generally confused about the AVFoundation's framework when using AVFAudio and the documentation https://developer.apple.com/documentation/avfaudio does not go into specifics of how to implement each method. For Example; The Doc states that for Creating an Audio Player: We need to init(contentsOf:url), but does not go into what the url is and why we are using it? Can anyone help me understand what steps to take further, I feel like i'm running around in circles trying to understand this framework and the apple documentation.

Richard
  • 41
  • 1
  • 1
  • 5
  • "We need to init(contentsOf:url), but does not go into what the url is and why we are using it" -- the URL is the file that you want to play. Otherwise, the player doesn't know where you play audio data from. – jnpdx Dec 28 '21 at 21:10
  • @jnpdx I do not understand, How can I play a file, when it hasn't been recorded yet? Are we creating that file itself? Can you go a bit more in depth? – Richard Dec 28 '21 at 21:34
  • I'm specifically referring to the second part of your question where you ask about creating a player. You don't do that until you've recorded a file. If I have time later I can write up an answer including the recording part. – jnpdx Dec 28 '21 at 21:35
  • @jnpdx yes please, id appreciate that a lot bec i've spent the last two weeks trying to understand the apple documentation and how to use this SDK. I understand the flow of the SDK, but I don't understand how to use it – Richard Dec 28 '21 at 21:43

1 Answers1

0

Here's a relatively bare-bones version. See inline comments for what is happening.

cclass AudioManager : ObservableObject {
    @Published var canRecord = false
    @Published var isRecording = false
    @Published var audioFileURL : URL?
    private var audioPlayer : AVAudioPlayer?
    private var audioRecorder : AVAudioRecorder?
    
    init() {
        //ask for record permission. IMPORTANT: Make sure you've set `NSMicrophoneUsageDescription` in your Info.plist
        AVAudioSession.sharedInstance().requestRecordPermission() { [unowned self] allowed in
            DispatchQueue.main.async {
                if allowed {
                    self.canRecord = true
                } else {
                    self.canRecord = false
                }
            }
        }
    }

    //the URL where the recording file will be stored
    private var recordingURL : URL {
        getDocumentsDirectory().appendingPathComponent("recording.caf")
    }

    private func getDocumentsDirectory() -> URL {
        let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
        return paths[0]
    }
    
    func recordFile() {
        do {
            //set the audio session so we can record
            try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default)
            try AVAudioSession.sharedInstance().setActive(true)
            
        } catch {
            print(error)
            self.canRecord = false
            fatalError()
        }
        //this describes the format the that the file will be recorded in
        let settings = [
            AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
            AVSampleRateKey: 12000,
            AVNumberOfChannelsKey: 1,
            AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
        ]
        do {
            //create the recorder, pointing towards the URL from above
            audioRecorder = try AVAudioRecorder(url: recordingURL,
                                                settings: settings)
            audioRecorder?.record() //start the recording
            isRecording = true
        } catch {
            print(error)
            isRecording = false
        }
    }
    
    func stopRecording() {
        audioRecorder?.stop()
        isRecording = false
        audioFileURL = recordingURL
    }
    
    func playRecordedFile() {
        guard let audioFileURL = audioFileURL else {
            return
        }
        do {
            //create a player, again pointing towards the same URL
            self.audioPlayer = try AVAudioPlayer(contentsOf: audioFileURL)
            self.audioPlayer?.play()
        } catch {
            print(error)
        }
    }
}

struct ContentView: View {
    
    @StateObject private var audioManager = AudioManager()
    
    var body: some View
    {
        VStack {
            if !audioManager.isRecording && audioManager.canRecord {
                Button("Record") {
                    audioManager.recordFile()
                }
            } else {
                Button("Stop") {
                    audioManager.stopRecording()
                }
            }
            
            if audioManager.audioFileURL != nil && !audioManager.isRecording {
                Button("Play") {
                    audioManager.playRecordedFile()
                }
            }
        }
    }
}
jnpdx
  • 45,847
  • 6
  • 64
  • 94
  • Thank you for sharing this, a lot of the online examples that i'm seeing is using swift UI, I am wondering if there is a reasoning for this? The project that i've created is in UIKit but i'm wondering if its better to use swift UI for this particular project, and why? – Richard Dec 28 '21 at 23:12
  • UIKit vs SwiftUI is inconsequential for all of the code related to recording and playing audio -- it only has to do with the UI presentation. I chose to use SwiftUI here because I can create a minimal example with many fewer lines of UI code. If this answered your question, please mark it as accepted using the green checkmark. – jnpdx Dec 28 '21 at 23:19
  • Thank you, I am still lost on the question relating to saving the files locally to the device, if you take a look at my original code, it says that the file path is a tempDirectory. I do not believe that a temporary directory would allow users to save the file onto their device? The file path also states that the format is in caf, I would like to save the files as MP3 format. How can i do this? – Richard Dec 28 '21 at 23:31
  • What does "save locally to the device" mean? I'm using the documents directory -- that *is* on the device. Converting files to mp3 should be a new question (also, you likely won't end up with mp3 without a lot of trouble, but [mp4 is trivial](https://stackoverflow.com/questions/45974974/how-to-convert-caf-audio-file-into-mp4-file-in-swift)) -- questions on SO should deal with *one* issue at a time. – jnpdx Dec 28 '21 at 23:33
  • See this for making your documents directory visible in the Files app if that's what you're getting at: https://nemecek.be/blog/57/making-files-from-your-app-available-in-the-ios-files-app – jnpdx Dec 28 '21 at 23:36
  • so basically these two lines of code -- let tempDirectory = fileManger.temporaryDirectory let filePath = "TempVocalRecording.caf" -- will save the recorded audio files into a temporary directory & will be saving it as Caf. I want user's to be able to save which audio file that they record into their iPhones or onto the app. Thats what I mean by locally, and rater than.caf, it should be .mp3, mp4 is a video format – Richard Dec 29 '21 at 13:52
  • “Into their iPhones or onto the app” is still unclear to me, unfortunately. Like I said, the conversation step should be searchable depending on your format (although it won’t be mp3, as iOS doesn’t come with the encoding codecs to store it) – jnpdx Dec 29 '21 at 15:15