9

Am trying to convert the speech to text and display it in UILabel using speech framework. I've authorized the user to allow use microphone.

Here's my code

- (void)startRecording {
if (_recognitionTask != nil) {
[_recognitionTask cancel];
_recognitionTask = nil;
}

NSError *error;

AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:&error];
[audioSession setMode:AVAudioSessionModeMeasurement error:&error];
[audioSession setActive:YES error:&error];

_recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
_recognitionTask = [[SFSpeechRecognitionTask alloc] init];

AVAudioInputNode *inputNode = [_audioEngine inputNode];

_recognitionRequest.shouldReportPartialResults = YES;

_recognitionTask = [_speechRecognizer     recognitionTaskWithRequest:_recognitionRequest resultHandler:^(SFSpeechRecognitionResult *result, NSError  *error) {
BOOL isFinal = NO;

if (result != nil) {
   _textLabel.text = [[result bestTranscription] formattedString];
   isFinal = result.isFinal;
}

NSLog(@"%@", error);

if (error != nil || isFinal) {
  _textLabel.text = [NSString stringWithFormat:@"%@", error];
  [inputNode removeTapOnBus:0];
  [_audioEngine stop];
  _recognitionRequest = nil;
  _recognitionTask = nil;
}
}];

[_audioEngine prepare];
[_audioEngine startAndReturnError:nil];
}

While debugging it comes into the recognitionTaskWithRequest block but the result is nil and am getting error like this:

Error Domain=kAFAssistantErrorDomain Code=203 "Corrupt" UserInfo={NSUnderlyingError=0x14651450 {Error Domain=SiriSpeechErrorDomain Code=102 "(null)"}, NSLocalizedDescription=Corrupt}

RBT
  • 24,161
  • 21
  • 159
  • 240

1 Answers1

2

The problem is that this code is missing the code to append buffer of the recognition, so to solve this, Before line [_audioEngine prepare]; add the below code

[_audioEngine.inputNode installTapOnBus:0 bufferSize:1024 format:[inputNode inputFormatForBus:0] block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when){
                //NSLog(@"Tapped");
                [self.recognitionRequest appendAudioPCMBuffer:buffer];
            }];

This solved my issue. Hope it helps you as well.

Ravi Kiran
  • 185
  • 1
  • 10
  • thanks. But I already have this in my code, and I still get the problem randomly. The problem for me is I also get the issue '[avas] ERROR: AVAudioSession.mm:1049: -[AVAudioSession setActive:withOptions:error:]: Deactivating an audio session that has running I/O. All I/O should be stopped or paused prior to deactivating the audio session.' and am not sure how to debug the problem. Can you pls help? – csharpnewbie Apr 27 '17 at 16:04
  • 1
    Do you have this code before activating the session.. if (_audioEngine.isRunning) { [_audioEngine stop]; [_recognitionRequest endAudio]; } – Ravi Kiran Apr 28 '17 at 04:49
  • 1
    yes. I do have that piece of code. I added logger in that and it does not show any audio engine running as well. – csharpnewbie Apr 28 '17 at 12:43
  • @csharpnewbie any chance do you remember if you solved this. – BooRanger Feb 13 '19 at 13:22