Hello I'm trying to convert an integer which represents a frame rate into CMTime to be able to set minFrameDuration in AVCaptureScreenInput.
Until now I have
int numberOfFrames = [self convertFrameRateStringToInt:[parameters objectForKey:@"frameRate"]];
Float64 frameDuration = 1.0/numberOfFrames;
NSLog(@"numberOfFrames %i - frameDuration %f",numberOfFrames, frameDuration);
NSLog(@"timeshow %f",CMTimeGetSeconds(CMTimeMakeWithSeconds(frameDuration, 1)));
return CMTimeMakeWithSeconds(frameDuration,1);
if the numberOfFrames is set to 15 frames per seconds then the frameDuration is 0.066667 therefore the parameters for CMTimeMakeWithSeconds are 0.066667 and 1 which means the CMTimeGetSeconds should return 0.066667/1 = 0.066667 but instead it returns 0.
Can some one figures out what I'm doing wrong and explain me why ? Thanks.
Edit1: I made it to work changing the function like this :
int numberOfFrames = [self convertFrameRateStringToInt:string];
NSLog(@"timeshow %f",CMTimeGetSeconds(CMTimeMake(1, (int32_t)numberOfFrames)));
return CMTimeMake(1, (int32_t)numberOfFrames);
I still can't understand why the last method didn't work