I have a video recording class that I created in Objective C that works and I can view all the frames from the recording when I run it in a native app. The class basically creates and AVCaptureSession and sets that up. All you need to do is pass it a UIView and it creates the preview view and everything.
I added the class to a nativescript plugin I am working on. I set it up and preview view gets created. However when I start recording after 1 second it drops all the frames. I am running the exact same class and functions that are in native ios app but on nativescript for some reason it drops all the frames in the class. The class has a different behavior in nativescript.
I have a feeling this is something to do with ARC on IOS and maybe its deallocating something which causes it to mess up in nativescript but not regular native ios. What could be causing this and how would I fix it?
This is the code that I am using to call the Objective C class that I created in the nativescript plugin. Its suppose to always call the processFrame block but instead class the processDroppedFrame block in nativescript. It functions properly in regular objective c.
this._videoRecorder = VideoRecorder.new();
this._videoRecorder.createCameraPreviewView(this.nativeView);
this._videoRecorder.processFrame = ()=>{
console.log("Frames");
};
this._videoRecorder.processDroppedFrame = ()=>{
console.log("Dropping Frames");
}
this._videoRecorder.startRunningSession();