I am trying to use the camera plugin for two purposes in my app:
- Allow the user to record a video
- Process the video alongside recording with some AI models
I tried to use the camera plugin with two controllers
.
CameraController _cameraRecordingController;
CameraController _cameraImageStreamController;
.
.
.
// initiate controllers
if (_cameraRecordingController == null ||
!_cameraRecordingController.value.isInitialized) {
_cameraRecordingController = CameraController(
CameraBloc.cameras[0],
ResolutionPreset.high,
enableAudio: false,
);
await _cameraRecordingController.initialize();
await _cameraRecordingController.prepareForVideoRecording();
}
if (_cameraImageStreamController == null ||
!_cameraImageStreamController.value.isInitialized) {
_cameraImageStreamController = CameraController(
CameraBloc.cameras[0],
ResolutionPreset.medium,
enableAudio: false,
);
await _cameraImageStreamController.initialize();
}
.
.
.
// start recording and imagestream one -by-one
await _cameraRecordingController.startVideoRecording(event.filePath);
await _cameraImageStreamController.startImageStream((img) {
// Do image processing here
});
.
.
.
.
// finally stop the controllers
if (_cameraImageStreamController.value.isInitialized)
await _cameraImageStreamController.stopImageStream();
if (_cameraRecordingController.value.isInitialized)
await _cameraRecordingController.stopVideoRecording();
The two controllers are for recording and imagestream respectively. Do you think this is the right approach of using this plugin for my use-case (I was getting some weird behavior when doing this ex. the camerapreview was hung on a single frame many times)? Is there a better way of doing this (even if not using this plugin)?