I'm trying to implement the functionality of:
- Recording a video (it's automatically saved to the apps memory)
- Moving video file from apps memory to gallery (in this step we also save path to newly created video in gallery)
- Displaying video from gallery
Steps 1. and 3. are written in Flutter, step 2 was implemented natively in Swift.
For some reason this feature sometimes works and sometimes not. When it doesn't work in step 3. I receive an error PlatformException(VideoError, Failed to load video: The requested URL was not found on this server., null, null)
.
I've also tried using path retrieved from step 2. to simply create File with it File(path)
but then I also receive an error that file was not found (OS Error: No such file or directory, errno = 2)
.
I suspect, that on iOS it's caused by the whole App Sandbox thing is that correct? But if it is, why it sometimes does work and sometimes doesn't? Maybe there is something in my code which I could fix?
For step 1. I use camera package with basically the same code as in example. Then, after receiving XFile
I run native iOS code, to save video to gallery and get it's path:
override func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
) -> Bool {
let controller : FlutterViewController = window?.rootViewController as! FlutterViewController
let videoSavingChannel = FlutterMethodChannel(name: "app.package.name/camera/video_to_gallery",
binaryMessenger: controller.binaryMessenger)
videoSavingChannel.setMethodCallHandler({
[weak self] (call: FlutterMethodCall, result: @escaping FlutterResult) -> Void in
// This method is invoked on the UI thread.
guard call.method == "saveVideoToGallery" else {
result(FlutterMethodNotImplemented)
return
}
let args = call.arguments as? Dictionary<String, Any>
guard let fileUrl = args?["fileURL"] else {
result(nil)
return
}
self?.saveVideoToGallery(fileURL: fileUrl as! String, result: result)
})
GeneratedPluginRegistrant.register(with: self)
return super.application(application, didFinishLaunchingWithOptions: launchOptions)
}
func saveVideoToGallery(fileURL: String, result: @escaping FlutterResult) {
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: URL(fileURLWithPath: fileURL))
}) { saved, error in
if saved {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
PHAsset.fetchAssets(with: .video, options: fetchOptions).firstObject?.getURL { urlFromGallery in
guard let absoluteUrl = urlFromGallery?.absoluteString else {
result(nil)
return
}
result(absoluteUrl)
}
}
}
}
getting a video path:
extension PHAsset {
func getURL(completionHandler : @escaping ((_ responseURL : URL?) -> Void)){
if self.mediaType == .image {
let options: PHContentEditingInputRequestOptions = PHContentEditingInputRequestOptions()
options.canHandleAdjustmentData = {(adjustment: PHAdjustmentData) -> Bool in
return true
}
self.requestContentEditingInput(with: options, completionHandler: {(contentEditingInput: PHContentEditingInput?, info: [AnyHashable : Any]) -> Void in
completionHandler(contentEditingInput!.fullSizeImageURL as URL?)
})
} else if self.mediaType == .video {
let options: PHVideoRequestOptions = PHVideoRequestOptions()
options.version = .original
PHImageManager.default().requestAVAsset(forVideo: self, options: options, resultHandler: {(asset: AVAsset?, audioMix: AVAudioMix?, info: [AnyHashable : Any]?) -> Void in
if let urlAsset = asset as? AVURLAsset {
let localVideoUrl: URL = urlAsset.url as URL
completionHandler(localVideoUrl)
} else {
completionHandler(nil)
}
})
}
}
}
And then in flutter to display a video I use video_player again with pretty basic implementation:
VideoPlayerController controller =
VideoPlayerController.file(File(_videoPathFromGallery));
controller.initialize().then((_) {
//...someStuffHere
}
It's probably more of an iOS question rather than a flutter one.