I have some problems with using NSURLSession
to upload photos from Asset Library to the server.
At first NSURLSession
doesn't support streaming upload. I got an exception when trying to using that:
@property (nonatomic, strong) NSURLSession *uploadSession;
...
_uploadSession = [NSURLSession sessionWithConfiguration:[NSURLSessionConfiguration
backgroundSessionConfiguration:kUploadBackgroundURLSessionIdentifier] delegate:self delegateQueue:nil];
...
NSURLSessionUploadTask *task = [self.uploadSession uploadTaskWithStreamedRequest:URLRequest];
This is an exception:
Terminating app due to uncaught exception 'NSGenericException', reason: 'Upload tasks in background sessions must be from a file'
That's really strange because Apple's manual doesn't contain any information about using only uploadTaskWithRequest:fromFile:
for background session. What if I would like to upload really huge video file from Asset Library? Should I save it previously to my tmp directory?
Looks like the only reason is to use uploadTaskWithRequest:fromFile:
anyway, right? But then I have a question how server get to know what's part of the file is uploading right now if uploading process was interrupted and started to upload next part in background?
Should I manage something for that? Previously I used Content-Range for that in URL request if I wanted to continue upload part of the file which was started previously. Now I can't do that - I have to create an URL Request before creating upload task and looks like NSURLSession
have to do something like that automatically for me?
Does anyone do something like that already? Thanks