I have some remote .tar.gz
files that I am downloading into my app that I then need to unarchive. I am writing the file to disk using an output stream so that my memory usage is low.
Now, when I want to untar the file, I have to load that as NSData
. The file is around 1.2 gigs, so this is where I run into a problem.
NSString *path = [self pathForFileName:vectorFile.fileName];
NSError *error = nil;
NSData *data = [NSData dataWithContentsOfFile:path options:0 error:&error];
if (error) {
NSLog(@"%@", error.localizedDescription);
}
When doing this, and trying to read that file into NSData
, I get the following error:
malloc: *** mach_vm_map(size=1005060096) failed (error code=3)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
The file “test.tar.gz” couldn’t be opened.
I am assuming that this is just too large of a file for my iPhone 6 to handle, since trying to load that into a NSData
object will be loading that into RAM, which is too much for one app and the system won't allow it.
If this IS the case, is there any other way around this? How can one uncompress a large file like this on iOS?