I have a tar.gz which has a large size (extracted file size is similar with 50GB), I want to extract it in Memory (or temp directory) When I extract file into memory using below script, the server memory is fully used (server memory is 64GB, available memory is lower than it)
import tarfile
with tarfile.open("large.tar.gz", "r") as tar:
members = tar.getmembers()
for member in members:
f = tar.extractfilie(member)
if f != None:
txt = str(f.read(), "ISO-8859-1").replace('\t', ' ').splitlines()
do_something(txt)
How can I extract it to avoid memory full ? (don't care to extracting memory or temp directory) Thanks.