1

I have a tar.gz which has a large size (extracted file size is similar with 50GB), I want to extract it in Memory (or temp directory) When I extract file into memory using below script, the server memory is fully used (server memory is 64GB, available memory is lower than it)

import tarfile

with tarfile.open("large.tar.gz", "r") as tar:
    members = tar.getmembers()
    for member in members:
        f = tar.extractfilie(member)
        if f != None:
            txt = str(f.read(), "ISO-8859-1").replace('\t', ' ').splitlines()

    do_something(txt)

How can I extract it to avoid memory full ? (don't care to extracting memory or temp directory) Thanks.

Donghyun
  • 21
  • 2
  • See this: https://stackoverflow.com/questions/339053/how-do-you-unzip-very-large-files-in-python – Yuval.R Jan 14 '22 at 12:15

0 Answers0