I recently downloaded a CSV that turned out larger than I anticipated (the size wasn't available until the download finished). The file is >100 GB and my drive only has around 25 GB free at this point.
Since CSV is not very space efficient, I'm wondering if there's a way to 'stream' the data from CSV to a more compressed format? That is, read the CSV by chunks, write the chunks to feather or parquet, delete the written lines from the CSV, and continue until the CSV has been 'emptied.'
A python solution would be ideal.