I have a django model like this:
class Todo(models.Model):
big_file = models.FileField(blank=True)
status = models.PositiveSmallIntegerField(default=0)
progress = models.IntegerField(default=0)
I'd like to do two operations:
- first make an empty zipfile out of
big_file
(less important) - and then progressively add files into my zipfile (and save it iteratively)
The overall process would look like that:
from django.core.files.base import File
import io, zipfile
def generate_data(todo):
io_bytes = io.BytesIO(b'')
# 1. save an empty Zip archive:
with zipfile.ZipFile(io_bytes, 'w') as zip_fd:
todo.generated_data.save('heavy_file.zip', File(zip_fd))
# 2. Progressively fill the Zip archive:
with zipfile.ZipFile(io_bytes, 'w') zip_fd:
for filename, data_bytes in long_iteration(todo):
with zip_fd.open(filename, 'w') as in_zip:
in_zip.write(data_bytes)
if condition(something):
todo.generated_data.save() # that does not work
todo.status = 1
todo.progress = 123
todo.save()
todo.status = 2
todo.save()
But I can't figure out the right filedescriptor / file-like object / Filepath / django-File object combination ...
And it seems that in django I always have to save(filename, content)
. But my content could be Gigabytes, so it does not sound reasonable to store it all into a "content" variable?