0

I am trying to update multiple files in fileField in django. I am fetching the objects from the database and i am assigning the new file by iteration over them and then, saving the object in the list. If i use bulk_update it updates all the fields including the FileFields but is not uploading the file and if i iterate over each object and use .save(), then its working fine.

But using .save() function hits the database multiple times. So, i need to use bulk_update

Code

update_list = []
t_obj = FileFieldDetails.objects.filter(ques_key__in=q_key)
for t in t_obj.iterator():
    t.value = request.FILES[0]
    update_list.append(t)

# Not Working
FileFieldDetails.objects.bulk_update(update_list,['value'])

# Working
for i in update_list:
    i.save()
Arsh Doda
  • 294
  • 4
  • 14

1 Answers1

0

You could wrap your for loop inside a transaction atomic.

Atomicity is the defining property of database transactions. atomic allows us to create a block of code within which the atomicity on the database is guaranteed. If the block of code is successfully completed, the changes are committed to the database. If there is an exception, the changes are rolled back.

Like so:

from django.db import transaction

with transaction.atomic():
    for i in update_list:
        i.save()

That could help you to mitigate the hits to your database

Toan Quoc Ho
  • 3,266
  • 1
  • 14
  • 22
  • @ArshDoda, yes and no. If you've got 1000 rows to insert, committing after every row can cause quite a performance hit compared to committing once after all the inserts. Of course, this can work the other way too — if you do too much work in a transaction then the database engine can consume lots of space storing the not-yet-committed data or caching data for use by other database connections in order to maintain consistency, which causes a performance hit. Here is the reference: https://www.justsoftwaresolutions.co.uk/database/database_tip_use_transactions.html – Toan Quoc Ho Sep 27 '19 at 16:55