0

This code is returning an error that I don't understand:

query = Analytic.objects(uid__type="binData")
analytics = []
for analytic in query:
    analytic.sessionId = str(analytic.sessionId)
    analytic.uid = str(analytic.uid)
    analytics.append(analytic)
    if len(analytics) % 10000 == 0:
        print(".")
    if len(analytics) == 100000:
        Analytic.objects.update(analytics, upsert=False)
        analytics = []

TypeError: update() got multiple values for argument 'upsert'

Badjano
  • 418
  • 7
  • 23
  • what is your code doing? is just converting field types? – Erdenezul May 19 '18 at 01:19
  • Yes, but I wanted to be able to do other stuff as well, as using insert with multiple documents is faster than saving each one individually, I thought I might want to update multiples in one command only – Badjano May 20 '18 at 02:16

2 Answers2

5

Updating multiple documents at the same time, I was able to get it working using the atomic updates section in the User guide in the documents. atomic-updates

So your update should look something a little like

Analytic.objects(query_params='value').update(set__param='value')

or

query = Analytic.objects(query_params='value')
query.update(set__param='value')

The section has a list of modifies that you might want to look at. You still might want to do the update outside of your loop, as you'll be updating your query many times over.

Joseph Vargas
  • 772
  • 5
  • 17
  • in your example I could only update all documents with a single value, I was attempting to make an update for many documents with individual values, I´m starting to think that this isn´t possible – Badjano Jun 02 '18 at 02:19
  • Oh, I see. Yeah, I think it might be impossible to do in a single database call. – Joseph Vargas Jun 03 '18 at 02:44
1

It looks like you are already looping through all the objects it the queryset.

query = Analytic.objects(uid__type="binData")

Then for every iteration of the loop that satisfies~

if len(analytics) == 100000:
    Analytic.objects.update(analytics, upsert=False)
    analytics = []

You start another query and set analytics to an empty array. Here you are retrieving many objects in a query. Since you are already in a loop I think you want to~

analytics_array= []

...

if len(analytics) == 100000:
    analytics.save()
    analytics_array.append(analytics)

The save will update objects that are already created. Not sure if that's what you wanted but the error is definitely coming from the line that reads "Analytic.objects.update(analytics, upsert=False). Hope this helps!

Joseph Vargas
  • 772
  • 5
  • 17
  • Iterating through the query without saving is a lot faster, but I was wondering if I was able to make a bulk update just like "insert" so all updates go in a single request to mongodb – Badjano May 28 '18 at 19:23
  • Not sure, but I think you have to pass in the fields to be updated. I'm a little confused by the docs. Have you looked into using straight pymongo query? – Joseph Vargas May 28 '18 at 23:35