1

I have a collection named "adverts"

/adverts/{advert_id} <-- where advert_id is auto generated by firestore.

And I have my collection "users" /user/{user_id} <--- where user_id is defined by a username

So inside the "adverts" docs I have the next map

user_data:{
    avatar_url: "",
    first_name: "example name",
    last_name: "example last name",
    rating: 5,
    username: "exampleusername"
}

This info comes from the user document each time an advert is created. So I want to update this map in advert collection, every time the user updates his data.

Is it possible to update this fields with a batch assuming that more than one document in adverts could exists? (I'm trying to avoid reading all files and rewrite them, I just want to write)

I was trying to achieve this by (is an onUpdate cloud function):

const before = change.before.data(); // Data before the update
const after = change.after.data(); // Data after the update
const user_id = after.username;

let batch  = db.batch()
let advertsRef = db.collection("adverts").where("user_data.username", "==", user_id)

batch.update(advertsRef, {
    "user_data.avatar_url": after.avatar_url,
    "user_data.first_name": after.first_name,
    "user_data.last_name": after.last_name,
    "user_data.overall_adverts_rating": after.overall_adverts_rating,
    "user_data.username": after.username,
 })

batch.commit().then(()=>{
  console.log("done")
})
.catch(error =>{
  console.log(error)
})

But I'm getting the next error:

Error: Value for argument "documentRef" is not a valid DocumentReference.
at Object.validateDocumentReference (/workspace/node_modules/@google-cloud/firestore/build/src/reference.js:2034:15)
at WriteBatch.update (/workspace/node_modules/@google-cloud/firestore/build/src/write-batch.js:312:21)
at /workspace/index.js:147:9
at cloudFunction (/workspace/node_modules/firebase-functions/lib/cloud-functions.js:134:23)
at /layers/google.nodejs.functions-framework/functions-framework/node_modules/@google-cloud/functions-framework/build/src/invoker.js:199:28
at processTicksAndRejections (internal/process/task_queues.js:97:5) 

I guess is cause my .where() is not referring to a specific file.

Luis Quiroga
  • 718
  • 2
  • 7
  • 22

1 Answers1

2

In your cloud function, you'll need to iterate through each matching advert. I.e. you are rewriting all of the documents which match the query which means you'll need to read each one, and update each one. e.g.

let adverts = await db.collection("adverts").where("user_data.username", "==", user_id).get
for (doc of adverts.docs) {
  doc.user_data = after
  batch.update(doc)
}
await batch.commit()
Jim Morrison
  • 2,784
  • 1
  • 7
  • 11
  • How is this different of the traditional read and write (update in this case)documents? I mean, if I use a simple read and update I can do unlimited writes. With this batch operation I'm limited to 500 writes. – Luis Quiroga Apr 14 '21 at 15:27
  • Batches are meant to reduce the total number of RPCs sent. The 10k document write/s limit (https://firebase.google.com/docs/firestore/quotas) applies if you are batching or not. – Jim Morrison Apr 17 '21 at 01:52
  • Im not referring about the firestore pricing. What I mean is this: "A batched write can contain up to 500 operations. ". ( https://firebase.google.com/docs/firestore/manage-data/transactions#batched-writes ) – Luis Quiroga Apr 20 '21 at 19:22