So I have a Webhook that delivers a JSON payload to my cloud function URL.
Within that cloud function, what are my limitations for writing the JSON to my Cloud Firestore?
I can't dump my JSON payload all to one document within a collection, so I need to parse it all out to different fields.
So my cloud function will look something like:
await admin.firestore().collection("collection1").doc(doc1).set({
field1: data.fieldFromJson1
})
await admin.firestore().collection("collection1").doc(doc1).collection("sub-collection1").doc(doc2).set({
field2: data.fieldFromJson2
})
Can I do this in 1 cloud function or do I need 2 function?
I have a 100 JSON lines to parse out all over my cloud firestore, so this example is very simplified.
References & Documentation:
https://cloud.google.com/functions/docs
https://github.com/firebase/functions-samples
CodePudding user response:
Technically there isn't any limit and as long as you stay in rate limits defined in the documentation you should be fine. If it's a single webhook that has all the data then you can write all documents in a single go. You can either use Promise.all()
or Batch Writes
(if writing a max of 500 documents).
// parse data and map an array as shown below
const promises = [
admin.firestore().collection("collection1").doc(doc1).set({
field1: data.fieldFromJson1
}),
admin.firestore().collection("collection1").doc(doc1).collection("sub-collection1").doc(doc2).set({
field2: data.fieldFromJson2
})
]
await Promise.all(promises)
// any other processing
res.status(200).end() // terminate the function
If you expect a large amount of data which can take some time to parse then make sure you set higher timeout for the function (defaults to 60 seconds).