Home > database >  Backing up Firestore data incrementally
Backing up Firestore data incrementally

Time:04-05

I'm trying to think of the best (read automated, cheapest and easy to use) way to back up Firestore data for a production app.

I'm aware I could automate exports through a scheduled cloud function and send them over to a gcloud bucket. The problem I have with this approach is that it does not allow for "incremental updates of the new and updated documents" but only for backing up entire collections. This means that most of the data will be backed up each and every time, even though it hasn't even changed since the last backup, skyrocketing the cost up for no reason.

The approach that came to mind was having a cloud function in "my-app" project that would listen to each and every change in the Firestore, and perform the same change in the Firestore of the "my-app-backup" project.

This way, I only back up the changed data. Furthermore, backed up data would never become stale (as it's backed up in real-time), unlike the first approach where automated backups happen e.g. daily or weekly.

Is this even possible, having a single cloud function in the first Firebase project writing data into another Firebase project? If not, perhaps write the data elsewhere(not in another Firebase project)? Does the approach even make sense, or do you have a better suggestion?

CodePudding user response:

If you want to export updated documents only then you can store a field updatedAt and query documents where("updatedAt", ">", "lastExportTime"). Then you can periodically run a Cloud function to export these documents. This should only cost N reads (N = number of updated documents) every time the function runs.

Furthermore, backed up data would never become stale (as it's backed up in real-time)

This works too but can also get expensive if the document updates are too frequent.

  • Related