Home > Software engineering >  How to handle mass changes to a large document with frequent reads in MongoDB?
How to handle mass changes to a large document with frequent reads in MongoDB?

Time:03-10

Let's say I have a document in the following format in MongoDB:

students: Array
grades: Array

I currently have about 10,000 students and grades that are constantly changing. The number of students is constantly growing and students are removed from the document. I have a process to update the document every 30 minutes. At the same time, I've built an ExpressJS API where various teachers query the database as often as every minute to view info about their students.

  1. What is the best way to update the data? Since there are so many possibilities of students being added, removed, and with grades changed, should I just overwrite the document every 30 minutes? The dataset is only a couple MB in size overall.
  2. How can I ensure that the teachers will have no downtime if I happen to be updating at the same time they're making a GET request?

CodePudding user response:

  1. If you have so many changes it is best to overwrite the document with all changes at once compared to updating the document for any single student change every time.
  2. The teachers will have no downtime , they will just read the previous document and after the change they will read the new document.

With so many ( 10k students/grades - elements in array ) maybe better if you have single document per student/grade in collection so you update the relevant student document only.

You need to adapt the schema based on the use cases , I guess not every time teachers need to read the full list of students/grades , but just students per class or lesson or school , guessing here since I dont see your exact use case and document example ...

  • Related