Home > front end >  Node JS architecture to handle huge amount of Data returned by DB in better possible way
Node JS architecture to handle huge amount of Data returned by DB in better possible way

Time:12-28

We have NodeJs application and SQL Server database, and there are couple of badly written queries with a lot of inner joins.

Problem and Use Case

We have use case of generating report (15-20 thousand reports) in PDF / Excel format and there is a query with a lot of joins, which takes almost 8-9 seconds, as there is a huge amount of data - 2-3 tables used in query which have a few million rows each.

For report generation we don't need the real-time data, it can contain a day old or week old data which is fine.

What I'm looking for: a few suggestions to handle this situation in better possible way.

We have few options on table

  1. Dump data from multiple queries in separate table and use it (we are planning to do this activity in periodic manner with the help of scheduler or something on similar lines)

  2. Use time series DB to store the result of query with the help of scheduler, and use it at the time of report generation.

  3. Limiting report generation to use at max last 1 year of data.

  4. Implement sharding in SQL Server

And yes improving query is also something we are working on; but I think there is scope to make it better and that's the reason I'm reaching out here to get few more suggestions.

CodePudding user response:

Denormalization is a tried and true method of speeding up reporting. As Preben suggested, creating an indexed view in SQL server is an efficient way to do this with minimal plumbing. Alternatively, it may be worth thinking about whether a data warehouse implementation is needed for future queries.

If this is a 1-off issue, put together your indexed view (pay attention to the requirements), and move on. If this is the first of many reports that you need to optimize, think about creating a more substantial solution.

  • Related