Home > Software engineering >  How to implement cache across multiple dynos
How to implement cache across multiple dynos

Time:11-11

Let's say I have Node/express app hosted on Heroku. I have implemented scalability using horizontal scaling by spanning a server across multiple dynos.

I have CMS panel to control content of the app which alters the DB to add content then content is presented to end-users throughout the server API.

What I want is to add cache mechanism to back-end API to make less trips to DB because I have huge traffic during the day by app users.

The solution initially can be done be setting up a simple cache using node-cache package which set in each server instance (dyno). But how to flush the cache through the CMS.

If I send a request to flush the cache, it will only trigger a single dyno each time. So the data isn't consistent across all dynos.

How to trigger flush cache on all dynos or is there a better way to handle caching?

CodePudding user response:

Instead of a per-dyno cache, you can use something outside of your dynos entirely. Heroku offers several add-ons for common products. I've never used node-cache, but it is described like this:

A simple caching module that has set, get and delete methods and works a little bit like memcached.

That suggests that Memcached might be a good choice. The Memcached Cloud addon has a free 30MB tier and the MemCachier addon has a free 25MB tier.

In either case, or if you choose to host your cache elsewhere, or even if you choose another tool entirely, you would then connect each of your dynos to the same cache. This has several benefits:

  • Expiring items would then impact all dynos
  • Once an item is cached via one dyno it is already in the cache for other dynos
  • Cache content survives dyno restarts, which happen at least daily, so you'll have fewer misses
  • Etc.
  • Related