Home > OS >  Best practices for ruby on rails 7 API backend filtering pagination with postgresql database
Best practices for ruby on rails 7 API backend filtering pagination with postgresql database

Time:12-02

Still new here. I currently face this issue:

We have a vue frontend with a view/page that lists products, sort of what amazon does in a category. And we have a component just like amazon's right sidebar to allow the users to filter on a multi-option select checkbox.

For a bit of context, in a first instance, we first do a request to the backend for an initial search (imagine going to amazon pet products category page), and we return a filtered first list. We also implemented offset pagination so that clicking page 2, will request the backend for the corresponding product list.

The issue arrives when we try to filter the results while mantaining the page amount of products.

I thought of making a request to the backend each time a change occurs in the frontend in a "search" component, wether it's the text box/input, or the multi select checkbox. And have the backend (rails) do a sql search as something like this:

products = products where(title like text-box-input and category in (multi-select-checkbox-options)) limit-condition offset-condition

Is making a request each time the search input changes a bad practice? I've been thinking for days how I could sort of cache or filter more efficiently the search, specially regarding the multi select checkbox as I have the idea it has to be fast or be prefetched (not an option as other filtering conditions are more complex) so the displayed products could switch fast.

What's the correct way of doing this?

Thanks in advance

The offset pagination and filtering works fine, but I just wondered if there is a standard or best practice way of doing this.

CodePudding user response:

Making a query everytime the search box or the filter options change isn't a bad practice, so long as you're throttling the query with leading: false, tail: true (which will wait until x milliseconds until after the user stops typing to send the request, see https://lodash.com/docs/4.17.15#throttle if you're unfamiliar with throttling) - after that it is just a matter of making sure you have indexes in place and your queries are efficient. (i.e. no N 1 queries)

If you are doing very high volume searches, or the searches are often repetitive (i.e if people are going to the collection page and hitting specific filter conditions from the multi select), you could do some caching to speed things up also. Implementing backend caching is a bit outside of the scope of the answer, but as a quick working example, you could use my gem trax_model's caching functionality https://github.com/jasonayre/trax_model/blob/master/lib/trax/model/mixins/cached_methods.rb , and have a working cache system by doing the following in your model:

  def self.search(**options)
    where(**options).to_a
  end
  cached_class_method :search

You could also implement it yourself by using the rails cache, again outside the scope of the answer, but it's basically just a matter of sorting the options hash, turning it into a string and using it as the cache key. For example if I have a model called graph and I use the code above to implement caching, do a Graph.search(:member_id => "1a705771-3089-4fb8-85b2-652a16516982")

Cache key will be graph.search/member_id/1a705771-3089-4fb8-85b2-652a16516982

-- Lots depends on your use case, but generally the above works for most cases, and anything more complex is unnecessary (i.e. switching to something like elasticsearch)

  • Related