Home > Software engineering >  Allow search engine bots to crawl a site while limiting access for users by storing a page view coun
Allow search engine bots to crawl a site while limiting access for users by storing a page view coun

Time:09-16

We would like to implement a free usage limit on the client side for non-authenticated users. For instance, users who are not logged in will not be able to see the content if they exceed a certain limit for a day (say 5 articles per day). We plan to store this information on the local storage - and then just restrict the client from making an API call if the user has already finished the free quota for a day.

However, it is necessary for us that this limit does not affect web bots or google bot which crawls the website for indexing. Does google bot refresh the local storage content for each page crawled from my website? OR do you have any good suggestions on how the free usage limit can be implemented for non-authenticated users?

CodePudding user response:

Googlebot and other search engine bots never store information between page loads. They fetch every page load with an empty local storage and cookie jar.

Your plan is fine for search engine optimization.

  • Related