Home > Software design >  htaccess rule blocking robots.txt file
htaccess rule blocking robots.txt file

Time:12-15

I have following rule inside my htaccess file:

RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*)([^/])$        /$1$2/ [L,R=301]

and I added that to make sure all pages on my website are 301 to pages with slash / at the end. Now it has the problem because google reported it's getting coverage issues with my robots.txt file because it is getting / at the end as well. How do I exclude the robots.txt file from that rule?

Thanks!

CodePudding user response:

Try including a condition that excludes requests that look as if they have a file extension:

RewriteCond %{REQUEST_URI} !\.\w{2,4}$
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*)([^/])$        /$1$2/ [L,R=301]

If nothing else, this should serve as an optimisation. However, as noted in comments, the RewriteCond directive that checks against REQUEST_FILENAME should already exclude requests for /robots.txt - assuming this exists as a physical file in the document root and is not internally rewritten to another file or routed through your application to generate robots.txt dynamically?

  • Related