I have deployed a React-based Webb app in Azure App Services The website is working as it is supposed to, but according to https://search.google.com/test/mobile-friendly, Google is not able to reach it.
Google's guess is that my Robot text is blocking it, but I don't think it is the case
Below is my robot text
https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:
Does anyone know if Azure App Services could be blocking Googlebots by default? If so, how do I whitelist them?
Update: After some investigation, I think Cloudflare is actually responsible for preventing Googlebot from crawling. Anyone knows how to get through the problem?
CodePudding user response:
Where several user agents are recognized in the robots.txt file, Google will follow the most specific.
If you want all of Google to be able to crawl your pages, you don't need a robots.txt file at all.
If you want to block or allow all of Google's crawlers from accessing some of your content, you can do this by specifying Googlebot as the user agent.
User-agent: Googlebot
Disallow:
Incase Cloudflare is responsible for preventing Googlebot from crawling, you can do the below settings. Go to Firewall settings > Managed Rules, and turn off Cloudflare Specials
Disable the rules individually so you don’t lose all the other Cloudflare Specials benefits. For reference please check Cloudflare Managed Special rules