robots.txt on the server is unreachable
-
We run Google Shopping ads for popular products on our website. As of this morning, the majority of our products were disapproved and are reading a “robots.txt on the server is unreachable” error. Yoast Sitemap is still showing as “successful” in the Google Search Console. I have verified with our cloud hosting provider that no changes have been made on their end regarding firewall/security features that could mistakenly block crawlers.
In response I have updated the robots.txt file to read:
User-agent: *
Disallow:Our ads are still down, but that function should ensure the next time Google tries to crawl our pages their “robots” will not receive any potential resistance. Hopefully this will re-approve the ads.
Unsure if this is related to changes made in the last update on March 11th or if this is caused by something else entirely.
Any help/ideas? Already checked off all other possibilities I can think of beside Yoast that could have caused this issue.
Edit:
I can see on March 4th there was an update more directed towards WordPress 5.7 and “robots” changes. If updates caused the issue, it is more likely this one rather than the March 11th update.
- The topic ‘robots.txt on the server is unreachable’ is closed to new replies.