Google Search Console Blocked: Robots.txt unreachable
-
Hello all,
I am trying to have my new wordpress site crawled by Google Search Console after migrating from Squarespace. My last crawl was indexed with Squarespace May 30th. Since moving, I have received the following error messages on GSC.
Page Fetch – Failed: Robots.txt unreachable
I have worked with several agents at HostGator and they have said all security mods & DNS look fine and shouldn’t cause issues. They are unable to assist any longer, blaming site issues on coding.
I have installed LiteSpeed Cache, Yoast SEO, & WP External Links plugins, but didn’t notice any setting that would trigger a nofollow. I have disabled CDN on LSCache.
I have amended my robot.txt several times and settled on allowing Yoast to create it. The same goes with my sitemap. I have test RRT with google and crawl fails with the “URL is not available to Google” “Robots.txt unreachable”.
I am at a loss and would appreciate any help with the error.
Thank you,
Kevin
https://www.rusticluxurycabins.com
The page I need help with: [log in to see the link]
- The topic ‘Google Search Console Blocked: Robots.txt unreachable’ is closed to new replies.