• Resolved mmmkrust0

    (@mmmkrust0)


    Hello all,

    I am trying to have my new wordpress site crawled by Google Search Console after migrating from Squarespace. My last crawl was indexed with Squarespace May 30th. Since moving, I have received the following error messages on GSC.

    Page Fetch – Failed: Robots.txt unreachable

    I have worked with several agents at HostGator and they have said all security mods & DNS look fine and shouldn’t cause issues. They are unable to assist any longer, blaming site issues on coding.

    I have installed LiteSpeed Cache, Yoast SEO, & WP External Links plugins, but didn’t notice any setting that would trigger a nofollow. I have disabled CDN on LSCache.

    I have amended my robot.txt several times and settled on allowing Yoast to create it. The same goes with my sitemap. I have test RRT with google and crawl fails with the “URL is not available to Google” “Robots.txt unreachable”.

    I am at a loss and would appreciate any help with the error.

    Thank you,

    Kevin

    https://www.rusticluxurycabins.com

    The page I need help with: [log in to see the link]

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Support Maybellyne

    (@maybellyne)

    Hello @mmmkrust0

    Thanks for reaching out about your robots.txt file. Please edit your robots.txt file with Yoast SEO in WordPress > Yoast SEO > Tools > File Editor and change the Disallow: / to Disallow: . That is, removing the backslash. See screenshot

    Thread Starter mmmkrust0

    (@mmmkrust0)

    Thanks for the reply. I have adjusted the robot.txt file with the same result. I even added allow for Google bot, but that didn’t change anything.
    There isn’t any “no follow” in my HTML, I have rebuilt the robot file and sitemap a couple times, plus removed LS Cache and Yoast and the same result.

    any other thoughts?

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Google Search Console Blocked: Robots.txt unreachable’ is closed to new replies.