• Resolved Steve

    (@puddesign)


    whenever I add a new sitemap to google search console after deselecting Discourage search engines from indexing this in WP settings in GSC I get the error sitemap.xml blocked by robots.txt, this goes away on its own after about 5 days though.

Viewing 3 replies - 1 through 3 (of 3 total)
  • Plugin Author Sybre Waaijer

    (@cybr)

    Hello!

    Yes, this is expected behavior. Google caches the robots.txt file for a few days before reparsing it. They’ve probably found your site before it went live, and then marked it as an error.

    I see you’ve figured that this resolves on its own ?? So, I’m marking this topic as resolved.

    If you have more questions, feel free to reach out! Cheers!

    Thread Starter Steve

    (@puddesign)

    Ah ok that explains things, great thanks

    Hello Guys i have the same error on my site probuzzing.com with same message from google
    i went and uncheked the : discourage ….in reading on dashboard but 4 days still got this message from google search console :
    To owner of https://www.probuzzing.com/,

    Search Console has identified that your site is affected by 1 new Coverage related issue. This means that Coverage may be negatively affected in Google Search results. We encourage you to fix this issue.

    New issue found:
    Indexed, though blocked by robots.txt

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Google Search Console sitemap.xml blocked by robots.txt’ is closed to new replies.