• Resolved rajamdade

    (@rajamdade)


    Hi,

    Googlebot is not able to crawl the sitemap of our website. We have validated our sitemap.xml using xml-sitemaps.com, though the Googlebot is not able to crawl it. Moreover, in the live URL test, the Googlebot failed to crawl it giving the error as:
    Page fetch Failed: Blocked by robots.txt

    Kindly help us to resolve the issue as soon as possible.

    Regards,
    Rahul

    The page I need help with: [log in to see the link]

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Author Jeff Starr

    (@specialk)

    Googlebot is whitelisted (always allowed) by the plugin. It never, ever is blocked, unless you or some other admin has changed the default settings. Best advice if you are having issues is to either reset the plugin settings to defaults, or simply remove the plugin.

    I hope this helps, let me know if I can provide any further infos, glad to help anytime.

    Plugin Author Jeff Starr

    (@specialk)

    Also, if you decide to remove the plugin, remember to remove any rules added to your robots.txt file. The plugin documentation explains more about the uninstall process fyi.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Googlebot blocked by robots.txt’ is closed to new replies.