• I searched the forum but did not see an answer to this question anywhere.

    I have two sites and both have the sitemap plugin installed into the root directory.
    virtual-rent-to-own.com/sitemap.xml
    triadcreditconsultants.com/sitemap.xml

    Both are getting 2 errors in google webmaster tools. One under “in sitemaps” that says “URL restricted by robots.txt” and under restricted by robots.txt that says the same.

    One on site is says it was “detected” on Nov 8, and on one Nov 9.

    I have used 3rd party tools to check the robots.txt files (in the root directories as well) and they said they were fine and allowing access to all robots.

    Under Site configuration: Crawler access, I get this:
    “Allowed by line 2: Disallow: Detected as a directory; specific files may have different restrictions”

    I have worked hard to solve this problem, but have been unsuccessful. I don’t know whether I just need to wait or whether I need to do something.

    Any thoughts would be greatly appreciated.

    I don’t know whether I just need to wait since these issues

Viewing 1 replies (of 1 total)
  • eviternity

    (@eviternity)

    If I got it right “Allowed by line 2: Disallow: Detected as a directory; specific files may have different restrictions” means that the Google found out that it has to crawl the directory.
    However, it can’t tell to you which files or directory directly, but it knows that the restrictions are different for each file/directory.

    You don’t need to solve this, as it isn’t a problem. This is what you want.

Viewing 1 replies (of 1 total)
  • The topic ‘[Plugin: Google XML Sitemaps] URL restricted by robots.txt’ is closed to new replies.