[Plugin: Google XML Sitemaps] URL restricted by robots.txt
-
I searched the forum but did not see an answer to this question anywhere.
I have two sites and both have the sitemap plugin installed into the root directory.
virtual-rent-to-own.com/sitemap.xml
triadcreditconsultants.com/sitemap.xmlBoth are getting 2 errors in google webmaster tools. One under “in sitemaps” that says “URL restricted by robots.txt” and under restricted by robots.txt that says the same.
One on site is says it was “detected” on Nov 8, and on one Nov 9.
I have used 3rd party tools to check the robots.txt files (in the root directories as well) and they said they were fine and allowing access to all robots.
Under Site configuration: Crawler access, I get this:
“Allowed by line 2: Disallow: Detected as a directory; specific files may have different restrictions”I have worked hard to solve this problem, but have been unsuccessful. I don’t know whether I just need to wait or whether I need to do something.
Any thoughts would be greatly appreciated.
I don’t know whether I just need to wait since these issues
- The topic ‘[Plugin: Google XML Sitemaps] URL restricted by robots.txt’ is closed to new replies.