Warning – Url blocked by robots.txt
-
[ Moderator note: moved to Fixing WordPress. ]
Hi,
I’ve generated a sitemap.xml for all of my WordPress sites, and submitted them in my Google Webmaster account. However one of them is giving a warning – Url blocked by robots.txt. It’s saying the issue count is 47 (same as the number of pages in the site) but my robots.txt is not blocking them! I’ve checked <mysite>/robots.txt, and all it says is
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.phpAlso, when I run the robots.txt tester for any of the individual pages in my site, they all show up as ‘Allowed’, even though they are showing as being blbocked by the robots file.
Anyone have any idea what’s going on here?
Many thanks,
WP.
- The topic ‘Warning – Url blocked by robots.txt’ is closed to new replies.