• [ Moderator note: moved to Fixing WordPress. ]

    Hi,

    I’ve generated a sitemap.xml for all of my WordPress sites, and submitted them in my Google Webmaster account. However one of them is giving a warning – Url blocked by robots.txt. It’s saying the issue count is 47 (same as the number of pages in the site) but my robots.txt is not blocking them! I’ve checked <mysite>/robots.txt, and all it says is

    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php

    Also, when I run the robots.txt tester for any of the individual pages in my site, they all show up as ‘Allowed’, even though they are showing as being blbocked by the robots file.

    Anyone have any idea what’s going on here?

    Many thanks,
    WP.

Viewing 2 replies - 1 through 2 (of 2 total)
  • Moderator Steven Stern (sterndata)

    (@sterndata)

    Volunteer Forum Moderator

    Give Google a couple of days to re-check your site. It’s probably relying on a cached copy of robots.txt it found before you made your site visible to search engines.

    Thread Starter whitephantom

    (@whitephantom)

    Ok, thanks Steve, I’ll keep an eye on it and see.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Warning – Url blocked by robots.txt’ is closed to new replies.