• We have just launched a website (www.gocar.ie). For the first day the box to ask Search Engines not to index this site was ticked – I changed it to allow search engines to index this site.

    However, I am getting an issue with the robots.txt file in Google Webmaster tools – and no matter what I try, I get this message: “The page could not be crawled because it is blocked by robots.txt”

    My robots.txt is as follows, but I don’t believe this to be this issue:

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Anybody got any ideas? Google Webmaster Tools is saying Googlebot is blocked from the site, and the sitemap has 800 issues (www.gocar.ie/sitemap.xml) despite the fact there are only 32 pages in the sitemap.xml.

    All help is appreciated – can’t get my head around this issue.

Viewing 3 replies - 1 through 3 (of 3 total)
  • You need to fix the sitemap but robots.txt can take some time to change as a googlebot needs to visit your website and check the robots.txt file and update into your google webmaster tools.

    How have you created your sitemap?

    Thread Starter menton

    (@menton)

    I have, https://www.gocar.ie/sitemap.xml using Google Sitemap Generator Plugin

    You can check to see if it validates and I have just used two and they say it is ok. Again you will have to keep checking back as you will have to wait for the Google Bot to check things.

    Maybe delete it and resubmit again just in case.

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Googlebot is blocked from Website’ is closed to new replies.