• Hello,
    When I recently launched a new WordPress site, I had Search Engine Visibility ticked to Discourage search engines.
    A few hours after launching I unticked this option.

    I uploaded a site map to Webmaster tools, however its saying that the URLs are being blocked. The contents of the robots.txt file are:

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    I noticed wp-content is not in there, so is my robots file ok? Will Google eventually index the urls, its just taking time?
    OR
    Is this robots file wrong? How would I fix it?

    Thank you

Viewing 8 replies - 1 through 8 (of 8 total)
  • Are you using any server cache methods/plugins (including at host account/setup)?

    Hi. I’m having precisely the same problem as whatachamp. Same robots.txt content. I don’t believe I have any issues at the server end as I was running another site with no issues. Only indication I’m getting is Google saying its blocked at a directory level.

    Have you tried using fetch as google tool . It will immediately get your page indexed if there is no problem .

    Two other things re my earlier query. Would greatly appreciate it if anyone can advise.

    1. When I do a blocked URLs search at Google webmaster tools it comes up saying that the Googlebot is blocked from my whole site and, in the test window, it says my robots.txt file contains:

    User-agent: *
    Disallow: /

    It then says that this command blocks the Googlebot.

    However, when I click on the link at the blocked URLs page to my actual robots.txt file, it comes up with the following, which should be okay.

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    I have installed wp robots.txt plug in. It is also reading the second set of commands above and, when I play with it, it is varying the contents of the robots.txt file as it should.

    2. Just to complicate matters a bit more, I’ve searched and can’t actually find the robots.txt file in my directory, even though both Google and the wp robots.txt plugin seem to be reading it.

    I’m stumped! Would really value any advice. Thanks.

    kulwant2sin,

    Sorry, just saw your response. I have tried fetch as Google. Comes up successful but the blocked URLs issue remains.

    Thanks

    First of all ,
    Google takes time to refresh the data in webmaster tools about 24 to 48 hrs . If your site is new then there are very good chances that bot is not visiting your site frequently . So it will take some time to refresh the robots.txt on its server . Secondly in webmaster tools you can check by making corrections to robots.txt . Cheeck there for you url if they are allowed or blocked . Then copy the robots.txt file from webmaster tools to your site root directory . Hope that helps . You can directly put .txt file in home directory site.com/robots.txt or use any plugin

    Thread Starter whatachamp

    (@whatachamp)

    Hi wayneiac,

    My sites did eventually get recognised by Google. It took about 4-5 days. It could be my imagination but I’m sure it used to be quicker. The whole time the thumbnail image of the site in Webmaster tools showed a picture of the old site. Then once the sitemap urls waere indexed the thumbnail also refreshed to show the new site.

    I’ve noticed in recent months new sites or re-launched sites take quite a while to be indexed.

    In my WordPress install there isn’t a robots.txt as such, its created dynamically by WordPress I believe.

    Thanks for your responses kulwant2sin and whatachamp. Hopefully all I need to do is give it time. I noticed this morning that, while the URLs are still showing up as blocked in webmaster tools, my site is slowly starting to appear in Google’s results.

    Really appreciate your help and will let you know how things go over next few days.

    Wayne

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘Search Engine Visibility allowed – still being blocked’ is closed to new replies.