• Resolved charmedattic

    (@charmedattic)


    Hi,

    I’m doing a soft launch for my company’s corporate blog and as such, it should be restricted to normal visitors only. As such, I understand that if we click on the Privacy settings “I would like to block search engines, but allow normal visitors” will cause “<meta name=’robots’ content=’noindex,nofollow’ />” to be generated into the <head> </head> section (if wp_head is used) of your site’s source, causing search engine spiders to ignore your site.

    Hence, is there a need to manually create robots.txt in the root folder? Or will it be automatically created once we turn on that setting?

    How will the robots.txt be affected once I decided to open my blog to search engines?

Viewing 6 replies - 1 through 6 (of 6 total)
  • Moderator James Huff

    (@macmanx)

    No, a virtual robots.txt file will be created, so there’s no need to make a physical one. Once you deactivate the privacy settings, you can safely create your own robots.txt file.

    Thread Starter charmedattic

    (@charmedattic)

    So once I deactivate the privacy setting, creating my own robots.txt file wouldn’t clash with the virtual file I assume?

    Moderator James Huff

    (@macmanx)

    That’s correct. The virtual robots.txt checks for the existance of a physical robots.txt before enabling itself.

    Thread Starter charmedattic

    (@charmedattic)

    Ok thank you for your help! ??

    Moderator James Huff

    (@macmanx)

    You’re welcome!

    I have created and placed a robots.txt in my root directory and it is not being recognized. I had privacy set to block engines then, upon changing the setting to allow engines, Google bots are still blocked. Verified using webmaster tools. Robot.txt file I created is not over-riding the virtual file nor is the virtual file updating to the setting of allow all. Please help.

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘Privacy Settings – robots.txt’ is closed to new replies.