• Hi,

    Google webmaster tool cannot access my sitemap.xml file. It says that it is blocked by my robots.txt file with the content:
    ___
    User-agent: *
    Disallow: /
    ___
    But the file it shows is not the one in my root directory which is completely different.

    I tried to upload a new robots.txt file, but no success.

    Under Settings->Reading: Search Engine Visibility to block search engines from visiting this site is NOT checked.

    I have Google XML Sitemaps installed, and tried to enable/disable the option: Add sitemap URL to the virtual robots.txt file (to try to force it to create a new robots), but no updates in the robots.txt when I check with Google webmaster tool.

    Is it WordPress itself, that creates a virtual robots.txt?
    or is it a plugin.
    I have tried to deactivate the following plugins: Wordfence, Jetpack, W3 Total Cache, Google Analytics and Google XML Sitemaps, but with the same result. A virtual robots.txt is created and blocks my sitemap.xml file from getting read by Google Webmaster Tool.

    Please help.

Viewing 2 replies - 1 through 2 (of 2 total)
  • Thread Starter peterbredahldam

    (@peterbredahldam)

    Update: I just went to my webhotels online filemanager and tried to access the robots.txt from there, first I can see the physical file, but after one second it is replaced with the virtual:

    User-agent: *
    Disallow: /

    I don’t know if that helps

    Thread Starter peterbredahldam

    (@peterbredahldam)

    I think I found the solution.
    Google Webmaster tool was looking at https://www.domain.com and not domain.com.
    I still find it strange that I will get two different robot.txt files. Since it is the same domain it is pointing to.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘My robots.txt is overwritten – problem with Google Webmaster tool’ is closed to new replies.