• From the past 1 month, my robots.txt has a caching problem which is leading to Google blocking my website. Although a robots.txt file is there and even after deleting every type of cache, creating exceptions and even changing hosting, the problem still persists.

    I even changed the permission of robots.txt file to R-R-R but still it has the caching problem. I don’t know what else to try. I also tried disabling all the plugins and themes but still the same result.

Viewing 3 replies - 1 through 3 (of 3 total)
  • What’s the exact nature of this “caching problem”? A stale cache that’s not getting purged?

    While I see no reason why a robots.txt file should be cached, to begin with, it doesn’t change that often to need purging after every site update — even if it’s been cached.

    That’s why I’m wondering what sort of “caching problem” you’re having.

    And what sort of caching are you doing?

    Thread Starter sarfrazk9

    (@sarfrazk9)

    I don’t know why robots.txt file seems different in different browsers and incognito even though it is static in my site. This has led to Googlebot seeing my old version of robots.txt which is blocking the whole site. I have no idea what is going on.

    You wouldn’t be using Cloudflare, by any chance?
    Cloudflare caches robots.txt and it is, apparently, impossible to control it. I tried creating a robots.php file with a rewrite in .htaccess to robots.txt, and adding no-caching headers to it. I’ve also tried changing the caching with a filesmatch directive in .htaccess all to no avail. And I even have CF set to “respect existing headers” but it doesn’t.
    This is one of the reasons that I’ve about had it with CF.

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Robots.txt caching problem’ is closed to new replies.