Viewing 2 replies - 1 through 2 (of 2 total)
  • There is a default list of excluded User Agents, and I believe ‘bot’ is on that list. You generally don’t want spiders to get a cached copy, you want them to pull the newest version to index. So most caching systems are setup to try to recognize any bots and bypass the cache.

    Plugin Contributor Frederick Townes

    (@fredericktownes)

    Also, some tools ignore headers in order for them to work.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Is it a header that tells W3 to cache?’ is closed to new replies.