• Resolved tavakal4devs

    (@tavakal4devs)


    Hi, is there any option to prevent crowlers from caching my pages?
    or is there any filters that i can use?

Viewing 5 replies - 1 through 5 (of 5 total)
  • Plugin Contributor Marko Vasiljevic

    (@vmarko)

    Hello @tavakal4devs

    Thank you for reaching out and I am happy to assist you with this.
    Can you please expand on this question? I am not sure which crawlers you are referring to?
    The pages are cached when visited and if you are referring to search engine crawlers, they should not be caching the pages.
    Thanks!

    Thread Starter tavakal4devs

    (@tavakal4devs)

    Yes am speaking about search engine bots.
    if they are not caching, then another kind of bots are.
    is there any filter that can use to abort cashing?

    Plugin Contributor Marko Vasiljevic

    (@vmarko)

    Hello @tavakal4devs

    the only way is to exclude some specific pages from being cached by adding them to Performance>Page Cache>Advanced>”Never Cache following pages” field for example:
    /some-page/
    I hope this helps!
    Thanks!

    Thread Starter tavakal4devs

    (@tavakal4devs)

    No, i have a ton of pages, this is not helpful, my space is running out.

    So, there is no filter?

    Plugin Contributor Marko Vasiljevic

    (@vmarko)

    Hello @tavakal4devs

    As I’ve mentioned before, the page is cached once visited.
    There is no filter to prevent page visits. The only way is to exclude a page from being cached.
    define('DONOTCACHEPAGE', true); if added to a specific template it will ensure that it’s not cached also.
    Thanks!

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘how to prevent crowlers from caching my page’ is closed to new replies.