• Resolved Ben

    (@benrecords)


    Hi,

    I’m facing a problem with Litespeed cache. Once I make a change on any page, the first refresh gives me the following cache related headers :

    Cache-Control: no-store, no-cache

    X-Litespeed-Cache-Control: no-cache

    X-Litespeed-Tag: b90_HTTP.200

    So the page is not cached yet.

    I refresh the page again and it gives me :

    X-Litespeed-Cache-Control: public,max-age=604800

    X-Litespeed-Tag: b90_HTTP.200,b90_front,b90_URL./,b90_F,b90_Po.11,b90_PGS,b90_,b90_MIN.673fe16fd8a74456bed2bdcf64cdfb1f.js

    X-Lsadc-Cache: miss

    I then refresh a third time and I finally get X-Lsadc-Cache: hit

    Why do I need to refresh twice every time to get my pages cached ?

    The same problem occurs with the crawler : my pages get crawled with the blue indicator. Normally that would mean my paged have correctly been cached.

    Unfortunately, once I get to my web pages, I still need to refresh every page one more time before getting them actually cached

    What am I doing wrong ?

    Thank you for your help

    • This topic was modified 1 year ago by Ben.
Viewing 15 replies - 1 through 15 (of 15 total)
  • Plugin Support qtwrk

    (@qtwrk)

    are you using something like Divi , Elementor that kind of page builder ?

    Thread Starter Ben

    (@benrecords)

    Yes, I’m using Divi

    Thread Starter Ben

    (@benrecords)

    I added add_filter( 'litespeed_const_DONOTCACHEPAGE', '__return_false' ); and it seems to work for Divi (as I read in another topic).

    When I update my page, I can get it cached with only one refresh instead of two before.

    But I’m still facing the issue with the crawler not doing its job. As soon as it starts, it aborts saying the cache is already warmed up :

    03/15/24 01:15:03.425 [109.234.166.218:45464 1 faA] ??? Init w/ CPU cores=80
    03/15/24 01:15:03.425 [109.234.166.218:45464 1 faA] ??? Take over lane as lane is free: /home/nona3840/www.dptelec.fr/wp-content/litespeed/crawler/meta.data.pid
    03/15/24 01:15:03.438 [109.234.166.218:45464 1 faA] ??? ......crawler started......
    03/15/24 01:15:03.439 [109.234.166.218:45464 1 faA] ??? Cron abort: cache warmed already.
    03/15/24 01:15:03.439 [109.234.166.218:45464 1 faA] ??? Release lane
    03/15/24 01:15:03.442 [109.234.166.218:45464 1 faA] ?? [Tag] Add --- HTTP.200

    In fact it is not and the page I updated needs to be refreshed manually to be actually cached.

    Why is the crawler not viewing the updated and cache purged page ?

    Plugin Support qtwrk

    (@qtwrk)

    please provide the report number

    you can get it in toolbox -> report -> click “send to LiteSpeed”

    and also a full screenshot on crawler summary page

    Thread Starter Ben

    (@benrecords)

    Here is the Crawler summary screenshot https://i.postimg.cc/05LHhcXC/image.png

    The report number is : NWEAFHWZ

    Plugin Support qtwrk

    (@qtwrk)

    image is too blur , can’t read anything

    please use others, for example https://prnt.sc/

    Thread Starter Ben

    (@benrecords)

    You can just click on the “blurry” image it will be displayed full screen and sharp.

    Plugin Support qtwrk

    (@qtwrk)

    please try lower the crawler interval from 302400 to 600 , then wait for 10 minutes and run it again.

    Thread Starter Ben

    (@benrecords)

    I made the change and waited for 20 minutes.

    The 3 first crawlers had successfully run. I ran the 4th manually and here is the updated screenshot : https://prnt.sc/e_uNL8Qx8pu7

    What is weird is that the “Current sitemap crawl started at” refers to the first crawling only. It says that the next complete sitemap crawl will start start 10min later but when it’s time it doesn’t change the “current sitemap crawl started at” value.

    In the logs I still the same data :

    03/18/24 17:55:02.875 [109.234.166.218:27004 1 U8T] [Router] parsed type: crawler => LiteSpeed\Router::verify_type()@475 => LiteSpeed\Task::async_litespeed_handler()@79 => WP_Hook->apply_filters(,ARRAY)@324
    03/18/24 17:55:02.875 [109.234.166.218:27004 1 U8T] ? type=crawler
    03/18/24 17:55:02.875 [109.234.166.218:27004 1 U8T] ??? ------------async-------------start_async_handler
    03/18/24 17:55:02.893 [109.234.166.218:27004 1 U8T] ??? Init w/ CPU cores=80
    03/18/24 17:55:02.893 [109.234.166.218:27004 1 U8T] ??? Take over lane as lane is free: /home/nona3840/www.dptelec.fr/wp-content/litespeed/crawler/meta.data.pid
    03/18/24 17:55:02.906 [109.234.166.218:27004 1 U8T] ??? ......crawler started......
    03/18/24 17:55:02.907 [109.234.166.218:27004 1 U8T] ??? Cron abort: cache warmed already.
    03/18/24 17:55:02.907 [109.234.166.218:27004 1 U8T] ??? Release lane
    03/18/24 17:55:02.908 [109.234.166.218:27004 1 U8T] ?? [Tag] Add --- HTTP.200
    03/18/24 17:55:02.908 [109.234.166.218:27004 1 U8T] [Core] CHK html bypass: miss footer const
    03/18/24 17:55:02.909 [109.234.166.218:27004 1 U8T] [Ctrl] not cacheable before ctrl finalize
    03/18/24 17:55:02.909 [109.234.166.218:27004 1 U8T] [Router] get_role:
    03/18/24 17:55:02.909 [109.234.166.218:27004 1 U8T] [Vary] role id: failed, guest

    And then the Crawler summary page remains in the “Blue” stage.

    I then edited 1 page and got a Purge_all triggered according to my purge settings.

    Next crawler sitemap run got everything cached again.

    But when I don’t update any page or post, the crawling state is always the same (Blue status). It aborts cause pages are actually cached but it doesn’t update the Crawler Status.

    Is this the normal behavior ?

    • This reply was modified 1 year ago by Ben.
    • This reply was modified 1 year ago by Ben.
    • This reply was modified 1 year ago by Ben.
    Plugin Support qtwrk

    (@qtwrk)

    Cron abort: cache warmed already.

    this message indicates the crawler last run did not reach the interval time yet , try make 61

    Thread Starter Ben

    (@benrecords)

    I made the change. Here are current settings : https://prnt.sc/dvu0w2vLK577

    What is weird is that the crawlers wait for 5min before running despite the 60 seconds parameter.

    Here is my crawler status for the last minutes : https://prnt.sc/dA581a85F4DI

    In the Debug logs I don’t see any more “Cron abort / cache warmed already, but i’m still wondering why my crawlers statuses remain Blue : https://prnt.sc/HchPK2kDIO_I

    Last question, is it normal that the crawler resets the cache every time it restarts the global sitemap crawling, even if no changes were made since last sitemap crawl ?

    Unfortunately with these settings, I get my website uncached with varying durations :

    • Crawler #1 : Guest – WebP ==> All cache varies get uncached (truncates sitemap and regenerates it) and #1 gets cached at T
    • Crawler #2 : Guest – WebP (Guest Mode) ==> gets uncached at T (for 5min) and cached at T+5min
    • Crawler #3 : Guest ==> gets uncached at T (for 10 min) and cached again at T+10min
    • Crawler #4 : Guest (Guest mode) ==> gets uncached at T (for 15min) and cached again at T+15min
    • Crawler #1 gets uncached at T+20min and the cycle goes on

    • This reply was modified 1 year ago by Ben.
    • This reply was modified 1 year ago by Ben.
    Plugin Support qtwrk

    (@qtwrk)

    please go to sitemap setting , drop domain , set it to OFF, then regenerate the sitemap list and run it again.

    Thread Starter Ben

    (@benrecords)

    I made the change and it seems to work better now : https://prnt.sc/cRTaMDzNyANG

    Is also get everytime a HIT for each crawler, no more MISS :

    03/20/24 19:20:03.656 [109.234.166.218:15579 1 AcT] ??? [status] ? Hit [url] https://www.dptelec.fr/climatisation/choisir-sa-climatisation/
    03/20/24 19:20:03.656 [109.234.166.218:15579 1 AcT] ??? [status] ? Hit [url] https://www.dptelec.fr/climatisation/entretien-de-votre-climatisation/
    03/20/24 19:20:03.656 [109.234.166.218:15579 1 AcT] ??? [status] ? Hit [url] https://www.dptelec.fr/electricite/securite-electrique-10-conseils-maison/
    03/20/24 19:20:03.656 [109.234.166.218:15579 1 AcT] ??? [status] ? Hit [url] https://www.dptelec.fr/electricite/renover-son-installation-electrique/
    03/20/24 19:20:03.656 [109.234.166.218:15579 1 AcT] ??? [status] ? Hit [url] https://www.dptelec.fr/

    I guess it’s OK now ?

    Plugin Support qtwrk

    (@qtwrk)

    looks alright

    Thread Starter Ben

    (@benrecords)

    Thanks a lot for your help !

Viewing 15 replies - 1 through 15 (of 15 total)
  • The topic ‘Refresh twice to get page cached’ is closed to new replies.