• Resolved kylie12

    (@kylie12)


    I upgraded from 3.3.1 to the latest version, and I found a change in the settings on the cache crawler
    It takes longer to Refresh the Sitemap. because I use the free Cloudflare, if the execution time exceeds 100 seconds, an error 524 will occur, which causes the Sitemap List to be incomplete every time. But I didn’t have this problem in 3.3.1 Version, the speed of Refresh the Sitemap List is faster than the new version and complete

    How to solve this problem?

Viewing 7 replies - 1 through 7 (of 7 total)
  • Plugin Support qtwrk

    (@qtwrk)

    Hi,

    are you manually running the crawler ? or it was by cron job ?

    I always see crawler hit timeout when manual run , regardless with or without Cloudflare , there is always a timeout at some point need to be extended

    Best regards,

    Thread Starter kylie12

    (@kylie12)

    I’m not talking about running the crawler. I’m talking about ‘Refresh the Sitemap’ I think I made it clear in the first

    Plugin Support qtwrk

    (@qtwrk)

    my bad

    how long does it take to load the sitemap if you visit manually from browser ?

    and also please provide the report number , you can get it in toolbox -> report -> click “send to LiteSpeed”

    Best regards,

    Thread Starter kylie12

    (@kylie12)

    I can’t confirm how long it takes, but it is always more than 100 seconds causing error 524

    I want to know a question Refresh the Sitemap will stop as long as the browser is closed? or is the PHP execution time limit causing it to stop?

    If it is PHP, then theoretically I can increase the execution time to solve this problem, but I don’t know how the Refresh the Sitemap exactly works

    Plugin Support qtwrk

    (@qtwrk)

    Hi,

    you can find it here

    https://github.com/litespeedtech/lscache_wp/blob/30b85ae31fe407c2eb4f8dfa6c94e86166a16a98/src/crawler-map.cls.php#L453

    it’s done by WP function wp_remote_get()

    well , if you can not access the sitemap manually , then either will crawler

    the PHP exec time is one thing , but CF has hard-limit on connection time out

    Best regards,

    Thread Starter kylie12

    (@kylie12)

    sorry I am not familiar with the code. I don’t know how to use these codes. so when you press to ‘Refresh sitemap’, if you close the browser, the action of Refresh sitemap list will stop?

    has nothing to do with php execution time. right?

    Plugin Support qtwrk

    (@qtwrk)

    this can be limited by many things

    connection timeout between your browser and CF , connection timeout between CF – your origin server , connection timeout between webserver and PHP process , PHP max execution time, maybe also max query time on database …etc

    among all of them , whichever the shortest , will take precedence of them all

    by that saying , if your browser and CF has 100 seconds timeout , then despite you set all other timeout to 200 seconds , the connection will still be killed at 100th seconds

Viewing 7 replies - 1 through 7 (of 7 total)
  • The topic ‘The latest version of the cache crawler problem’ is closed to new replies.