• This might not be a problem for most, but we went from about 3000 files to more than 250000 files, and that is our shared hosting maximum. Exceeding the file count limit caused all sorts of problems. Deleting the W3 Cache plugin and manually deleting the associated files solved our problems.

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Contributor Marko Vasiljevic

    (@vmarko)

    Hello @timecheck00

    First of all, thank you for the review.
    This question was asked a number of times, and it was answered every time, so, with a little bit of search, you would find the correct way to remedy this problem.
    I am not seeing the support topic opened by you regarding this so I am guessing you did not bother to find the answer.
    When using Shared hosting, the only available caching method is Disk. This being said, it means that you enabled Page Cache (and I am guessing everything else like Database Cache and Object Cache) using a Disk Caching method. And if you’d searched and googled a bit, you would find that using every single module to disk on shared hosting is not recommended.

    If you are caching everything on the disk, naturally, the number of files will increase over time, and shared hosting with limited resources will, of course, have something to say about this.
    So the problem is how you configured W3 Total Cache, and if you would open a new support topic, and share your server information and the website URL I would be more than happy to assist you with the configuration for your website’s optimal performance.

    Thanks!

    Thread Starter timecheck00

    (@timecheck00)

    Thank you. I appreciate the informed reply. I rather naively just installed w3 total cache based on someone’s recommendation that a cache manager would help my performance. Didn’t do any research other than number of stars. Now I see that in your installation notes and FAQs you do have a warning:

    Optional:?On the “Database Cache” tab, the recommended settings are preset. If using a shared hosting account use the “disk” method with caution, the response time of the disk may not be fast enough, so this option is disabled by default. Try object caching instead for shared hosting.

    I suggest modifying it as follows:

    Optional:?On the “Database Cache” tab, the recommended settings are preset. If using a shared hosting account use the “disk” method with caution, the response time of the disk may not be fast enough and it will generate 1000s of files, possibly maxing our your allowed file limit so this option is disabled by default. Try object caching instead for shared hosting.

    I found that my problem was due to w3 total cache almost by chance. The problem I had for several months was that UpdraftPlus would very rarely finish a backup. It would stop with various errors, or no errors. It turns out that when the file limit is hit, Godaddy doesn’t give an error, just whatever process was trying to write a file fails. We thought it was Cron failures, etc.

    However, I found from the cpanel that the file limit had been hit, and at that time talking with support on how to find what was generating so many files, that their level 2 support said w3 total cache could generate many files. Then I found all the cached files, deinstalled the plugin and deleted all the cache files and got back to normal operations.

    At this point I am just happy to be able to run backups and am not going to think about caching for awhile.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘generated 1000s of files’ is closed to new replies.