Is there a solution for large files?
-
I’m still working through my migration plan from the previous bloated plugin to your nice solution, and I found an issue with large download files. I have one file that is about 215MB. (Because of the file size, it had to be manually uploaded via FTP.)
I am able to get that file set up as a download in Simple Download Counter, but when a user clicks on it, the downloaded file has the correct name but is empty, and in the PHP error log, there is this error:
PHP Fatal error: Allowed memory size of 209715200 bytes exhausted (tried to allocate 169885696 bytes) in /local/path/mydomain.test/wp-content/plugins/simple-download-counter/inc/functions-core.php on line 387
I also get this error at line 342, which makes sense, because I tried to do this both as a local and a remote download with the same trouble each time.
The only idea I have for files such as this is simply to link directly to the resource without going through Simple Download Counter. Do you have another solution of something to do on my end?
Or perhaps there is some improvement that could be made to the plugin? I don’t know if this is any real help, but I looked up the PHP docs for the
readfile()
function used at the two lines where I had the error, and I found this note that may be a clue:readfile()?will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with?ob_get_level().
https://www.php.net/manual/en/function.readfile.phpI followed the suggestion to test with
ob_get_level()
to see if there is any output buffering, and it returned2
. So just as a test I threw in a quickwhile (ob_get_level()) ob_end_clean();
just above thereadfile()
and tried again. Then the large download seemed to work exactly as I hoped.Of course, I don’t presume to know what implications that might have for the rest of your code, but I hope it might be a good start, and maybe it is a good memory optimization even for files that can otherwise fit in allocated memory, because it apparently is reading in the whole file to memory before sending out headers.
(Note: at present this is only on a local dev site. Using current WordPress 6.3.2 and Simple Download Counter 1.7)
- The topic ‘Is there a solution for large files?’ is closed to new replies.