Plugin is exhausting total server memory uploading to S3
-
The current version of the plugin is exhausting the total server memory (1gb) when attempting to upload to Amazon S3. I thought that was odd seeing as it is simply uploading a file.
I had upped the memory_limit in php.ini to no avail and contacted hosting support and they took a look at it and said it was using more memory than the server had.
It is a WP multi-site install. I took a couple of sites off the backup list but I’m getting the out of memory error for the upload to S3 step.
Here’s the most recent run log:
[INFO] BackWPup 3.6.10; A project of Inpsyde GmbH
[INFO] WordPress 5.2.2 on https://xxxxxx.com/
[INFO] Log Level: Normal
[INFO] BackWPup job: Weekly Backup – Full Site and DB
[INFO] Logfile is: backwpup_log_24ff1f_2019-08-12_10-49-01.html
[INFO] Backup file is: xxx_backwpup_3f23a9_2019-08-12_10-49-01_DDYQHXTT01.tar.gz
[12-Aug-2019 10:49:01] 1. Try to backup database …
[12-Aug-2019 10:49:02] Connected to database xxxxxx_main on localhost
[12-Aug-2019 10:49:04] Added database dump “xxxxxx_main.sql.gz” with 4.11 MB to backup file list
[12-Aug-2019 10:49:04] Database backup done!
[12-Aug-2019 10:49:04] 1. Trying to make a list of folders to back up …
[12-Aug-2019 10:49:04] Added “wp-config.php” to backup file list
[12-Aug-2019 10:49:04] 1219 folders to backup.
[12-Aug-2019 10:49:04] 1. Trying to generate a file with installed plugin names …
[12-Aug-2019 10:49:04] Added plugin list file “xxxxxx.pluginlist.2019-08-12.txt” with 3.96 KB to backup file list.
[12-Aug-2019 10:49:04] 1. Trying to generate a manifest file …
[12-Aug-2019 10:49:04] Added manifest.json file with 2.00 KB to backup file list.
[12-Aug-2019 10:49:04] 1. Trying to create backup archive …
[12-Aug-2019 10:49:04] Compressing files as TarGz. Please be patient, this may take a moment.
[12-Aug-2019 10:49:41] Backup archive created.
[12-Aug-2019 10:49:41] Archive size is 839.73 MB.
[12-Aug-2019 10:49:41] 6217 Files with 862.30 MB in Archive.
[12-Aug-2019 10:49:41] 1. Trying to send backup file to S3 Service …
[12-Aug-2019 10:49:41] Connected to S3 Bucket “yyyyyyyyy” in us-east-1
[12-Aug-2019 10:49:41] Checking for not aborted multipart Uploads …
[12-Aug-2019 10:49:41] Starting upload to S3 Service …
[12-Aug-2019 10:51:01] ERROR: Out of memory (allocated 774111232) (tried to allocate 5242881 bytes)
[12-Aug-2019 10:51:02] 2. Trying to send backup file to S3 Service …
[12-Aug-2019 10:51:02] Connected to S3 Bucket “yyyyyyyyy” in us-east-1
[12-Aug-2019 10:51:02] Starting upload to S3 Service …
[12-Aug-2019 10:51:24] Backup transferred to https://yyyyyyyyy.s3.amazonaws.com/xxxxxx/xxx_backwpup_3f23a9_2019-08-12_10-49-01_DDYQHXTT01.tar.gz.
[12-Aug-2019 10:51:24] One file deleted on S3 Bucket.
[12-Aug-2019 10:51:24] One old log deleted
[12-Aug-2019 10:51:24] ERROR: Job has ended with errors in 143 seconds. You must resolve the errors for correct execution.///////////
Server info:
$ cat /etc/os-release
PRETTY_NAME=”Debian GNU/Linux 9 (stretch)”
NAME=”Debian GNU/Linux”
VERSION_ID=”9″
VERSION=”9 (stretch)”
ID=debian
HOME_URL=”https://www.debian.org/”
SUPPORT_URL=”https://www.debian.org/support”
BUG_REPORT_URL=”https://bugs.debian.org/”——
$ lsb_release -a
No LSB modules are available.
Distributor ID: Debian
Description: Debian GNU/Linux 9.9 (stretch)
Release: 9.9
Codename: stretch——
$ uname -r
4.14.133-grsec///////////
Any ideas why it is using so much memory for S3 upload? Let me know if you require any more information.
Thanks.
- The topic ‘Plugin is exhausting total server memory uploading to S3’ is closed to new replies.