• Resolved arteminfamia

    (@arteminfamia)


    Hi support,

    we receive errors like:
    [08-Sep-2020 03:02:28] Starting upload to S3 Service …
    [08-Sep-2020 03:03:25] ERROR: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 5242912 bytes)
    [08-Sep-2020 03:03:25] 2. Trying to send backup file to S3 Service …
    [08-Sep-2020 03:03:25] Connected to S3 Bucket “infamia-client-backups” in us-east-1
    [08-Sep-2020 03:03:25] Starting upload to S3 Service …

    our dev team did investigate and found the problem with memory leaks at AWS library

    to fix the issue need to add at the file /backwpup/inc/class-destination-s3.php

    on loop function gc_collect_cycles(); for clean memory leaks

    while ( ! feof( $file_handle ) ) {
    $chunk_upload_start = microtime( true );
    $part_data = fread( $file_handle, 1048576 * 5 ); //5MB Minimum part size
    $part = $s3->uploadPart( array(
    ‘Bucket’ => $job_object->job[‘s3bucket’],
    ‘UploadId’ => $job_object->steps_data[$job_object->step_working ][‘UploadId’],
    ‘Key’ => $job_object->job[‘s3dir’] . $job_object->backup_file,
    ‘PartNumber’ => $job_object->steps_data[ $job_object->step_working ][‘Part’],
    ‘Body’ => $part_data,) );

    $chunk_upload_time = microtime( true ) – $chunk_upload_start;
    $job_object->substeps_done = $job_object->substeps_done + strlen( $part_data );
    $job_object->steps_data[ $job_object->step_working ][‘Parts’][] = array(
    ‘ETag’ => $part->get( ‘ETag’ ),
    ‘PartNumber’ => $job_object->steps_data[ $job_object->step_working ][‘Part’],
    );
    $job_object->steps_data[ $job_object->step_working ][‘Part’] ++;
    $time_remaining = $job_object->do_restart_time();
    if ( $time_remaining < $chunk_upload_time ) {
    $job_object->do_restart_time( true );
    }
    $job_object->update_working_data();
    gc_collect_cycles();
    }

    here is the old discussion of this issue: https://github.com/aws/aws-sdk-php/issues/1572

Viewing 5 replies - 1 through 5 (of 5 total)
Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘memory leaks on large websites’ is closed to new replies.