No worries. remove the old code first and delete the exported-images.csv
file
then add this code
function custom_export_all_the_images() {
if ( isset( $_GET['magic-export'] ) ) {
$offset = get_option( 'custom-export-all-the-images', 0 );
$offset = empty( $offset ) ? 0 : absint( $offset );
$limit = 500;
$images = get_posts(
array(
'post_type' => 'attachment',
'post_mime_type' => 'image',
'posts_per_page' => $limit,
'offset' => $offset,
'post_status' => 'any',
)
);
if ( $images ) {
$fp = fopen( ABSPATH . 'exported-images.csv', 'a' );
if ( $offset <= 0 ) {
$header_fields = array( 'ID', 'Title', 'URL', 'Original URL' );
fputcsv( $fp, $header_fields );
}
foreach ( $images as $image ) {
$title = get_the_title( $image->ID );
$orig_url = wp_get_original_image_url( $image->ID );
$url = wp_get_attachment_url( $image->ID );
$fields = array( $image->ID, $title, $url, $orig_url );
fputcsv( $fp, $fields );
}
fclose( $fp );
}
update_option( 'custom-export-all-the-images', ( $offset + $limit ) );
}
}
add_action( 'init', 'custom_export_all_the_images' );
then access the page using the same way ?magic-export and let the page completely load.
then reload the page again. and do it multiple times. since you’ve 3000 images then it will take 8 reloads.
if you see the error in the first load then delete the exported-images.csv
file again. then change $limit = 500;
to $limit = 200;
basically, we’re now fetching 500 images in one load to avoid the memory limit issue.