• Resolved bretticus

    (@bretticus)


    Greetings bstump,

    So I’ve resolved to use this plugin as it’s the best thing I’ve found and thank you for sharing this code. I’m working on moving an existing wordpress site to new hosting and using rackspace cloud files for the media.

    One issue I’m struggling with is that I have approx. 438,276 files totaling 31.52 GB. Initially, when enabling this plugin, it basically crashed my server as those local file arrays got way too large! I was able to use turbolift to upload the files to my CDN (which took forever as I recall.) I cannot remember if the plugin ended up syncing with the existing files or if I ramped up the PHP memory limit and timeout until I got it done. I did have it working at one point though.

    Now that I’ve settled on my configuration for the new hosting, I need to cut-over the current site to the new hosting. It requires a database dump and an rsync push (I stashed images that would have been deleted by the plugin and used them to compare against for the subsequent mirror push.) However, it’s been several weeks and this website produces huge amounts of media (news website.) Now, when enabling the plugin after a recent mysql dump and rsync push, I get the super lag and timeout again. It’s just too much of pain in the you-know-what.

    Once the website has been moved to the new hosting, this plugin will work perfectly.

    That being said, I have been working on a PHP script today for the purpose of doing the heavy lifting from command line. Essentially, my goal is to make this a companion script for your plugin that works like turbolift but uses PHP (might be a little easier for the average Web developer. Also, I don’t really want to upload the extra images to the new hosting server, so I plan on making this as easy as pointing to a directory on the current server or a stash of new files since the last sync on my computer.

    I haven’t really jumped into your plugin code to see how your updating metadata about images in the cdn yet. I’m scratching the surface here. But I’d appreciate any critiques/suggestions/alternate solutions you may have. Feel free to fork and send me pull requests if you desire (ie. feel free to contribute or take over with your greater experience with opencloud.) Or let me know if I can accomplish this without resorting to scratching my own itch.

    Thanks again.

Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author paypromedia

    (@paypromedia)

    If the files are already in WordPress, if you upload them to the CDN in the correct container with the correct file path, it *should* show all of your files as synced, no?

    Thread Starter bretticus

    (@bretticus)

    Thanks for the reply.

    For now, just going to do a direct cutover. My php script took a huge backseat but perhaps, due to your question, it’s unnecessary.

    I assume you mean that your plugin code checks the container first and uses the CDN if available? I’ll also assume that it’s cached? I noticed you were using sessions (which I suppose it as good as any but that information would not be available to search engine bots.)

    Thanks for your work on this. I’ll probably be going this route later this month (where I can test your assumption.)

    Plugin Author paypromedia

    (@paypromedia)

    Yes, the CDN plugin checks both local and CDN files and “compares” the list. The files are cached, based on the timeframe set in the CDN plugin settings.

    I will get try to think about some ways to improve this issue.

    Plugin Author paypromedia

    (@paypromedia)

    I’ve released 1.3.0, this should be resolved. If you upload your WP files directly to the CDN, with folder names intact (year/month/filename.ext), you should be able to login to WP, go to the CDN plugin page, slick “Save” and it will cache all of the file names.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Working on a sync script for large existing data’ is closed to new replies.