Performing LONG requests, breathing to not exceed timeout? Such as curling to save images
Posted 16 March 2012 - 06:50 PM
I have to be able to, on demand from my admin panel, pull about 1000 images and save to my disk. I'm fetching the images via curl, and everything's working fine.
Except my design — it's BAD design to run script for several minutes. I've done this before on my apache setup. Now I'm on Nginx/PHP-FPM, getting 504 Gateway timeouts. Regardless of temporarily for this section setting "set_time_limit(300)" or something high.
Right now I'm looping through an array of CActiveRecords, performing one "image fetch and save" per record.
If I do this request (currently saving ~260 images), my Nginx times out with a 504, however; the PHP-FPM keeps working and completes the request.
Rethink the design? Let the script run for a long time, but somehow feed back to Nginx for every record to avoid timeout (if this is possible)?
Does anyone have any tips for this?
Posted 16 March 2012 - 11:12 PM
And/or you could try to fetch several images at once instead of fetching them one by one. I never used it, but think curl has support for this...
Besides that, I guess there's really only the ajax method.
Posted 17 March 2012 - 04:39 PM
jelly: I'm making a site that keeps a database of known gaming consoles, and I pull that info from Freebase. This is the for initial import, and the ability to recreate it from their information if need be.