I’m developing an application where I’m pulling a lot of pictures to my database from freebase.com, via their image API.
I have to be able to, on demand from my admin panel, pull about 1000 images and save to my disk. I’m fetching the images via curl, and everything’s working fine.
Except my design — it’s BAD design to run script for several minutes. I’ve done this before on my apache setup. Now I’m on Nginx/PHP-FPM, getting 504 Gateway timeouts. Regardless of temporarily for this section setting “set_time_limit(300)” or something high.
Right now I’m looping through an array of CActiveRecords, performing one “image fetch and save” per record.
If I do this request (currently saving ~260 images), my Nginx times out with a 504, however; the PHP-FPM keeps working and completes the request.
Rethink the design? Let the script run for a long time, but somehow feed back to Nginx for every record to avoid timeout (if this is possible)?
An idea I had was to make a progress bar and make ajax requests that does one request at a time and via javascript calls for the next one upon completion of the previous - but that seems quite convoluted, no?
Does anyone have any tips for this?