Performing LONG requests, breathing to not exceed timeout?

I’m developing an application where I’m pulling a lot of pictures to my database from freebase.com, via their image API.

I have to be able to, on demand from my admin panel, pull about 1000 images and save to my disk. I’m fetching the images via curl, and everything’s working fine.

Except my design — it’s BAD design to run script for several minutes. I’ve done this before on my apache setup. Now I’m on Nginx/PHP-FPM, getting 504 Gateway timeouts. Regardless of temporarily for this section setting “set_time_limit(300)” or something high.

Right now I’m looping through an array of CActiveRecords, performing one “image fetch and save” per record.

If I do this request (currently saving ~260 images), my Nginx times out with a 504, however; the PHP-FPM keeps working and completes the request.

Rethink the design? Let the script run for a long time, but somehow feed back to Nginx for every record to avoid timeout (if this is possible)?

An idea I had was to make a progress bar and make ajax requests that does one request at a time and via javascript calls for the next one upon completion of the previous - but that seems quite convoluted, no?

Does anyone have any tips for this?

You could turn the process into a console command to prevent timeouts.

And/or you could try to fetch several images at once instead of fetching them one by one. I never used it, but think curl has support for this…

Besides that, I guess there’s really only the ajax method.

Why exactly do you need to curl download 1000 images? I think that’s the biggest design flaw you have …

Making it a console command was the best option for me, thanks!

jelly: I’m making a site that keeps a database of known gaming consoles, and I pull that info from Freebase. This is the for initial import, and the ability to recreate it from their information if need be.