So we have a data feed management service, where we basically process datafeeds containing product information for a particular merchant, and at night we send them out to comparison shopping engines, like shopping.com, pricegrabber, etc.. via FTP.
There's about 1000 files total, and maybe 10 different FTPs that we send to. The files range in size anywhere between 1mb to 100mb.
My question is then is there an optimal number of files to send out at once? Too many files at once create the potential for it to timeout--whereas one at a time will be inefficient especially if it's a big file. So is there a way to determine the happy medium?
There's about 1000 files total, and maybe 10 different FTPs that we send to. The files range in size anywhere between 1mb to 100mb.
My question is then is there an optimal number of files to send out at once? Too many files at once create the potential for it to timeout--whereas one at a time will be inefficient especially if it's a big file. So is there a way to determine the happy medium?