The company I work for right now has recently added a pipeline to another company where they are sectioning off bandwidth and data access because our company is splitting and eventually the half that is using the newly sectioned off data connection will be off the network.
The Problem:
Since we added the pipeline to their company we have had intermittent problems with certain apps that require a continious connection to the internet or other internet based servers.
Example: FTP connections upload and download will cause a queue of multiple to fail after a large file ~1-2+ MB file will upload but then the queue will stop after the file has finished uploading as if it is waiting for aknowledgment for continueing on to the next files. Using bullet proof I the program is able to reconnect to the site and reupload the file but this is unexceptable. I belieave that the FTP connection information packet is being dropped during the upload but the upload connection stays open but when the upload ends and the FTP program goes to look for the connection pipe it is no longer there.
Example: AOL Instant Messanger will lose its connection and reconnect to the AOL's server ever 5 mins or less. This I also think is because we are either dropping packets too much for a stable connection within tolerble limits of the program.
Solution: We have been told that it may be up to 1-2 months where we have to have this company on our connection with their company data being pushed through our lines. We asked our T-1 provider if upping the CIR rate would help the matter but they said that it wouldn't I would image if it is bandwidth related that upping the CIR would help it out but I am not an expert on the matter.
Does anyone know why this would matter or any good documentation that I could read to tell me about the limitations of data connections and their thresh-holds?
I appreciate any help anyone out there in the feild can provide.
The Problem:
Since we added the pipeline to their company we have had intermittent problems with certain apps that require a continious connection to the internet or other internet based servers.
Example: FTP connections upload and download will cause a queue of multiple to fail after a large file ~1-2+ MB file will upload but then the queue will stop after the file has finished uploading as if it is waiting for aknowledgment for continueing on to the next files. Using bullet proof I the program is able to reconnect to the site and reupload the file but this is unexceptable. I belieave that the FTP connection information packet is being dropped during the upload but the upload connection stays open but when the upload ends and the FTP program goes to look for the connection pipe it is no longer there.
Example: AOL Instant Messanger will lose its connection and reconnect to the AOL's server ever 5 mins or less. This I also think is because we are either dropping packets too much for a stable connection within tolerble limits of the program.
Solution: We have been told that it may be up to 1-2 months where we have to have this company on our connection with their company data being pushed through our lines. We asked our T-1 provider if upping the CIR rate would help the matter but they said that it wouldn't I would image if it is bandwidth related that upping the CIR would help it out but I am not an expert on the matter.
Does anyone know why this would matter or any good documentation that I could read to tell me about the limitations of data connections and their thresh-holds?
I appreciate any help anyone out there in the feild can provide.