FTP transfers failing due to unstable internet. Another transfer protocol?

alfa147x

Lifer
Jul 14, 2005
29,307
106
106
We use a VPS for temp storage for transferring video footage. My problem is that my internet connection at my apartment is quite unstable and the FTP client (CyberDuck) will throw up a error message. Then I can hit resume and it'll download the rest of the file. Sometimes the file will come down with no issue but 75% of the time it'll have errors. Is there a better way that I can get these files down?

The internet is bundled with rent and they block all P2P traffic so I couldn't even use a torrent setup (if even possible) plus my VPS host would not be fond of that.

These files are not too large. It's mostly video shot with iPhones / iPads and very few from a 5DII so total size is approx 5 - 10gb per transfer.

Any ideas?

Thanks,
Alfa
 

skillyho

Golden Member
Nov 6, 2005
1,337
0
76
Is there a specific reason you're using a VPS? I'm assuming your ISP provided it? Just curious...

I would probably setup a Filezilla server on the virtual machine and send the files in broken RAR chunks (2GB) and let them reassemble on the other side. This wouldn't be such a pain if something timed-out/disconnected and you could pick back up easily.

Why it's happening with the system you have is difficult to troubleshoot without knowing more. Does your internet connection drop, but you maintain network connectivity? Does this happen consistently with large file transfers?
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Sounds like you just need a client that will automatically retry which I thought most did these days. I would probably use a tool like wget which lets you set how many times to retry and how long to wait between them.
 

mikeymikec

Lifer
May 19, 2011
20,547
15,380
136
With FileZilla you can specify the number of auto retries IIRC.

Yup, in preferences, specify the number of retries and the number of seconds between retries.
 

alfa147x

Lifer
Jul 14, 2005
29,307
106
106
Is there a specific reason you're using a VPS? I'm assuming your ISP provided it? Just curious...

I would probably setup a Filezilla server on the virtual machine and send the files in broken RAR chunks (2GB) and let them reassemble on the other side. This wouldn't be such a pain if something timed-out/disconnected and you could pick back up easily.

Why it's happening with the system you have is difficult to troubleshoot without knowing more. Does your internet connection drop, but you maintain network connectivity? Does this happen consistently with large file transfers?

I got the VPS to experiment with and then we needed a way to transfer large files so I had the other guys dump files on to it. I still want to keep it to mess with. The reason I think our internet sucks is because the backbone wasn't setup to handle so many people. Normal usage it's fine but for transferring files that take 4 - 5 hours to download at some point the internet breaks and the transfer fails.

Sounds like you just need a client that will automatically retry which I thought most did these days. I would probably use a tool like wget which lets you set how many times to retry and how long to wait between them.

I'll look into using wget. I didn't realize you could use it to pull FTP files. Thanks. When a client automatically retries does it start from 0 or picks up where it left off? If it picks up would that possibly create errors?
 

alfa147x

Lifer
Jul 14, 2005
29,307
106
106
With FileZilla you can specify the number of auto retries IIRC.

Yup, in preferences, specify the number of retries and the number of seconds between retries.

Sweet. I'll have to use that. When it retries does it start at 0 or does it pick pup where it left off?
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
rsync can also be useful since it does rolling checksums on the data so an invalid chunk gets discarded.

The corruption issues is harder to deal with when working with FTP. Some servers can be configured to drop a few k from a file on a lost connection so that the resume in theory start over prior to the junk data.
 

mikeymikec

Lifer
May 19, 2011
20,547
15,380
136
Sweet. I'll have to use that. When it retries does it start at 0 or does it pick pup where it left off?

It would be bloody silly if it started again at zero, so I assume it doesn't :)

Seriously though, I think that resuming transfers over FTP requires server-side support, but I think that support came in to mainstream FTP servers more than a decade ago.
 
Last edited:

alfa147x

Lifer
Jul 14, 2005
29,307
106
106
Okay. Setup ProFTPD on the VPS to allow resuming downloads. Now I'm pulling down a 8gb file @ 500 KB/s ETA 9 hours from now. I also setup cyberduck to retry the download after waiting 40 seconds and 9 total retry attempts.

Thanks for the help!