• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

downloading large files over http/ftp

darfur

Member
I'm struggling to get this working, and my biggest succeses has some in the form of wget.

However, around 2gb the download timed out, and I tried to resume as I had been all along. Except now it gives me an http 206 Partial Content error.

I did some googling of this error, and people's description of it are just as vauge as it's name.

Any ideas?
 
Would the 'no-clobber' option help? Just a guess.

`--no-clobber'
If a file is downloaded more than once in the same directory, Wget's behavior depends on a few options, including `-nc'. In certain cases, the local file will be clobbered, or overwritten, upon repeated download. In other cases it will be preserved.
When running Wget without `-N', `-nc', or `-r', downloading the same file in the same directory will result in the original copy of file being preserved and the second copy being named `file.1'. If that file is downloaded yet again, the third copy will be named `file.2', and so on. When `-nc' is specified, this behavior is suppressed, and Wget will refuse to download newer copies of `file'. Therefore, "no-clobber" is actually a misnomer in this mode--it's not clobbering that's prevented (as the numeric suffixes were already preventing clobbering), but rather the multiple version saving that's prevented.

 
Back
Top