Originally posted by: Ken_g6
WGet should work for you. To get a page and all associated files, I would use "wget -ENHkp -P.", which saves the page and any associated filesto a directory structure matching the source URLs. If you don't want all those directories, you could add the "-nd" switch as well to put everything in the current directory.
Next, you'll need to copy/rename whatever files changed with new filenames, probably based on the date. Hopefully, your "other associated files", which includes all images on the page, scripts, and anything in a frame or iframe, don't change. If something does change, you either need to upload all files for each snapshot, or do some automated HTML editing. Otherwise, a short script (e.g. a Perl one-liner) can do the renaming.
If the other associated files were static, you only have to upload all the files once; then just automatically upload the current version of the main HTML file into the same directory. You may want to try
cURL for automatic uploading; it works with all the protocols I know of.