looking for webpage saving software

Red Squirrel

No Lifer
May 24, 2003
70,294
13,646
126
www.anyf.ca
I need to be able to save web pages EXACTLY as they apear, 100% local (download direct links to disk too). I can't seem to find a way to do that, most browser's save as features don't work worth crap, half the stuff does not get linked properly. PDF writers are very buggy and crash when use intensively, HTTtrack is not an option, it won't save everything either, and besides, I can't use cookies to log in to a forum, for example, and there's no way to delay it so it just hammers the server.

I was thinking something that takes a jpg image of the web page, that way it's 100% replica of it. print screen sort of works, but when you need to save a thread 10 pages long, it's a job to start cutting and pasting it all together in 1 jpg. I use firefox so an extension that does this would be sweet, but even if another browser has such feature I don't mind having to download another browser. I just need to be able to quickly save web pages 100%.

Oh, and it has to be free, would not want to pay 100's just for something like this. I could code it, just time consuming, and I hate coding with sockets, really tedious.

Also I have a server, so if there's a way to setup a caching proxy where it can be configured to store viewed pages in normal html and link them together as I browse, that would be cool too, as I could simply use the proxy whenever I want to save stuff permanently. Maybe some kind of mod to squid or something.
 

jlee

Lifer
Sep 12, 2001
48,518
223
106
'Save as web page (complete)' has worked for whatever I've needed..but I can't think of anything else.. :(
 

Red Squirrel

No Lifer
May 24, 2003
70,294
13,646
126
www.anyf.ca
I find "save web page complete" leaves lot of links linking to the server still, like images. So let's say in years I open it, everything is missing. So that's why I need something more advanced. I'll take a look at what has been mentioned that I have not tried and any further suggestions still welcome. Thanks.
 

Red Squirrel

No Lifer
May 24, 2003
70,294
13,646
126
www.anyf.ca
Will that get everything? Or just the links I tell it to? Mind you, if I make my own, I could use a wget backend, since handling binary data is not the easiest thing to do so that would save me a bundle right there.