• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Is there software to download the entire contents of a website?

magomago

Lifer
Sep 28, 2002
10,973
14
76
Hi -

The reason I ask is simple: I'm a student. I like to have a copy of all the homework solutions, lectures, etc because I find myself going back and checking the stuff often. Typically I just spend the hour or two (...yeahh...that much stuff) so downloading each individual file and organizing the stuff.

However this time around I'm wondering if there is a program that will do that, but even download the webpage....kind of like "saving the webpage", but also saving every file that is directly linked to the web page, such that I just need to open the .html and then click on anything I want that now exists locally.

I would prefer some kind of utility for linux since it is what I use, but I do know that most solutions people know of exists only within a Windows context, so I will gladly accept those means as well since I have a laptop with XP on it.

Thanks :)
 

screw3d

Diamond Member
Nov 6, 2001
6,906
1
76
wget is the bomb diggity

I'm surprised that as a linux user, you don't know wget ;)
 

WildHorse

Diamond Member
Jun 29, 2003
5,006
0
0
Free Download Manager will download a web site.

Get it from http://www.freedownloadmanager.org/

Click the "HTML Spider" tab, which causes a new drop down menu to appear which is also labeled "HTML Spider."

On that drop down menu click "Download web site" button, then set the depth of downloading you want the spider to follow.

Also notice the drop down menu labeled Options, go into HTML Spider Default Settings. It lets you tune it to also download images and files that link to other sites, or you can tell it to download all such images & files except any particular file extensions you'd like to exclude.

 

ElDonAntonio

Senior member
Aug 4, 2001
967
0
0
You can try this open-source program for Windows:
Weblight

It allows you to download all files with a specific extension from a site (eg all PDFs). That might not be what you want though. wget should do the job too.
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
Hey guys...I'm having some problem >< I've tried to figure this out for a while but I simply can't get it. I've tried both windows and linux programs - pretty much everything on the list.

For example this is one of the sites I want to download:

http://eee.uci.edu/07w/15080/

Yet beyond the first page almost everything else fails...and I haven't figured out why :\
 

WildHorse

Diamond Member
Jun 29, 2003
5,006
0
0
Originally posted by: magomago
bump :(

Just do exactly what I suggested above with FDM and you'll have yourself a real nice download.

Set the spider to crawl say maybe 4 levels, how's that?
That ought to return the entire website, with all it's words, pictures and links.

Just go ahead and TRY IT. WOrks for me, simple as pie, just a few clicks & you're done!

What's blocking you from trying? Is it because of too much sun baking you down there in Temecula? Just do it. It'll take you maybe 2 minutes.