Hi. I want to download the content of an entire directory online. At the end of the quarter I typically save all the files that the professor posted for future reference. I thought about trying to automate it with wget. Apparantly wget can download a webpage and all assosciated links and redirect the links so I could potentially open up the page locally and click to open files.
However I tried that and I'm having problems. The best I can get it is to download JUST .htm files and the directory structure all the way to the original www.bla.com. ie: if I wanted just the junk inside bla.coim/cool/i/love/at it gives me JUST html files for all of bla.com
This is the command I'm currently using
wget -r -l0 http//website
I searched around and there seem to be a few others, but this type consistently appears. Does anyone have any advice? And if what I ask is not possible (ie: getting the webpage) then is there a way to get all files within a directory (files are almost exclusively power points and pdfs)
Thanks!
However I tried that and I'm having problems. The best I can get it is to download JUST .htm files and the directory structure all the way to the original www.bla.com. ie: if I wanted just the junk inside bla.coim/cool/i/love/at it gives me JUST html files for all of bla.com
This is the command I'm currently using
wget -r -l0 http//website
I searched around and there seem to be a few others, but this type consistently appears. Does anyone have any advice? And if what I ask is not possible (ie: getting the webpage) then is there a way to get all files within a directory (files are almost exclusively power points and pdfs)
Thanks!