• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Having trouble using wget...

Hi. I want to download the content of an entire directory online. At the end of the quarter I typically save all the files that the professor posted for future reference. I thought about trying to automate it with wget. Apparantly wget can download a webpage and all assosciated links and redirect the links so I could potentially open up the page locally and click to open files.

However I tried that and I'm having problems. The best I can get it is to download JUST .htm files and the directory structure all the way to the original www.bla.com. ie: if I wanted just the junk inside bla.coim/cool/i/love/at it gives me JUST html files for all of bla.com

This is the command I'm currently using

wget -r -l0 http//website

I searched around and there seem to be a few others, but this type consistently appears. Does anyone have any advice? And if what I ask is not possible (ie: getting the webpage) then is there a way to get all files within a directory (files are almost exclusively power points and pdfs)

Thanks!
 
Back
Top