If everyone started to read the manual then bloggers like me would…. eh no… I would still blog even if no one reads this.
This is pretty simple but it is really nice and may inspire you to investigate more. If you have a list of links to download all you have to do is put them all in a text file. One in each line. Then fire up wget from the terminal like this
wget -i path/to/textfile
This will download all of those files one by one and save it in the present working directory. Really good if you already have a list of links to download. Or if you have a set of links to download from on the same page you can use some download plug in like downthemall on firefox.
You can use wget to do a lot lot more. Read the manual. You can use wget on bash scripts as well, since it is a command line tool. Imagine the possibilities. You could write a bash script with regex to get all files with certain patterns from a particular location. I once downloaded a full website containing more than a 1000 html pages with wget, and it was really good. But that was simple. Not much scripting involved though. You just have to look for the proper parameters to use with wget.