Bulk downloads using wget

I recently had to download a lot of ZIP files (14848)  that were in a txt file and which although they had the same directory path couldn’t have been downloaded using recursive wget because the server had the directory indexes disabled and trying to pull with recursive wget resulted in a 403 forbidden error.

So I copy/pasted all the links in a txt file called dld.txt and then I wrote a simple bash script to loop through the file and download them one by one. The script, written as a single command line (yea I know it looks ugly) looks like this:

for i in $(cat dld.txt); do wget $i; done

If you do a lot of downloading like this and want to automate things you can simply drop a bash script and execute it whenever you need it:

#!/bin/bash
file = $1
for i in $(cat $file)
do wget $i
done

You can save the script above as get.sh or whatever name you’d like, give it executable permissions by doing a chmod +x get.sh and once you’re done and you have the file with links ready you can just execute it like this:

./get.sh links.txt

Hope you’ll find as much use to this as I did already 😀

1437047