I have a web directory where I store some config files. I\'d like to use wget to pull those files down and maintain their current structure. For instance, the remote directo
This version downloads recursively and doesn't create parent directories.
wgetod() {
NSLASH="$(echo "$1" | perl -pe 's|.*://[^/]+(.*?)/?$|\1|' | grep -o / | wc -l)"
NCUT=$((NSLASH > 0 ? NSLASH-1 : 0))
wget -r -nH --user-agent=Mozilla/5.0 --cut-dirs=$NCUT --no-parent --reject="index.html*" "$1"
}
Usage:
~/.bashrc
or paste into terminalwgetod "http://example.com/x/"
You should be able to do it simply by adding a -r
wget -r http://stackoverflow.com/