The examples are divided into three sections loosely based on their complexity.
wget http://fly.srk.fer.hr/
wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg
wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg &The ampersand at the end of the line makes sure that Wget works in the background. To unlimit the number of retries, use `-t inf'.
wget ftp://gnjilux.srk.fer.hr/welcome.msg
wget ftp://prep.ai.mit.edu/pub/gnu/ links index.html
wget -i fileIf you specify `-' as file name, the URLs will be read from standard input.
wget -r http://www.gnu.org/ -o gnulog
wget --convert-links -r http://www.gnu.org/ -o gnulog
wget -p --convert-links http://www.server.com/dir/page.htmlThe HTML page will be saved to `www.server.com/dir/page.html', and the images, stylesheets, etc., somewhere under `www.server.com/', depending on where they were on the remote server.
wget -p --convert-links -nH -nd -Pdownload \ http://www.server.com/dir/page.html
wget -S http://www.lycos.com/
wget -s http://www.lycos.com/ more index.html
wget -r -l2 -P/tmp ftp://wuarchive.wustl.edu/
wget -r -l1 --no-parent -A.gif http://www.server.com/dir/More verbose, but the effect is the same. `-r -l1' means to retrieve recursively (see section Recursive Retrieval), with maximum depth of 1. `--no-parent' means that references to the parent directory are ignored (see section Directory-Based Limits), and `-A.gif' means to download only the GIF files. `-A "*.gif"' would have worked too.
wget -nc -r http://www.gnu.org/
wget ftp://hniksic:mypassword@unix.server.com/.emacs
wget -O - http://jagor.srce.hr/ http://www.srce.hr/You can also combine the two options and make pipelines to retrieve the documents from remote hotlists:
wget -O - http://cool.list.com/ | wget --force-html -i -
crontab 0 0 * * 0 wget --mirror http://www.gnu.org/ -o /home/me/weeklog
wget --mirror --convert-links --backup-converted \ http://www.gnu.org/ -o /home/me/weeklog
wget --mirror --convert-links --backup-converted \ --html-extension -o /home/me/weeklog \ http://www.gnu.org/Or, with less typing:
wget -m -k -K -E http://www.gnu.org/ -o /home/me/weeklog
Go to the first, previous, next, last section, table of contents.