Go to the first, previous, next, last section, table of contents.
-
Say you want to download a URL. Just type:
wget http://fly.srk.fer.hr/
-
But what will happen if the connection is slow, and the file is lengthy?
The connection will probably fail before the whole file is retrieved,
more than once. In this case, Wget will try getting the file until it
either gets the whole of it, or exceeds the default number of retries
(this being 20). It is easy to change the number of tries to 45, to
insure that the whole file will arrive safely:
wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg
-
Now let's leave Wget to work in the background, and write its progress
to log file `log'. It is tiring to type `--tries', so we
shall use `-t'.
wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg &
The ampersand at the end of the line makes sure that Wget works in the
background. To unlimit the number of retries, use `-t inf'.
-
The usage of FTP is as simple. Wget will take care of login and
password.
wget ftp://gnjilux.srk.fer.hr/welcome.msg
-
If you specify a directory, Wget will retrieve the directory listing,
parse it and convert it to HTML. Try:
wget ftp://prep.ai.mit.edu/pub/gnu/
links index.html
Go to the first, previous, next, last section, table of contents.