The wget command is used to retrieve content from servers via HTTP, HTTPS, and FTP. It simplifies many downloading tasks that you'd normally have to do yourself by perusing a website and manually clicking links to download. Wget is able to perform the same function from the command line and has a lot of added abilities that can save you time, such as downloading directories recursively.

In this article, we'll show you what wget is capable of and provide you with example commands that you can use in your own Linux terminal.

In this tutorial you will learn:
  • How to download a file from a website with wget
  • How to download a directory
  • How to mirror a website
  • How to download and untar a file automatically
  • How to authenticate with wget
  • How to use quiet mode with wget
Wget command on Linux
Wget command on Linux
Software Requirements and Linux Command Line Conventions
Category Requirements, Conventions or Software Version Used
System Linux (any distribution)
Software wget
Other Privileged access to your Linux system as root or via the sudo command.
Conventions # - requires given linux commands to be executed with root privileges either directly as a root user or by use of sudo command
$ - requires given linux commands to be executed as a regular non-privileged user

Download a file from a website with wget


SUBSCRIBE TO NEWSLETTER
Subscribe to Linux Career NEWSLETTER and receive latest Linux news, jobs, career advice and tutorials.


Wget makes file downloads very painless and easy. It's probably the best command line tool on Linux suited for the job, though other tools can also perform the task, like cURL.

Let's take a look at a few examples of how we could use wget to download a Linux distribution, which are offered on developer websites as ISO files.

The most basic command you can execute with wget is just supplying the URL of the file you wish to download.

$ wget http://example.com/linux.iso
Downloading an ISO file with Wget on Linux
Downloading an ISO file with Wget on Linux

Wget will download the specified file to whatever location you are running the command from. It will show the progress of the download, current speed, and estimated time of completion. It also spits out some other information about its process of connecting to the server and requesting the file. That output can be helpful when diagnosing a connection issue.

Specify directory and file name

You can also specify a directory for the file to download to, as well as choose a name for the download. Use the -O (output) option and enter the directory and file name after the URL.

$ wget http://example.com/linux.iso -O /path/to/dir/myfile.iso
Specify where you want to save the downloaded file
Specify where you want to save the downloaded file

Resuming downloads

A nifty feature of wget is that it can resume downloads. If your file download was interupted, either unintentionally or because you needed to stop it with Ctrl+C, you can pick up right where you left off by using the -c option. Just make sure you are in the correct directory or you tell wget where to find the partial file with -O.

wget -c http://example.com/linux.iso
Wget is resuming an interrupted download, as indicated by Partial Content
Wget is resuming an interrupted download, as indicated by Partial Content

Download multiple files



If you want to download more than one file, create a text document that contains a list of download links, with each URL on a separate line. Then, run the wget command with the -i option and specify the path to your text document.

$ wget -i download-links.txt

Limit download speed

Another handy option of wget is to limit its download speed. This is useful if you don't want a large download to steal all your network bandwidth, which might give latency to other users on your network. Use the --limit-rate flag and specify k for kilobytes, m for megabytes, or g for gigabytes. For example, this would download a file at a maximum rate of 500 KB per second:

$ wget --limit-rate 500k http://example.com/linux.iso

Download a directory

Wget can download an entire directory, recursively, from either an FTP or web (HTTP/HTTPS) server. For FTP, just use the -r (recursive) option in your command and specify the directory you want to get.

$ wget -r ftp://example.com/path/to/dir

If you are trying to download the directory of a website, the command is pretty much the same but in most cases you will also want to append the --no-parent (or just -np) option so wget doesn't try to follow any links back to the index of the site.

$ wget -r -np http://example.com/directory

How to mirror a website

Wget has the ability to follow all the links on a website, downloading everything it comes across as it goes. This makes wget an extremely powerful tool because not only can it download a directory or multiple files, it can actually mirror an entire website.

Websites are made up of HTML files, and usually you'll also find some .jpg or .png image files, .css (style sheets), .js (JavaScript), and a variety of others. Wget can find all these files automatically and download them into the same directory structure as the website, which would essentially give you an offline version of that site.

Include the -m (mirror) flag in your wget command and the URL of the site you want to mirror.

$ wget -m http://example.com

In most cases, you'll also want to include the -p option in your command, which tells wget to download all the files that would be required to display the offline website correctly, such as style sheets. The -k option can also make the site display better, as it will rename the directories and references as necessary for offline viewing. Whether or not you'll need these commands just depends on the site you're mirroring.

$ wget -m -p -k http://example.com
Wget command being used to mirror a website
Wget command being used to mirror a website

Download and untar a file automatically



You can save some time when downloading a tar archive by piping your wget command to tar so it downloads and decompresses all in one command. To do so, use the -O - option, which tells wget to download the file to standard output. Then just pipe directly to your tar command.

For example, to download latest version of WordPress and open the tar archive in a single command:

$ wget https://wordpress.org/latest.tar.gz -O - | tar -xz

How to authenticate with wget

If the HTTP or FTP server you are trying to download from requires authentication, there are a couple of options you have for supplying a username and password with wget. These example commands will work with both FTP and HTTP.

The first option is to supply the username and password in the wget command itself, which is not the safest method since your password is visible to anyone looking at your screen or viewing your user's command history:

$ wget --user=USERNAME --password=SECRET http://example.com/SecureFile.txt

You just need to replace USERNAME and SECRET with the appropriate information.

The second option is to let wget prompt you for a password, which keeps it hidden from those that can see your screen or look through your command history:

$ wget --user=USERNAME --ask-password http://example.com/SecureFile.txt

Use quiet mode with wget

To suppress all the output that wget displays, you can use the -q (quiet) option. This is especially useful when saving to standard output (-O -) as that can spam your terminal with a ton of text. When using -q, you'll know your download completed when your terminal returns to a normal prompt, since wget can't give you any indication itself.

$ wget -q http://example.com

A somewhat similar option is to background the wget command with -b. This will allow you to close your terminal or continue using it for something else while the wget command continues its job in the background.

$ wget -b http://example.com/linux.iso

Wget will log the usual output in a text file (wget-log by default) and tell you the process ID. If you want to cancel the download, you can always use the kill command followed by the PID.

Using the wget command to background a download
Using the wget command to background a download

Conclusion

Wget is simply the best command line utility you can use to download files on Linux. It has so many options, many of which are built to save you time - such as the feature to download recursively. In this article, we covered some of the basic uses of the wget command. Believe it or not, this is only scraping the surface of what it can do.

FIND LATEST LINUX JOBS on LinuxCareers.com
Submit your RESUME, create a JOB ALERT or subscribe to RSS feed.
LINUX CAREER NEWSLETTER
Subscribe to NEWSLETTER and receive latest news, jobs, career advice and tutorials.
DO YOU NEED ADDITIONAL HELP?
Get extra help by visiting our LINUX FORUM or simply use comments below.

You may also be interested in:



Comments and Discussions