How To Download A File From The Command Line

Download Files Using wget
Download Files Using wget.

Introduction

In this guide, I will show you how to download a file using the Linux command line.

Why would you want to do this? Why wouldn't you just use a web browser in a graphical environment?

Sometimes there isn't a graphical environment. For instance, if you are connecting to your Raspberry PI using SSH then you are mainly stuck with the command line. 

Another reason for using the command line is that you can create a script with a list of files to download.

You can then execute the script and let it run in the background.

The tool that I will be highlighting for this task is called wget.

Installation

Many Linux distributions already have wget installed by default.

If it isn't already installed then try one of the following commands:

  • Ubuntu, Debian, Mint etc - sudo apt-get install wget
  • Fedora, CentOS etc - yum install wget
  • openSUSE - zypper install wget

How To Download A File From The Command Line

In order to download files, you need to know at the very least the URL of the file you wish to download. 

For instance, imagine you wish to download the latest version of Ubuntu using the command line. You can visit the Ubuntu website. By navigating through the website you can get to this page which provides a link a download now link. You can right click on this link to get the URL of the Ubuntu ISO you wish to download.

To download the file using wget using the following syntax:

wget http://releases.ubuntu.com/14.04.3/ubuntu-14.04.3-desktop-amd64.iso?_ga=1.79650708.1078907269.1453803890

This is all well and good but you needed to know the full path to the file you needed to download.

It is possible to download an entire site by using the following command:

wget -r http://www.ubuntu.com

The above command copies the entire site including all the folders from the Ubuntu website. This is of course not advisable because it would download lots of files you don't need. It is like using a mallet to shell a nut.

You could, however, download all files with the ISO extension from the Ubuntu website using the following command:

wget -r -A "iso" http://www.ubuntu.com

This is still a bit of a smash and grab approach to downloading the files you need from a website. It is much better to know the URL or URLs of the files you wish to download.

You can specify a list of files to download using the -i switch. You can create a list of URLs using a text editor as follows:

nano filestodownload.txt

Within the file enter a list of URLs, 1 per line:

http://eskipaper.com/gaming-wallpapers-7.html#gal_post_67516_gaming-wallpapers-1.jpg

http://eskipaper.com/gaming-wallpapers-7.html#gal_post_67516_gaming-wallpapers-2.jpg

http://eskipaper.com/gaming-wallpapers-7.html#gal_post_67516_gaming-wallpapers-3.jpg

Save the file using CTRL and O and then exit nano using CTRL and X.

You can now use wget to download all of the files using the following command:

wget -i filestodownload.txt

The trouble with downloading files from the internet is that sometimes the file or URL is unavailable.

The timeout for the connection can take a while and if you are trying to download lots of files it is counter-productive to wait for the default timeout.

You can specify your own timeout using the following syntax:

wget -T 5 -i filestodownload.txt

If you have a download limit as part of your broadband deal then you might wish to limit the amount of data that wget can retrieve.

Use the following syntax to apply a download limit:

wget --quota=100m -i filestodownload.txt

The above command will stop the download of files once 100 megabytes has been reached. You can also specify the quota in bytes (use b instead of m) or kilobytes (use k instead of m).

You may not have a download limit but you might have a slow internet connection. If you want to download files without destroying everybody's internet time then you can specify a limit which sets a maximum download rate.

For example:

wget --limit-rate=20k -i filestodownload.txt

The above command will limit the download rate to 20 kilobytes per second. You can specify the amount in bytes, kilobytes or megabytes.

If you want to make sure that any existing files aren't overwritten you can run the following command:

wget -nc -i filestodownload.txt

If a file in the list of bookmarks already exists in the download location then it won't be overwritten. 

The internet as we know isn't always consistent and for that reason, a download can be partially completed and then your internet connection drops out.

Wouldn't it be good if you could just continue where you left off? You can continue a download by using the following syntax:

wget -c <url>

Summary

The wget command has dozens of switches that can be applied. Use the command man wget to get a full list of them from within a terminal window.