What Is Curl And Why Would You Use It?

The curl Command
The curl Command.

The manual page for the "curl" command has the following description:

curl is a tool to transfer data from or to a server, using one of the supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET and TFTP). The command is designed to work without user interaction.

Basically you can use curl to download content from the internet.

For example if you ran the curl command with the web address set to http://linux.about.com/cs/linux101/g/curl.htm then the linked page will be downloaded.

By default the output will be to the command line but you can also specify a filename to save the file to. The URL specified can point to a site's top level domain such as www.about.com or it can point to individual pages on the site.

You can use curl to download physical webpages, images, documents and files. For instance to download the latest version of Ubuntu Linux you can simply run the following command:

curl -o ubuntu.iso http://releases.ubuntu.com/16.04.1/ubuntu-16.04.1-desktop-amd64.iso

Should I Use Curl Or Wget?

The question "should I use curl or wget?" is a question that I have been asked a number of times in the past and the answer is that it depends on what you are trying to achieve.

The wget command is used to download files from networks such as the internet.

The main benefit of using the wget command is that it can be used to recursively download files. Therefore if you want to download an entire website you can do so with one simple command. The wget command is also good for downloading lots of files.

The curl command lets you use wildcards to specify the URLs you wish to retrieve.

So if you know there is a valid URL called "http://www.mysite.com/images/image1.jpg" and "http://www.mysite.com/images/image2.jpg" then you can download both images with a single URL specified with the curl command.

The wget command can recover when a download fails whereas the curl command cannot.

You can get a good idea of the cans and cannots with regards to the wget and curl command from this page. Bizarrely one of the differences on this page states that you can type wget using just your left hand on a QWERTY keyboard. 

Thus far there have been lots of reasons to use wget over curl but nothing as to why you would use curl over wget.

The curl command supports more protocols than the wget command, it also provides better support for SSL. It also supports more authentication methods than wget. The curl command also works on more platforms than the wget command.

Curl Features

Using the curl command you can specify multiple URLs in the same command line and if the URLs are on the same site all of the URLs for that site will be downloaded using the same connection which is good for performance.

You can specify a range to make it easier to download URLs with similar path names.

There is also a curl library which the curl command uses called libcurl.

This can be used with multiple programming and scripting languages to scrape information from webpages.

Whilst downloading content a progress bar will appear with download or upload speeds, how long the command has spent running thus far and how long there is still to go.

The curl command works on large files over 2 gigabytes for both downloading and uploading.

According to this page which compares curl features with other download tools, the curl command has the following functionality:

  • Multiple URLs
  • Usernames and Passwords support
  • IPv6 support
  • Retry failed download
  • URL globbing/sequences
  • Win32 support
  • Large file support
  • GnuTLS support
  • DarwinSSL support
  • Schannel support
  • Cyassl support
  • PolarSSL support
  • AxTLS support
  • SSL Session ID 
  • SSL Private Certificate
  • netrc support
  • Metalink support
  • IDN support
  • Bandwidth limiting
  • Happy eyeballs
  • TFTP
  • SCP upload/download
  • SFTP upload/download
  • HTTP Proxy
  • HTTP Resume
  • HTTP Ranges
  • Follow HTTP Redirects
  • HTTP Post
  • HTTP Post Chunked
  • HTTP Put
  • Cookie support
  • HTTP 1.1
  • HTTP 2 (plain text upgrade)
  • HTTP 2 (TLS NPN)
  • HTTP persistent connections
  • HTTP Digest Auth
  • HTTP NTLM Auth
  • HTTP Negotiate Auth
  • HTTP Multipost Part
  • HTTP Deflate gzip
  • FTP resume
  • FTP ranges
  • FTP active mode
  • FTP upload
  • FTP Kerberos
  • FTP Connection re-use

More From Us