I am keen to know the difference between curl and wget. Both are used to get files and documents but what the key difference between them.
Why are there two different programs?
The main differences are:
wget's major strong side compared to curl is its ability to download recursively.wget is command line only. There's no lib or anything, but curl's features are powered by libcurl.curl supports FTP, FTPS, GOPHER, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, LDAP, LDAPS, FILE, POP3, IMAP, SMTP, RTMP and RTSP. wget supports HTTP, HTTPS and FTP.curl builds and runs on more platforms than wget.wget is released under a free software copyleft license (the GNU GPL). curl is released under a free software permissive license (a MIT derivate).curl offers upload and sending capabilities. wget only offers plain HTTP POST support.You can see more details at the following link:
wget does that is left out of this answer is http mirroring (or 'spidering') ability. curl is very good at what it does, but it alone is not intended to be used to mirror a web site.
cURL.
They were made for different purposes
wget is a tool to download files from serverscurl is a tool that let's you exchange requests/responses with a serverwget
Wget solely lets you download files from an HTTP/HTTPS or FTP server. You give it a link and it automatically downloads the file where the link points to. It builds the request automatically.
curl
Curl in contrast to wget lets you build the request as you wish. Combine that with the plethora of protocols supported - FTP, FTPS, Gopher, HTTP, HTTPS, SCP, SFTP, TFTP, Telnet, DICT, LDAP, LDAPS, IMAP, POP3, SMTP, RTSP and URI - and you get an amazing debugging tool (for testing protocols, testing server configurations, etc.).
As many already mentioned you can download a file with curl. True, but that is just an "extra". In practice, use CURL when you want to download a file via a protocol that wget doesn't support.
wget also follows the redirect then saves the response unlike curl. Both can achieve the opposite to the default behaviour wget -qO - http://google.co.uk/ or curl http://google.co.uk/ > index.html
curl http://google.co.uk/ > index.html is not using an inbuilt functionality though. Anyway the main distinction is the purpose each tool was made for. There is not denying that tools evolve and many times deviate from their initial trajectory.
curl http://google.co.uk -o index.html would use curl's internals instead of shell output redirection with >.
Actually, the major difference is that curl includes a library (libcurl), and that library is widely used by other applications. wget is standalone.
I did some performance tests with wget and curl, and the result is:
100 times tested average run time while downloading 1MB file:
wget: 0.844s
cURL: 0.680s
100 times tested average run time while downloading 5MB file:
wget: 1.075s
cURL: 0.863s
100 times tested average run time while downloading 10MB file:
wget: 1.182s
cURL: 1.074s
10 times tested average run while downloading 67 different images (27MB in total):
wget: 22.410s
cURL: 17.425s
Command size on the system:
wget: 371K
cURL: 182K
curl links to libcurl.so. On my system: wget is 516K, curl is 255K, and libcurl.so.4 is 658K. So curl is a total of 913K.
The main differences(1. curl mainly reminds of communicating in various protocols, wget mainly reminds of downloading, 2. curl provides - and is based on - the libcurl library, and other softwares are able to use the same library as well, wget is standalone) have been mentioned in other answers, but here's also another difference worth emphasizing, explained in an example.
Another interesting feature of curl not possible with wget is communicating with UNIX sockets (i.e., communication even without a network).
For instance we can use curl to talk to Docker Engine using its socket in /var/run/docker.sock to get a list of all pulled docker images in JSON format (useful for "programming", in contrast to the docker images CLI command which is good for "readability"):
curl --unix-socket /var/run/docker.sock http://localhost/images/json | jq
curlauthor: daniel.haxx.se/docs/curl-vs-wget.htmlhttpiepackage is available on debian. And is a neat alternative to curl, that does a lot of helpfull things when working with REST APIs.