curl & wget — Trivia & Interesting Facts¶
Surprising, historical, and little-known facts about curl and wget.
curl was first released in 1998 and has had only one lead developer¶
Daniel Stenberg released curl 4.0 on March 20, 1998 (earlier versions were called httpget and urlget). He has been the lead developer and maintainer for over 26 years, making it one of the longest-running single-maintainer open-source projects. Stenberg estimates curl is installed on over 20 billion devices worldwide.
wget predates curl by two years¶
GNU Wget was originally released by Hrvoje Niksic in 1996 as a successor to an earlier tool called Geturl. While curl focuses on being a data transfer library and command-line tool, wget was designed specifically for recursive downloading and mirroring — it can follow links and download entire websites.
curl supports over 25 network protocols¶
Beyond HTTP and HTTPS, curl supports FTP, FTPS, SCP, SFTP, TFTP, DICT, LDAP, LDAPS, MQTT, POP3, IMAP, SMB, SMTP, Telnet, Gopher, and more. This protocol breadth is unusual for a single tool and comes from curl's underlying library, libcurl, which is designed to be protocol-agnostic.
libcurl is used by more software than almost any other C library¶
libcurl is linked into applications ranging from cars (virtually every connected vehicle uses it) to game consoles, medical devices, and space probes. Apple ships it in macOS and iOS, Microsoft includes it in Windows 10+, and it is a dependency of PHP, Python's PycURL, and countless other language bindings.
curl has a built-in write-out format for timing every phase of a request¶
The --write-out flag with format variables like %{time_namelookup}, %{time_connect}, %{time_appconnect}, %{time_starttransfer}, and %{time_total} provides microsecond-precision timing for DNS, TCP, TLS, and first-byte latency. This makes curl an essential HTTP debugging tool that replaces the need for separate timing utilities.
wget can resume interrupted downloads automatically¶
wget's -c (continue) flag resumes partially downloaded files by sending an HTTP Range header. Combined with --retry-connrefused and --tries=0 (infinite retries), wget can reliably download large files over unreliable connections. curl can also resume downloads with -C -, but wget's retry behavior is more sophisticated out of the box.
The curl project has a formal security vulnerability process¶
As of 2024, the curl project has disclosed over 150 CVEs across its history. Daniel Stenberg runs a bug bounty through HackerOne and publishes detailed vulnerability reports. The project's security discipline is remarkable for a volunteer-run project and is considered a model for open-source security practices.
curl added HTTP/3 and QUIC support before most web servers¶
curl gained experimental HTTP/3 support in 2019, making it one of the first widely-available tools to support the then-draft protocol. HTTP/3 uses QUIC (UDP-based transport) instead of TCP, reducing connection setup latency. You can test HTTP/3 with curl --http3 https://example.com.
wget2 is a complete rewrite with parallel downloads¶
GNU Wget2, a ground-up rewrite started around 2012, adds features like multi-threaded parallel downloads, HTTP/2 support, and improved memory efficiency. It is designed to be a drop-in replacement for wget but with modern protocol support and significantly better performance for mirroring tasks.
curl can send email, upload to FTP, and query LDAP directories¶
Because curl speaks so many protocols, you can use it to send SMTP email (curl --mail-from sender@x.com --mail-rcpt recipient@x.com smtp://server), upload files to FTP servers (curl -T file.txt ftp://server/), and query LDAP directories. Most people only use it for HTTP, but it is genuinely a universal network transfer tool.