wget is a command-line tool for non-interactive downloading of files from the web. Where curl is a generalist protocol client, wget specialises in retrieving content—it excels at recursive mirrors, resuming interrupted downloads, and behaving well over flaky connections. It has been a GNU project since 1996.
wget https://example.com/file.zip # simple download
wget -c https://example.com/big.iso # continue partial
wget -r -np -k https://example.com/docs/ # recursive, relative links
wget -i urls.txt # download list of URLs
wget --mirror --convert-links --adjust-extension site.com # mirror
wget -r (recursive) plus --mirror is the classic way to make a local copy of a website for offline browsing—an occasionally useful trick for documentation or content archiving. --limit-rate throttles bandwidth, --wait pauses between requests, and --random-wait helps avoid tripping rate limits.
For single downloads, wget and curl are interchangeable. For scripted mirrors, wget wins; for interacting with APIs or protocols beyond HTTP, curl wins. Most Linux users install both.
Related terms: curl
Discussed in:
- Chapter 12: Networking — File Transfer: scp, rsync, curl, wget
Also defined in: Textbook of Linux