curl & wget — Footguns¶
Mistakes that cause silent failures, security holes, or hours of debugging in production scripts.
1. Using -k/--insecure in production¶
-k tells curl to skip TLS certificate verification. It exists for debugging self-signed certs in dev. When it leaks into production scripts, health checks, or CI pipelines, you have disabled the entire point of TLS — anyone can MITM the connection.
# This is in way too many production health check scripts
curl -sk https://api.example.com/health
# "We had cert issues during the outage so we added -k and forgot to remove it"
# Now every request from this script trusts ANY certificate
# An attacker on the network can intercept credentials, tokens, API keys
# Fix: install the proper CA bundle
curl --cacert /etc/ssl/certs/internal-ca.pem https://internal.example.com/health
# Or update the system CA trust store
sudo cp internal-ca.pem /usr/local/share/ca-certificates/
sudo update-ca-certificates
Rule: Grep your codebase for curl.*-k and curl.*--insecure periodically. If you find them in anything that touches production, fix them.
2. curl silently succeeds on HTTP errors¶
By default, curl returns exit code 0 even when the server returns 404, 500, or any other HTTP error. Your shell script thinks the request succeeded.
# This "works" even when the server returns 500
curl -s https://api.example.com/deploy -X POST
echo $? # 0 — "success"
# Your deploy script continues, thinking it triggered a deploy
# Nothing actually happened
# Fix: use -f/--fail to get a non-zero exit code on HTTP errors
curl -sf https://api.example.com/deploy -X POST
echo $? # 22 — curl error for HTTP error response
# Better: capture the status code explicitly
STATUS=$(curl -s -o /dev/null -w "%{http_code}" -X POST https://api.example.com/deploy)
if [ "$STATUS" -ne 200 ]; then
echo "Deploy failed with status $STATUS" >&2
exit 1
fi
Rule: Always use -f in scripts, or check %{http_code} explicitly. Never trust exit code 0 to mean "the API call worked."
3. Not URL-encoding query parameters¶
Spaces, ampersands, plus signs, and other special characters in query parameters cause subtle bugs. The request either fails silently, returns wrong data, or sends a truncated query.
# BUG: the space breaks the URL, the & starts a background process
curl "https://api.example.com/search?q=hello world&limit=10"
# Actually sends: GET /search?q=hello
# "world&limit=10" is treated as a shell background command
# BUG: plus sign means something different in URLs
curl "https://api.example.com/search?q=C++"
# Server receives "C " (plus = space in query strings)
# Fix: use --data-urlencode for GET requests
curl -G https://api.example.com/search \
--data-urlencode "q=hello world" \
--data-urlencode "limit=10"
# Fix: manually encode special characters
curl "https://api.example.com/search?q=hello%20world&limit=10"
curl "https://api.example.com/search?q=C%2B%2B"
4. wget following external links during mirroring¶
wget --mirror follows ALL links by default, including links to external sites. You intended to mirror your internal docs. wget is now downloading the entire internet.
# Dangerous: will follow links to any domain
wget --mirror https://docs.example.com/
# Your docs have a link to "See also: https://kubernetes.io/docs/..."
# wget is now mirroring kubernetes.io
# Fix: restrict to the same domain
wget --mirror --domains=docs.example.com --no-parent https://docs.example.com/
# Fix: use --span-hosts=off (default, but be explicit)
wget --mirror --no-parent -np https://docs.example.com/
# Fix: exclude external patterns
wget --mirror --reject-regex='(github|google|youtube)\.com' https://docs.example.com/
5. Tokens and credentials in shell history¶
Every curl command with a token or password is saved in ~/.bash_history. Anyone with access to the user account can see your secrets.
# This is now in your shell history forever:
curl -H "Authorization: Bearer eyJhbGciOiJSUzI1NiIs..." https://api.example.com/admin
# And in the process table while it's running:
ps aux | grep curl
# deploy 12345 curl -H Authorization: Bearer eyJhbGciOi... https://api.example.com/admin
# Fix: read from environment variables
curl -H "Authorization: Bearer $API_TOKEN" https://api.example.com/admin
# Fix: read from a file
curl -H @auth-header.txt https://api.example.com/admin
# where auth-header.txt contains: Authorization: Bearer <token>
# Fix: use .netrc for basic auth
# ~/.netrc (chmod 600):
# machine api.example.com login admin password s3cret
curl --netrc https://api.example.com/admin
# Fix: suppress the command from history
# Prefix with a space (requires HISTCONTROL=ignorespace)
curl -H "Authorization: Bearer $TOKEN" https://api.example.com/admin
6. Unquoted JSON body in curl¶
Passing JSON without proper quoting causes shell variable expansion, broken escaping, and silently malformed requests.
# BUG: shell interprets the double quotes and $variables
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d {"name": "test", "role": "$ROLE"}
# Sends: {name: test, role: } — JSON is broken, $ROLE is empty
# BUG: missing quotes around -d value
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d {name: test}
# Shell tries to glob {name: test} — result is unpredictable
# Fix: single quotes around the JSON body (no shell expansion)
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d '{"name": "test", "role": "admin"}'
# Fix: when you need variable interpolation, mix quoting carefully
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d '{"name": "test", "role": "'"$ROLE"'"}'
# Fix: use a heredoc for complex JSON
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d @- << EOF
{"name": "test", "role": "$ROLE", "team": "$TEAM"}
EOF
# Fix: use jq to build the JSON (safest — handles escaping)
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d "$(jq -n --arg role "$ROLE" '{"name": "test", "role": $role}')"
7. Forgetting -L for redirects¶
curl does not follow HTTP redirects by default. You get a 301/302 response and an empty or HTML body, not the resource you wanted. Scripts fail silently.
# Downloads an HTML redirect page, not the actual file
curl -O https://github.com/user/repo/releases/download/v1.0/app.tar.gz
# File contains: <html><body>You are being redirected...</body></html>
# Your checksum doesn't match. Your extraction fails. You blame the mirror.
# Fix: always use -L to follow redirects
curl -LO https://github.com/user/repo/releases/download/v1.0/app.tar.gz
# Limit the number of redirects (prevent infinite loops)
curl -L --max-redirs 5 -O https://example.com/download
# wget follows redirects by default — this is one case where wget is safer
wget https://github.com/user/repo/releases/download/v1.0/app.tar.gz
8. wget user-agent blocking¶
Many CDNs and web servers block wget's default user-agent (Wget/1.21). You get a 403 Forbidden or a CAPTCHA page instead of the content. The download appears to "work" but you saved an HTML error page.
# Blocked by Cloudflare, Akamai, etc.
wget https://example.com/api/data.json
# 403 Forbidden
# Fix: set a normal user agent
wget --user-agent="Mozilla/5.0" https://example.com/api/data.json
# curl is less commonly blocked but can also be
curl -A "Mozilla/5.0 (compatible; monitoring/1.0)" https://example.com/api/data.json
9. No connect timeout (hanging forever)¶
Without a timeout, curl and wget hang indefinitely when a server is unreachable or a firewall silently drops packets. Your cron job, health check, or CI pipeline hangs.
# This will hang forever if the server is behind a firewall that drops packets
curl https://api.example.com/health
# ... 10 minutes later, you notice the script is still "running"
# Fix: always set connect timeout AND max time
curl --connect-timeout 5 --max-time 30 https://api.example.com/health
# --connect-timeout: give up if TCP connection isn't established in N seconds
# --max-time: give up if the ENTIRE operation takes longer than N seconds
# wget equivalent
wget --connect-timeout=5 --read-timeout=30 --tries=3 https://example.com/file
# In Docker HEALTHCHECK (the most critical place to set timeouts)
HEALTHCHECK --timeout=5s --interval=30s \
CMD curl -sf --connect-timeout 3 --max-time 5 http://localhost:8000/health || exit 1
10. curl | bash as a security risk¶
Piping curl into bash is common in install scripts. It downloads and executes arbitrary code. If the connection is intercepted, if the server is compromised, or if the URL is wrong, you just ran someone else's code as root.
# Every tool's "quick install" page:
curl -sSL https://get.example.com | bash
# or worse:
curl -sSL https://get.example.com | sudo bash
# Risks:
# 1. MITM attack replaces the script (especially without TLS verification)
# 2. Server detects curl user-agent and serves different content than browser
# 3. Partial download: if the connection drops mid-stream, bash executes
# a truncated script
# 4. No audit trail — you have no idea what you just ran
# Safer: download first, inspect, then run
curl -sSL -o install.sh https://get.example.com
less install.sh # read it
sha256sum install.sh # verify checksum if published
bash install.sh
# Safest: use your package manager
apt-get install tool-name
# or
brew install tool-name
11. -o /dev/null hiding errors¶
Redirecting output to /dev/null to "clean up" output also hides error messages in the response body. The API returned a 200 with an error in the JSON body, and you threw it away.
# You think this is a clean health check
STATUS=$(curl -s -o /dev/null -w "%{http_code}" https://api.example.com/health)
# STATUS is 200, but the body was: {"status": "degraded", "db": "unreachable"}
# Fix: capture both status code and body
RESPONSE=$(curl -s -w "\n%{http_code}" https://api.example.com/health)
BODY=$(echo "$RESPONSE" | head -n -1)
STATUS=$(echo "$RESPONSE" | tail -n 1)
if [ "$STATUS" -ne 200 ] || echo "$BODY" | grep -q '"status":"degraded"'; then
echo "UNHEALTHY: status=$STATUS body=$BODY" >&2
exit 1
fi
# Or capture body to a temp file while getting status code
STATUS=$(curl -s -w "%{http_code}" -o /tmp/response.json https://api.example.com/health)
if [ "$STATUS" -ne 200 ]; then
echo "Failed ($STATUS):" >&2
cat /tmp/response.json >&2
exit 1
fi