Bash: Capturing HTTP status code using curl write-out

curl has a “write-out” ability to display information after a completed transfer that can be used to support Bash scripting. In it’s simplest form, a curl request looks like this:

page="https://www.google.com"
curl $page

But this leaves a lot to be desired in a scripting scenario: curl does not exit with a non-zero code on failure, it has a long timeout/retries, and you would need to parse out the HTTP status code manually.  But with a few tweaks to the options, you can get all these capabilities.

# flags used to make curl more scriptable
options='--fail --connect-timeout 3 --retry 0 -s -o /dev/null -w %{http_code}'

# make curl call, capture return exit code and stdout
outstr=$(curl $options  $page)
retVal=$?
echo "OK pulling from $page successful, retVal=$retVal, http_code=$outstr"

With these flags, we now have scriptable output from curl.  The “–fail” option tells curl to use an exit code,  and the “-w” gives access to a set of special variables such as http_code that are sent to stdout.

See my curl_writeout.sh on github for a full example.

 

REFERENCES

man7, curl man page

stackoverflow, how to get https status code with curl

gist github subfuzion, curl examples

linuxize.com, curl POST examples

httpbin.org, public POST location

NOTES

“–head” is another flag that can be sent to curl so only the headers are retrieved, and not the full content.

Pulling from an IP address but using https, being explicit about no proxy usage

# using curl to pull by IP
# (not sufficient to specify header)
# use "--insecure" if self-signed cert
curl --fail --resolve foo.com:443:1.2.3.4 -x '' https://foo.com

# wget with IP address for virtual host will not work
# needs entry added to /etc/hosts
# use --no-check-certificate if self-signed cert
wget -e use_proxy=no -O - --header="Host: foo.com" https://foo.com