As of 2014, there are plenty of tools that allow observing HTTP traffic from and to your browser. As a default, nowadays browsers come with an integrated Developer Tool offering – among many things - a visual interface for such tests.
Why is valuable investigating the HTTP connections?
From an SEO standpoint, this investigation proves to be helpful while debugging Ajax problems and Flash apps connectivity issues. However, it is definitely worth carrying such an investigation to find the cause of a redirection chain, or malformed HTTP responses.
When I audit large corporate website, it comes with no surprise (not anymore) that many of them includes technical issues simply because coders have no clue of what the web and the Internet foundations are.
For instance, one of the most common mistakes is the non-existing URLs returning a HTTP Status code 200 instead of 404 (aka soft-404).
Another error it could be the Expires field in the HTTP header set too far into the future. A problem that in combination with a caching mechanism may for instance deliver a stale version of the page regardless of the change done to the server version.
That is the reason because it is worth doing a preliminary site audit and randomly check some pages / images HTTP responses.
What should you use?
I do not think my favourites HTTP response checkers have been mentioned before on this blog.
Most modern browsers like Chrome and Mozilla Firefox provide a built-in platform for HTTP debugging. If the project is exigency limited and you are not too fussy, my recommendation is to go with this one.
However, assuming an audit is required, other options are available. Aside from the command-line utility CURL that can be used to print on screen the HTTP response, below a list of the most recent tools I tested on my Mac.
CharlesProxy: I do not like this tool for two reasons: The cost, which I found expensive for the output produced, and the app layout with result panels stretching and contracting in a way my eyeballs crossed over each other.
So I tested something in line with my “server administration skills” and I downloaded a command-line HTTP proxy checker named MitmProxy. This is an interactive console program that allow the traffic to flow, to be inspected and edited on the fly.
Based on Python, it can run on your Mac Maverick using version 2.7 once installed via pip, which requires XCode Developer installed too.
To get this working, a change to the proxy settings of your network interface is required by specifying the loopback address (127.0.0.1) for both the http and https proxy settings. Once applied, all the HTTP requests suddenly will appear on screen with you flabbergasting of how many services transmit data without your knowledge.
MitmProxy best companion is mitmdump, an extension that will allow you to record programmatically all the HTTP traffic retrieved over time.
However, I appreciate out of there are SEOers who prefer using software with a GUI. If that is what you are after, my last recommendation is a shareware whose price is very reasonable and that comes with the advantage to inspect traffic without changes to the network configuration (I suspect it inspects HTTP connection at a different stage of the OSI level, but I’m not sure).
The shareware called HTTP Scoop 1.4 can be downloaded from here
I have not intentionally included any web browser HTTP header checker as the list could become quite long (and very similar to many other articles already available on the web).
I hope you can find these resources useful.