If you've been involved in more than a few website deployments, I'm positive that you've encountered the following scenario: The website was developed, tested, demonstrated to internal audiences, accepted, deployed, …and failed completely due to unacceptable performance.
A number of reasons could contribute to this scenario, ranging from under-provisioned servers to bloated HTML pages or suboptimal database queries. We'll skip these possibilities, as they're covered in numerous articles published by InformIT, and instead focus on two hard-to-diagnose problems:
- Even though the HTML code is lean, the response time is unacceptable over slow-speed links. (Keep in mind that more than 40% of U.S. households still don't have broadband Internet access.)
- While the website performs amazingly well for users who are geographically close to the web servers, far-away users (becoming more important in the millennium of globalization) report dismal results.
Both symptoms are caused by Hypertext Transport Protocol (HTTP; RFC 2616), one of the underlying web technologies that is usually well hidden from web designers and developers. This article describes the physical limitations faced by HTTP and their impact on website performance. Follow-up articles will describe various mechanisms you can use to minimize response-time degradation, and the tools that measure and predict website performance.