How HTTP 2.0 affects existing web optimizations
Today's blog post would focus on how HTTP 2.0 affected web optimization techniques that were dominantly used in the HTTP 1.1 era.
HTTP 2.0 - An Insurgence
HTTP 2.0 was proposed by Google to improve the overall performance of web transactions. The thing that triggered the need for HTTP 2.0 is the significant gap between the growth of bandwidth capacity of networks and web page load times. HTTP 1.1 based browsers can only support six concurrent resource download on each domain.
On the graph above, you would notice that the latency of clients with 5 Mbps network bandwidths and above don't get significant performance gains as compared to the clients on the left side of the graph. The decline on latency improvement as bandwidth capacity grows is generally blamed to the fact that HTTP 1.1 implementation of browsers can only download six resources per domain concurrently.
HTTP 1.1 limitation
HTTP 1.1 Workarounds
To deal with the latency issues that were plaguing the HTTP 1.1 implementation, developers came up with different performance enhancement techniques.
- Inlining of Images - A technique that allows embedding of images in CSS and HTML files by converting images to data URI format. Therefore, reducing number of HTTP requests required to render pages.
- Domain Sharding - A technique that enables web pages to download more than six resources by downloading the required resources from multiple domains. NOTE: The HTTP 1.1 concurrent resource download limitation only applies on resources per domain.
HTTP 2.0 to the rescue - See Github
To address the issues associated with HTTP 1.1, HTTP 2.0 introduced the features below that would improve page load time and reduce network latency.
Multiplexing and Concurrency - Eliminates the need for opening multiple TCP sockets to retrieve multiple resources from a single domain.
HTTP 2.0 specifies that resources should be retrieve in streams chopped between frames.
- Server Push - Allows servers to preemptively push relevant responses to a client's cache.
- Stream Dependencies - Allows clients to inform servers which resources have higher priority.
The effect of multiplexing in resources.
How does HTTP 2.0 affect existing optimization techniques?
- HTTP reduction techniques are now obsolete since the concurrency bottleneck is solved on the browser level.
- Individual image storing is now more viable because image sprites are difficult to maintain and vulnerable to cache-flushing after website deployments
- HTTP 2.0 eliminates the need for domain sharding. This will probably drive the cost associated with maintaining multiple domains.
- HTTP 2.0 eliminates the need to convert images to Data URI format. Data URI encoded images are difficult to locate especially for developers who did not wrote the code.
HTTP 1.1 optimizations are still needed...
HTTP 1.1 optimizations are still useful during the transition era. Below is a list of reasons why HTTP 1.1 optimization techniques are still relevant.
- Website is served on HTTP (HTTP 2.0 requires HTTPS)
- Website is hosted by a server that does not support ALPN and HTTP 2.
- Website is required to support old browsers (Sensitive and Legacy Systems)
- Website is Required to support both HTTP 1 and 2 (Graceful Degradation)