Showing posts from June, 2017

How HTTP 2.0 affects existing web optimizations

Today's blog post would focus on how HTTP 2.0 affected web optimization techniques that were dominantly used in the HTTP 1.1 era. HTTP 2.0 - An Insurgence HTTP 2.0 was proposed by Google to improve the overall performance of web transactions. The thing that triggered the need for HTTP 2.0 is the significant gap between the growth of bandwidth capacity of networks and web page load times. HTTP 1.1 based browsers can only support six concurrent resource download on each domain. On the graph above, you would notice that the latency of clients with 5 Mbps network bandwidths and above don't get significant performance gains as compared to the clients on the left side of the graph. The decline on latency improvement as bandwidth capacity grows is generally blamed to the fact that HTTP 1.1 implementation of browsers can only download six resources per domain concurrently. HTTP 1.1 limitation

Risks associated with plain text passwords

Today's blog post would be about why storing passwords in plain text is a bad practice. We would be identifying vulnerabilities associated with this type of password management. This would be the pilot post among a series of blog postings regarding ethical and secure authentication systems. How user authentication travels For us to be capable of identifying the risks and threats associated with plain text passwords, we have to get a little background on how password travels from a client application (web, desktop or mobile) to the server. The diagram below describes the typical flow used by a client application to verify a user's authentication data. The diagram above shows how most authentication systems are implemented. Listed below are the steps associated with this authentication process: Password is provided by the user to the client applicationClient application sends authentication data to…