The Many Faces of Latency

Nothing is more damaging to a website than poor response times. Latency is probably the most serious issue that website application developers have to contend with. Whether it is retail application or a e-ticketing application poor response times play havoc on user experience. Latency has many faces each contributing in a little way to the overall response times of the application. This article looks at some of the key culprits that contribute to a website latency

Link Latencies: This is one of major contributors. The link speeds from the host computer to the website plays a major role. For those applications that are hosted on the public cloud it makes sense to deploy in multiple availability zones dispersed geographically. This will ensure that people across the globe get to the website from a cloud deployment closest to them. Besides, with the recent Amazon EC2 outage it definitely makes sense to be able to deploy across availability zones promoting geographical resiliency in the application. Dispersing the applications geographically helps in connecting the user with the least number of intervening hops thus reducing the response times.

DNS latencies: This is another area which needs to be focused on. DNS lookup can be fairly expensive. Hence it makes sense to speed DNS lookups by using some DNS services that provide additional name servers across geographical regions. There are many such DNS services that speed DNS lookups by propagating DNS lookup across geographies. Some examples are Amazon’s Route 53, UltraDNS etc.

Load Balancer Latencies: Typical cloud deployments will multiple instances usually be behind a load balancer. Depending on what algorithm the load balancer adopts for balancing the incoming traffic it is definitely going to contribute to the latency. Amazon’s Elastic Load Balancer is usually a set of participating IPs.

Application Latencies: When the load balancer sends the request to the Web application the logic in processing the request is a key contributor. This latency is within the control of the developer so it makes sense to bring this down to the absolute minimum.

Web page Rendering Latencies: A poorly designed web page can also result in large latencies. A webpage that needs to download a lot of items prior to being able to render it will definitely affect the user’s experience. Hence it is necessary to design an efficient web page that renders quickly. A standard technique to deliver content to a website is to use a Content Delivery Network (CDN) to deliver content. CDNs typically distribute content across multiple servers dispersed geographically. The content server selected for content delivery is based on user proximity based on the fewest number of hops. Major players in CDNS are Akamai, Edgecast andAmazon’s Cloudfront.

These are the many aspects that contribute to overall latencies. Focus should being trying to optimize in all areas while deploying a web application either in a hosted network or the public cloud.

Find me on Google+