The Trinity of Technical SEO: Latency, Indexation and Bandwidth

The Trinity of Technical SEO: Latency, Indexation and Bandwidth
As most of us know, SEO goes way beyond Titles, Tags and keyword selection. Gone are the days where heavily optimizing a page for “crazy purple widgets” will get you where you want to be. As the Search Engines focus on their core goal of providing the most authoritative and relevant results for each query we as Internet Marketers are faced with a new challenge. As web technology progresses, we must look beyond what a site “says” and look at what a site “is”. Structure and construction become even more important.

Site Latency: How fast do your site pages load?
Let’s first start with Latency. The Latency of a site is the speed in which the page loads for a visitor. There are many factors that contribute to the Latency of a web page including, but not limited to, site construction, reliance on external resources, server capacity, page size, etc. We do our best to reach a happy medium with the factors we can control. Not every site can afford Google-size processors and server capacity; most are sharing their server with hundreds or thousands of other sites. When this is the case we have to focus on factors we have more of an influence on like how our site is constructed and how bulky it is. In the end there is only so much speed you can get out of your site for your given budget.

How does one assess site Latency?
There are many tools out there that can be used for testing the latency of a site. Some are paid which we use extensively, and some are free, which we also use extensively. One of the most useful resources is Google Webmaster Tools. In April 2010, Google officially announced they would be incorporating site speed as one of the 200+ site signals used in determining search rankings and now have become sort of the de facto authority on this topic. Let’s use an actual site example where we are using Google Webmaster tools.

This is a site that relies very heavily on external resources, with a lot of multimedia. There is no way around this, so the speed of the site is limited by the speed of anywhere from 6-12 other servers to provide it with the data it needs to load a page. Here is a screenshot from Google Webmaster tools showing the time it takes to load a page:

As you can see, over the last 90 days the sites pages load on average just over 4 seconds. This is a site that has been receiving exponentially more traffic over this time frame. I want to also make the point that in the third week of June the site was redesigned and recoded for a better user experience and greater efficiency. As such, you can see the line is very erratic until July when the spikes smooth out a bit and on average the time has decreased. Remember we are only talking a spread of 1.3 seconds between the high and low. Not a huge amount off is it? Most visitors wouldn’t even notice a difference. But Search Engines do.

This brings us to Bandwidth.
Bandwidth is defined as “the maximum amount of information (bits/second) that can be transmitted along a channel”. Why does this matter? Search Engines have a limited amount of capacity available. Granted, their “limited capacity” far exceeds anything most of us can dream of, but it is also a really big Internet to crawl (over a trillion pages). Based on this, the Search Engines will allocate a certain amount of resources to crawling a particular website based on its perceived value (whether it is stated or not). CNN.com is going to get a substantially larger portion of Google’s resources than my Dog in Funny Hats blog.

So what does this mean to the rest of us? It means that we have to make the best use of the resources that are given to us. Basically, when Google comes a-knockin’ it is in your best interest to make sure it has the most clear path through your site and can get as much information as possible before it leaves for its next appointment (probably cnn.com). This is why building the most search-friendly efficient site is critical.

Below is a screenshot of the same site, same time period as before, this time showing how much bandwidth is afforded this site by Google in a given day:

Notice the trend of increasing bandwidth up until the third week of June, when the new site design was launched. Immediately before the 2.0 version of the site was launched the site was receiving a peak high of almost 89,000KB of attention from Google. Then it decreased substantially and immediately to 12,000KB and has since settled in somewhere around 39,000KB. The initial impulse is to look at this and say that a mistake was made, Google isn’t as interested in crawling this site anymore. This next screenshot shows the truth – how many pages is Google crawling in a given day:

This chart pretty much speaks for itself. Based on this chart we can see, that while the Latency of the site has only moved within a narrow band, the bandwidth usage has dropped considerably, which has allowed Google to crawl more pages on a given visit. This is a strong case for optimizing your entire site presence, not just your Titles and Tags. During this time period of this example, no on-site SEO elements were changed.

And by the way this third chart mimics the organic site traffic trend as well. How well is your site performing, contact us today to conduct a Technical Site Assessment and start improving or rebuilding your online presence?

9 Comments

  1. hey there, this might be little offtopic, but i am hosting my site on hostgator and they will suspend my hosting in 4days, so i would like to ask you which hosting do you use or recommend?

  2. Interesting blog! Is your theme custom made or did you download it from somewhere? A design like yours with a few simple tweeks would really make my blog shine. Please let me know where you got your theme. Kudos

Post A Comment