Posts Tagged ‘search engine optimization’

Oh No Daddy! GoDaddy Redirects Cause Major SEO Issue, now what?

Thursday, July 14th, 2011

If you are a GoDaddy customer you will want to read this.

Part of our weekly SEO routine for clients is to check search engine Webmaster tool accounts, in doing so we usually find something interesting, disturbing or exciting. This week, we noticed that a domain that we permanently redirected to a new one back in December was showing as a top inbound linking site to the domain in which is was supposed to be redirected.

We asked ourselves, “How could this be?” Was the site moved to a new host without our knowledge? Was the domain being redirected properly?

And so our investigation began.

Our Findings:
This client is a GoDaddy customer and back in December we had permanently forwarded the domains in GoDaddy’s Domain Control panel which at that time issued a proper permanent redirect telling the servers and search engines that this was indeed a permanent redirect (otherwise known as a 301 redirect).

Today, when we checked the server headers it said that the redirect was a temporary redirect (otherwise known as a 302 redirect). We went a step further to investigate some of the other domains that also had been setup as permanent redirects in the GoDaddy system. And to our dismay, they are all showing as temporary redirects.

Arizona SEO: Checking redirects

We logged into GoDaddy to make sure the permanent redirect settings were correct for our domains AND…they were.

Phoenix SEO: How to set a redirect

We then called GoDaddy to understand why the domain settings marked to permanent redirect (301) were showing as a temporary redirect (302) in our third party checking tool.

The GoDaddy tech support person told us that “all of their forwarding now only issues a temporary 302 redirect” and if we wanted “to make it a permanent 301 we would have to do that at the hosting level.” They also could not tell us what type of redirect would be issued if we actually selected temporary redirect as an option or why there are two choices if they both do the same thing.

There are three situations for when you would use redirects:*

  1. If you are launching a new site and switching to a new domain.
  2. If you have deleted, removed, changed or are planning to change web page URLs.
  3. If you have a list of URLs that are derivations or misspellings of your company name/main site domain that you want to forward to your main site.

*Redirects can be configured for site domains and specific web pages/URLs.

There are two types of preferred redirects for these situations:

a) 301 which is for files that have been permanently moved.
b) 302 which is for files that have been temporarily moved.

Why are redirects such a big deal?

In the SEO world, redirects affect everything. A redirect tells servers, search engines and browsers how to handle the domain when someone requests it. If a page is deleted, a site is relaunched, or a page is renamed it is important for both SEO and usability to make sure you have those old pages going to either a new relevant page or appropriate URL.

If URLs/domains are not configured properly you can lose valuable search engine visibility. Keep in mind that people and sites are linking to your content and inbound links are one of the key influencing factors to search engine rankings. Use of 301 redirects should be used to preserve search engine rankings and any inbound links to that particular URL. This way search engines will index the new address instead of keeping the outdated URL. It is the best option to avoid negatively impacting search engine ranking.

The reason you don’t want to use a 302 redirect is this signals to the search engine that the old URL should be maintained in the index as an active URL, it just has been moved for now. This causes none of your new URLs to be indexed.

If a search engine doesn’t know where to go and runs into a dead-end URL/page this can impact your search visibility not to mention your user experience if they follow a link to a URL that no longer exists.

How do I fix my forwarding domains in GoDaddy to be properly configured for permanent redirection?

Note: If you are redirecting a domain, you DON’T want to just switch the DNS to your main site – this will cause a mirrored site and create a duplicate content issue for you. Search engines will not like that and you may get penalized for it by search engines.

  1. The first step is to remove or turn off the forwarding in your GoDaddy domain control panel. Go to Domain Management and find the domain, and go to Forwarding and click on Manage next to it. Edit to turn off.

    Search Optimization: Turning your redirects off in Godaddy

  2. If the web site you are redirecting to is hosted with GoDaddy, add the domain as an additional domain to the root through the Hosting Control panel.
  3. If the web site you are redirecting to is NOT hosted with GoDaddy, change the DNS records in the Domain Control panel to point to the IP address of the site you are redirecting to. Add the domain as an additional domain in your host’s control panel. Or your virtual hosts file.
  4. Add the following code to the .htaccess file for the main site. Make sure to create a RewriteCond and ReWriteRule line for each domain you are redirecting. Make sure to redirect both the www and non-www version of the domain if needed. And always immediately test your .htaccess to make sure there are no errors. .htaccess can be tricky, better to be safe than sorry. So always backup your .htaccess before making any changes.
  5. Options +FollowSymlinks
    RewriteEngine on
    rewritecond %{http_host} ^ [nc]
    rewriterule ^(.*)$$1 [nc,r=301]
    RewriteCond %{http_host} ^ [nc]
    RewriteRule ^(.*)$$1 [nc,R=301]
    RewriteCond %{http_host} ^ [nc]
    RewriteRule ^(.*)$$1 [nc,R=301]

    It is wise to speak with an SEO consultant prior to making your site live to the redirects of old pages and domains are appropriately handled. If you are relaunching a site or looking to increase your online visibility? Give us a call regarding your SEO.

The Trinity of Technical SEO: Latency, Indexation and Bandwidth

Tuesday, September 7th, 2010

The Trinity of Technical SEO: Latency, Indexation and Bandwidth
As most of us know, SEO goes way beyond Titles, Tags and keyword selection. Gone are the days where heavily optimizing a page for “crazy purple widgets” will get you where you want to be. As the Search Engines focus on their core goal of providing the most authoritative and relevant results for each query we as Internet Marketers are faced with a new challenge. As web technology progresses, we must look beyond what a site “says” and look at what a site “is”. Structure and construction become even more important.

Site Latency: How fast do your site pages load?
Let’s first start with Latency. The Latency of a site is the speed in which the page loads for a visitor. There are many factors that contribute to the Latency of a web page including, but not limited to, site construction, reliance on external resources, server capacity, page size, etc. We do our best to reach a happy medium with the factors we can control. Not every site can afford Google-size processors and server capacity; most are sharing their server with hundreds or thousands of other sites. When this is the case we have to focus on factors we have more of an influence on like how our site is constructed and how bulky it is. In the end there is only so much speed you can get out of your site for your given budget.

How does one assess site Latency?
There are many tools out there that can be used for testing the latency of a site. Some are paid which we use extensively, and some are free, which we also use extensively. One of the most useful resources is Google Webmaster Tools. In April 2010, Google officially announced they would be incorporating site speed as one of the 200+ site signals used in determining search rankings and now have become sort of the de facto authority on this topic. Let’s use an actual site example where we are using Google Webmaster tools.

This is a site that relies very heavily on external resources, with a lot of multimedia. There is no way around this, so the speed of the site is limited by the speed of anywhere from 6-12 other servers to provide it with the data it needs to load a page. Here is a screenshot from Google Webmaster tools showing the time it takes to load a page:

As you can see, over the last 90 days the sites pages load on average just over 4 seconds. This is a site that has been receiving exponentially more traffic over this time frame. I want to also make the point that in the third week of June the site was redesigned and recoded for a better user experience and greater efficiency. As such, you can see the line is very erratic until July when the spikes smooth out a bit and on average the time has decreased. Remember we are only talking a spread of 1.3 seconds between the high and low. Not a huge amount off is it? Most visitors wouldn’t even notice a difference. But Search Engines do.

This brings us to Bandwidth.
Bandwidth is defined as “the maximum amount of information (bits/second) that can be transmitted along a channel”. Why does this matter? Search Engines have a limited amount of capacity available. Granted, their “limited capacity” far exceeds anything most of us can dream of, but it is also a really big Internet to crawl (over a trillion pages). Based on this, the Search Engines will allocate a certain amount of resources to crawling a particular website based on its perceived value (whether it is stated or not). is going to get a substantially larger portion of Google’s resources than my Dog in Funny Hats blog.

So what does this mean to the rest of us? It means that we have to make the best use of the resources that are given to us. Basically, when Google comes a-knockin’ it is in your best interest to make sure it has the most clear path through your site and can get as much information as possible before it leaves for its next appointment (probably This is why building the most search-friendly efficient site is critical.

Below is a screenshot of the same site, same time period as before, this time showing how much bandwidth is afforded this site by Google in a given day:

Notice the trend of increasing bandwidth up until the third week of June, when the new site design was launched. Immediately before the 2.0 version of the site was launched the site was receiving a peak high of almost 89,000KB of attention from Google. Then it decreased substantially and immediately to 12,000KB and has since settled in somewhere around 39,000KB. The initial impulse is to look at this and say that a mistake was made, Google isn’t as interested in crawling this site anymore. This next screenshot shows the truth – how many pages is Google crawling in a given day:

This chart pretty much speaks for itself. Based on this chart we can see, that while the Latency of the site has only moved within a narrow band, the bandwidth usage has dropped considerably, which has allowed Google to crawl more pages on a given visit. This is a strong case for optimizing your entire site presence, not just your Titles and Tags. During this time period of this example, no on-site SEO elements were changed.

And by the way this third chart mimics the organic site traffic trend as well. How well is your site performing, contact us today to conduct a Technical Site Assessment and start improving or rebuilding your online presence?

Is Your Website Getting Indexed in Search Engines? Read this 4 Step Process.

Tuesday, August 10th, 2010

A Short How-To On Identifying Indexation Problems during an SEO Audit
If you think you might be having trouble with indexation there are some simple checks to do.

1. Do a physical check of the pages and site structure. This could be done with a database tool or by hand, thus giving you a chance to review the copy of your site, because most likely you haven’t done that in a while.

2. Run a XML sitemap of to get a list of the URLs present on the site. It is important to use a web-based or desktop application to do this and not rely on an internal tool that generates the sitemap from a CMS database. Why you ask? External sitemap generators (Web-based or desktop) do not have access to the server so they must crawl the site from link to link, just like a search engine. This will give you a better understanding of what content is accessible and what is not.

3. Check your Analytics program. Run a report of all content on the site that has received non-paid Search Engine traffic over a period of time (how long depends on your site traffic levels).

4. Query “” in any Search Engine to get a list of the URLs from that domain that are included in the index. Also, check the www version of the site to see if there is any variance. Theoretically, these numbers should be near identical.

Now that you have this data, what do you do with it?
Now that you have this data, compare them to each other. In Step 1 you identified all the pages/URLs that exist on the site. Now, compare this list to Step 2 – if not all pages that physically exist on the site are present in the sitemap, then you have some investigation to do. This indicates that there may be some issue with the structure of the site that is preventing crawlers from reaching those pages.

Next, compare Step 3 with Step 1.
Are there pages that are present on the page but have never received any traffic from Search Engines? This is an indication that these pages may not be indexed by the Search Engines.

There are pages that you don’t really want traffic to, like your refund policy, or your list of pending lawsuits (note: if you have this on your site take it off). If your product or services page is not receiving traffic this is something that should be addressed.

Finally, compare Step 4 with Step 1.
What pages on the site are not indexed? If you see that a large number of pages are not included in Step 4, you may have an indexation issue. As a method for double-checking this issue, compare the pages missing from Step 4 with Step 3. If there are pages that are not currently in the index, but have gotten Search traffic in the past, these pages may have gotten de-indexed for some reason. Investigate why this might be, especially if you consider them mission-critical pages. Don’t overlook checking your robots.txt file. It is not uncommon to see large sections of a site disallowed, when they shouldn’t be.

How well is your site performing, contact us today to conduct a Technical SEO Site Assessment and start improving or rebuilding your online presence for maximum visibility?

Turkey Teaches SEO and Social Media #thanksgiving

Monday, November 30th, 2009

seo-socialmedia-turkeyLast year in my hurry to get everything prepared, cooked and ready for our glorious Thanksgiving spread, I forgot to change the setting on my oven after roasting veggies from “broil” to “bake” and my turkey only cooked half way. The top of the turkey was juicy and a gorgeous golden brown, but the bottom was severely undercooked and I had 14 hungry turkey bird eaters ready to feast. And well, my turkey mishap got me thinking about similarities between SEO and Social Media.

Turkey Mishap teaches SEO

    1. Due to appropriate preparation and continuous basting, the turkey browned nicely. It looked good from the outside but this was merely on the surface. Similar to optimizing your web site for search engines, optimizing your titles and metadata isn’t enough. Make sure all elements of your site are optimized, from site architecture to content and multimedia to alt tags. Some of these elements are proving to be more influential and are often overlooked by many so-called SEO practitioners.

    2. Once the turkey was in, it was on the “set it and forget it” method. Like cooking a turkey, SEO is not a “set it and forget it” initiative. Just as I should have been monitoring the progress of the turkey, your SEO efforts need to be monitored continuously for progress and adjusted accordingly to achieve desired results.

Turkey Mishap teaches Social Media

    1. Although I was mortified by my mistake, I assembled the appropriate team to help me deal with the situation and I didn’t hide from my mistake (although I wanted to). In your Social Media communication it is important to remember that your customers know you will make mistakes, but it is how you handle them that they will be watching. Remember, be human. People make mistakes and don’t be afraid to get others involved to help you solve the issue.

    2. There was no mistaking that the turkey was undercooked when we cut into it. Same for your Social Media communication, people will see through the golden brown coating if you don’t have a well planned and executed social media strategy. A good product and great customer service are only two of the ingredients in a recipe for social media success.

    3. If at first you don’t succeed, fail fast, and then re-evaluate how to resolve and respond to the situation. Then, move forward quickly, the side dishes are getting cold and people will start to leave if you are unable to respond to the situation or provide an adequate solution.

    4. Don’t try to do too many things at once, but if you do, have a plan. Most importantly surround yourself with people who can help you implement and execute your plan seamlessly.

Gobble Gobble Lessons for SEO and Social Media

    1. Track and monitor progress; adjust along the way. Don’t get to the end to find out only half the plan worked or yielded you half the results you were expecting.

    2. Check to see if the oven is on. Review all elements and components of your Social Media Plan and on-site and off-site SEO. Sometimes it can be the most simple and obvious things that can foil your plan.

    3. Share your experience. Mistakes create an opportunity…for content! Case studies, articles, contests, etc. How can you share your experience to benefit your customers or provide added value to others?

Happy to say this year the turkey was juicy and delicious…and cooked all the way through. Picture above. Happy Holidays.