SEO tips for business websitesWhile uncovering and fixing technical issues has always been an important part of SEO, in the wake of Panda and Penguin, technical SEO has moved closer to the forefront. You may have thought that the better Google gets, the less effect technical problems would have on SEO — I know I did. But in fact, it has been the opposite. It’s not that Google can’t figure out technical SEO problems and work around them — they most certainly can and have done it for years. But it seems that they have decided to force webmasters to clean up their sites now.

It does make sense from Google’s perspective. Why should they waste their computing power to sort through badly coded websites and misconfigured servers? I can totally see them deciding that if you don’t have the time or wherewithal to fix blatant errors, then why should they show your website to their users (aka the searchers)?

Enter Google’s Webmaster Tools

For many years Google has provided a host of free webmaster tools to diagnose technical SEO issues. Yet I imagine that only a very small percentage of website owners actually use the tools, and an even smaller percentage are likely to fix the problems. So it seems that Google eventually decided to take drastic measures by downgrading sites that had the most egregious technical issues.

What Better Way to Make Site Owners Take Notice than Taking Away Some of Their Traffic?

Now, I’m not saying that all sites with any technical problems are being downgraded by Google. They’re most certainly not. But if sites have other issues that Panda and Penguin caught, PLUS they have a lot of technical issues, it’s easy to imagine the creation of a perfect storm, so to speak. Which is partly why some sites that fix their spammy SEO issues without fixing their technical ones may never quite recover.

Why Would Google Care About Technical Website Problems?

In most cases, it’s not the technical issues themselves that are hurting your SEO efforts, but the results that are caused by the errors they create. For instance, most of us agree that Google had made a big push toward showing the most user-friendly sites first in their search results. Well, what’s less user friendly than a site where many of the links produce “Internal Server Error” pages instead of what they’re supposed to show?

Google is Really a “Referral Engine.”

Think of it this way: What if I recommended a particular product to you, but after you bought it, it didn’t work very well? Would you trust me for future product recommendations? Probably not. It’s the same with Google. They need to refer searchers to the most relevant results that also work as they should. A site with lots of technical errors *should* be downgraded by Google because it provides a poor user experience.

By now you’re probably wondering what sorts of technical SEO issues might cause Google’s black-and-white animal hitmen to downgrade your site. While the list is long, below are the ones I’ve compiled that I see the most often when I’m auditing penalized sites. They’re generally ones that either cause a poor user experience or simply make your site harder for Google to crawl, index, or read.

Technical SEO issues include (but are definitely not limited to):

  1. Server errors: This includes tons of 404-errors on a site (especially bad if the rest of the site is internally linking to them), what Google calls soft-404′s, plus 500-server errors, and just generally pages that can’t be accessed by Google (or any spider).
  2. Incorrect HTTP header responses: This includes redirects that simply don’t redirect at all, ones that show 302-HTTP header responses instead of 301′s, and 404 pages that respond with a 200, 301, or 302 instead of a 404 response.
  3. Multiple redirects: This includes any redirect that makes more than 1 hop before a user lands on the page they’re ultimately supposed to land on. While Google can and does handle 1 or 2 hops, it’s prudent to set your redirects to go directly to the actual URL that you want your users to land on without any stops in the middle.
  4. Redirect loops: This is when you redirect a URL to a different URL that is redirecting back to the first URL (yes, I’ve actually seen this in action!).
  5. Misconfigured canonical link elements: Ever see a site that inadvertently pointed all of the pages to the home page via rel=”canonical”? I’ve seen many. (True confession here — when the tag was new I even did it myself once with my forum…oops!)
  6. Requiring JavaScript or cookies to view something: Search engines traditionally don’t use JavaScript or cookies, so if the only way to see something on your site requires them, there’s a good chance none of that information will be indexed.
  7. Pages indexed that shouldn’t be: I’ve seen all sorts of these, from server index pages to those that pop up Ajax errors.

How to Diagnose Technical SEO Issues

As previously mentioned, you can find most of these issues by digging into your Google Webmaster Tools account. You’ll find lists of 404-errors, soft-404′s, crawl errors, and pages that Google simply can’t access. You can even try to fetch problematic pages as Googlebot to gain additional insight.

I also highly recommend using a spidering tool such as ScreamingFrog. This tool will spider your entire site and provide you with all kinds of feedback. One thing to remember if you use a tool like this, however, is that just because the tool finds all kinds of strange things, it doesn’t mean that Google is also finding them. Be sure to double-check Google’s index before you panic!

The key takeaway here is to not just find your site’s technical errors, but to actually fix them. Even if your site hasn’t lost any traffic over the years, if you find a bunch of technical errors, they could be keeping you from receiving all the search engine traffic you deserve. It’s very possible that spending a day fixing these problems could pay off handsomely in the long run. If nothing else, it will certainly keep your users happier!

By Jill Whalen