SEO tips for business websitesYes, today’s world of SEO can be one overwhelming wave. With a never-ending flow of articles and videos that trumpet all the many steps required to appease Google and friends, high page rankings can feel out of reach. Sometimes the less intimidating approach is to examine the current common mistakes made by SEO do-it-yourselfers and their professional counterparts. By first ensuring you’re not guilty of any of today’s most frequent blunders, you can feel good that nothing is amiss. Then, move on to the myriad of how-to’s and enjoy the fruits of your labors.

So what flubs are the current rage in SEO realms? You already know the importance of quality content, so for the sake of this article, we’ll focus on oversights primarily via on-site SEO.

1. The Overuse of Keywords

Keywords are a critical component of great SEO, but we are currently experiencing a massive overuse of key phrases on thousands of websites. This results in the over-optimization of the page, and when it’s obvious, it’s a terrible user experience. Visitors are savvy; they know when a search engine is being courted over their own involvement. Use your researched keywords with precision and conciseness.

2. Duplicate Content and Canonicalization

Most site owners are now shrewd about understanding duplicate content is a big no-no, but the infraction is still rampant.

Here are the steps to take to ensure you’re not guilty as charged:

  • Comb your own site to be sure you haven’t duplicated content anywhere.
  • Use a site like Copyscrape to search the web and see if any of your content has been scraped.
  • If you do have a need for duplicate content for any reason, add 301 redirects to your content so Google does not read both as original posts.
  • Make sure you are implementing your canonical tags correctly (rel=canonical). This tells search engines that each page is your original content. Failure to add these across your site can cause a significant drop in SERPs.

3. Incorrect Linking Habits

Just like keywords, internal links are often overdone. Remember that if you attempt to use any SEO tactics in an excessive manner, Google will likely catch on and assume you’re being sneaky.

Internal links are important for credibility and an overall good user experience, but limit your links to 1-2 per 500 words, as a rule. Likewise, be methodical about the anchor text you use. This is a direct search engine bot communication; resist the urge to be creative and make each anchor link descriptive, concise, and on point.

4. Problems with Hosting and Servers

If your site is hosted on a server with various other URLs, Google may ding your rankings. It is now recommended that you have your very own dedicated server, so as not to be affected by the mistakes others make.

Ensure your caching and compression tools are in top shape, with adequately fast response times. Most importantly, monitor your site’s stability and note any significant downtime; if you’re down for more than 24 hours, search engines will notice and likely penalize you. Stability is critical.

5. Site Maps and a Crawlable Website

It’s far too common for site owners to generate a site map at launch, and then swiftly forget it exists. It’s crucial to keep your site map current, with comprehensive links to all indexable pages. Every time you update your site map, be sure to let the search engines know. This guarantees the engines know about all of your content, especially those pages a standard crawl might miss.

And, of course, your website as a whole should be crawlable. It’s shocking how many are not. This means none of the pages on your site are hidden – that is, they are properly linked to by simple navigation. If users can’t find your pages, neither can Google.

If you have content that you’d like to temporarily keep hidden, use robots.txt to prevent them from being crawled and indexed. Here’s an actual quote from Google about how to use these correctly:

“While Google won’t crawl or index the content of pages blocked by robots.txt. we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project, can appear in Google search results. To entirely prevent a page’s contents from being listed in the Google web index even if other sites link to it, use a noindex meta tag or x-robots-tag.”

6. Sluggish Load Times and Poor Usability

If you really want to see your SERPs plummet, create a user-interface that’s confusing, convoluted, and slow to load. Page speed does in fact matter, as does intuitive navigation. Studies show you have less than 3 milliseconds to establish engagement and trust with a new visitor. This means the site needs to communicate what it is, and in a clear manner, quickly.

If your site doesn’t pass this test, Google will notice eventually. More importantly, so will your users. If you’re not doing a decent job of guiding visitors down the ideal path, your conversion rates are surely reflecting this failure.

Next, clock the speed it takes for your site to load. Many argue the validity of a fast loading site; the fact is, even if only a small percentage of pages are affected by this stat, it’s foolish to have a sluggish site from a user perspective. Aim for a score above 90, and you might see an increase in rankings – but you will surely see a drop in your bounce rates.

Remember – what’s good for search engines is also good for users; create a highly usable site, and you are poised for a win-win.

By Tina Courtney-Brown