Technical SEO Issues – 7 Top Offenders to Watch For

hurting-seo-efforts

Jenny’s Note: I love SEO, but I have a confession: my least-loved part of SEO is the technical part of it. Give me keyword research, on-page optimization, or content strategy all day long. 

Occasionally, a gem of an article about Technical SEO comes my way, which I’ve shared below. This one comes courtesy of Jill Whalen, an industry expert. She has graciously permitted me to reprint here. This is a great refresher on WHY technical SEO is important, and the biggest offenders to look for. It has passed my stringent “non-boring-and-actually-helpful-for-a-technical-SEO-topic” standards.  Enjoy! 

Technical SEO Issues by Jill Whalen

While uncovering and fixing technical issues has always been an important part of SEO, in the wake of Panda and Penguin, technical SEO has moved closer to the forefront. You may have thought that the better Google gets, the less effect technical problems would have on SEO — I know I did. But in fact, it has been the opposite. It’s not that Google can’t figure out technical SEO problems and work around them — they most certainly can and have done it for years. But it seems that they have decided to force webmasters to clean up their sites now.

It does make sense from Google’s perspective. Why should they waste their computing power to sort through badly coded websites and misconfigured servers? I can totally see them deciding that if you don’t have the time or wherewithal to fix blatant errors, then why should they show your website to their users (aka the searchers)?

Enter Google’s Webmaster Tools

For many years Google has provided a host of free webmaster tools to diagnose technical SEO issues. Yet I imagine that only a very small percentage of website owners actually use the tools, and an even smaller percentage are likely to fix the problems. So it seems that Google eventually decided to take drastic measures by downgrading sites that had the most egregious technical issues.

What better way to make site owners take notice than taking away some of their traffic?

Now, I’m not saying that all sites with any technical problems are being downgraded by Google. They’re most certainly not. But if sites have other issues that Panda and Penguin caught, PLUS they have a lot of technical issues, it’s easy to imagine the creation of a perfect storm, so to speak. Which is partly why some sites that fix their spammy SEO issues without fixing their technical ones may never quite recover.

Why would Google care about technical website problems?

In most cases, it’s not the technical issues themselves that are hurting your SEO efforts, but the results that are caused by the errors they create. For instance, most of us agree that Google had made a big push toward showing the most user-friendly sites first in their search results. Well, what’s less user friendly than a site where many of the links produce “Internal Server Error” pages instead of what they’re supposed to show?

Google is really a “referral engine.”

Think of it this way: What if I recommended a particular product to you, but after you bought it, it didn’t work very well? Would you trust me for future product recommendations? Probably not. It’s the same with Google. They need to refer searchers to the most relevant results that also work as they should. A site with lots of technical errors *should* be downgraded by Google because it provides a poor user experience.

By now you’re probably wondering what sorts of technical SEO issues might cause Google’s black-and-white animal hitmen to downgrade your site. While the list is long, below are the ones I’ve compiled that I see the most often when I’m auditing penalized sites. They’re generally ones that either cause a poor user experience or simply make your site harder for Google to crawl, index, or read.

Technical SEO issues include (but are definitely not limited to):

  • Server errors: This includes tons of 404-errors on a site (especially bad if the rest of the site is internally linking to them), what Google calls soft-404’s, plus 500-server errors, and just generally pages that can’t be accessed by Google (or any spider).
  • Incorrect HTTP header responses: This includes redirects that simply don’t redirect at all, ones that show 302-HTTP header responses instead of 301’s, and 404 pages that respond with a 200, 301, or 302 instead of a 404 response.
  • Multiple redirects: This includes any redirect that makes more than 1 hop before a user lands on the page they’re ultimately supposed to land on. While Google can and does handle 1 or 2 hops, it’s prudent to set your redirects to go directly to the actual URL that you want your users to land on without any stops in the middle.
  • Redirect loops: This is when you redirect a URL to a different URL that is redirecting back to the first URL (yes, I’ve actually seen this in action!).
  • Misconfigured canonical link elements: Ever see a site that inadvertently pointed all of the pages to the home page via rel=”canonical”? I’ve seen many. (True confession here — when the tag was new I even did it myself once with my forum…oops!)
  • Requiring JavaScript or cookies to view something: Search engines traditionally don’t use JavaScript or cookies, so if the only way to see something on your site requires them, there’s a good chance none of that information will be indexed.
  • Pages indexed that shouldn’t be: I’ve seen all sorts of these, from server index pages to those that pop up Ajax errors.

How to diagnose technical SEO issues

As previously mentioned, you can find most of these issues by digging into your Google Webmaster Tools account. You’ll find lists of 404-errors, soft-404’s, crawl errors, and pages that Google simply can’t access. You can even try to fetch problematic pages as Googlebot to gain additional insight. 

I also highly recommend using a spidering tool such as ScreamingFrog. This tool will spider your entire site and provide you with all kinds of feedback. One thing to remember if you use a tool like this, however, is that just because the tool finds all kinds of strange things, it doesn’t mean that Google is also finding them. Be sure to double-check Google’s index before you panic!

The key takeaway here is to not just find your site’s technical errors, but to actually fix them. Even if your site hasn’t lost any traffic over the years, if you find a bunch of technical errors, they could be keeping you from receiving all the search engine traffic you deserve. It’s very possible that spending a day fixing these problems could pay off handsomely in the long run. If nothing else, it will certainly keep your users happier!

Jill 

Jill Whalen has been an SEO Consultant and the CEO of Jill WhalenHigh Rankings, a Boston area SEO Company since 1995. Follow her on Twitter @JillWhalen

If you learned from this article, be sure to sign up for the High Rankings Advisor SEO Newsletter so you can be the first to receive similar articles in the future!

ChatGPT for SEO Content:

Get this free resource - ChatGPT for SEO Content Cheat Sheet and learn how to maximize output and not get penalized in the process.

The following two tabs change content below.

Jenny Munn

Jenny is an independent Digital Marketer and SEO Consultant with more than 10 years of experience helping companies and content creators generate brand awareness, traffic, and conversions with SEO. She is a frequent speaker and is on the faculty for the AMA (American Marketing Association) and has taught SEO to thousands of marketers over the past 10 years.
SEO |

Leave a Reply

Your email address will not be published. Required fields are marked *