How to Conduct a Technical SEO Site Audit

Once you’ve finished your on-page SEO audit, the next step is to conduct a technical site audit. You don’t want technical issues to make it difficult for search engines to crawl and index your site, as that will make it tougher to rank highly and cost you traffic or potential business.

Information Architecture

Information architecture can be slight tricky as it is a broader topic and can encompass many different areas. Information architecture basically means how information flows around on your site.

There are three main areas of information architecture to watch out for.

  • Site navigation: This should be intuitive to the user. Grouping similar content or categories of product together can help eliminate this issue. If it is intuitive and easy to navigate for users then in all probability it will be easy for search engine spiders to crawl.
  • Labeling and naming conventions: Labeling refers to how you name your categories or navigation links. Labeling incorrectly and not using targeted keywords could hurt, so it’s important to get this right.
  • Directory structures: Folder structures refer to what kind of folder hierarchy you have on your site. A recommended approach is not have a site hierarchy more than three folders deep to reach final product page. It may look like this:


Identifying and fixing broken links fo repair 404s definitely needs to be part of your audit checklist. Broken links can cause in spider traps and wasted crawl budget. It can also indicate to search engine spiders that the site hasn’t been updated recently, causing indexing issues.

It’s always a good idea to use a tool like Xenu Link Sleuth or Screaming Frog to check for broken links. Google Webmaster Tools also provides detailed list of broken links on your site and pinpoints the exact pages on your site from where the broken links originates from. If your site is verified on Google webmaster then you may see a message like the one below:


Crawling/Indexing Issues

If the pages on your site can’t be crawled then they definitely can’t be indexed by search engines. There are many common issues that impede a search bot’s ability to crawl your site. Most notable among them are broken links or 404s on your site, content in Flash, AJAX, and iFrames or embedding links inside JavaScript.

Site Loading Time

A slow loading site will result in higher visitor drop-out rate and eventually less conversion. Looking at your site page weight is important with excessive use of scripts, heavy images, Flash files increasing the page weight. Ideally, try to keep maximum page weight at around 130K for your home page and 180K on all other pages.

Google’s recent focus on faster loading sites again makes it critical for SEOs to include site loading time as part of technical site audit. Site performance tools like Gomez and pingdom can help you determine your site loading times. Google Webmaster Central also shows you how your site is in terms of performance.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s