There are reportedly 200 signals that Google uses to evaluate where a site will be positioned in their search results. Most of these signals are unconfirmed, but through years of testing by industry experts, we have a good idea of what makes your site rank in search results.
Auditing a site on how SEO friendly it is can be an overwhelming task for those just starting out. For this reason, an SEO audit checklist can come in handy as it points out the top items to look for and why it’s important to the health of your site. So, clear your mind and follow along as we look at how this all works.
What Do We Inspect During an SEO Audit?
Reviewing the health of your backlink profile can determine if your site was hit with a penalty. We’ve had clients who didn’t realize what was in their backlink profile because they let other SEO “experts” take care of their site using black hat techniques. Often, they had SEO work completed a long time ago, back when buying links was more common.
During our SEO audit, we check for any paid and spamming backlinks. These should be disavowed immediately. The anchor text from the backlinks is also checked for a balanced mix of unbranded, naked and generic.
To avoid duplicate content, your pages should have canonical tags.
“A canonical tag (aka “rel canonical”) is a way of telling search engines that a specific URL represents the master copy of a page. Using the canonical tag prevents problems caused by identical or ‘duplicate’ content appearing on multiple URLs. Practically speaking, the canonical tag tells search engines which version of a URL you want to appear in search results.” – Moz
In situations where your site has multiple URLs for the same content, canonical tags can highlight the correct version from the duplicate URLs. This will consolidate all the page authority into one place instead of splitting the value.
Above all, your website needs to be crawlable. If you’re a large site, hitting your crawl budget is a real possibility. Therefore, the efficiency of the crawl will determine how quickly your pages will be indexed or updated. Having a crawl-friendly site means a fast page load speed and low amount of server errors that would slow down a crawl.
Some websites randomly throw images on a site without much thought about proper formatting. Unoptimized images can lead to large file sizes, which can cause a page to load slowly.
Modern website designs tend to have more imagery, and this can hinder the page speed performance if not properly optimized. Search engine bots can’t read or understand an image (yet) so it’s our job to describe it to them in an alt text. Therefore, when it comes to an alt text, we check to make sure each image is labeled clearly and concisely.
The pages on your site need support by linking out to each other. A powerful internal linking strategy encourages streamlined navigation and creates a clear path for search engine bots.
How you link between your pages will determine how page authority is spread out, so avoid having any broken links. Don’t force internal links. Make them relevant and natural to the reader. During our SEO audit, we check to make sure you’re using keyword rich anchor text since this is a signal to the search engine bots about the linked page.
Most SEO experts check metadata at the beginning of each audit. When you think SEO, you probably think about how you’re going to rank for keywords related to your industry. This is where metadata becomes so important.
Are the title tags, meta descriptions, and H1 tags formatted properly? Optimized metadata can make a significant difference in the performance of some pages on your website. Websites that don’t pay attention to metadata are missing an opportunity to maximize organic traffic. Check for metadata best practices when it comes to character length and placement.
You may have heard that Google confirmed page load speed as a ranking signal long ago. Since then there have been numerous studies on the benefit of a fast website. Taking all of this into account, page load speed should be taken seriously.
In this day in age, your users expect things quickly. The optimal page load time is under 3 seconds or you can expect the impatient user to start abandoning the page. Here’s a neat post on page load speed.
The robots.txt is the gatekeeper to your site. It controls what can be crawled within your site by providing directives to search engine bots. However, it also has the potential to harm your site if improperly used. During SEO audits, our team checks robots.txt to make sure nothing is blocked that search engines should be able to crawl.
“Think of your robots file as one-stop shopping for the rules to your website. The robots file should contain disallow statements for those URLs that Google can safely ignore as well as a link to your XML sitemap that includes the URLs Google should crawl and index.”
Seth Nickerson, Senior SEO Strategist
You can quickly review this feature by adding /robots.txt to your website’s URL.
How’s your website’s organic traffic, keyword rankings and authority metrics? Loss of traffic can be caused due to many variables on your website, so to get a better understanding of your website’s performance, our team reviews keyword rankings and authority metrics (including page/domain authority, page rank, and backlinks).
Keyword information is the key to making future recommendations on receiving new traffic or reclaiming lost ones. How the site performs on authority scores may tell us the amount of work that will be needed to increase performance and meet established goals.
Are the website’s pages organized in a clear and logical hierarchy that’s easy for the user and search engine bots to navigate? Pages should be organized under their logical categories and should be reflected in the URL structure.
The category name should be a related to the pages underneath it in the hierarchy to help users and search engine bots understand how the site is structured. So, for this blog post URL, it should live underneath a blog category folder that uses a keyword phrase related to SEO which then precedes the root domain.
It can be difficult to organize all the pages on the internet, but that’s why structured data is critical to an effective SEO audit. Structured data helps classify pages in the index by using schema types to organize the internet.
When your pages are marked correctly, they become eligible for a search feature in Google. These will make your site listing stand out and increases your appeal to be clicked. Here are some examples of structured data.
Sitemaps are as they sound: a map to your site, kind of like a treasure map. It helps the search engine bots discover all the pages on your site. Make sure to include all the URLs that you want to index. Too often, sitemaps plugins get automated with URLs and include pages that shouldn’t be indexed, such as Thank You pages. We keep an eye on what’s in your sitemap to ensure your website is properly optimized for search.
Completing a Full Website SEO Audit
No two SEO audits end up the same because every website is different and requires different evaluations. For me, it makes the audit process more interesting because of the endless things you can discover.
There’s no better feeling for our SEO team than finding an optimization mistake on a site, – correcting it – and watching our clients enjoy the positive results. Try for yourself! Tap into those problem-solving skills and follow the SEO audit checklist to see what you come up with: