24 Mar 2015

3 Technical SEO Fundamentals You Shouldn’t Neglect

According to one of my favorite motivational speakers, Jim Rohn, “Success is neither magical nor mysterious. Success is the natural consequence of consistently applying the basic fundamentals.”

This is especially true in the context of SEO.

If you don’t know the difference between a canonical tag and hreflang tag; how responsive web design compares to adaptive web design; or how best practices in information architecture can bolster your SEO efforts, this article is probably for you.

In the age of Penguin, Panda, and Hummingbird, more business owners than ever are moving towards content marketing, defined by Content Marketing Institute as “creating and distributing valuable, relevant, and consistent content in an effort to attract and retain a clearly-defined audience.”

A positive by-product of creating and distributing high-quality, audience-aligned content is that it earns links, allowing business owners to move away from having to actively build links while simultaneously bolstering the strength of their search presence.

Unfortunately, all the high-quality content and earned links in the world won’t do much to move your site up the search engine results pages (SERPs) if you’re neglecting the fundamentals of technical SEO.

1. Duplicate Content

According to Google, duplicate content is content “that either completely match[es] other content or [is] appreciably similar.” What’s interesting is that Google makes it a point to also state that duplicate content is mostly not deceptive in origin.

Duplicate content

Non-deceptive duplicate content is among the most frequent technical SEO issue we see here at Vertical Measures, rearing its ugly head in the form of everything from printer-only versions of web pages to sites in which there is no expressed preference over whether search engines should serve up the “www” or “non-www” versions of their content.

Why is this such an issue?

The goal of Google’s web crawler – Googlebot – is to crawl the web to discover new and updated pages to be added to Google’s index. It also happens to run on a huge set of computers.

Like any other computer, it is only capable of doing so much at once. In order to maximize efficiency, Googlebot uses an algorithmic process to determine which sites to crawl, how often, and how many pages to fetch from each site.

Duplicate content hampers Googlebot in achieving its goal. Remember: Googlebot’s goal is to discover new and updated pages. By definition, duplicate content is neither new nor up-to-date, and after a while, Googlebot will start to crawl your site less often and with less depth.

As a result, any new content you create — regardless of whether it’s Pulitzer Prize material or not — has less of a chance of being found by Googlebot, let alone indexed and showing up in the SERPs for its target keyword.

So, what can you do? A lot.

Luckily, Art, our Director of SEO Services, held a webinar on this topic entitled Prepare for Panda: How to Destroy ALL Your Duplicate Content. In it, you can find a boatload of actionable tips for cleaning up your site and setting systems in place to prevent any unintentional duplicate content in the future.

2. Mobile-Friendliness

On February 26, 2015, Google released a blog on their Webmaster Central blog entitled Finding more mobile-friendly search results.

In it, they state:

“Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.”

Let me repeat that…

“Starting April 21 … [Google’s] use of mobile-friendliness as a ranking signal … will have a significant impact in [their] search results.”

If the importance of mobile-friendliness is a surprise to you, here are some mobile statistics that’ll help you understand why Google is all of a sudden expressly stating a date for an algorithm modification (something they rarely do):

  • “Among U.S. adults, 22.9% of all media time in 2014 was spent on mobile” ~ Kapost
  • “Mobile searches (roughly 85.9 billion) will surpass desktop searches in 2015” ~ Business2Community
  • “57% of the United States owns a smartphone” ~ Business2Community
  • “81% of conversions from mobile search happen within five hours of the search” ~ Business2Community

Mobile friendly on Google

To remain competitive and aligned with its belief that all else will follow if you simply focus on the user, Google must adapt its algorithm to the significant role mobile-friendliness plays in the overall user experience of the search engine.

If Google continued to serve up sites that weren’t friendly to mobile devices, it’d likely result in an immediate loss of search engine market share.

That said, how do you know if Google considers your site mobile-friendly?

Luckily for us, they’ve provided two tools that tell you exactly whether or not they consider your site mobile-friendly, and if not, why not: the Mobile-Friendly Test and the Mobile Usability Report.

The first is a publicly-facing tool; the other, a tool you can only access via Google Webmaster Tools.

For those without Webmaster Tools, plug in your website’s URL — or multiple URLs of content within subdirectories of your website — and you’ll know within the matter of a few seconds whether Google considers those URLs to be mobile-friendly.

For those with Webmaster Tools, simply click the 2nd link above.

Regardless of which tool you use, if your site doesn’t pass as mobile-friendly, you’ll be given reasons why along with links to resources on how to fix the issues.

Identify the issues, plug them into a spreadsheet, and begin working on addressing them. If you have a web developer, even better! Simply shoot the spreadsheet over to them and let them do what they do best.

3. Site Structure

There is no question that a good site structure makes for a great user experience, and that a great user experience often results in search engine benefits, but why?

Consider for a second that your site is suddenly void of its colors and fonts, its kerning and images – everything that makes it pretty.

What’s left? Text in the raw.

Now consider someone visits your site at this exact moment. How confident are you that they will be able to find what they are looking for without any visual cues? My guess is not very confident.

Jean PiagetIn the 1900s, Swiss psychologist Jean Piaget (the happy, old man smiling in the picture on this page) developed the concept of cognitive equilibrium. Cognitive equilibrium, Piaget stated, is a state of balance between an individual’s expectations of how something should be and those expectations being met by whatever they’re interacting with.

Admittedly, knowing what an individual’s exact expectations are with regards to a website isn’t possible, but you can be fairly confident that at the core of their expectations is the ability to find and navigate to the information they’re looking for in a way that logically makes sense to them.

What’s interesting is that Googlebot functions in much the same way!

When Googlebot hits your site, it gets your site’s text in the raw. Its cognitive equilibrium can only be achieved if it can find and navigate to the information it’s looking for, except there’s an element of being able to understand what the content is about, as well.

How can you accomplish this?

There are many ways, but in the context of site structure, here are two steps you can take to start the process:

  1. Indexable content. Ensure your content is indexable in the first place by putting your most important content in HTML text format. A common issue business owners’ run into is that they fall in love with aesthetics at the expense of crawl-ability. Instead of their most important content residing in HTML text format, they trap it in images, Flash files, and Java applets, making it pretty difficult for crawlers to figure out what the content is about, let alone index it.
  2. Crawlable link structures. Crawlable links, by definition, are those that enable crawlers to browse the pathways of a website. A common issue with link structures is that sometimes content is placed too deep within a site’s architecture. The issue with this is that Googlebot isn’t able to navigate to it as quickly as they would like, and because of this, it halts all attempts of finding it after an appreciable amount of time. Generally, you should make all of your content reachable within as a few clicks as possible, ideally within 3, e.g. http://www.example.com/category-keyword/subcategory-keyword/primary-keyword.html

To wrap things up, here’s a quote from another one of my favorite motivational speakers, John C. Maxwell:

“Small disciplines repeated with consistency every day lead to great achievement gained slowly over time.”

If you want to achieve great success for your business in search, you must stick to the small disciplines — the seemingly minor and inconsequential things that make up the fundamentals of technical SEO — and you must do them often.

There’s no better time to start than now. Review the three things above and audit your site for them now. Set things in motion to address them now. Schedule regular audits to ensure everything is sound now, because if you don’t, your competition probably is, and if they’re not, don’t you want to get ahead?

Remember: Success is neither magical nor mysterious. Go back to basics today and stick to them. Before you know it, you’ll achieve more than you’ve ever imagined possible.

See you at the top!

12 Comments

  • Justin P Lambert Mar 24, 2015

    Great overview of 3 important points here. I think the first one – duplicate content – is probably most important just because resolving it is low-hanging fruit for most of us, but it can kill us if we overlook it. But they’re all important. Thanks!

  • Houston Barnett-Gearhart Mar 25, 2015

    Justin, thanks so much for your comment. I appreciate it.

    Though we briefly discussed the “hierarchy of needs” with regards to the 3 points via Twitter, I 110% agree with you that dupe content should take priority over mobile optimization and flattening your site’s structure. Unintentional dupe content is rampant, especially when you consider that the most common CMSs for SMBs, e.g. WordPress, create an environment that can have you creating dupe content left, right, up, down, and every which way in between with you being none the wiser.

    Glad you took the time to leave a comment, Justin. Again, much appreciated. Feel free to reach out to me via email at houstonb@verticalmeasures.com if you ever have any questions you just can’t quite figure out the answer to in regards to anything I discussed in the article.

  • Michael Pomposello Mar 25, 2015

    It’s so crucial to get the client on board to prevent duplicate content on their end. Them and their design companies usually mean well though I’ve noticed when they’re focusing on a ton of moving parts it’s important for the SEO agency to keep on top of dupe content not just at the start of the campaign but also through multiple iterations of the site.

    Great article Houston, definitely going to keep this on hand.

  • Houston Barnett-Gearhart Mar 25, 2015

    Hi, Michael! Absolutely agree. Dupe content is one of those things that can be nipped in the bud if you’re lucky enough to partner with a client and their web developers at the onset of a new website redesign project.

    Your point about staying on top of everything through multiple iterations is on point. In fact, I would even go so far as to say quelling dupe content should be a KPI, or at least treated like one.

    Thanks for weighing in, Michael. Your time is appreciated; your input even more so.

  • Sean Maki Mar 25, 2015

    Great overview Houston, I totally agree that technical seo as a whole is often neglected, and among larger sites I definitely see duplicate content and site structure as common problems needing to be addressed. Proper internal linking may fit well under site structure.

    One thing I find interesting is when fundamentals aren’t addressed, how easily issues compound to create big problems from a technical perspective. Such as not forcing a single version of domain/url or lack of canonical, duplicate content issues, together with messy internal linking (linking to different versions or a page, large amounts of redirects and/or many 404’s).

  • Andrew Cranmer Mar 25, 2015

    Hey Houston,

    Great blog post! I couldn’t agree more and I thought you brought up 3 very important points. It’s so weird how nobody seems to know that mobile friendly websites are given a higher relevancy score by the Googlebot. I haven’t seen a lot of results in the earned link category so I need to see about creating better content to make this possible. Thanks for the tip.

    On another note, a great account structure can also give you a better quality score for keywords in your paid search campaigns. Very important when trying to lower CPCs.

    Great post!

    Andrew

  • Matt Fielding Mar 25, 2015

    The mobile-friendly update should be great for the web. It’s driving us towards a time where we can replace the term ‘responsive website’ with ‘website’. Mobile should not be an add-on!

  • Houston Barnett-Gearhart Mar 25, 2015

    Matt, thanks so much for sharing your thoughts!

    I definitely agree that the mobile-friendly algo component is a net-positive thing for the web at large. No doubt many will continue to debate the merits of responsive over adaptive web design or take this as another platform to whine about Google being a “monopoly,” but the fact remains that user experience should trump all.

    It’s why I drew the comparison between Googlebot and human beings with regards to cognitive equilibrium, evidenced by your exact thoughts that we’re being driven towards a time where you don’t refer to a website as “responsive” or “adaptive,” but as a “website” because the former two are simply adjectives to describe something that should be automatic anyways.

    Thanks again, Matt! I appreciate you and the time you spent on your feedback.

  • Marco Novo Mar 25, 2015

    Great post Houston,
    Mostly, and we should not to forget about it, when we do SEO, we are simplifying and adding value to our site visitors. If they find great unique content, that Google could easily index, if they can access our website through all platforms, including mobile, of course, if its easy and intuitive to navigate and browse on our site, Google will notice, and will reward us. So, simplifying user experience and giving them good reasons to stay, share and come back on our site, its a not so hard and rocket science way to do SEO.

  • Rachel Gray Mar 25, 2015

    I love how this topic is introduced. It is important to make it clear that SEO is not a mystery, but a puzzle that is constructed.

  • Houston Barnett-Gearhart Mar 25, 2015

    You nailed it, Marco! A draft of this article actually had a section dedicated to showcasing how best practices in information architecture (IA) are finally being adopted by SEOs. If you think about it, this makes absolute sense. According to Peter Morville — an IA consultant — the the purpose of IA is to help users understand where they are, what they’ve found, what’s around, and what to expect.

    Hm… sounds a lot like the goal of web crawlers except there’s an element of speed to the entire process.

    Anyways, you’re exactly on point. Thanks for your input, Marco. Always appreciated!

  • Jordan White Mar 26, 2015

    This article is great.

    We consult for various SEO & SEM agencies in Southern California. We commonly see these agencies telling their client what they need to do in order to improve their overall rankings.

    The challenge we see resurfacing for the majority of the agencies we work with is their inability to connect the “Why” for their clients. Put simply and rationally, marketers need to understand that their prospects never buy WHAT they do but rather WHY they do it. To have the greatest impact on each prospect, agencies must align their WHY with the prospects WHY.

    So how does this all tie into this particular article?

    Houston really bridges the gap between the duties agencies perform with WHY they are important to a prospect.

    – I know I need original content, but WHY does that leads to more business for me.

    – I know I to upgrade my site, but WHY do I need to do it now.

    This article definitely hammers home on the key fundamentals that average agencies neglect, while also explaining the WHY, HOW & WHAT.

    Great article Houston! I look forward to reading your future posts.

    Jordan White