Expert SEO Interview with Scott Hendison

May 25th, 2011 • By:  • Expert Interviews

Expert Interviews

This week, I met (virtually) with Scott Hendison, the CEO of Search Commander Inc. and the creator of SEO Automatic. Based in Portland, Oregon, Scott was a computer retailer from 1997–2002, and built a successful computer repair business through PPC and organic search.This early entry into search marketing enabled his business to gain front page search results for many phrases that remain entrenched in the top of the SERPS to this day. I picked his brain with some questions on SEO and social media optimization.

Scott Hendison Image

Elise Redlin-Cook: What are the most important steps a webmaster should take to ensure a website is properly optimized for search?

Scott Hendison: I strongly believe every business should have their own Google Analytics account in their own Google Webmaster account, and that those should be used for site measurement. I see far too many businesses that either don’t have one or both, or the code is under someone else’s Google account, usually a former webmaster, SEO firm, or employee. The first thing I recommend for anyone is to get those in place.

Unless I’m using a tool, the first thing I usually look at is a site:domain.com search in Google to see how my pages are indexed, and whether or not they have unique and relevant title tags and description tag. If they don’t, then that’s probably the first thing to pay attention to. As far as “on page” optimization goes, Google Webmaster Tools site recommendations shows you what THEY say are important, and I’d say to take care of those. Interesting to see that WMT points out description tag duplicates, even though they’re not a ranking factor.

For more in-depth on page analysis, we developed a URL checker tool to examine 18 on page ranking factors. It covers some SEO 101 stuff and a few more advanced options, including instructions on how to fix many of the issues yourself.

Elise: Great tip! When performing an SEO audit, what are the most common problems that you encounter?

Scott: Invariably, when looking at the average Joe’s website we find lots of duplicate or missing title tags, duplicate or missing description tags, and duplicate and overstuffed keyword meta-tags. We also usually find very little (or improper) use of H1 tags or H2s and bold / bullet lists, which I believe can help rankings. Admittedly I’m in the minority among SEO’s that think so, but I DO believe they make it easier for the reader to understand what a page is about and provide a more enjoyable page view where they can scan before reading in depth.

SEO Automatic LogoLogically, if I were Google, all things being equal, I’d rank an easy to understand page higher than one more poorly formatted with no headlines or sub heads. (But then what do I know? I still believe that keyword meta tags could come back some day!)

We also almost always find some internal links to missing pages that give 404 errors, and very frequently find lots of 302 redirects in place when 301 redirect would’ve been the right choice.

Assuming that the person auditing the site knows what they’re looking for, then hands down, the best tool I know of for spotting and filtering ALL of these things is from Screaming Frog.

Elise: What are the most important local search sites that a business should get listed on today?

Scott: The benefits of linking on the local specialty sites are always going to vary over each industry and geographic location, but as a general rule the rule of thumb, obviously Google, Yahoo, and Bing are at the top of the list, followed by with Localeze, Info USA, and Acxiom. Probably the best resource I know for local search is probably Get Listed.

Elise: As you know, Google recently made massive change to their algorithm, called the Panda update. Would you like to discuss what this means to the average website seeking rankings?

Scott: Actually, to the average website, it probably doesn’t mean a whole lot, since most websites really aren’t affected. This change only affected 12% of the search results, and unless you’re in the minority who saw huge traffic drops, or perhaps now has other websites ranking for scrapings of their content, they really shouldn’t pay any attention. Continue to do what you know to be right, and what the Google Police are preaching – create good content into search friendly environment with basic SEO best practices in place.

Elise: If a website has been hurt by poor SEO, how can a good SEO company help restore a site’s ranking and reputation? Does your company fix a lot of sites that have been harmed by bad SEO companies?

Well, I’d argue that most websites aren’t hurt by poor SEO, and that a “good SEO company” is a subjective term. What you, I, and a business owner down the street might call “good SEO” are likely VASTLY different.

I’d guess we’ll likely see a continued expansion of evaluating trust factors, in other words, it’s WHO is doing the voting that matters most.

Perhaps to you, a good SEO is someone that never pushes the envelope of Google’s webmaster Guidelines, and is 100% “white hat”, maybe even to the point of outing other SEO activity that they think is “crossing the line”.

To me however, (and often to a site-owner) a “good SEO” would be one that can actually deliver results and makes a site owner money, by realizing that Google’s guidelines are not laws, and outlining any and all risks throughout the process. It’s my contention that there are plenty of markets where a self proclaimed white hat “good SEO firm” simply has no chance to compete.

And for the second part of the question; no, we’ve not fixed many sites that have been harmed by bad SEO companies… mostly just ones that we’ve screwed up on our own.

Elise: LOL! So, what are some of the common obstacles with large retailer web sites when it comes to SEO?

Scott: Technically, the biggest obstacles seem to be ensuring that the site is crawled deeply and without duplicates, and that the e-commerce software is flexible enough to be able to implement the SEO recommendations.

For the SEO firm however, often the biggest obstacle is that many of their most fundamental recommendations can take weeks or months (or sometimes never) to be implemented. It this is due either to technical limitations (i.e the shopping cart sucks) or even more often, personnel issues (i.e. some people are just lazy and unaccountable).

Elise: Do you think it’s worth optimizing content for search within social media sites?

Scott: Sure, because you simply have to have to consider the audience. Experts have been speaking for years about the different demographics of the top users in different social networking and bookmarking services, from Digg to MySpace, and you do have to know your market. Often though, it doesn’t really take all that much effort, and will happen naturally.

For example, if you have a technical white paper you want to socialize and share on Facebook, then yeah, you’d better cut it down a bit and have some bullet points, as well as an interesting hook. Want to share that same thing via Twitter? You’ll have to cut it down to a compelling 140 characters MINUS the number of characters in your username for potential Retweet value.

Before you ask “who would create multiple versions of the same content to appeal to different demographics?” I’d point out that the exact same news story reported on NBC or Fox or MTV would usually be pretty different…

Elise: Any comment on how powerful/useful Google’s +1 button is going to be for social and SEO?

Scott: Click to TweetBacklinking has become so abused and manipulated, and “on page” SEO can only take you so far, so I think things like Google’s +1 button are going to continue to matter more and more.

Currently the +1 is an opt-in only tool in Google Labs, and I did opt in on the first day, but I’ve not used it much. Just like all sorts of social signals, I think it’s going to weigh more and more heavily on rankings while at the same time creating more and more opportunities for abuse.

In order to combat the abuse, I’d guess we’ll likely see a continued expansion of evaluating trust factors, in other words, it’s WHO is doing the voting that matters most. In the offline world, you hear expressions like “location is everything” and “it’s not what you know, but who you know that counts” because both of those things build trust.

I think that Google’s algorithm is likely taking into account things like bookmarks, their own +1 votes, Twitter retweets, (and someday maybe Facebook “Likes”) and more, all as “ranking factors”, but just like backlinks, it won’t simply be volume that counts as much as the trust factor – looking at the sources that those “votes” are coming from. That’s what I’d do if I were Google.

Elise: Good advice, Scott! Thanks for taking the time to answer my questions!

Elise Redlin

Elise is the Content & Marketing Manager at Vertical Measures, an internet marketing company in sunny Arizona providing services ranging from content marketing, to social media marketing, link building, and advanced SEO. She’s fully immersed herself into the world of content marketing and content strategy and is the managing editor of this blog. +Elise Redlin

More PostsWebsiteTwitter

This entry was posted on Wednesday, May 25th, 2011 at 4:14 am and is filed under Expert Interviews. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

Leave a Reply