Halloween is the time of year when people dress up as ghosts, ghouls, goblins and other malevolent creatures that patrol the streets in search of candy and a good scare or two. In the SEO world, terrors lurk behind every corner of the web on a daily basis, pitfalls that can cause your site to fall in the rankings – or even worse, drop from the SERPs altogether. While the spooky side of Halloween lasts just one night, a penalty from a search engine can affect you for years. Here are the top 10 SEO problems an Internet marketing services company can help you prevent this Halloween:
Cloaking is a black hat search engine optimization technique in which the content presented to the search engine spider is different from that presented to the user’s browser. This is done by delivering content based on the IP address or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page. The purpose of cloaking is to deceive search engines so they display the page when it would not otherwise be displayed.
From a search engine spider standpoint, when a broken link is found, that equates to a dead end. If the missing page returns a 404 error, the search engine will identify the page as non-existent and catalog the pages linking to it. If the page linking to the 404 error remains on this list for too long or has too many links to 404 errors, in all likelihood this would have a negative effect on the “Quality Score” of that page.
Keyword stuffing is considered to be an unethical search engine optimization technique. Keyword stuffing occurs when a web page is loaded with keywords in the meta tags or the content. The repetition of words in meta tags may explain why many search engines no longer use these tags. Keyword stuffing had been used in the past to obtain maximum search engine ranking and visibility for particular phrases. This method is completely outdated and adds no value to rankings today.
Paid links can land you in a lot of hot water with the search engines, so be smart about how you handle them. The types of link buys that Google has a distaste for are the links that are exchanged directly for cash. Modify your way of thinking just a little and there are a wide array of easy to obtain, high value links out there for you to get. The key to having a low risk profile is to make the link appear indirect.
Hiding text from the visitor is done in many different ways. Common techniques include creating text colored to blend with the background, CSS “Z” positioning that places text “behind” an image — and therefore out of view of the visitor — and CSS absolute positioning that positions text far from the page center. By 2005, many invisible text techniques were easily detected by major search engines. “Noscript” tags are another way to place hidden content within a page. Although this is a valid optimization method for displaying an alternative representation of scripted content, these tags may be abused, since search engines may index content that is invisible to most visitors.
When a website is offline the impact that it has on your SEO can be bigger than you think. If Google accesses your website while your website is down, Google could classify your site as being untrustworthy. The more often your site is offline, the more Google will think your site is not to be trusted, thus Google will position you lower in the listings.
A link farm is any group of web sites that all link to every other site in the group. Although some link farms can be created by hand, most are created through automated programs and services. A link farm is a form of spamming the index of a search engine. Search engines recommend that webmasters request relevant links to their sites (conduct a link campaign), but avoid participating in link farms. According to Google, a site that participates in a link farm may have its search rankings penalized. Links from related sites carry more weight than those from irrelevant sites.
In truth, search engines don’t wish to crawl websites with too many parameters in the URL. Search engine software engineers have a considerable amount of search data. They recognize URL patterns that are potentially problematic, and content management systems often generate problematic URLs. Additionally, search engines limit the number of characters they’ll crawl in a URL. This is partially due to known problems in URL structures and partially to Web site usability.
Bad link neighborhoods, to search engines, are typically identified by spammy on-page ‘SEO‘ techniques and dubious backlink and interlink profiles. They also refer to the dreaded PPC sites: porn, pills and casinos. You do not want to link to these neighborhoods, because who you link to matters. By linking to them, your site can be grouped along with these less than desirable sites, which can in turn lower your rankings.
Keyword dilution is a very common SEO mistake. Each page of a website should target 1-3 keyword phrases max, and even less if the keyword phrases are extremely competitive. Targeting too many keywords on a page detracts from the overall topical relevance and reduces the importance of your REALLY important keywords.
This entry was posted on Tuesday, October 12th, 2010 at 4:43 am and is filed under Search Engine Optimization. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.