Determining why a Web site will not rank well can be a complex issue. It can take several days to completely diagnose the possible range of problems with a average size web site. Part of the issue is that proper search engine optimization techniques are not taught in Web design classes, so many designers unwittingly search engine build barriers into web site designs. We still hear from designers who were taught that meta tags are the key to better search engine rankings. Techniques like that may have worked in the 1990s, but have been hopelessly ineffective since then.
Two major components of good search engine rankings
First, all the issues that can lead to search engine penalties must be identified and corrected. Many Web sites that do not rank well have accumulated penalties that each reduce a site’s natural rankings. We’ve seen simple issues, such as linking to a banned web site (a site removed from a search engine index) can reduce a site’s rank positions in Google by 30 to 50 positions. Other issues, such as hidden text, hidden links or spammy, keyword-stuffed content can have a similar effect. Given the problems with Supplemental Result penalties, a number of issues can cause individual pages to essentially become banished from Google search results. A large percentage of penalties are related to common techniques used by Webmasters, designers and developers for many years. Sometimes these techniques are very logical and not intended cheat search engines in any way, but nonetheless are in violation of a search engine guideline and therefore are subject to possible penalization.
The best way to identify these issues is with a proper and thorough web site evaluation. A good site review looks at over 100 factors that either hurt or help a site’s search engine performance. Resolving the issues puts a website on the path to better rankings and higher levels of free traffic from search engines.