Supplemental Results were a method that Google once used to weed out web pages that it felt were not worthy of ranking well. It did not work very well and resulted in perhaps millions of web pages that were dumped into a secondary database, frequently for reasons that were not apparent. It is now an outdated concept, but nonetheless we explain what is was for those not familiar with this issue.
In August of 2007 Google finally found a way to resolve the supplemental results issue that I describe below. Rather that rectify an issue with their algorithm that was apparently dumping millions of web pages into their supplemental results database, they simply removed the indicator that identified those pages in their index. That does not mean that the supplemental results database is gone, nor does it mean that pages deemed to be less worthy will now start appearing in searches. All that they did was remove what had become a valuable tool to help identify problem pages in a web site. The result is that you will no longer see the supplemental results indicator next to problem pages when you list all of a site’s pages in Google’s index.
I personally felt that the indicator was useful to identify pages that had little chance of ranking well. Google was also getting much better at refreshing the supplemental results database much more regularly, so once the issues with a web page were resolved, they were moving back to the main database much quicker.
Have you noticed your traffic from Google dropping off rapidly over the past several months? Your web pages may be getting penalized and could slowly be getting banished to Google’s Supplemental Result index. Google’s Supplemental Result index is a secondary database that is used to segregate Web site pages that it feels may not meet its standards. Although the Supplemental Result penalty has been around since the summer of 1993, during the last few Google updates it appears that literally millions of Web sites have been hit with this penalty.
You can check to see if Google has moved any of your site’s pages into the Supplemental Result index by using the “site” query operator to view the page URLs that Google has indexed for a site. Just enter the following operator command in a Google search box. Substitute your site’s domain name in place of mydomainname.com.
This query will display all of a site’s pages that are in Google’s index. The same query works with Yahoo and MSN, but neither of those search engines provides a method for flagging Web pages with an identifier indicating they have been penalized.
Page through all the URL listings that are displayed. Most often, the Supplemental Result pages are listed toward the end. You are very fortunate is you do not find any pages flagged as a Supplemental Result. Many sites have only a few pages flagged, but a growing number of sites have almost every page banished to Supplemental Results hell. If you see the dreaded words Supplemental Result to the right of a web page’s URL, that page has been downgraded to second class status by Google. These pages will rarely–if ever–show up in Google’s search results.
So what is a Supplemental Result?
The Supplemental Result database contains pages that Google has essentially removed from their main, searchable database, which means they do not show up in normal search results. It is a penalty applied to individual pages. These pages only show up in search results when there aren’t very many good search matches in the primary search database.
Did I do something wrong with my site?
Maybe. We have found a number of issues that appear to trigger this penalty.
- Lack of content on a page. Images do not contain content and links are not content. Every page in a Web site–especially the home page–needs content. Preferably at least several hundred words of informational content. Text in images is unreadable by search engines and therefore is not content. Text delivered via Flash is also not content because search engine spiders typically cannot read it.
- Duplicate content. Make sure that every page in a web site contains unique content and every page is represented by one and only one URL. If you have duplicated the same pages in a site under different URLs or are repeating pages of content, the Web pages will likely be flagged as a Supplemental Result. Do not copy articles from other sites. It doesn’t matter if the article came from another one of your sites, or a news site, or a free articles site. This is all duplicate content and any page that duplicates the content from another page can be penalized.
- Content from another site in HTML frames or iframes. If your site is pulling in content from another site using these techniques, a search engine can see that the domain name is different and therefore knows that the content does not originate on your site. This is fairly common with real estate sites and turnkey mortgage lending sites. Make sure that the content is unique and physically exists on your web site.
- Affiliate marketing links. If a Web page contains affiliate links that are redirected or run through an affiliate network server, such as Commission Junction, ShareASale, Linkshare or others, the page may get penalized. This Top Rank Solutions web site was banished to Supplemental Results for a brief period because we used to have an Amazon affiliate book store in a directory. More on this later.
- Obsolete pages. Rather than removing page URLs that no longer exist from their database, as Google is supposed to do, these page URLs are frequently getting moved into Supplemental Results for a period of time. Why? Who knows. If the page is gone, the URL is supposed to be removed if a proper status code 404 (Page Not Found) is generated. At the time of the last update to this article, we still had thousands of non-existent web pages showing up in Google’s index. Each of these pages is listed as a Supplemental Result, even though they have been gone from the site for more than six months.
- Link pages. Pages that contain nothing but links to other sites are being flagged. These pages are typically used for reciprocal linking programs, which Google also does not like. If you wish to include a link page or even a site map page in your site, add some text next to each link to break up the set of links. Better yet, get rid of pages that only contain links to other sites, especially if these links are used for reciprocal linking.
- Links to Bad Neighborhoods. Be careful who you link to. Sites that are penalized by Google are considered to be Bad Neighborhoods. Given the chaotic and apparently random nature of Google’s most recent penalties, it can be very difficult to tell if another site has been penalized. Do not engage in link exchange programs with unknown Web sites and avoid linking to sites that are unrelated to the theme or industry that your site represents. The theory behind Google’s linking penalty states that while you do not always have control over who links to your site, you do have control over the sites that you link to. A site therefore inherits part of a penalty when it links to a penalized site. Ideally, you want to have all of the outgoing links on your site point to industry-related sites and have all of the inbound links to your site come from industry-related sites.
- Duplicate HTML Title tags, Description meta tags and Keyword meta tags. Some Web sites use the same HTML title tag on every page. The HTML title tag is supposed to indicate the content theme for that specific page. Spiders use that information to help determine which keywords to use to represent the page in search results. Given that the content on each page is supposed to be unique, each title tag should also be unique. It is more common for site owners to use a standard combination of a description and keyword meta tag throughout a site. It might be a good idea to try to make each of these tags unique wherever possible.
- Old content. There is some speculation that pages with old content that has not changed for several years are being tossed into the Supplemental Result database. That is a possible factor, but appears to be unproven thus far.
Nonetheless, it is a good idea to periodically update the content on each page in a site. This tends to stimulate spidering activity.
- You’ve done nothing wrong. We’ve seen numerous instances where sites with absolutely unique content that does not appear to violate any of Google’s known guidelines or rules are getting tossed into Supplemental Results. Many of these are small e-commerce sites. If the site contains product descriptions that are mere duplicates of the manufacturer’s product descriptions, then a duplicate content penalty is likely the culprit. If the product pages contain little or no content, then that could be the reason for the penalty. But numerous sites that have been penalized have significant amounts of unique content. It is very likely that a bug in Google’s algorithm is to blame.
Why does Google have a Supplemental Result penalty?
It doesn’t appear that Google views it as a penalty. That is evident by watching Matt Cutts’ video that you will find in the resources links at the end of this article. But to a site owner whose Web pages no longer appear in Google’s search results, it’s a penalty. One theory is that Google needed to trim many of the billions of pages in its index and is only keeping the sites and pages that it feels are most relevant. Another theory is that they are trying to drive site owners to Google AdWords advertising. If you are the owner of a small e-commerce site, the latter reason makes sense. In effect, it leaves a small site owners with little choice, because top listings in Yahoo and MSN combined do not drive the level of traffic that you will find with a top Google rank position.
So How Do I Get My Pages Out of Supplemental Results?
The GoogleBot spider does periodically crawl pages stuck in Supplemental Results, but it is a different spider and it does not visit these pages very often. It is possible, however, to have pages returned to normal search results.
Most of this Web site was stuck in Supplemental Results for a brief period. It started with an Amazon.com book store that was located in a subdirectory. The store focused on SEO and Web design books and contained affiliate links to Amazon. We noticed that book store pages toward the end of the site query listings were being flagged. Being aware of the issue with affiliate marketing pages, we thought the penalty would just stop with these pages. It did not. The flagging increased weekly and spread like a virus until it began to penalize all of the 100% original content pages found in the main portions of the site. We quickly removed the book store and filed a reinclusion request with Google. Within about a week, almost all of our original content pages were removed from Supplemental Result hell and Google traffic began to flow in once again.
It is difficult to tell if our reinclusion request or the prompt actions to remove the store resolved the issue. Technically speaking, the reinclusion request should only be used if an entire site is removed from Google’s database. I suspect that the prompt action led to the recovery.
At the time of this writing, several thousand pages from the old Amazon store still linger in the supplemental database, as do a few original content pages from the main section of the site. No one outside of Google completely understands the Supplemental Result issue. We do recommend that you take swift action to identify and remove any potential problems as soon as you see any pages in your web site getting flagged.
Resources: Matt Cutts is a Google quality engineer who works as a liaison with the SEO industry. Sometimes his answers and recommendations are very straight-forward, but they can also be a bit evasive and cryptic. Nonetheless, it is worthwhile to read what he writes and watch him address issues in his videos. I think he is basically telling us all that he can about an issue without giving away Google trade secrets.