Google’s new patent
Search engine Google released a new patent which delivers more information about its ranking factors. The document shows factors that Google may be currently using in its algorithm and others that it will be implemented in the short term and in the long term.
Until now the famous sandbox which has been a penalty for new web sites, has never been officially confirmed although most SEO specialists have noticed that almost no website could rank well on Google, especially for competitive markets, during the first 6 to 12 months. The patent refers to a date of document inception, which is the domain registration date, for a site or a single page. Recently, Google acquired the title of a registrar and it can easily query the ICANN database and include historical data as a factor for its algorithm. For example, a site that has been registered only for one year could be penalized because there are more chances that it is a spam attempt. Google probably already uses some historical data as a factor since the Florida update, the date from when most new sites can’t do well on the search engine SERP for sometime a year.
Another included factor is the frequency of documents changes over time. New pages, changes and the ratio of new to old documents will be given more importance which probably means that a site is supposed to develop its content at a reasonable rate since the first time it was crawled by Google.
Historical data is also integrated when determining how fresh the incoming links to a website are and if the tendency is towards more or fewer link popularity efforts. Over time, the search engine could eventually calculate the changes in the anchor text to deliver more “fresh” results according to the actual link popularity of a webpage.
Static links would be given more importance over links that are changing all the time – this means that not only the number, the relevancy and the quality of links are important factors but so is the fact that the links stay in place with the same anchor text. Therefore, blog and forum spam would be completely inefficient since the links keep on moving at any new post or thread.
Overall and except for how exactly Google will integrate the changes, the click through rate factor is probably the most controversial one. Whether a document receives a better click through rate or not, it will be given more weight in the SERP or not, Google already uses this technique for its PPC engine. The SERP shows only the page title and the Meta description or part of the content. This partial information would deliver absolutely no valuable information in terms of relevancy or satisfaction to the searcher.
The new Google’s patent discloses what most search engine optimization and marketing professionals have been speculating on for quite some times now. Some of the factors described in the patent are probably already in use, to a certain degree, and the current frequent SERP rollbacks and the long awaited page rank update could indicate that Google is implementing some of these new factors in its ranking algorithm.