Last week Google took another step toward personalized search results by launching a feature that allows users to block certain websites from their search engine results pages (SERPs). Here is an overview of how the system works:
The user will submit their search through Google and the applicable results will show up just as before. If the user clicks on a website then quickly clicks back again to get off the site (“bouncing” at it is known in the community) there will be an option at the bottom of Google’s results page that says: “Block all ‘example.com’ results.”
After a site has been blocked it will no longer show up in the results for future similar searches. Instead, there will be a message at the very bottom of the page that informs the user that a certain number of sites have been blocked, and the user has the option to unblock them from their list. Not surprisingly, sponsored links cannot be blocked.
This certainly seems like a good idea, but the real meat lies in what Google will do with its collection of aggregated data on the “block patterns” of its users. On a personal scale this feature just allows users to avoid annoying repeat sites, but on a global scale it allows Google to create a database of what users perceive to be “low quality sites.”
At this point, Google states that it will not use the “block” feature as a means to decide site ranking, but it is certainly being considered for the future. Google is all about data mining after all, so it seems inevitable that they will include blocked sites in their algorithm after their initial tests with it.
As of now, Google is in “beta test mode” for this project because it takes a series of steps to implement it into your search results. This is Google’s way to hide it form the average user while still allowing people “in the know” test it, then decide whether or not to move forward with the system based on the results.
When Google finally starts implementing this new information into their algorithms it will have a major influence on how search results are shown. It will create an entirely new factor for companies when building their web pages and it will make it infinitely harder for spammy sites and linkfarms to drive traffic organically. Quality is definitely the future of SEO and this new system is just one more step in that direction.
My question is, how much more vulnerable does this make your site to competitor attack? No doubt it’s dirty, but what harm could an unethical competitor do by exploiting this new feature?