(A brief refresher on Google’s impactive algorithm updates before the release of new Penguin)
The major reason for Google’s dominance over the search engine market is how uniquely the original version of its algorithm functions. While many other search engines used an algorithm which was focused on keywords, Google’s own PageRank, tried to look at the links generated by humans. The basic idea was that if a site is being linked to by other high authoritative sites, they would be worth visiting. So, the key factor of Google’s search results has always been relevance and popularity.
Since its inception, Google’s algorithm has seen massive changes. Businesses over the world need to understand not only the algorithms that Google comes up with, but also why it is implementing the changes every now and then.
From Boston To Penguin…And Still Going On!
The first named update to Google’s algorithm was released in February 2003 and it was named Boston after it was announced in the city named the same. Although, Google had updated its algorithm earlier, this was notable because it was the first to be given a name. It was a combination of Google’s major index refreshes and algorithm updates.
Google didn’t stop here and continued with its update process resulting into the reshaping of many SEO strategies. On November, 2013 Google sent down an early chill down the spine with its Florida update which was disastrous for a large number of websites and turned SEO upside down.
Google continued its update process with Austin, Brandy, Allegra, Bourbon, Gilligan, Jagger, Big Daddy until Panda happened. This is the next in the list of Google’s destructive algorithms introduced on February, 2011. This update not only did target spammy links and sites, but also it threw away “thin” sites which were not very useful for the users and was low quality, containing scraped content. Google itself gave a number that 12% of search results had been impacted with this update.
The message was loud and clear to any business firm trying to strategize an online marketing campaign that Google was not only eliminating spam and black hat SEO, but the algorithms were also based on the usefulness of the site. Only those sites which showed some strength and quality will be awarded a place on Google. Rand Fishkin had stated that in the current scenario, where Google has been constantly changing its algorithms, it has become more important to focus on how you build the website in order to earn trust and authority. According to him, this was more important that anchor texts, keywords and links.
Comes next, Panda 2.5 where specific details of the changes are unknown however, there were some sites which had reported massive drop-down. The next major update meant to penalize over optimization was the Penguin announced on April 24, 2012. This was rolled out as “Webspam update” and later dubbed to Penguin. The impact affected an estimated 3.1% of queries.
The 20th Panda update which was a combination of algo and data rolled out and affected 2.4% of queries. Now the Panda updates were being named by their order and this was called the Panda #20. A slightly higher impact was experienced than the Panda # 21 and 22 when the latest Panda # 23 was introduced which affected 1.3 % of English queries.
The year 2013 until now has only seen 2 updates until March, the first being Panda #24 in January affecting 1.2% of search queries and #25 in March which was pre-announced and was suggested that Panda would integrate into a core algorithm.
What Google is up to now?
The latest announcement by Google’s search spam expert, Matt Cutts, indicates that Panda will become softer on those sites which do have some quality signals and will not be affected more.
There will be major changes in the Penguin updates and the message is clear to all the spammers and Black Hat users that they will not exist by the end of the summer. The new Penguin update is all set to scare the daylights off for many a stuffers, and duplicate content creators.
Let’s see what Google has in store for us!