Google, which was founded in 1998, soon grew to be world leader in collating and organizing data. But Google soon ran into a problem. Ranking websites by their apparent popularity wasn’t enough. Scammers, link farmers and shady marketers were flooding the internet with fake and often fraudulent websites. Users who turned to Google to find relevant and topical information were being inundated with junky results.
On February 23, 2011, the engineers at Google launched Panda. This powerful algorithm was soon put to work, re-assigning and pruning 12% of all search results. On August 12, 2011, Google decided Panda was ready, and unleashed it on the entirety of its data. Soon, the powerful algorithm was having a noticeable impact on close to 9% of all search queries system-wide.
Improvements in quality continued throughout 2012, and by September of that year, Google announced that Panda was successfully pruning, re-assigning and re-evaluating 2.4% of all queries. Updates and upgrades continued, and on May 19 2014, Google proudly unveiled Panda 4.0. The powerfully-tweaked algorithm soon was successfully tackling 3.5% of all search queries made in the English language. That soon rose to 7.5% by the end of September.
After seeing Panda’s blazing successes, Google knew that it needed another way to evaluate and assess websites. Its legendary Penguin algorithm was first unveiled on April 24, 2012. After the initial rollout was proven to be handling 3.1% of all queries, Google continued its success by launching Penguin 2.0 on May 22, 2013. After proving it had a winner on its hands, Google released Penguin 3.0 on October 17, 2014, and has since been continually providing minor updates and tweaks.
Penguin was specifically designed to address so-called “spam” tactics, whereby shady site hosts try to mimic genuine content with computer-generated nonsense. Penguin’s razor sharp algorithm was also designed to make mincement out of “black hat SEO”, sites that are bloated with fraudulent use of keywords and/or thousands of links. Penguin’s powerful engine churns through all links, and is uniquely designed to be able to sort through which links are low quality and which are high quality connections to valuable content.
Google has successfully leveraged its dominant position in the information market that it can now set the terms of what defines quality content in a website. Most importantly, Google believes that all websites should provide value to visitors. Keywords and other important phrases should be used organically, and naturally. Google believes that people want more from their websites than just blog posts or articles. The users of the future will increasingly demand active engagement with their websites. That’s why Google encourages so-called “social signals”, defined as sharing amongst social networks, because it allegedly indicates that the shared content is important to readers. Google’s vision for the future of the internet is a network of communities that encourage interaction, extremely important to building brand loyalty.
The race to stamp out fraudulent, low-quality and artificially generated websites is not over. Malicious hackers and site hosts are constantly evolving new ways to try to defeat Panda and Penguin. Google’s engineers are constantly testing, tweaking and improving their products to ensure that only the highest quality, relevant information is returned for user queries.
The History of Panda and Penguin Infographic
Be sure to check out the infographic (source) below on a moment by moment breakdown of Panda and Penguin over the years.