By now, webmasters and SEO professionals across the globe were very aware of Google Panda, the cuddly black & white animal who wasn’t so friendly.
And neither was Google, in this update we found out that they had also used signals from Chrome, and directly from SERP pages, where users had blocked sites. So whilst Google were promoting this feature to end users as a useful tool to help them get results they wanted to see, Google were actually building up a big database of websites which humans thought were poor.
At the time, this one of few a revelations that we were informed with regards the Google algorithm – in that they use data from the block link in SERPs and the Chrome extension feature to weed out the unsavoury and unwanted websites. Initially, they stated that they only used this data to validate their algorithm, but they later changed this so say they did use the blocking data in their algorithm.
As webmasters and SEO professionals have grown to learn, there is never a clear cut answer from Google.
Google Webmaster Central states
Googles official webmaster central blog post leads with the headline “High-quality sites algorithm goes global, incorporates user feedback” and quietly informs the public that Panda has been rolled out worldwide and has affected 2% of U.S. queries (the previous release affected 11.8% of U.S. queries). But there with no mention of a percentage of how many queries worldwide had seen a decrease. So we were led to assume that the 11.8% + 2% (approx. 14%) of queries in other English speaking countries around the world would be affected as a result of the 11th April 2011 Panda update.
What is clear from this post, is the multiple references of the Panda penalty being all about site quality. Google’s aim was to quite clearly target low-quality websites which are outranking their high-quality counterparts.
Regarding the definition of what is high-quality, Google refer to an earlier post and state:
“This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
Proof once again, that site owners needed to start investing in content creation, and great unique content at that!
Who was hit the hardest?
With the first Google Panda update, it was mostly large websites that were hit the hardest. This makes common sense, as large websites tend to have content which overlaps with other sites or its own web pages, and these might not necessarily be written to a high standard, or contain in-depth articles. However, Panda 2.0 was noted as going deeper into “long tail” keywords aimed at low quality websites. As a result, many smaller websites would have noticed a drop in rankings for low volume long tail search phrases if they did not meet the quality standards.
The SEO industry later analysed the various blog posts relating to Panda and deduced the following definition of a low-quality website:
- The algorithm potentially targets all websites – not necessarily just content farms
- Websites which have content copied from other websites
- Sites with shallow content – that is, not enough useful content
- Poorly written content
- Content that is not useful to the end user
Google did confirm that poor low-quality pages on a website could affect the whole website/domain. And for those website owners who believe they were affected, that they should remove the low quality pages, move those pages to another domain, or of course, improve the quality of them. In doing this, they should recover and notice an improvement in search ranking.
Were there unfortunate victims?
With the list of quality elements above, Google did appear set on sending the B&W furry animal after those of scraped or plagiarised content from elsewhere on the web. Though, whilst it had great success by de-ranking those websites which made no sense due to the jumbled up manner in which it scraped content, there was undoubtedly good intentioned affiliate websites which suffered as a result. Nowadays, more and more products are online and even the best creative wordsmith will struggle to promote and write about a “mug” or a “kettle” for example, so that it does not appear to mimic or plagiarise the manufacturers product description.
Before I get shot down, yes you can create videos about how to fill the kettle, offer safety information about how to correctly use it. Even write about uses for an old kettle, such as a plant pot, construct a 101 list of uses of a kettle, offer advice on caring for a kettle for extended product life, list consumer reviews etc. But even after many of those avenues are exhausted, there are thousands of affiliate/retailers selling the exact same model – so each one writing unique high quality content will become quite a task, or near impossible. The SEO industry suggested that a website owner should raise their brand profile to stand out from the saturated crowd, but if everyone is doing it, then it probably comes down to those with the deepest pockets who stand out most.
As I type this, my own search for a Bosch kettle returns: bosch-home.co.uk, currys.co.uk, amazon.co.uk, argos.co.uk, johnlewis.com, tesco.com, ebay.co.uk and asda.com – in that order, no minnow retailers in that list, so I rest my case. All have big marketing departments, big budgets and big reach. Interestingly, they are all very similar in what is presented back – in that I get a product category page with very little unique high quality content! Has panda turned a blind eye to the big dominant brands?
I will look at Google Panda 2.1 (and possibly Panda 2.2)