If you had any kind of online presence back in February 2011, you will probably have heard of the Google Panda update – or the ‘Farmer update’, as it was first known.
Google introduce new algorithm updates all the time – literally a couple of times a day – so different websites’ rankings go up and down quite regularly, even without publishing new content.
If you think of algorithm updates as the search engine equivalent of earthquakes though, Panda was a big one, affecting nearly one in eight queries conducted on Google.
Panda was its internal name at Google, but before that was widely known, web users were unofficially calling it the Farmer update – and with good reason…
The Cream of the Crop
If a good website is the cream of its crop, then article databases are the weeds of the web world, spreading rapidly by scraping content from elsewhere online – often without making any changes to it whatsoever, and without observing the relevant copyright laws.
Google have worked with Chilling Effects for some time to remove search listings that are subject to copyright or intellectual property claims, but in many topic areas prior to 2011, article database sites dominated the top search results.
The Farmer update worked to remove these websites from the highest ranks of the results pages – as Google described in a February 2011 blog post.
“This update is designed to reduce rankings for low-quality sites – sites that are low-value-add for users, copy content from other websites, or sites that are just not very useful,” wrote Google fellow Amit Singhal and principal engineer Matt Cutts on the Official Google Blog.
“At the same time, it will provide better rankings for high-quality sites – sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
Coping with Change
For webmasters, the Panda update caused quite a lot of concern – people were worried that their search rankings would vanish entirely overnight. And, in some cases, they did.
Websites that had been crammed with keywords simply to rank highly, with no thought given to how human visitors would benefit from that content, were rightly punished by Google for being effectively useless to the people who clicked through to them.
The article databases that prompted the Panda update were also hit hard – and the changes made remain in place, so it’s important to understand how to produce web content that complies with the criteria effectively set out by Google in their algorithm change.
For instance, it’s not just whole articles that can represent duplicated content – if your site contains an e-commerce catalogue, you’ll probably want unique product descriptions, not just a copied-and-pasted replica of the manufacturer’s own description.
If you have little to no plain-text content at all, you might want to consider starting a blog, news section or online press office.
Whatever industry you’re in, make sure that your unique selling point is on your website in writing, and add a little extra insight into the areas where your experience lies, and you’re already part-way to producing Panda-friendly pages that will keep your search rankings healthy for a long time to come.
The Google ‘Penguin’ Update
As if the ‘Panda’ update wasn’t enough to think about, Google subsequently released a new ‘Penguin’ update from February 2012 onwards, which was aimed specifically at sites which it deemed had been engaging in ‘link spam’ activities, such as keyword stuffing, buying hundreds of irrelevent links in from other sites and directories, and using the same ‘anchor text’ for links in an excessive way.