SEO Google Panda algorithm updates,
Those working in the field of SEO already know about the Google Panda algorithm updates, as they have had a marked effect on working practices over the last two years, with the fallout of the updates having a lasting impact that is still felt today.
For those not versed in ‘Google-speak’, here’s what you should know: Google’s ‘Panda’ updates are a series of algorithms that experts have determined (although Google is usually notoriously tight-lipped about changes to its search functions, on this occasion they have confirmed several key components of Google Panda) is primarily concerned with ‘on-page’ web elements, namely the content you stock your website with.
Upon first release, the unnamed algorithm changes were dubbed the ‘farmer update’, as the websites that took the full brunt of the changes tended to be low-quality article directories, or ‘content farms’ that scraped content from other sites, duplicating it and devaluing it for the original producer. As Google announced more updates, they dubbed them ‘Panda’ after a member of their search development team.
As the effects of the update became more apparent, websites with low on-page SEO quality ‘signals’, such as duplicate content or excessive adverts ‘above the fold’ (the area of a website you can see without scrolling down the page) also found their position in search results for optimised keywords had dropped, or in extreme cases found that their entire site no longer appeared in Google search results at all.
Recently, Google announced that they were incorporating Panda into the normal algorithm the search giant uses, meaning that webmasters will no longer be informed that Google have tweaked their on-page algorithms.
Speaking at conference in the US, Google’s Matt Cutts confirmed this:
“I don’t expect us to tweet about or confirm current or future Panda updates because they’ll be incorporated into our indexing process and thus be more gradual.”
This means that businesses and webmasters will have to ensure their on-page SEO tactics are up-to-date and that there is no duplicate content anywhere on their site, even for a brief moment (eg. check BEFORE publishing!)
It may also mean that on-page tactics will have to be tweaked and altered more often, as Google’s ‘anti-spam’ Panda-based crawler (program that reviews sites for indexing) will undoubtedly be looking at sites more often, and so the effects of any newly duplicate content will be felt faster.
A side effect of this ‘may’ be that some sites get penalised more often, although it is possible that, with Panda as part of the regular search algorithm, if you do get penalised and dropped down the search rankings, once you have corrected the issues Google flags, it will be a matter of days before your penalisation is removed and your original position is restored. Panda being incorporated into the standard search algorithm could be a blessing in disguise for webmasters and web users alike, because of this.
0 comments:
Post a Comment