The eternal quest for better rankings on Google’s search engine results pages (SERPS), was boosted last week by the release of Google’s, “Search Quality Rating Guidelines”. Details were released on the Google Webmaster Central Blog on the 19th November and is indicative of Google’s open approach to helping companies benefit from ‘search’.
Though challenging, Google has always tried to give companies and webmasters guidelines on how to observe and implement best practice. There has been an abundance of resources and people willing to help with Google’s own team running webinars through ‘Hangouts’, answering questions on impending changes like algorithmic updates.
There are some excellent resources and commentators on SEO and they provide excellent feedback and insight. Some of the best resources, in my humble opinion, being – Barry Schwartz (@rustybrick), Danny Sullivan (@dannysullivan), Londoner Rishi Lakhani (@rishil) and the Google master himself, Matt Cutts (@mattcutts).
Who are Search Quality Raters?
As there is so much data being produced every day and ultimately having to be indexed, Google relies heavily on its algorithmic machine code learning process. Though even Google, with all its mass computing power, cannot rely solely on robots – enter the human, search quality rater. These individuals physically check results, alongside the mathematical and formulaic machine. The reason humans are required is that the continually evolving landscape has so many non-machine factors like quirks in language, usage and numerous other factors, which can only be determined by humans to ensure quality of results.
So What Are The Guidelines?
The Search Quality Rating Guidelines is a 160 page PDF that is aimed to help those that rate Google’s Search Quality, with guidelines in how to best rate the quality of what they are testing. The guide can be found here.
If you require further information or consultation on Search, both PPC and SEO, please contact the team at Digital Clarity to learn more.