Google search has undergone drastic changes over the last year in an effort to improve overall user experience on their search engine. With the search engine algorithms severely punishing several websites that offer poor or irrelevant content to the users, several webmasters are becoming more careful about their website’s quality and relevance. The Panda algorithm launched in Feb 2011 was a unique experiment in machine learning where artificial intelligence was built into the ranking techniques adopted by the search engine to identify relevant search specific results. The ranking criteria were first determined by human testers who manually scoured thousands of websites to rate them for relevance based on several factors, and this learning and rating was fed into the search engine indexing algorithm to be used as a reference for future ranking exercises.
The Panda roll-out re-ranked about 12 per cent of search results, with later updates or “data refreshes” still continuing their impact, though to a limited extent. The Panda updates effectively obliterated websites that had poor, duplicate or outdated content, while also penalizing excessive advertising above the fold.
Penguin, released this April, is Panda’s successor and already has one “data refresh” to its credit. Although the initial impact is yet to surface fully, this time the hit may be harder as the prime factors under the scan are links and anchor texts. The supremacy of content has been compromised and links have now become crucial factors to gauge the quality of results. The Penguin attempts to punish over-optimisation of keywords, links as well as cloaking by simply ignoring them. If you have this issue be sure to contact a reliable SEO expert.
Moving past ‘black hat’ and ‘grey hat’ techniques that are used to drive traffic and ranking, Google is now looking beyond just keywords and long-tailed phrases to focus on answering both implicit and explicit user queries using advanced AI algorithms.
It is the semantic search or the ‘Knowledge Graph’ in Google parlance that is in the limelight. The search keyword is now interpreted in one or more specific contexts to arrive at the relevant results in terms of websites that best meet the context, rather than simply matching text and keywords. The search results are gleaned from inbound links to a given site as well as user-specific search history, and location to look for information. Google search has now become intelligent enough to pick up sites that best answer a user query but also other related sites that may not really have the search term in the description or anchor texts.
Knowledge Graph (for English), released in May this year is reported to understand 500 million entities and 3.5 billion attributes and connections, in an effort to translate distinct chunks of data (words) into relevant context-specific information. Google search will now display all information related to a given search term on the right side of the SERP, allowing the user to even specify a context when the results span across different domains. For example: Searching for celebrities will display images, bio-data, events, and other related frequent searches for a given name. With all related information being displayed on the same page of the SERP, the user hardly needs to leave the page to access other websites.
Google’s Knowledge Graph is intended to power the Google Assistant to be released later this year on the Android platform and it will be pitted against Apple’s Siri, an intelligent personal assistant based on semantic search, programmed for context-specific interactions.
Initial reactions expect Google Assistant to surpass Siri, primarily due to the wealth of user data being available with Google to build intelligent responses, given the fact that Google has merged the privacy policies of most of its individual offerings to share usage details across applications.
Impact on SEO
While search results get increasingly relevant and user-friendly with the semantic update, it is arduous exercise for webmasters to revamp their content, especially links and anchor texts to first tackle the Panda and the Penguin. Once they have survived these algorithms, webmasters will have to ensure the semantic value of their content. Websites now will have to address specific queries related to the context of the keyword and not simply optimize the content for keywords. For example: Swimming – what exactly do people look for when they search for this term? Swimming lessons, swimming competition, swimming Pools – there are multiple interpretations associated with the term and the content on the website should answer queries in an appropriate manner for a prominent position on the SERP.
Natural, user-friendly language and original rich content is the ideal way to get noticed by the semantic search. By adopting a semantic approach, Google has now learned to interpret search terms to form and answer queries all on its own.
Pick SEO Services in Brighton, if you want your websites to figure on the SERPs. The key to this is matching your content to the queries of Google.