Google Hummingbird Update & Why I should blow my own Trumpet

Almost 1.5 years back, I documented a research on “Structured Data”, and my conclusion was as follows:

“A human can search for a lowest priced book, can understand “Operating System”, when documents say “OS”, but machines cannot accomplish this task without human intervention, because web-pages are designed for humans, not for machines. As I said, semantically structured web is required which will enable machines to understand and respond to complex search queries. Google is now actually trying to make happen what Tim Berners-Lee initially expressed the vision of Semantic Web, which is as follows:
I have a dream for the Web [in which computers] become capable of analysing all the data on the Web – the content, links, and transactions between people and computers. A “Semantic Web”, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The “intelligent agents” people have touted for ages will finally materialize.
Google is now moving towards being a “Knowledge Engine” from a traditional “Search Engine”, Google renamed its Search Team to Knowledge Team recently, which further solidifies the aforementioned fact. Google Knowledge Graph, Carousel list are some more concrete evidences for the same.”

There was a time when Search Engines were committed towards focusing their efforts in effective indexing of the giant size html content across the web, but, today, it’s not all about indexing, it’s about understanding the information in the html content. In order to effectively understand the information, Structured/Semantic data comes into the picture. Google Hummingbird Algorithm update is capable of understanding the long question form queries (Yeah, Conversational Search) and can easily identify the synonyms and substitution queries. Hummingbird is more of a query expansion approach which can better understand natural language queries. This new algorithm is well equipped to identify user intent in the query, for example, if you do a voice search with “which is the best place to eat pizza in Jaipur”, Google can easily understand that here in the query, “place” means “restaurants” and Google will return results accordingly.

which is the best place to eat pizza in jaipur   Google Search

In this way, Google will use synonyms to provide better search results. Google uses “Statistical Language Translation”, in which query is translated into different language and then translated back to the original language. If both variations correspond to same results, they can be used as synonyms. Google in its Hummingbird Patent clearly says following:

A computer-implemented method comprising: identifying a particular query term of an original search query; identifying a candidate synonym for the particular query term; accessing stored data that specifies, for a pair of terms that includes the particular query term and the candidate synonym of the particular query term, a confidence value for a non-adjacent query term of the original search query that is not adjacent to the particular query term in the original search query; determining that, in the stored data that specifies, for the pair of terms that includes the particular query term and the candidate synonym of the particular query term, the confidence value for the non-adjacent query term satisfies a threshold; and determining to revise the original search query to include the candidate synonym of the particular query term, based on determining that the confidence value for the non-adjacent query term satisfies the threshold.

car repair   Google Search

In May 2013, Google rolled out Conversation Search for chrome, which pushed users to use conversational search queries and long tail question form queries. Then, in September 2013, Google announced that Google is using a new Algorithm called “Hummingbird”. If we connect these two events, we can conclude that Google first asked users to use long tail conversational queries and then, Google launched an effective algorithm to handle it.

Today, when I see an article like this, this or this, I feel that if it is a herculean task to criticize yourselves, from the same school of thoughts, there is no harm in blowing your own trumpet (at times). What I concluded year ago, is going to be the future of Search (Need not to mention that Sir Lee said the same years ago).


Closing Thoughts!!

“Everyone must leave something behind when he dies, my grandfather said. A child or a book or a painting or a house or a wall built or a pair of shoes made. Or a garden planted. Something your hand touched some way so your soul has somewhere to go when you die, and when people look at that tree or that flower you planted, you’re there”

-Ray Bradbury

Closing thoughts while leaving DPFOC…

Hi All,
It is a mixed feeling for me, the feeling is more like the end of an era, than just a new beginning. I still remember the day I joined DPFOC, my very first job. Like everyone who steps into a world of competition and growth here I was  – a naïve young man, with a lot of knowledge just waiting to put it into effect. I started my career as a “No one” with a dream and passion to be “One of the One”, I know that’s not a proper English sentence, but this can express what I really mean. The culture and the working environment at DPFOC is responsible for what I am today, Salik Khan- Head Of Google Search Marketing. Everyone looks for new horizons and dreams and the same applies to me too, I do not have a choice but to move on.

To Eddie, every day was a stepping stone to getting better at what I knew I do best and I can only say thank you Eddie for being not just an ideal(read tough) boss, a teacher and a mentor, but also teaching  me that achieving success didn’t happen overnight. Working for you and DPFOC is an experience and a part of my life that will forever be etched into my memory as some of the best days of my career. All those fights – every argument with you helped me a lot to grow into a mature professional. I feel that my  intelligence has grown in leaps and bounds and the credit goes to you, because you do not accept anything lesser than the best. I’ve seen many experienced professionals and Business Owners, but the level of dignity, strong working ethics, discipline and dedication you posses is very rare, thanks for inculcating the same in me. As it is said that “If we are facing in the right direction, all we have to do is keep on walking”, thanks for showing me the right direction.

To Kalpesh Sir, having you as my immediate boss, guide and counsellor has made it a privilege to work in this environment. You have been a constant source of encouragement in helping me dream bigger, aim higher and achieve greater heights. Its been a very good learning, growing, achieving period and I step out into the world confident and thankful for having had the honour and the pleasure of working for you, thanks for casting your spell on me.

To Nidhi , Purva and  the Account Managers, thank you for being a support throughout my tenure, you all have played an important role in my career, you guys are the backbone of DPFOC, I wish you all the very best in your future endeavours.

To Saroj,  I think it is best time to congratulate and wish Saroj and I am sure she will take the assignment Bus to its destination. Saroj, what I write below ,is for you:

A pencil-maker told the pencil 5 important lessons.

  1. Everything you  do will always leave a mark.

  2. But you can always correct the mistake you make.

  3. What is important is what is inside you.

  4. In life you will undergo painful sharpening which will make you a better pencil.

  5. And to be the best pencil you can be, you must allow yourself to be held and guided by the hand that holds you.

You are the pencil . Be the one.

To Andy and Onshore Staff, it was great working with you guys, I am really glad that I met all of you. Andy, thanks for your pro tips on Girls.

Lastly, to someone special, I know you are not there in this email, but thank you for everything that you have done and been for me,  you will remain the same for me.  Always!! Below is for you:

I will always be indebted to DPFOC for being a solid foundation of my growth and confidence.

I can no other answer make, but, thanks and thanks…

Google is a Verb, While Facebook is a Noun: The Grammar Of Search

It is no secret that more often than not, we use Google as a verb- “ I Googled it”, “Google it and you’ll get the answer”, are some of common usages of Google as a Verb.  Oxford dictionary, one of the most prestigious dictionaries of English Language, defines Google as:

 Definition of google
[with object]
search for information about (someone or something) on the Internet using the search engine Google

 While Googling is a daily chore like criticizing the local government, bathing and brushing your teeth; introduction of Mobile technology seems to be a catalyst in these activities, according to a report, Android users Conduct 2.65 Mobile Searches per Day , which is more than the number of times an individual brushes his/her teeth in a day. Daily searches performed on Google are increasing with a rapid pace, and the reasons are quite obvious- School Projects, College Assignments, Local Dentists, Nearby Pet shops, Birthday Gifts/Ideas,How to’s, Movie Reviews, Recipes-you have the question, Google has the answer, If it isn’t on Google, it doesn’t exist.

Average Number Of Searches on Google Per Day(Year Wise)

Basic Methodology Of Google:

Initial method of ranking and collecting information was citation(links) bases which analyzed the information based on the quality and quantity of links by applying algorithms to provide best results for a given query, according to the initial research paper submitted at Stanford Uni entitled “The Anatomy Of Search Engine” :

The Google search engine has two important features that help it produce high precision results. First, it makes use of the link structure of the Web to calculate a quality ranking for each web page. This ranking is called PageRank and is described in detail in [Page 98]. Second, Google utilizes link to improve search results.

 While there has been a number of changes and updates, Penguin Update for example,  in order to better identify these links and thus provide better search experience. Citations and links are mostly created by publishers, not actual end user, a room for manipulating these links still exists. As per latest Algorithm updates from Google, the search quality has improved in recent years and we should expect regular updates from Google to further improve the overall quality.

 After a series of regular updates , Google then made some significant changes in the basic methodology of search engine, Page rank , one of the major ranking factors, has lost its place in recent years and Social Signals came into the picture, because Social Signals is what a real end user can control and this is the reason why Google Launched Google Plus.

 Yes, Facebook Is A Noun- Enter Graph Search

 When I first read the news about Facebook Graph Search, and its tagline “Search People, Places and Things”, the first thought that came to mind was of my Primary English classes, where I was told and forcefully asked to accept the fact that “ A Noun Names  Places, People Or A Things”.

 So, we already had a Verb called “Google”  in Searchsphere( Yeah, that’s not a proper English word) , and now its Facebook, a Noun.

Facebook has been quoted as a  Noun previously, but Facebook Graph Search solidifies the aforementioned quote. Before we move on, read this article which outlines a brief history of Graph Search.

 Basic Methodology Of Facebook Graph Search:

 As opposed to its most talked about competitor called “Google” or any other modern day search engine, Facebook Graph Search is not Crawling Based, FGS looks for the user intent while performing search and looks for semantically related accurate results by understanding the contextual meaning of search terms using Natural Language Processing. Although, Google has introduced services like Knowledge Graph, which is based on the same technology. Semantic search is a concept referring to technologies that make a search respond to what the user enters to try to better understand what the user wants to know. I am a huge fan of semantic search, reason being, a vast majority of searchers don’t actually know how to search, when Semantic Search is in place, it helps users to search for what they actually want to know.

 Apart from this, Facebook Graph Search works on the fact that Facebook users are connected to each others and with things they like, places they’ve been to and various other ways of connections, the endpoints of these connections are called as “Nodes”. This has been explained in the Patent Application filed  by Facebook called Search and retrieval of objects in a social networking system, click here to read more on the Patent.

The computer-implemented method of claim 9, wherein the social graph has nodes corresponding to objects and edges corresponding to relationships of the objects, the user being represented as one of the objects.

From the marketing point of view, if you want to get found in Facebook Graph Search results, you need to minimize this distance, i.e you need to make sure that you are in close connection with your potential customers.

A computer-implemented method comprising: receiving a query from a user; submitting the query to a remote social networking system; and receiving, from the social networking system, a combined result set comprising objects matching the query, the combined result set comprising objects obtained from a plurality of search algorithms performed by the social networking system; wherein at least a plurality of the objects of the combined result set are ordered based at least in part on measures of affinities of the user for the objects, an affinity of the user for an object comprising at least one from a group consisting of: a distance on a social graph between the user and the object, and a similarity between the user and the object.

The above explains two things:

  • Just like SEO, where fake links/ artificial link building is a real problem for Google, and people manipulating search engine rankings using fake links, Fake Likes in Facebook will be of no use, reason being the distance and affinities between nodes.

  • When user searches for “Dentist in Dublin”, the user is not looking for those web-pages where “Dentist, Dublin” appeared in the Title, Headings, anchor texts of the links pointing to the page, user wants dentists in Dublin with closest connections to his profile.

As explained above, semantic search will look for the nodes, connections and edges between the searcher and the search results. Facebook uses Unicorn Framework as the building block of its search, this framework helps in ranking and indexing of the search results.  Facebook is extending the use of Unicorn to be a search engine, Unicorn is an inverted index framework and includes capabilities to build indices and retrieve data from the index.  Sriram Sankar, Facebook Engineer, posted on the blog that “ Our goal is to maximize searcher happiness, which we do our best to quantify through metrics like click through rate (CTR), NDCG, engagement, etc. We have to measure the impact of ranking changes on all of these metrics to maintain a proper balance”.

Above is  the Unicorn Framework to run ranking experiments where the happiness metrics are compared between various experiments against their controls, read the complete working of Unicorn Framework here on Facebook’s Notes.

I believe that Facebook and Google are too good to defeat each other, yes, they both can bring something really new, as in this case, Facebook came up with something which might looks like a competitor of Google, but in reality, its not. Google, on the other hand, recently applied for a patent entitled “Query based user groups in social networks”, Bill Slawski from Seo by the Sea covered this in his blog post :

 Are Google’s query-based social circles the answer to Facebook’s Graph Search? Not too long ago, Facebook launched its Graph Search, which enables people to search for things like “My Friends who live in San Francisco,” and My Friends who like Surfing,’ and “Places my Friends like.”

 Imagine if Google Plus allowed you to perform searches such as, “People who take the same bus as me into the city,” or “People who like to eat at the Red Truck Bakery,” or “People attending the Dave Matthews Band Concert next Friday,” and creates in response a social network circle that other people might be invited to join, even temporarily, or who could join anonymously. Or Google Plus may dynamically create such a query-based social circle which it may recommend that you share through as you create a post about a music festival you’re going to, or a meal you’re reviewing from a local hotel.

 While there are few things on which Facebook needs to work in order to make Facebook Graph Search a better experience. With Graph Search Launch, I was expecting Facebook’s Open Graph Tags to mark their importance and internet marketing ninjas will include OGP tags in their technical checklist, I am not sure why Facebook didn’t emphasize on usage of OGP. Apart from this, when Likes and connections would be the key in Graph Search , SBM’s will find it very difficult, as Danny Sullivan said :

Consider me. Not only have I not liked my electrician, my plumber, my dentist, my doctor or my tax person on Facebook, but I don’t even know if they have Facebook pages. I have nothing to offer to my Facebook friends in this regard.”

 User’s Privacy in Graph Search is another area on which Facebook needs to work, Privacy concerns have been reported after its launch, read this tumblr post to see what I mean here with Privacy.

 Closing Thoughts:

 According to me, Facebook’s Graph Search will give us a new way to perform searches on internet, while Google will continue its dominance in traditional search market with new features in Google’s search technologies, Facebook will emerge as a new dimension in Search Market.

You can Upvote this article on here.

Does Your Website Hosting Company Affect Your Rankings? Matt Cutts In His New Video Answers

If you are hosting your website on a host which hosts spam websites, will it affect your website’s reputation/rankings? Matt Cutts answered this in the latest GoogleWebmasterHelp Video uploaded at Youtube. Though, Matt’s answer was not a straight forward NO, but you really need not to worry in most of the cases. Matt said that websites which are legitimate, should not suffer because of the spam hosted on same server, so the answer to the question in Title is Typically, NO. Below is the video:

Although, Matt reminded an old and exceptional case whereby Google found that  more than 25k spam websites were hosted and there were only 2-3 legitimate websites, because the ratio was too high, Google took according actions. That being said, if you are hosting your website on a free host, you should think again, because most of the free hosted servers are full of spam and thus, ratio of smap is higher. Also, there is no harm in investing few time in analyzing few websites which are hosted on the server which you are using for your website, a quick Reverse IP Lookup for your domain would be very easy, you can use this tool:

Apart from this, site speed, as we all know that it is ranking signal and it has been included in many articles on site speed, Matt Cutt on his blog confirmed the same. This article explains how to choose a hosting server.

E Commerce Optimization Tips For SEO

Optimizing an E Commerce website is quite different from usual websites and thus most marketers /SEO’s find it very difficult while dealing with such huge eCommerce websites. Unlike service businesses or simple static websites, which will typically have few pages to optimize, E Commerce websites provide ample amount of learning experience and opportunities for optimization. I personally love to work on E Commerce optimization , because this is the only place where you can actually work on some advanced methods, Google Analytics, Semantic Markup and Google Base for example. E Commerce websites come with some  inherent qualities — dynamic URLs resulting in duplicate content, duplicate meta tags  and a large product catalog — that makes it challenging to achieve a high Google ranking. So, those who think that Optimization of an E Commerce website is akin to shooting fish in a barrel,no need to read any further, probably, you should read something relatively simpler, may be  “What is Google Authorship” . For others, some highly recommended optimization tips while dealing with E Commerce Websites are explained below:

  •  Tip One: Keyword Research- Don’t Step Off On The Wrong Foot
  • Tip Two: Duplicate Content-  Enter Panda Update
  • Tip Three: Landing Page Optimization- If Done Once, Its Lame, Keep It Going
  • Tip Four: Social Media- Sharing Is Caring
  • Tip Five: Semantic Markup- Enter The Structured Web
  • Tip Six:  Google Analytics- Statistical Analysis
  • Tip Seven: Bonus Tip

Note: This is an excerpt of  my article, please visit  to read complete article, you can also download the PDF version of this guide, click here to dowload PDF.

Do take some time out to share it on your social media profiles.

Bounce Rate And SEO

Seo (Ex)pert: We have analyzed your website’s stats, some of your pages are of poor quality, they won’t be able to be ranked well.

You: Ohh!! But these pages are ranking quite well !!!
Seo (Ex)pert: You seem to be lucky, seems like Google hasn’t seen your web-page’s bounce rate yet, Bounce Rate is an important ranking signal
you know.
You: What is Bounce Rate?
Seo (Ex)pert: Bounce rate is the percentage of users who visited only a single page of your website and didn’t find content worthy, so they  didn’t visit any other page, to dig your services in detail.

Poor You: Ohh!! I never knew that, can you please help me sort this out?

Seo (Ex)pert: Yeah, for sure. We charge $250 per month and we will sort this issue.

Many $250 later, again Poor You: Hey, I can’t see any tangible effect on my sales from your services…

The above conversation is quite common these days, I am sure, you must have been  “Poor You” in many such instances. Bounce rate, one of the most  talked about matrix and as I said, widely misunderstood concept, gained  its popularity after “Panda Update” from Google which is essentially  related with thin content, and thus, easy to relate with bounce rate and
yes, easy to fool people around.  Panda update and its widely misunderstood concepts are also prevailing in  the seo society, but that’s another article for another day.  Bounce rate, from its core definition, is the percentage of users who visited only a single page of your website  and I think that this definition leads to such confusions . Alright, a visitor didn’t click any other page of my website, that’s not a crime!! We need to understand user sentiments here, and let me explain this with an example:

I did a Google search for “How to install Windows “, landed on a blog post explaining the same, I got the complete A-Z information on installing windows, I thanked the guy who took the pain of writing this blog post and left the website.  I never felt like clicking on any other page of the website and thus will be counted as bounce in GA, but I got precise information on my topic and that’s what Google wants, best results to users according to their search queries. Same applies to Q& A websites, they provide point to point information about a given topic, no need to dig into the website.

Why Google doesn’t use GA data in rankings?

  1. Not all websites use Google analytics for tracking, though nowadays, most of the websites do use GA, but not all. Don’t forget that Omniture , Web Analytics and many other enterprise tools are also available to track website traffic and other data, it will be unfair to use GA data for rankings.
  2. Even those who are using GA, tracking scripts are often found to be placed incorrectly, which further hinders the data pool to be accurate.
  3. There are any number of Java Script experts who can easily game GA tracking code to manipulate GA data and Google has no control over it.  For example, the piece of code below will adjust your bounce according to you:
<script type=”text/javascript”>
 var _gaq = _gaq || [];
 _gaq.push(['_setAccount', 'UA-XXXXXXX-1']);
 setTimeout(“_gaq.push(['_trackEvent', '15_seconds', 'read'])”,15000);
 (function() {
   var ga = document.createElement(‘script’); ga.type = ‘text/javascript’; ga.async = true;
   ga.src = (‘https:’ == document.location.protocol ? ‘https://ssl’ : ‘http://www’) + ‘’;
   var s = document.getElementsByTagName(‘script’)[0]; s.parentNode.insertBefore(ga, s);
  1. If you were asked by your examiner specifically, not to prepare for a particular topic for the exam as your question  paper will not contain any question from the topic, will you still waste your time in studying the topic? If yes is what you are thinking in your head, then close this blog post , have a cup of coffee and I wish you all the success. If No,then why we are not listening to Matt Cutt, when he clearly said here, here and here , that they do not use GA data in rankings. Danny Sullivan confirmed this on twitter:

Instead of ranting around Bounce Rate, you should focus on a matrix which is often overlooked and dies a natural death in your statistical analysis. Welcome Click Through Rate(CTR).   Click Through Rate (CTR) from the SERPs themselves is an easy to use matrix. Whether or not a result gets clicked on is one of Google’s first clues about whether any given result is a good match to a query. We know Google has this data, because they directly report it to us. In Google Webmaster Tools, you can find CTR data under “Your site on the web” > “Search queries”. It looks something like this:

CTR plays a vital role in adwords and thus on your quality score(PPC). Though, the Adword algorithm is very different from organic search, CTR doesn’t need any analytics and search engine can easily calculate CTR, logic is very simple,  Relevant results drive more clicks.

Can I Manipulate CTR?

Yes, you can. CTR by itself can easily be manipulated – you can drive up clicks with misleading titles and META descriptions that have little relevance to your landing page.  Though CTR is easy to manipulate, search engines can calculate the time stamps between the click on serp result and when users hit back back button of the browser to return  to serp. The lower the the time between a serp click and back button hit, will suggest that the content of the page was not good, thus user decided to look for an alternative. Bing called this time as “Dwell Time” and combination of CTR and “Dwell Time” is a strong matrix to judge the quality of SERP result. If you are trying to manipulate CTR, that kind of manipulation will naturally lead to low dwell time, though. If you artificially drive up CTR and then your site isn’t as good as it appears in serp, people will go back to the SERPs. Unfortunately, Google didn’t made this “Dwell Time” thing public but, Bing’s Duane Forrester wrote a post on “Quality Content” in which he talks about “Dwell Time”.  But, I do believe that Google does use this time in order to calculate something called “Dwell Time” or maybe with some other name, there’s one piece of evidence that suggests strongly to me that they use dwell time as well (or something very similar). Last year, Google tested a feature where, if you clicked a listing and then quickly came back to the SERP (i.e. your dwell time was very low), you would get the option to block that site:


Recently, Google has rolled back this feature because of the social results in serps. The main objective of this feature was to block a site which does not provide useful information as suggested in SERP.

I hope the above information explains “How  Bounce Rate could mislead you”. Instead of focusing on bounce rate, you should care about your snippets in SERP. If your CTR is too low then surely, something is wrong. May be either Description or Titles are misleading, you need to fix them ASAP.

How To Generate Rich Snippets In Serp For Videos Hosted On YouTube

We can use Semantic Markup for videos in order to generate rich snippets for the videos easily if the video is hosted on our own server, but the same is not possible if you are using videos from Video hosting websites. But, if you are using videos on your website which are hosted on YouTube, then you have to embed those videos on your website through the code provided by Youtube, make sure not to use iframe embedded code, use traditional one.

Here is an sample code provided by YouTube(Click Image to Zoom) .


Embedded Code (Traditional)

We need to find out below properties from this code:

media:video –> A URL to the video you wish to be displayed when the user clicks the “play” button.

media:thumbnail –> A URL pointing to a preview thumbnail, which must be a GIF, PNG, or JPG image.

Also, the preview thumbnail must be hosted on the same domain as the video. YouTube does not provide an option to upload a preview thumbnail as they create one.  Here is the trick, in order to find out the path to the image thumbnail, follow below steps.

1. The Thumbnail url will be with the ID of your video

2. You just need to enter the ID of your video here. You can find the id of the video from the embed scr section of the above code, the embed url is;hl=en_US.

3. The / or = after the “v” is always the first delimiter, and the “?version=3&amp;” is always the second delimiter. Everything in between is the video ID. In other words,;hl=en_US&amp;rel=0

4. In the above url, your Id is 6mZShors3o0, which is marked in Red in the above code. So, the thumbnail url would be

Note: This thumbnail URL will be useful in creating video sitemaps for such videos as well. You can use

The media:video url will be the url which contains the video id, i.e,;hl=en_US

Now we have media:video url and thumbnail url. We will now create the RDfa markup code(You can use Microdata as well); the equivalent RDFa code for the video would be like this(Click Image to Zoom):

rdfa for videos

Marked-up code

And that’s all. With above code along with video sitemap, you can generate rich snippets for the videos hosted on Youtube.