Blog | Search Engine Optimization | How has google changed after Hummingbird?

How has google changed after Hummingbird?

How has google changed after Hummingbird?
1 Star2 Stars3 Stars4 Stars5 Stars (3 votes, average: 5.00 out of 5)
Loading...

Let’s start with a different question. How has search syntax changed since voice search became more popular? This has been answered by Matt Cutts in a video back in July this year. Rings a bell?

Today, well actually, since the last couple of months, we all in the SEO world have been running pillar to post trying to figure out what actually happened with the Hummingbird launch, and what it leaves us to deal with? One fine morning in August, Google makes an announcement that over a month ago, they launched their search engine algorithm update. Mind you, this is not a version update like a Penguin 2.0; it is actually one of the biggest changes to their algorithm since 2001. And we are left wondering how something this substantial happened without the users and webmasters noticing a thing about the change! It is therefore crucial to understand and acknowledge that Hummingbird is a change in the Google search infrastructure, more than an algorithm change (that affects search results directly). With Hummingbird, Google has revised the parameters which are taken into consideration to process text, understand the significance of a web page’s content, and decipher the concept and purpose of its content. The previous versions like the Penguin and Panda were the actual algorithm ‘updates’. Once this is understood, it should help us look at its effects on webmasters. Let’s dig in.

A quick thought that should set the expectations right – there is a reason Google chose the name over millions of other species – Hummingbird stands for speed, a small stature (read compact in our context!), precision (accuracy), and a cool look. For our top search engine results that we care to browse through, we are looking at a website with the above mentioned features!

Google’s Not Provided is an attempt towards secure search. Google has seized to share a lot of information on user queries and keyword data. These are the data that many webmasters and SEO firms highly depended on to base their strategies on, be it in terms of targeting the audience, focusing more on successful strategies and working on weak areas. With the lack of this information, it is probably another hint that there are better and more hygienic ways to target traffic.

Conversational Search – With the increased number of mobile queries from users on the go, quite naturally we see a gradual increase in the number of long tail keywords. They are a natural output of voice search. So how does that affect our SEO strategies? Valid question! For the webmasters, so far, we do not see any suggestion to amending the (white hat) SEO practices we have been practicing over the years. What we might now experience is a sudden surge in traffic (quite rewarding for the fair webmasters!) This is anticipated from these long tail queries entered by the mobile traffic. The amusing fact is that probably very few of us even optimized our websites for these long tail keywords, although we did work on the shorter versions of these keywords. The reason is, with the last update, Google is better equipped to decipher the context of search by going beyond the face value of the keywords used, and return a result that best answers the query. So that’s semantic search for you!

There is a second part to it. For websites which are solely dependent on keyword targeting, without the equal focus on generating regular fresh content to that effect might not feature for long, as they fail to satisfy the search intent owing of the user. This is despite their efforts at intense social signals like social media inputs, brands signals, link signals, usage signals, and so on, since they do not have sufficient content to substantiate the search query. On the other hand, a website that is majorly based on content without the adequate focus on keywords (for example inadequate focus on keywords in title, description, images etc) still have a higher chance to come up on search results owing to the strength of their text and the context they are able to establish through their write ups. Essentially, what it means is websites that build content around answering the most popular questions/queries in relation to its subject matter will have a higher chance of appearing on top.

So that brings a good reason to cheer for webmasters who primarily focus on user experience rather than search engine experience. This shift is an attempt to suspend yet another parameter – keyword stuffing – to optimize your website. This is a great move by Google. The more such initiatives are taken, the lesser the chances of manipulation by Black Hat tactics. And it boils down to a rather simple bottom line in relation to the Hummingbird release – any content that well establishes a concept, will rule (can’t stress on this point enough!).

So here’s how to cope with Hummingbird. When every aspect of your website is well designed and aimed at explaining your product, service or the output of your research, establishes the context, and references appropriate links, all the points are sufficiently described and properly titled, you should not have to worry about your website’s quality. All you need to do is market it appropriately.

In terms of internet marketing, this is a clear indication to the rise of content marketing. Content in all possible forms – articles, Blog posts, Press releases, white papers, FAQs or Q&A sections, discussion boards, Newsletters, Case studies – all gain prominence like never before. Introduce sections in your website that answers relevant ‘Why does..’, ‘ How to..’, ‘What does..’questions related to your field. Remember they form a huge fraction of search queries. Take interviews, let experts clarify myths, share experiences, express ideas – pen down every aspect of your business in a meaningful manner (and don’t forget to link them appropriately in a structured design).

A word of advice – for webmasters who have been practicing a fair SEO strategy without resorting to Black Hat SEO Techniques, I recommend you do not make any immediate changes to your website, or its content. There is no reason to believe that all that you published so far is irrelevant and will fail to answer queries. The way I look at it, any future posts or content creation should be designed keeping the recent changes in mind. That should take care of it. You must also consider promoting rich content for the sake of promoting valuable information that users find useful, over prioritizing promoting the website. If the former is achieved, the latter very well follows suit!

What else could a webmaster do to stay ahead of competition? Well, if you have not yet been making use of Google authorship, you better start immediately, or soon you will be left out. They say, Google loves Google. You bet! So almost any content you publish gets more visible when it gets an identity of the author, and it links back to the author’s Google+ profile. It all makes sense, doesn’t it?

SEO is about marketing your website. There are no short cuts (and if there are, they work for a very short period of time, and then turn out to be disastrous.) So while there is half a galaxy of people trying to find out how this could be a Google scheme to maintain its monopoly, it is wiser to accept its power, and find answers to challenges such as Hummingbird, or whatever comes next! The discussion above should help webmasters gain an understanding of where they stand, and how to move ahead in the months to come.

Testimonial

CONTACT US

We're not around right now. But you can send us an email and we'll get back to you, asap.

    captcha