I am often asked by clients how we detect if a website is hit by Google. And there are many who have a similar query looking for an answer. Let’s take a look at this issue.
Why would you even have this question in your mind? Probably you noticed a sudden drop in your visitors or the call-to-action you measure your conversion with? Or you have, for long, been waiting for that elusive upward shift in your page rank, that you don’t see happening. And you think, there is a chance Google’s latest algorithm change could have affected your website.
For one, if you sense a drop in the visitor base or the number of page views of your website, or a drop in your page rank (PR), these are alerts that something is wrong. It may not necessarily mean that the webmaster is at fault. But this is a sure hint that some investigation is required to detect what led to this downward change.
What are some ways to detect problems in your website?
1. Check the webmaster console for malware.
Webmaster tool can tell you if your site has been attacked by malware or a virus. This is a situation which may lead to a sudden drop in visitors. If so, more than the numbers, your concern is to prevent further damage and take control of the situation immediately.
2. Check the health of the external links to your site or your back-links.
Often, we forget to check back how our back links are performing. If the quality of your backlinks deteriorates, or there are massive numbers of links that were built in the past which have lost their authority and are treated as spam, it is time you dissociate your site from them. The effects of linking to those sites could well be taking a toll on your website.
3. Check ‘site:domain.com’ on Google.
This will help webmasters detect how your website shows up on Google. Meaning, if the website is entirely invisible on Google or if certain parts of the website are not showing up. The reason could be a robots.txt file or a no-index that you did some time ago, and is inadvertently preventing Google from indexing the page(s). If you think your website is fine and the content is relevant and up-to-date, you may choose to submit fresh URLs to index to the Webmaster console for Google to recrawl and index the pages.
4. Try ‘Fetch as Google bot’.
This can be an eye-opener because this way, you will see a link as Google bot sees it. So you will know how you appear, how your content looks, if there’s a malware, or anything at all.
5. Any change made to your website’s design, overall functionality, old pages removed and additional pages added, changing your hosting, DNS – any major change can affect your site’s rankings, if Google fails to crawl those pages.
6. Upgrade your content.
Ask yourself if you have content that is scrapped, duplicate, or just not useful or relevant to what your audience is looking for? Or do they get better answered by your competitors? If so, it is time to revisit your content. Your competitors are walking away with all the visitor attention.
For additional help, I would recommend you read the post by Amit Singhal who has shared some great insights into what is considered a quality content. I think it is a valuable read.
By this time, you should have a fair idea if all’s well with your website, or if it needs some amendments. Any policy violation or breach of Google guidelines should be immediately addressed and rectified. And then consider submitting your website’s URL to index. Google obliges such reconsideration requests almost immediately, and you have a fair chance of being rated in the next couple of days. If you still end up with a poor show, seek professional help and get your website diagnosed by experts.