Google Search Console Tutorial: Google Index

Google Search Console Tutorial: Google Index

The tools that we use for our content writing and publishing, link building, or any other part of our SEO campaign, can only be as good as our understanding of them. If we do not understand the totality of the tools, then we are just using them at the most basic level. One example of a tool that needs thorough understanding is Google Search Console and its Google Index feature.

Since Google is enforcing the move to HTTPS it necessitates us to know tools that can help us with the adjustment. That is why I have started an article series that details on how to use the main features of Google Search Console – in my opinion, one of the best tools available to SEOs. This is the third part of the series, which means the first part about Search Appearance and the second article about Search Traffic has already been published. Let’s get started.

google indexGoogle Index

The Google Index section of Google Search Console can help SEOs in a variety of ways. It can help you in your search for index bloat instances, finding out if CSS is blocked, or the simple removing of URLs, the Google Index section is definitely a gold mine.

When you’re struggling with penalties because of the Panda update such as having too thin of a content, or low organic traffic, Google Index provides the necessary data that tells you on your content’s performance in the SERPs (Search Engine Results Pages).

Index Status

Basically, the Index Status sub-section show you, the user, a report on your website’s URLs that Google has already indexed over a specific amount of time, usually the past year.

The best use of the Index Status feature is to find out whether your website experiences an instance of index bloat. You can find out by comparing the data that you receive from the Index Status of Google Search Console to the data from Google Analytics.

What you’ll do when comparing them is:

Look at the number of pages in the Index Status data  Match it with the number of landing pages that receive organic traffic highlighted in the data from Google Analytics

If ever they do not match, then it probably means that only a small number of your indexed pages are receiving organic traffic.

To identify index bloat, here’s what you should do:

Go to Index Status → Google Index → Head to Google and perform a site:[website URL] search Inspect each page that are shown in the search results to detect a pattern in the page’s parameters → Find out if there are indexed pages that should not be indexed  Add the noindex tag to the pages Disallow them in the robots.txt

Blocked Resources

The Blocked Resources section lets you know what is blocked by your site’s robots.txt directives. Two of the most common items that get blocked unexpectedly is your site’s Javascript and CSS. If for example, you find out that your AJAX is blocked, then that means that Googlebot cannot render the pages using Javascript, and will definitely affect your rankings.

Unblocking URLs is simple, and here’s how to do it:

Remove the URLs from your robots.txt’s disallow section Use Google Search Console’s robots.txt tester tool to see if your updated robots.txt file works Inspect your pages to make sure that there are no instances of noindex or nofollow tags Input the URLs into Fetch As Google tool to see if they are being properly rendered

Remove URLs

Using the Remove URLs feature of Google Search Console can be complicated, but can be immensely useful for SEOs. Additionally, it is common for SEOs to experience a website they are handling possess thin or duplicate content.

I’ve written an article about the permanent solutions to duplicate or similar content, but you can use the Remove URL feature to temporarily hide them from Google. Just add the URL to the tool in Google Search Console. By doing this, the URL will temporarily remove it for 90 days, and the processing will take a day or two.

The main feature of this that I really like is that I can use it to organize my content before the Panda update comes in and to clean URLs that have case sensitivity issues.

So, before you input pages into Google Search Console, you should:

  • Add noindex meta tags to each and every page
  • Insert the rel=canonical tag to each page
  • Disallow them in the robots.txt file
  • Submit them to the Remove URLs feature

Key Takeaway

I’ve said it, and I’ll say it again. Knowing and understanding Google Search Console is a must for all SEO professionals. It can help you in each and every aspect of a website. From content to the technical aspect, Google Search Console has it all.

This is the second to the last part of the article series, so read up, and learn. If you have queries or suggestions, comment down below.

Share on:
Sean Si

About Sean

is a Filipino motivational speaker and a Leadership Speaker in the Philippines. He is the head honcho and editor-in-chief of SEO Hacker. He does SEO Services for companies in the Philippines and Abroad. Connect with him at Facebook, LinkedIn or Twitter. Check out his new project, Aquascape Philippines.