Last updated on July 27th, 2018 at 05:02 pm
A great, competent, and successful SEO campaign can only be achieved if the SEO professional has the sufficient capability to make it happen. Unfortunately, many of today’s SEOs only have a shallow understanding of the entire process of making a website rank.
That is why I started this article series. To help SEOs have a deeper understanding of one of the most useful tools available to us, Google Search Console. Part 1, part 2, and part 3 are already available, and this will be the last part of the tutorial series. So, read up, learn, and take your campaign to new heights. Let’s start
The move to a bigger, better, and a more mobile-friendly website is a must – now more than ever – for all SEO professionals. We can no longer skim through our website’s data, but we have to attentively scrutinize every little bit of information the data has in order for us to completely understand the issues that affect our site.
The Crawl subsection has the capability to do a lot of things. From incorporating 301 redirects to viewing your website the same way Google does – the Crawl subsection will definitely help your SEO campaign.
Basically, the Crawl Errors feature can tell you the broken pages located on your website. The feature shows you two sections of errors, namely:
- Site errors
- URL errors
These two shows you a compilation of your website’s errors. From HTTPS response codes to the common website errors, you can check them all here.
The Site error displays all the errors of your website, while the URL errors display the errors when your desktop and mobile pages are crawled.
If you experience any of these errors, here are the steps you should take:
- DNS error: This means that there is an issue on your site’s servers, so it would be optimal if you contact the hosting such as GoDaddy, BlueHost, etc.
- Server Error: This happens if your website receives too much traffic. If you want to fix this, then inspect your page and find out if there are issues with the connection or timeout.
- Soft 404 errors: This is due to the inability of your header HTTP response code not returning a 404 code. Immediately incorporate a 301 redirect if the page is no longer active, or check if the specific page has duplicate or thin content.
- 404 error: One of the most common errors SEOs and webmasters experience, 404 errors usually happen whenever you unpublish or delete that certain page. It’s important to note that this does not affect your rankings, however, the issue with your backlinks will be a problem. So, immediately 301 redirect that page to save your backlinks.
The best way you can fix your website is to clean up all your crawl errors and make sure to avoid any other issues in the long run.
This feature may intimidate you at first, but expect it to give you a detailed data report on your website’s crawl rate. Basically, it tells you how often Google crawls your site, and when they do it.
Always remember that the faster Google crawls your site, the better it is for its indexing. Additionally, if your site’s crawl rate continuously goes up, then you’ll probably notice that your rankings are improving. Conversely, if you see the lines in the graph going down, it might be because of an issue in your site.
Crawl Stats has three sections:
- Pages crawled per day: This displays the crawl rate of both your good and bad pages. If ever you notice the line going down, you might want to check out your website for any issues or errors. If you see the line suddenly spike up, it might be because of a recent addition to the site, or because you unblocked a certain section in your robots.txt file.
- Kilobytes downloaded per day
- Time spent downloading a page
Examine the graphs of the last two sections. If the graphs are high, then it might be related to the HTTPS requests not hitting more than 20. You can actually use the SEO Site Checkup tool to see if there are multiple HTTP requests happening.
Fetch as Google
This might be the best feature of Google Search Console because it lets you know how Google sees your website. Additionally, this feature also allows you to view how Google renders your website. The Fetch button displays the HTTP response the Google receives, and the Fetch and Render button lets you see how a particular browser displays your website.
Also, the Fetch as Google feature is great because it gives you the capability to find out if there are hidden content that is dynamically generated.
Other notable features of Fetch as Google are:
- Launch a new segment of your site
- Mobile design introduction
- Robots.txt file updates
- Rel=canonical tags implementation
- Updating old web pages
- Moving from HTTP to HTTPS – which is now enforced by Google
The dread of most SEOs is their client’s inability to understand why their website receives little to no traffic, and sometimes, the cause of this is because of the improper implementation of their robots.txt file. Fortunately, Google Search Console’s Robots.txt Tester tool lets us know why Googlebot is blocked from the URLs of our website.
Finding out if there are blocked pages in your website is easy enough. Just follow these steps:
Go to the Robots.txt Tester → Input a main page’s URL in the field → Click the “Test” button
Immediately after, you will see if there blocked pages by seeing the word “blocked” that is colored red.
Sitemaps can be hard to understand at times, but it is and always will be an integral part of Google Search Console. From the exclusion of tags to the removal of categories, much of what is included in the Sitemaps subsection can affect your website.
Always remember that everything you see in the Sitemaps reports because it shows you warnings regarding some issues on your website.
It also lets you see some valuable insight regarding your website. Obviously, these insights mostly include errors, however, if you really investigate the insights, you’ll probably discover some things about your website that you did not know before.
The best way to fix the errors shown by your Sitemaps report is to investigate the root cause of the issue. Even if your website has more than 90,000+ pages with multiple sitemaps, all it takes is one uppercase letter in the entry to ruin thousands of your site’s URLs.
One of the most inconvenient things that an SEO could ever experience is duplicate content. It always comes up during the most unexpected moments – events, promotions, or new releases. And the best way to tackle this issue is through the URL Parameters feature of Google Search Console.
Basically, this tool lets you see all the parameters that are being used on your site. You can set guidelines on how Google treats the parameters of your website. Although the use of canonical tags is a big help for duplicate or similar content, it actually puts your crawl budget at risk which could lead to index bloat.
The URL Parameter tool will help you take care of duplicate content while managing your crawl budget.
This is the last part of the article series, and now you know how important Google Search Console is for SEO professionals. You can achieve multiple things with it that would have taken you a lot of time to with along with a variety of tools, but with Google Search Console, all of them are located in one place.
Also, there have been a set of screenshots that have leaked on the internet that shows us how the new version of Google Search Console will look like, and some new features included. Based on the screenshots, the new features include:
- Better organization with a focus on the mobile platform
- New index coverage reports for better insights
- New AMP tool
- Better notification system
This is the last part of the Google Search Console tutorial series. If you have questions, feel free to comment down below.