I still can’t believe it’s already 2017 when it feels like only a few days ago that it was 2016. *Ba-dum-tss* okay, no more puns, I promise. Anyway, SEO is something that I’ve loved doing for a long time now. We’ve discussed many topics over the past couple of years and so, I’d like to take a step back and talk about something a little more basic: Mistakes.
I know what you’re thinking: Why the heck should we cover mistakes when we’re already so far ahead? My friends, that’s exactly why we should take a step back, breathe and admire the work that we’ve done so far and look at the possible mistakes that you may be doing. It will only take a couple of hours or so to check but believe me, this is so much better than ranking lower or even making Google mad at you!
I know, it’s hard to constantly have to update your website and make sure that it’s in top condition all the time! But nobody ever said that SEO was easy. In reality, SEO is easy to get-into but actually putting it into practice is another story entirely. The beautiful thing about SEO is that while it does need a lot of hard work in order to stay at the top but when you do get there, I can assure you that it’s one of the most rewarding feelings in the world.
See? Thinking about it like that makes looking for mistakes a little less bothersome right? It’s all about your mindset, really. Let’s put what you’ve learned so far to use and make your website even better.
We’ll be covering minor SEO mistakes that you may be making and believe me, these minor mistakes could actually be doing you more harm than you think. Without further ado, let’s get started:
#1 That awkward moment when you don’t allow search engines to index your site though CMS
Some of the most widely used blogging platforms in the world such as Joomla and WordPress have so much to offer that they can actually hinder you from growing your website! Many of these SEO plugins in these CMS platforms have pre-installed features that empower users to allow search engines to basically ignore their website and not index them at all. That’s a funny idea but it won’t be so funny when it happens to you – even if it was because of an accident!
What you can do to avoid making this mistake is go to Settings > Reading > Discourage. It should be like that across all of these platforms and beside this option should be a box that you can MAKE SURE THE BOX ISN’T TICKED. Ticking the box will instruct Google to not index your site which will negatively affect your rankings.
#2 You know the feeling when you don’t check your code in the validator? Yeah, that sucks.
One of the most important things that you have to keep in mind is that your website, along with everything on the internet, is made up of code. Code is essentially the digital counterpart to what we are made out of, DNA. Anyway, if you have a talented web development team then chances are that your code is good. That’s a good thing because the better your code is, the higher your SERPs for your website will be. The reason for this is because cleaner code will help search engine crawlers index and scan your website more systematically or efficiently.
Whatever the reason or occasion, whenever a new project is given to you or your team, always make sure to double check the code! It doesn’t take an expert to make sure that your code is doing well. What you can do is copy your website’s address or URL then put it in The W3C Markup Validation Services and have them check it for you. Even if the results are good or bad, always make sure to present your findings to your web development expert so that they know if they’re doing things correctly or if there are some errors that they can fix.
Majority, if not all SEO specialists know and respect the fact that links are one of the most important factors when it comes to ranking. What many don’t know is that forgetting to do one simple thing will let other websites take advantage of your link powers – and that’s not good. That’s right, it’s the simple act of adding a “nofollow” tag to your outbound links.
Your main strategy should always be to drive quality backlinks to your website but remember to keep as much of your link power to yourself. Keep in mind that this isn’t being selfish, it’s simply you doing what you can to protect your website. It is never your obligation to let people take advantage of your links but it is a privilege you can give to people who actually do something positive for your website.
#4 When you leave your robots.txt open for crawling. Feelsbadman.
While there are other “minor” errors written here on our list, this one is easily the worst offender. Remember. NEVER EVER LEAVE YOUR ROBOTS.TXT FILE OPEN. Leaving it open can lead to serious privacy concerns that will definitely open your website for intrusion. Some people can actually even use that loophole to take control of your website while others may even take your website down completely!
It doesn’t matter if you’re an SEO specialist for 10 years or someone who started yesterday. Please, take the time to learn whatever you can, as much as you can about taking care of your robots.txt file. If you check your robots.txt file and see this:
That means that your website is vulnerable and robots will be able to access each and every corner of your website. Yes, even the most secure areas such as the admin page and even your customers’ private data! Simply put, making sure that the pages that should be locked (such as the examples listed above) stay locked will do you more good than you think. Make sure to allow robots into places where they should be, such as areas that should be indexed.
#5 Ever disallow your website from being indexed in .htaccess? Yikes.
I know that you know what .htaccess is. It’s the configuration file that is basically used to keep instructions or directives that provide access or block said access to a website’s directory of documents. This seemingly basic tool is actually extremely crucial to your SEO process because it can be used to refine your website’s indexing processes and by extension, it also helps you get better, higher quality SERPs.
The main problem with correctly setting up your .htaccess is that it really needs to be done by a professional. Maintaining your website’s .htaccess is important to keep allowing bots in to places where they can be, otherwise they will ignore your website and not crawl or index them. That can be an extremely terrifying thing to experience especially if you’re trying to get your website to rank higher only to find out that this simple mistake screwed you over?
So there we have it. You see how simple these errors are? You could be doing them and truth be told I could be doing them too. The good thing about them, however, is that they are easily fixable as long as they haven’t done any lasting damage to your website. Just remember to keep checking what needs to be checked and you should be fine!
Have you ever had anything happen to your website because of a small mistake? I know I have. Let’s talk about them in the comments section. Who knows, your experience may actually help save someone’s website!