Basic and Advanced SEO Tutorials and News – SEO Hacker Blog https://seo-hacker.com SEO Hacker is an SEO Services Company and SEO Blog in the Philippines Tue, 18 Jun 2019 09:28:47 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.10 10 Crucial SEO Mistakes You Might Be Doing https://seo-hacker.com/10-crucial-seo-mistakes/ https://seo-hacker.com/10-crucial-seo-mistakes/#respond Tue, 18 Jun 2019 09:28:13 +0000 https://seo-hacker.com/?p=17025 Last updated on

Cover Photo - 10 Crucial SEO Mistakes You Might Be doing

With hundreds of ranking factors being used by Google to identify websites that deserve to be on the first page of the search results, us SEOs are getting crazy with the number of things to think about and tasks to do.

SEO is competitive. Once you get to the top page, you can never afford to make any mistake that can affect your website gravely. There is no perfect SEO campaign but these SEO mistakes are something you should take a look at.

Unoptimized Page Titles and Meta Descriptions

Page titles remain to be one of the most important on-page SEO factors. It is important for both search engines and users. I’ve seen a lot of websites only use their brand names as the page title of their homepage and that alone is causing them to miss a lot of ranking opportunities.

Page titles alone can increase organic rankings by a huge margin. This is one of the small things you can do A/B Testing with to see what works for you.

On the other hand, there were a lot of violent reactions to Google’s SEO Mythbusters episode where Google’s Webmaster Trends Analyst Martin Splitt said that Meta Descriptions is one of the top important things for SEO. While there is a lot of criticism by this statement, I am actually with him.

Yes, meta descriptions are not a direct ranking factor anymore but just like Page Titles, it is the first thing users see before entering your website.

Even if you produce high-quality content regularly but write crappy titles and descriptions, searchers would click on your website less and those clicks will belong to your competitors.

Not Doing a Regular Audit of Indexed Pages

An SEO Audit is something all SEOs should do regularly. While there are a lot of SEO Audit checklists that can be downloaded around the internet, the most important audit should be in your Google Search Console data.

The Coverage report will show you all pages being indexed by Google. If you’re seeing an absurd number of pages being indexed by Google, then you might need to de-index some pages to save crawl budget. At the same time, there might be some important pages not being indexed by Google because of a crawl error.

Fixing detected errors in the Search Console is also a must. This could be problems with indexing your content that could negatively affect your rankings.

Not Scouting Your Competitors

Competitor research should be a part of any SEO campaign. If you’re not doing this, then you’re missing out a lot.

Scouting on your competitors will give you an idea on what type of content they are publishing or what keywords they are targetting. Use this data to gather keyword suggestions and create better content and dominate them in the search results.

You also miss out on link building opportunities if you’re not spying on your competitors. Checking their backlinks and reaching out to these websites are a great way of gathering links for yourself.

Not Optimizing for Long-Tail Keywords

Long-tail keywords are often overlooked because they bring in less traffic but they are great opportunities. Often, long-tail keywords have low difficulty so it’s easier to rank for and increase a website’s SEO value. At the same time, long-tail keywords have a higher chance of converting because they cater to specific searchers.

Not Matching Content with Search Intent

Yes, content is king. We’ve heard it time and time again. But that doesn’t mean you just put out any content that you like.

Search intent, in my opinion, is one of the most important things SEOs should think about when producing content. Google is always for the user and the most important thing to them is serving the right information to searchers.

There are 4 types of search intent:

  • Informational
  • Navigational
  • Transactional
  • Commercial

Place your feet in the shoes of your audience and look at the list of keywords you have gathered. Think about what they are thinking if they search these terms. Or better yet, go ahead and search it on Google and see what results you’ll be served.

Building Links from Irrelevant Websites

You could build hundreds and thousands of links but if there are totally irrelevant to your website’s niche, Google will ignore these links and your link building campaign is a failure.

Sure, it is tempting to get links from a high authority website but link relevance is just as important as link authority.

Not Optimizing for Mobile

With the recent announcement that Google will now crawl websites using Mobile-First Indexing  by default, there are no reasons not to optimize your website for mobile. With more and more users using mobile to surf the web, Google will always favor websites that load faster and perform better on mobile that those that don’t.

To test how your website performs on mobile, you could use Google’s Mobile-Friendly Test or PageSpeed Insights. There are also other free testing tools like GTmetrix. This tools will give you valuable information that you could use to optimize your website’s speed.

If you don’t One of the easiest things you could do if you’re using WordPress, you could install a free plugin to create AMP pages for your website. Once Google index your AMP pages, users would most likely land to them seeing a lightweight version of your website.

Not Optimizing Anchor Texts

Anchor texts are both important for inbound and outbound links. Using tools like Ahrefs, you could see the top anchors for your website. Having 5,000 links is nothing if your top anchor texts include terms that are totally irrelevant to your website.

The same goes for internal links. Do not link to other pages of your websites just for the sake of linking to it and inserting your keywords. Make it look natural and don’t force it.

Keyword Cannibalization

Keyword cannibalization happens when multiple pages of a website are ranking for a single keyword. Not only is this a total waste of time but it also can be detrimental to your rankings.

This problem is often unnoticed and has caused headaches to the SEO community. If search engines see two blog posts eligible to rank for a single keyword, it confuses them resulting both articles to rank low.

Keyword cannibalization has a lot of grey areas and you should be careful before you decide to scrap an article or combine articles together.

To spot keyword cannibalization on your websites using Google Search Console, under Performance, add a keyword you are targetting under Query and it will show you all pages that have impressions for that keyword. You could also use Ahrefs to solve keyword cannibalization.

Not Having an SSL Certificate

Have you ever gone into a website and see a message that your connection is not private? Or a website that is labeled “not secure” in the URL bar? Ugly right?

A few years ago, Google made an announcement that HTTPS is a ranking factor. This is one of the basics when setting up a website and yet I see a lot of websites still running on HTTP.

This is something not only Google is concerned but also the users. Users would most likely not continue to your website or click away when they see this warning. There a lot of websites selling SSL certificate but there are also websites that offer SSL Certificates for free.

Key Takeaway

SEO mistakes happen. With hundreds of rankings factors and many details, it’s almost impossible to have a perfect SEO campaign.

But one of the things I love about SEO is the most is that one is different from the other. It’s a mix and match. Try to find what works for you and create goals to guide you in every step of the way to make fewer mistakes as possible.

Have you had an SEO mistake that affected your website badly? Share it on the comments down below!

]]>
https://seo-hacker.com/10-crucial-seo-mistakes/feed/ 0
Google – Yoast Team Proposes New API for WordPress Sitemap https://seo-hacker.com/wordpress-sitemap-proposal/ https://seo-hacker.com/wordpress-sitemap-proposal/#respond Thu, 13 Jun 2019 07:09:21 +0000 https://seo-hacker.com/?p=17016 Last updated on

Cover Photo - Google - Yoast Team Up for Proposal on XML Sitemap WordPress Core Feature

Devs from Google and from Yoast have collaborated on a new project proposal that will automatically generate XML WordPress Sitemaps by default. The proposal for the integration of XML Sitemaps to the WordPress Core has been gaining mixed reactions across the SEO community and I have been observing each one of them.

Now, here’s my take on this new feature proposal. Since most of my clients use WordPress, it would be a beneficial move to stay in the loop of things and I think you should be updated as well. Here’s what you need to know about the XML Sitemap WordPress Core Feature proposal.

What are Sitemaps?

Before I get ahead of myself, let’s backtrack and have a brief section about sitemaps. A sitemap pertains to the list of web pages that are accessible to users. Your sitemap can help you filter out the good quality pages and separate them from the pages that are not worthy of indexation. Webmasters can use the sitemap to ping Google and tell them that the pages included in this list are more important than the others. Sitemaps are also very valuable in maintaining relevance to your site because this can tell the search engines how frequently you update your website.

You can see your sitemap index by putting /sitemap.xml together with your homepage URL:

seo hacker sitemap

There is a common misconception that just because you have a sitemap, it automatically qualifies you as a success for a ranking factor. Google does not give out favors to those who have sitemaps just because they are diligent enough to create one and submit it to the search engine. It does not affect the search rankings. However, the search engines can easily find the most important content in your site because it can allow the system to better crawl your website.

The integration of XML Sitemaps to WordPress Core as a feature project

WordPress has long established itself as a great foundation for SEO because of its handy features. For one, you can customize metadata pretty easily with this site. Now, with the proposal to integrate sitemaps into its core system, this would further highlight the platform for SEO efforts.

Sitemaps greatly supplement crawling because it can improve site discoverability and accessibility. Search engines would have a better idea of what URLs are relevant to your site and what their purpose is thanks to the associated metadata.

WordPress does not generate XML sitemaps by default, which is why a team of developers from Google and Yoast proposes that the WordPress Core include their own implementation of XML sitemaps. According to them and those who echo an affirmation of this initiative, there is a universal“ need for this feature and a great potential to join forces.”

Google and WordPress came up with the proposed solution of integrating basic XML sitemaps in WordPress Core through the introduction of an XML Sitemaps API to automatically enable sitemaps by default.

The enablement of the XML sitemaps by default will make the following content types indexable:

    • Homepage
    • Posts Page
    • Core Post Types (Pages and Posts)
    • Core Taxonomies (Tags and Categories)
    • Custom Post Types
    • Custom Taxonomies
    • Users (Authors)

You can further digest this information clearly through this diagram from wordpress.org:

XML Sitemap Proposal WordPress Core

It is also important to note that the robots.txt file tagged to the WordPress will reference the sitemap index. The Sitemaps API aims to further extend its use and according to the developers, these are the list of ways that the XML Sitemaps can be maximized via the proposed API:

    • Add extra sitemaps and sitemap entries
    • Add extra attributes to sitemap entries
    • Provide a custom XML Stylesheet
    • Exclude specific post types from the sitemap
    • Exclude a specific post from the sitemap
    • Exclude a specific taxonomy from the sitemap
    • Exclude a specific term from the sitemap
    • Exclude a specific author from the sitemap
    • Exclude a specific author with a specific role from the sitemap

Once the project proposal has been rolled out fully, it can cater to most WordPress content types and it can help webmasters fulfill the minimum requirements to be indexed in the search engine. However, there are some items that the developers are not prioritizing for the initial integration:

  • Image sitemaps
  • Video sitemaps
  • News sitemaps
  • User-facing changes like UI controls in order to exclude individual pages from the sitemap
  • XML Sitemaps caching mechanisms
  • I18n

How can the API help you maximize your sitemap?

Sitemaps are an important factor especially if you want to stay ahead of your competition in the SERPs. You should pay special attention and care to it because it can make the difference between a success in your site’s performance and its stagnant growth.

In addition, the XML sitemap proposal also highlighted that there can also be a form of leverage for the standard internationalization functionality provided by the WordPress Core which will help all those sites to be competitive in terms of localized content. The sitemap is a bold promotion of the best web development practices for SEO. Knowing Yoast, this would be a good reach into the way people do SEO since the API can greatly affect optimization of sitemaps.

Although it is not a direct ranking factor, you should take it to mind that you would not rank for your most important content without a sitemap. This is why the integration of an XML sitemap to WordPress by default can be a step above the usual SEO practices. Additionally, the feature will be at home in the core of the WordPress site because they can be especially useful for caching during optimization. It goes on to show that it can also improve the speed and performance of your site.

Key Takeaway

The team is still in the middle of crafting this API, as evidenced by Thierry Muller, Developer Relations Program Manager at Google and Former Engineering Program Director at WordPress, his parting words were:

Your thoughts on this proposal would be greatly valued. Please share your feedback, questions or interest in collaboration by commenting on this post.

There is a wide audience of webmasters who are interested in working on the project as evidenced by the make.wordpress.org comment section on this particular post and the buzz in the Twitter community, so we cannot review this proposal accordingly until it is fully integrated into WordPress.

It would be nifty to enable sitemaps by default but you also have to think about the cons of this; one of which is that this API might clash with the plugin you installed to generate a sitemap. The question for this would be, “Would this API disregard these kinds of plugin features if this would be fully implemented on WordPress?”

Since this is still in the works, let’s just stay tuned for more updates.

]]>
https://seo-hacker.com/wordpress-sitemap-proposal/feed/ 0
A Week Into Google’s June 2019 Broad Core Update https://seo-hacker.com/google-2019-broad-core-update/ https://seo-hacker.com/google-2019-broad-core-update/#respond Tue, 11 Jun 2019 09:00:52 +0000 https://seo-hacker.com/?p=16990 Last updated on

A Week Into Google’s June 2019 Broad Core Update

It has been a week since Google rolled out their pre-announced June 2019 broad core update. Leading SEO news websites such as Search Engine Journal and many others have already published articles regarding some wins and losses. This is one of the rare instances where Google has let us know ahead of time that they’re going to be updating their algorithm instead of just releasing it and letting us SEOs find out the hard way.

They officially rolled out the update last June 4 and it’s exactly been a week since. I wanted to share some findings and how you can improve your understanding of sudden algorithm changes that happen throughout the year.

Effects of the June 2019 Broad Core Update

I started checking the effects of the broad core update yesterday since it’s safe to assume that they fully rolled out the complete update after a couple of days of its official release. Here’s what I found:

Analytics Client Data

 

Analytics Client Data graph

 

Analytics big Client Data

The images above contain the traffic graph for three of our biggest and oldest clients and, as you may see, there aren’t notable changes to their traffic. It’s consistent, steady, and the trend hasn’t changed. Although one of them experienced an extremely minimal drop in traffic during the day that the broad core algorithm was rolled out, it could mostly be disregarded since it’s steadily climbing back up.

This is different from what Danny Sullivan, Google’s resident Search Liaison said in his tweet:

Danny Sullivan Tweet Screenshot

He explicitly stated that core updates are definitely noticeable. But maybe we’re one of the lucky ones that did not get affected by the broad core algorithm.

However, I can’t say the same for others. One of the leading cryptocoin news websites, CCN, was massively affected by the broad core update – leading to them shutting down since their loss of revenue was too great to support their team. This is what they said:

“Google’s June 2019 Core Update rolled out on June 3th 2019 and CCN’s traffic from Google searches dropped more than 71% on mobile overnight.

Our daily revenue is down by more than 90%.”

This is the most extreme result we can experience. Having to close down a profitable website all because of a single algorithm update by Google. You can attribute this to a variety of factors, but to really know the reason behind the loss of traffic, Google is the only one that can answer that.

SEO Tools are not Perfect

Although 3 of our biggest clients were not majorly affected, one of our oldest partners had an alarming drop after we checked it. Here’s a screenshot:

The date when our traffic dropped coincides with the date of the broad core update release, so it was alarming for us. We ran diagnostics on the website, checked if there was an attack or if there was something that violated Google’s guidelines. So far, we didn’t find anything that might have caused the drop. I decided to check other tools that could confirm if this was true. Here are some screenshots:

Google Search Console Traffic Data

The red arrow points to when the update was rolled out and as you can see, a drop didn’t happen. I needed to be sure and I check another tool:

Ahrefs Traffic Data

After finding this out, I was sure that there wasn’t anything wrong with our client’s traffic. We just needed to fix something with regard to the implementation of our Google Analytics’ tracking code.

Never rely on just one tool. It’s always best for us SEOs to have a variety of tools at our disposal because we can never hope to be too accurate. Having a variety of tools to cross-check your site’s data is the safest way to conduct site checks. If you rely heavily on a single tool, you might be looking at inaccurate or downright wrong data – which you’ll show to your client if you give them regular reports.

Key Takeaway

SEOs have to always be prepared to adapt and change our strategies accordingly since Google updates their algorithm regularly. For us to be able to do that, we need accurate data and effective tools to help with our campaigns. Google’s June Broad Core Update is just one among the many algorithm updates they’ve released and I’m sure this is not the last one.

Lastly, if you know that whatever you’re doing is in line with Google’s guidelines and all the strategies that you do are white-hat, you don’t have to worry about Google’s regular updates. Although we can never predict what Google will do next, it’s better to be safe than sorry. So far, the June Broad Core Update has had minimal effects on us and our clientele. What about you? Tell me about it in the comments below!

]]>
https://seo-hacker.com/google-2019-broad-core-update/feed/ 0
8 Reasons Why Your Link Building Campaign is Not Working https://seo-hacker.com/8-reasons-link-building-campaign-working/ https://seo-hacker.com/8-reasons-link-building-campaign-working/#respond Thu, 06 Jun 2019 09:22:45 +0000 https://seo-hacker.com/?p=16980 Last updated on

Link building continues to be one of the most important SEO strategies. There is still a great correlation between the number of links and high organic rankings. This is why any SEO strategy could not be complete without link building.

Search engines, especially Google, have made link building a lot more difficult. This makes link building campaigns a lot more complicated. It requires careful planning and takes a lot of time and patience to execute.

It could quite be frustrating to launch a link building campaign for months and see no increase in your website’s traffic or rankings and one of these might be the reason why your link building efforts are not working.

Your Links are from Irrelevant Websites

Just like for content, relevance is important for any link building campaign. If your website’s niche is business and finance, it wouldn’t make sense for search engines if you get links from a medical website.

Link relevance is just as important as authority. Both of these are strong link-related ranking factors.

If you find irrelevant links on your Search Console Link Report, don’t disavow it just yet! If Google thinks a link is irrelevant, they are smart enough to devalue it. So unless you receive a manual action, don’t disavow immediately.

If you have guest blogging on the list of your strategies, plan it out and don’t just send emails to random webmasters. Create a list of websites in your niche that are not direct competitors where you could contribute articles that will provide value.

You’re Not Getting Referral Traffic

If you’re building links for the sake of just getting a backlink, then you’re stuck in the old ways and you need to catch up. Link building today goes hand in hand with content marketing. You create great content to get people to go to your website and read your content.

Before you build a link on a website, you need to ask yourself: “Would people click on my link?”. The more likely a person clicks on your link, the more a link is valued regardless if it’s NoFollow or DoFollow.

If you’re doing a link building campaign, check your Google Analytics account and go to your Acquisition report. Monitor and identify the websites you are getting clicks from to better plan your strategy.

Imbalance Between Links Built and Links Earned

Do you want to know the best way to build links? Earn them. The reality is, Google hates link building because it’s too manipulative that is why they have taken steps to make it more difficult. If all of your links are built and none of them are organic, then you have a bad link profile.

While a lot of common link building strategies still work, to Google, nothing beats more than acquiring links because of great content. That is why the most successful link building campaigns involve great blogs.

There is no doubt that to Google, content is king. Building links to your website is good but you need to have a balance in between. Create good content that people would love to read and link to and links will come without forcing it.

Not Enough Referring Domains

A backlink profile with thousands of referring pages might look great but what matters to Google more is the number of referring domains. Even if you get 10 links because you contributed 10 guest articles in a website, it will only count as one referring domain.

This was once abused before that is why link directories and link farms became a business in the 2000s. You place a link in a website and that website scatters your link in different pages inside giving you hundreds of referring pages from just website.

You could use Google Search Console’s link report to check if you have a good ratio between incoming links and linking website/referring domains.

If you’re building links, make sure you don’t get links from the same websites just because it’s easier. Focus on building relationships, connect with different webmasters, and refrain from buying links from link farms or PBNs because they just don’t work anymore!

Slow Link Velocity

Acquiring a link in a day is one thing, acquiring more links over time is another. Link velocity is about how slow or fast your website is acquiring backlinks. If you’re losing more links than your gaining, it might be a sign that people are losing trust on your website. At the same time, if you’re gaining links too fast, it might be a sign to Google that you’re doing something fishy.

You could use Ahrefs to check how the growth of your links looks like. Here’s what a great link velocity look like:

Make sure that you build links as natural as possible. You don’t have to build a thousand links in a month just to relax the next month. Never sacrifice quality for quantity. It is better for you to build 5 to 10 high-quality links in a month than produce a thousand random links.

Lost Backlinks

Losing backlinks can be caused by different things. It might be the website closed for good, the page was deleted, or maybe the webmaster decided to remove your links from a post. If you want to check for lost backlinks, you could use Ahrefs.

You could either check for Lost Backlinks or Lost Referring Pages. I usually prefer checking the Lost Backlinks report because it specifically says what page I was linked before.

A loss is a loss but you could still try to get them back. If you lost a backlink from a blog post that linked to you as a source, you could reach out to the webmaster and ask why your link was removed. It might be because your page was outdated or it returns a 404 Error. This could be an opportunity for you to update your content or fix errors and ask webmasters to link back to you again.

Competitors Have Better Links

If you’ve been building links for months and still haven’t overthrown your competitor who is at the top spot, then most likely they have better, more high-quality links.

Take note of “high-quality links”. Even if you check Ahrefs or other backlinks checker tools out there and see that you have more links than your competitor; quality will always have more value than quantity.

Do research on what your competition is doing. Competitive link building is one of the best ways to build links. Find out who is linking to your competitor and try to get them to link to you.

Other SEO Factors are Not Optimized

Link building is great but it is not everything. Google uses more than 200 ranking factors and links are just one of them. No website gets to be successful by relying on links alone so don’t give yourself too much headache.

There is always room for optimization. Aside from acquiring links on a regular basis, you should also focus on regularly producing high-quality content, making your website mobile-friendly, improving site speed, and many more.

Always remember that SEO should be holistic. Try to find a balance in everything. While you can’t do all ranking factors perfectly, try to find what works for you.

Key Takeaway

Link building has evolved and it continues to evolve. In my opinion, links will continue to be relevant. It might not be as powerful as before but to rank high organically will always involve high-quality links.

A great link building campaign is the one that follows Google’s guidelines and it goes hand in hand with other SEO strategies. If you see that your link building campaign does not work, learn how to step back, investigate, and restrategize.

]]>
https://seo-hacker.com/8-reasons-link-building-campaign-working/feed/ 0
Google Announces Broad Core Algorithm: June 2019 Update https://seo-hacker.com/google-june-core-update/ https://seo-hacker.com/google-june-core-update/#respond Tue, 04 Jun 2019 06:34:15 +0000 https://seo-hacker.com/?p=16961 Last updated on

Google Announces Broad Core Algorithm June 2019 Core Update

In a rare move by Google, they pre-announced a broad core algorithm update over the weekend via their Google SearchLiaison Twitter account. This is a bold improvement for SEO specialists who are usually kept in the dark about these matters, only to see the effects once it drastically affects them in a negative or positive light.

The second update of the year has baffled the industry since it was the first of many updates that have been announced in advance by the search engine. As with any update, there is bound to be winners and losers that will come out of this. Let’s take a look at the June 2019 Core Update and what it means for SEO.

June 2019 Core Update

Google announced the Core Algorithm update on June 2 EST. Google SearchLiaison paired it with a previous tweet they posted last October, 2018. The said tweet is an update about updates, leaving much to the reader to analyze what it means. At the heart of it, the search engine is basically telling us to stay calm and avoid making changes that “fix” the effects of their updates.

Google June 2019 Core Update

This was affirmed by Danny Sullivan, the face of the search liaison, when he also linked to that particular tweet. As with many situations, this is a prime example of how prevention is better than cure. In this type of instance, content can prevent you from getting hit by updates once it comes rolling out. Quality content is still your best bet to dominate SERPs, it would do you good if you never lose sight of this.

danny sullivan

As of today, the June 2019 Core Update is showing no signs of alarm. With forecasts of the SERPs volatility remaining relatively calm.

roll out update

Using Mozcast, Semrush Sensor, and Accuranker’s Grump, I saw that there were no drastic changes. However, it would be best to consistently monitor your site’s performance for the weeks to come.

mozcast

semrush sensor

grump accuranker

Relevance and Content for SEO

Staying relevant through quality content and cutting edge site optimization will help you rise above algorithm updates. When you observe that it is no longer ranking as well as it should have, look at your site’s relevance and put yourself in the perspective of Google. View your relevant content as Google will see it and the best way to do this is to digest the Search Quality Rater Guidelines. Sometimes, it plays to go by the book so you can have more room to experiment without getting penalized.

The search algorithm is a fair-weathered friend and the best way to deal with its many moods is to peruse the Google Search Quality Rater Guidelines. This is something that SEOs neglect to read even before diving into the craft of search. Frankly, everyone should. I know I did.

Users need content, no matter what the trends are playing out on the digital field. Your SEO would flourish if you know how to serve Google’s purpose and that is to cater to user queries. Be relevant, you would not help anyone with outdated content. Possibly the best way to ride out the algorithm updates is by keeping this in mind. It may be an obvious reminder, but it is something that many webmasters often forget.

Be productive and avoid fixing things that do not need to be fixed. Keep optimizing for relevance. Compete with those who are ranking better through content and quality links.

Favicons in Search Bar

Although it is not a confirmed part of the algorithm update, you would notice that Google has taken upon itself to include favicons in the search bar, at least for famous personalities and pop culture references. This is a recent development from just having a text in the Google Chrome browser search bar which previously displayed the text of a person or entity you are searching for. Here’s a sample screenshot:

Favicon

This is not reflected in any local listings, at least from our end. This could mean that Google is just overhauling its display interface. But does this signal that favicons are possibly a ranking factor in the future? Who knows, right? This may be a heads-up that Google will be more vigilant in monitoring authority in the SERPs. Which leaves us to think how could we establish authority in our particular niche.

stranger things

There has been no announcement of this yet. Just know that this is merely an observation, but this leaves room for discussion. Which leaves me to ask, are you optimizing your site’s favicon recently?

What to Do Now?

According to Google, you should probably just calm down. Broad core algorithms do not generally mean that you have to fix something on your site. Webmasters should be mindful of progress reports, tracking the effects of updates along the way. These types of updates are meant to teach you that banking on content will go a long way because you can be practically invincible once Google starts sniffing for sites that do not pass a suitable quality rating.

Danny Sullivan Broad Core

However, it’s still too early to prop your feet up and be relieved that the update did not affect you. Google updates are known to roll around and take effect for about a week to ten days, so it is advisable that you monitor your sites accordingly. Be reminded that this is a “Broad” core algorithm so it does not specifically target a particular niche, it will cover a wide scope and not just the industry that your site is in.

It’s normal to be on top of things and find the possible solution to fix the problem as soon as you see it, but you would be surprised how you can approach the algorithm update effects as a level-headed webmaster.

Key Takeaway

The June 2019 Core Update is an early announcement by the search engine, but we should expect more from them. Now more than ever, this is a call to ramp up our SEO efforts and be hellbent on staying at the winner’s circle. Be mindful of errors and low-quality content, this will help you keep your head in the game.

Relevance through quality content and authority will be the best equipment you can have to stay winning. Keep that in mind.

]]>
https://seo-hacker.com/google-june-core-update/feed/ 0