Takeaways from Google’s Webspam Report 2019
Earlier this month, Google released its annual Webspam Report in the Official Google Webmaster Blog for 2019 where they discuss their fight against webspam. Webspam refers to webpages that try to manipulate Google’s algorithm or use techniques that are deceptive to users.
As someone who is devoted to doing white-hat SEO, I personally look forward to this every year. This is not only a good way of getting in the know on what Google is doing to make their website a safer place for users but also it is a great way to avoid trouble for your website.
Google Webspam 2019 Highlights:
- Google discovers 25 billion pages every day that is spammy and manipulative.
- According to Google, 99% of web traffic from search results lead to spam-free experience.
- Google was able to suppress link spam more. 90% of link spam techniques were caught by their systems.
- Manipulative link building techniques such as paid links and link exchanges have been made less effective.
- Hacked spam is still a challenge and Google is still finding more ways on how they can better alert website owners that their website has been hacked and help them recover from it.
- Google is continuously improving its machine learning systems and combining it with its ‘manual enforcement capabilities’ to identify and prevent spam from getting to the search results.
- There was an increasing amount of websites using auto-generated and scraped content but they were able to reduce the impact of these sites on users by 60% more from 2018.
- Google received 230,000 manual reports of webspam from users and was able to resolve 83% of it.
- Google sent out 90 million error messages to website owners via Google Search Console. 4.3 million of these messages are about manual actions.
My Thoughts on this year’s report
Google has gone a long way in improving its systems in detecting webspam. They are catching more and more websites and penalizing them but unfortunately, a lot of people are still doing it. Google has a lot on its plate. 25 billion spammy webpages every day is mind-blowing but it’s really great that they don’t get to the users anymore.
It is also obvious that Google has doubled down on link spam and user-generated spam and it’s safe to say that it is related to the new link attributes. For us SEOs, it gives us more reasons to use the rel=”sponsored” and rel=”ugc” tags to avoid getting penalized.
Website hacking is a persisting problem and it is definitely a huge one. It is good news that it is more stable the past year and Google is finding more ways to detect hacked sites and website owners. But I think it us up to the website owners to help solve this problem. Investing in your website’s security is crucial and you should not leave it to Google to protect your website.
How can you help as an SEO?
As SEOs, let us be more responsible and respond to Google’s call in helping them combat webspam. We have a better understanding of the world wide web and we can easily spot webspam. I believe that what Google is doing is for the better of the internet and I believe we play a huge role in the fight.
230,000 webspam reports in the previous year is lower than I expected. But it may also be a good sign since Google is doing a great job in preventing webspam from appearing on search results.
If you find pages in the search results that are spammy, deceptive, and manipulative, you can use the Report Webspam tool in Google Seach Console to report the webpage. There is also a Chrome Extension that you can download from the Chrome Web Store so you could easily report webspam. Make sure to use these tools responsibly and only when necessary.