Google Versus BlackHat Since 1999

Black Hat SEO

The recent Google algorithm updates of Panda and Penguin were the latest interventions in a long history of combating unethical link building and spam content sites.

Webmaster’s Note: This is a Guest Post by Thomas Bagshaw

The early Google algorithm was a less sophisticated creature. In the wake of the search engine’s birth in September 1998, the development of search keywords and pay per click advertising by the year 2000, the black hat SEO industry soon followed.

The following are brief examples of some of the black-hat tactics, against which Google sent out named algorithm dates over the last ten years:

Hidden Text

An early tactic that gained popularity with black-hat practitioners was the hiding of text or links – especially the over use of keywords – intended only to be picked up by Google bots, and not visible to human readers.

Methods included concealing colored text behind a background of identical color ( usually white) or an image, using CSS to hide text by creating tiny hyperlinks ( such as a hyphen) or even setting the font size to zero.

In January 2004, Google sent out the ‘Austin’ algorithm, which aimed to penalize the use of hidden text as a way of keyword stuffing not only in the body of page text (by as much as 50 per cent in some instances), but also within meta-tags too.

The main casualties were sites weighted down with “on the page” keywords and those containing exchange links with unrelated content sites. Some of the larger brand retailer sites and directories appeared to be unscathed.

Link Spam

Link spam, whereby keywords are inserted within a page of text completely unrelated to the subject on the page or on the site itself is, of course, a major enduring plank to black-hat techniques. The creation of large scale link farms and the consequential volume of spam back-links connected to suspect sites of questionable quality and ranking have come to dominate the increasing attention from Google in recent years.

Between September and November 2005, Google released a series of three updates, known collectively as ‘Jagger’, which were mostly aimed at dealing with the growing problem of low-quality links and the use of paid links, reciprocal links and link farms in black-hat practice.

However, as seems to be the way with Google algorithm changes, which can confuse site owners and webmasters because the reasons why sites are penalized are not always clearly defined, universal and uniform.

Observations mentioned the removal of duplicate content from same-owner sites with identical subjects/themes, disappearance of main revenue earning keywords and page-rank reduced to zero. Elsewhere, site owners decided they should remove affiliate page content supplied entirely by affiliate scheme vendors before making a  ‘re-inclusion’ request with Google. However, the problem of link spam has been an enduring constant on the web to this day.

In February and March 2006, Google sent out “Big Daddy” – a change in Google’s data center infrastructure, which contained new code for increasing capacity to evaluate and index web pages. By seeking to deal with potential spam problems such as 302 redirects or canonical URLs more efficiently, it would be looking at the  abuse of ‘redirects’, which would also involve ‘doorway pages’ and the black-hat practice of ‘cloaking’:

Redirects

This tactic would display ‘keyword-stuffed’ landing pages, which quickly ‘redirect’ to the required actual page. They invariably do not contain content of relevance but are for the sole purpose of gaining a high position in search engine results pages.

Generally set up in groups to target similar and related keywords or phrases, the links on these pages connect to other pages in the same group aimed at creating a set of a false linking relationship. The redirect can occur  by movement of the mouse while on the redirect page, by command or even automatically.

Doorway Pages

Another popular black-hat tactic, where site pages are created, sometimes by using software to generate ‘orphaned’ pages, i.e. not belonging to the site’s regular navigation, where most of the content is duplicated from other site pages.

Cloaking

 A widely used black-hat tactic of creating web pages, which display a completely different set of content to a human reader than it shows to a search engine. The aim is to try and deceive the search engines to display the page. Inevitably, the ‘cloaked’ or concealed page is ‘stuffed’ with keywords for the purpose of obtaining high ranking.

The Big Daddy structural overhaul once again affected those sites employing black hat tactics by making unscrupulous use of another site’s content with 302 redirects and the removal of spam sites and link farms, stuffed with purchased keywords and phrases.

Shift towards content quality…

Google now modifies its search algorithm by over 500 times per year, and updates which were originally infrequent since 2007, have been more apparent. And according to Google, updates have been happening in an average of “more than once per day”.

From 2011 onwards, the Panda and then the Penguin updates have stepped up Google’s determination to deal with unnatural links and unrelated or ‘thin’ content.

In doing so, Google has crucially indicated a shift away from user interface changes and user experience improvements towards content search quality. Former algorithm indexing of ‘meta keyword and description tags’ is considered almost irrelevant compared to the quality of page content and integral keywords.

The growth of social networking sites and the imperatives of fresh, credible and human-based content are the prevailing indicators of site page relevance and site authority.

While Google is still in the business of evaluating traffic to determine ranking, the simplistic mechanics of optimization have been re-calibrated to the more relational semantic indexing and social signals governing today’s site authority and web presence.

From Zero to a Thriving SEO Company, We're sharing everything on our Journey to success!

You'll want to get in. Promise.

We guarantee 100% privacy. Your information will not be shared.

Comments

  1. John Morinaga says

    Hi Thomas,

    Great insight into the history of Google and how they interpret sites, links and other “factors” that they used in the past to determine site relevancy. It truly amazes me that for so long, Google was fooled by these tactics and black hat SEO strategies. Although Google doesn’t have it all right as of yet, they have greatly done away with many black hat practices (which amazingly some SEO’s continue to use).

    I always knew that these practices would catch up to many black hat SEO’s. Remarkable content is now being rewarded to those who produce it and that makes me happy. Too many websites were making too much money from black hat means and methods. I am glad that your article highlights the importance of producing great content as a priority. Great content is what separates the good from the bad. Making it known that remarkable content is what is really rewarded on search engines makes the internet a better, more useful place.

    Thank you again!
    John Morinaga

  2. PH says

    SEO is clearly getting Social. I don’t think the actual SEO techniques will stand very long, and BH will have a very complex task managing hundred or thousands fake Twitter, FB or G+ accounts to make some websites ranking.

  3. Kristie says

    It seems to be quite a time when those tactics seems so prevalent. Thanks to the actions implemented by Google, these are no longer pestering us.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title="" target=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Loading Facebook Comments ...