Google Algorithm Update History, Explained (Infographic)

google algorithm update history

Over time, Google consistently changes and updates its algorithm intending to provide accurate, faster, and relevant search results to its users.

But what is Google algorithm? Why is it important to keep updated with algorithm changes for your website?

This post explains what Google algorithms are, why Google continuously updates and changes its algorithms, and what you can do to keep your website up to date with all Google’s updates.

What is Google algorithm?

To better understand how Google’s algorithms work, let’s look at the definition of the algorithm first.

Essentially, an algorithm is a specific set of directions or processes to follow to solve a problem or accomplish a task.

One practical example of an algorithm is a recipe. When preparing a dish, you can’t mix the ingredients all at the same time.

To know which ingredient comes first, and which ones are added at a later time, the recipe has specific instructions on how the dish is prepared.

The same goes for Google’s algorithms. Google search’s algorithm complex system retrieves information from its search index, assembles, and delivers the best possible and relevant query result in an instant.

For instance, when you type “coffee shops in Manila” or “chewy chocolate chip cookies” in the search bar, Google will provide millions of results for you to choose from. Now, how did Google come up with search results for you? How did Google find and choose which result to show you?

A Google algorithm will look for, rank, and return the most relevant pages for your search query.

To get a clear picture of how Google search works, watch this YouTube video from Google.

thumbnail screenshot of Google's YouTube video

Why Google keeps updating its algorithms

Since Google began, they knew the extreme importance of providing better and more accurate results for search users. In the early years, Google would only update a few of its algorithms.

Further Reading: Unconfirmed Google Algorithm Updates: Should you be Worried About Them?

As time went on, updates became more frequent and often unannounced.

This is because Google wanted to provide higher quality, relevant search results and prevent users from manipulating the system.

Google algorithm updates

As mentioned, Google makes hundreds of algorithm updates and changes every year.

Some minor updates were unannounced, but the majority of core algorithm changes were rolled out in a manner that webmasters and SEO experts notice them significantly, especially in their site audits.

Here’s our list of noteworthy Google algorithm updates over the last decade, including a summary of what were the updates for.

We have also included an infographic  of Google’s algorithm updates and its history for your quick reference:

an infographic showing Google's algorithm updates


Launched in February 2009, it is named after one of Google’s engineers in recognition of his efforts on this algorithm update.

It focuses on the trustworthiness of the websites. 

Big brands and government sites have largely benefited from this update as Google favors and ranks them first in SERP (Search Engine Results Page).

What this means is that bigger brands didn’t need to rely on the number or the quality of their links because Vince made them achieve high rankings for big terms or keywords related to their business instead. 

While Google claims Vince wasn’t a major algorithm update, it impacted small and medium businesses.

Small businesses which have put effort into creating content with quality backlinks and have ranked well for “big” keywords took a huge decline in rankings than their big counterpart.  

This isn’t to say that small businesses don’t stand a chance to overmatch big brands.

The update stresses the importance of building your website as an authority on a product or service you specialize in.


Google initially released its preview in August 2009, but it wasn’t until June 2010 that it was fully rolled out.

This update was made in response to the growing number of content on the web, including news, images, and videos.

It gave Google a boost in speed, accuracy, and index size by 50% in web search results.

Google Caffeine analyzes the web in small portions and so they are able to update their search index on a continuous basis. It just keeps on going and going.

So every time Google sees new content, they can instantly add it up to their index.

This consequently means that we become happy searchers because we are now able to find fresher content than ever before!

The way that the old Google engine works is that it indexes web pages and stores them in layers.

Some of which are refreshed faster than others.

If you’re in the upper layer, you get to be indexed more often by Google – probably because you update content faster and your site authority is higher.

In refreshing a layer of the old index, Google needs to analyze the entire web which of course means a significant delay between when Google finds a new page or new content and makes it available on its SERP.


In February 2011, Google launched Panda (a.k.a. Farmer) as its first core algorithm update.

It works as a search filter, weeding out websites with “content farm” business models and with an excessive ad-content ratio.

A decade ago, there was a proliferation of “content farm” websites.,,, and were some examples of “content farm” websites.

“Content farm” or “content mill” is a website or company that creates tons of content that are mass-produced, which compromises the quality of content as a result.

Remember Caffeine update? When it dramatically increased Google’s ability to index content, it may have indexed “low quality” content from “content mill” sites as well.

Hence, the Panda update.

It was developed by comparing Google’s human quality rankers with Google’s actual ranking signals.

The update was implemented to eliminate “low quality” or excessive ad-content ratio.

It also weeds out websites with “content farm” or “content mill” business models trying to rank in the search engines.

While Google didn’t specifically make its actual ranking signals known to the public, it provided some guiding questions to assess the quality of the sites.

To learn more about how Google assesses a website’s overall quality (including guide questions), you can find it in Google Search Central Blog.

What Google actually wants publishers to do is to always keep in mind that people want real, useful, unique, and quality content.

Further Reading: Optimize SEO: Almost a Decade Into the Panda Update

Google continues to integrate and roll out updates for the Panda algorithm. To date, MOZ tracked 28 Panda updates in total between 2011 and 2015.

In February 2016, Google confirms Panda’s consolidation with its core ranking algorithm. Despite this, this integration was only meant to encourage publishers to create high-quality content, not to focus on quantity.


Before this update went online, the method that you have to use to track your local search listings is through Google Places.

While this was a handy method of going through local search results, there are occasions where some search results add local listings, despite the search terms not asking for it.

This changed in February 2012, with the new Google Venice update.

This allows your search results to show some local listings based on your IP address or even your physical location.

This allows you to be able to look for the closest businesses and establishments near your current location, which comes in handy when you are traveling or planning a place for a meeting or an event.

This has helped local SEO in a big way, as local businesses are now able to gain more traffic and visibility, which helps them compete with larger and more established businesses.

Google Venice also made using Google My Business much more important, as local businesses are able to track their local SEO and SEM through this effective free tool.

Small local brands and businesses have largely benefited from this update as they can rank for short-tailed keywords.

At the same time, small businesses with local intent can compete with bigger brands using keywords with high search volumes.


Google loves to reward high-quality sites and penalize “black-hat webspam” sites at the same time.

When Google rolled out Penguin in April 2012, its goal was to downrank websites that do keyword stuffing and buy spammy links.

This update amplified Panda’s effort to take down low-quality sites.

When Panda was launched, some websites that have low-quality content were penalized. However, not a lot of spammy sites were penalized or taken down still.

In a desperate attempt to manipulate search results, some webmasters and SEO specialists implemented “black-hat” spamming techniques to downrank their competitors in the form of link-building.

An unethical form of “black-hat” technique that broke the surface in time for the Penguin to roll out was negative SEO.

In essence, negative SEO lowers the search ranking of a competitor’s site by sabotaging it. This SEO Sabotage can come in the form of:

  • Sending fake backlink removal requests to the webmaster
  • Posting fake negative reviews of a website
  • Hacking a website
  • Buying low-quality or spammy links for a competitor’s website
  • Scraping content

While these negative SEO tactics are indeed very powerful SERP rank killers, negative SEO practitioners aren’t ethical, and indeed aren’t doing the SEO industry a favor.

In addition, Penguin didn’t just hit websites that employ “black-hat” strategies. It also hit over-optimized websites as well.

Over-optimization can be simplified to optimizing your website for the search engines rather than for people.

So, over-optimizing your website doesn’t necessarily mean you’re employing black-hat strategies.

However, if your content does not live up to the site’s SEO level and is filled with too much of SEO techniques, you can still get penalized.

As always, Google aims to provide an excellent experience and fulfill the in-depth, accurate information needs of its users.

As a result, it was reported that this update affected an estimate of 3.1% of English searches.

Similar to Panda, there have been a number of updates and refreshes to the Penguin since it’s rollout. Penguin made its final update and became a part of the core algorithm in September 2016.


After Google was challenged by production companies and studios in the entertainment industry for failing to sanction or censure online copyright piracy, they released the DMCA (Digital Millennium Copyright Act) or the Pirate update in August 2012.

Google acknowledges that copyright infringement is a serious issue, and so sites with pirated content like videos, music, and movies with several copyright violation notices began to appear lower in search results.

With this update, anyone can submit removal notices for websites with pirated content.

When Pirate was initially launched, Google received over 57 million web page “takedown” requests from copyright owners and their agents.

Google has thoroughly evaluated each of these requests, and once they deem it valid, the content in question, especially the ones that violated the copyright act guidelines were taken down.

To further Google’s continuous effort in taking down new online piracy or copyright violators, they officially updated and refined their Pirate signal in 2014.

In addition, Google has also introduced a new ad format where they show legitimate content results to users looking to download free content.

Jumping forward to February 2022, Search Engine Land wrote about Google releasing a document to the US Copyright Office.

The document states that Google has developed a demotion signal for its search engine where it causes sites to appear and rank much lower in search results.

The document also states that “when a site gets demoted (by Pirate signal), the traffic Google search sends it drops, on average, by 89% on average.”

You can read more about the document shared by Torrentfreak here.


Hummingbird was not just a simple core update. In fact, this was a complete overhaul of Google’s algorithm since 2001.

Matt Cutts explained that Hummingbird was a rewrite and not just a part of the core algorithm.

What Cutts essentially means is that Hummingbird does “a better job of matching the users’ queries with documents, especially for natural language queries, you know the queries get longer, they have more words in them and sometimes those words matter and sometimes they don’t.”

This move from Google intends to refine its search results as users began utilizing voice search.

What Google actually noticed is that users who do voice searches on mobile devices ask long conversational search queries.

With the old algorithm, Google would match a user’s typed search query with a webpage containing every word in that query.

But with Hummingbird, Google zeroed in on what a search query means instead of just matching keywords with pages.

Hummingbird uses semantic search to do the heavy lifting in creating and understanding its user query intent.

It was initially rolled out in August 2013 and affected nearly 90% of searches.

As far back as September 2013, I asked Bill Slawski of SEO by the Sea for his assessment of how Hummingbird differs from the old Google engine.

To read more about his approach to the topic, click here.


Launched in August 2014, the Pigeon (named by Search Engine Land) update was widely known as the most impactful local algorithm update, since Venice was launched in 2012.

It aims to strengthen the connection between local and core algorithms. Google modified its local search results’ accuracy by using more conventional site ranking signals to it.

There was a significant difference in search results and the user experience between Google search and Google Maps prior to the update.

After Pigeon was released, the overall appearance and search results of both Google search and Google Maps are more cohesive.

Another thing that stood out from this update is the local pack.

Pre-Pigeon, the local pack includes 7 to 10 businesses, giving the brick-and-mortar more opportunity to be found, rank well, and show up on the first page of search results.

However, when Pigeon was launched, it was significantly reduced to a 3-pack.

Another Pigeon highlight worth noting is the role that E-A-T (Expertise, Authority, and Trust) plays for small and local businesses.

With this update, Google aimed to make local search behave as much as possible like traditional organic search.

Further Reading: How To Boost Your Author Reputation for Google’s E-A-T Standards

One interesting highlight prior to the Pigeon update was when customer-driven local directory and reviews sites like Yelp were penalized and outranked by websites that Google controls.

To strengthen their argument, Yelp leaked and published a presentation that proves Google’s mistreatment of the major review site.

Search Engine Journal highlighted some of the details about Yelp’s internal presentation here.

Yelp’s accusations were denied by Google, but the Pigeon update proved otherwise.

Google began treating Yelp and other user-driven content and customer review sites more favorably by ranking them high in search results.


The mobile update or mobilegeddon was rolled out in April 2015, as Google aims to reward mobile-friendly websites.

This update was Google’s response to how consumer behaves, and not just about organic search. Adapting to consumer behavior was a great move by Google.

What is considered to be the most significant change, (and worth commending) however, is that Google actually announced its algorithm changes ahead of time, contrary to their earliest roll out updates.

They announced the update as early as February 2015 to give webmasters and publishers enough time to improve the mobile-friendliness of their webpages.

Because Google acknowledged the increasing popularity of mobile users in many countries, they released an in-depth information and guideline about this update.

Some experts found this update as something that wouldn’t keep them warm at night.

But having said that, Mobilegeddon has caused SEO experts to recognize that site speed and load times will become crucial ranking factors in the not too distant future.


When Rankbrain was rolled out in October 2015, Google announced that it’s their third most important ranking signal, in parallel with other two most important signals: content and links.

Rankbrain is Google’s machine learning algorithm that provides more relevant search results by helping Google process and understand what the users’ search query means.

For example, you typed “glamping site overlooking” in the search bar. Google used to scan pages to determine if they contained the exact keyword someone searched for before Rankbrain.

With Rankbrain, Google will try to figure out what your search query mean. It matches brand new keywords to keywords it has encountered before.

Rankbrain was rolled out to solve one but huge problem in search intent query:

15% of search query done by users were never seen before by Google.

There was no way Google can tell the context of these queries and no past analytics to send a signal to Google whether it was good or satisfied the user’s intent.

Hence, Rankbrain’s introduction.

RankBrain displays search results that it thinks you’ll like.

A ranking boost may be given to a page if it is liked by many people in the search results.


Possum was one of the unconfirmed Google algorithm updates. However, when it was launched in September 2016, the local SEO community noticed a change in the local pack results.

According to Search Engine Land, Possum has particularly impacted ranking in the 3-pack and local results or Google Maps results. As a result, many businesses have seen a significant increase in their local ranking.


Another unconfirmed but major Google algorithm update, Fred was rolled out in March 2017. This update was aimed at sites with low-quality content and only focus on revenue instead of its users.

The majority of marketers noticed its impact on sites with aggressive ad placements, spammy content, negative user experiences, or utilizes shady black-hat tactics. Google also targeted other websites who have violated the Webmaster Quality Guidelines as well.


Google launched Project Owl to target fake news, offensive or misleading content sites, and upsetting search suggestions. Launched in April 2017, this Google algorithm update aims to improve search quality by emphasizing authoritative content.

To address problematic content, Google utilized data from “quality raters” who responded to its feedback forms to better identify offensive and inaccurate results.


Another unannounced Google algorithm update but with massive impact, Medic was rolled out in August 2018. This update significantly affected health, wellness, and medical websites, and YMYL (Your Money or Your Life) pages.

In a survey done by Barry Schwartz, Google penalized pages and websites in the medical, health, and fitness spaces that make medical claims or provide health advice without authority, expertise, and trust.

Google emphasized that there was no “fix” for pages or sites that were penalized, other than to continue creating high-value content.


BERT is another major Google algorithm update that focuses on the NLP (Natural Language Processing). Bidirectional Encoder Representations from Transformers was launched in October 2019, and its focus is to better understand users who make longer or conversational queries. It was internationally rolled out in December 2019.

Google announced that BERT is the “biggest change of the last five years”, one that would “impact one in ten searches.” It officially impacts featured snippets and 10% of search queries.


Similar to BERT model, MUM (Multitask Unified Model) aims to handle complex tasks for its user queries. Rolled out in May 2021, MUM lets you get comprehensive results with your search because it can read, understand, and learn languages.

When ytu search, Google will provide search results that aren’t just limited to texts and images, but also videos and audios relevant to your query. John Mueller said it best, “MUM can understand things with a fined-grained level of detail.”

Page Experience Update

Launched in June 2021, this update focuses on user browser experience. Google’s Page Experience Report tool provides accurate data of your website. This tool checks your website’s mobile usability, security issues, HTTPS usage, ad experience, and core web vitals.

With this update, Google rewards sites with better UX/UI design with better SERPs (search results page rankings).

Key takeaway

With Google algorithms constant updates, it’s a challenge to keep up with them. There are more minor and unannounced updates that I have not included in this list. Instead, I’ve listed all algorithm updates and changes that impacted us and our clients since 2010. 

You don’t have to keep abreast of every algorithm update Google has. This post is meant to make you aware of how the Google search system is changing, and what you can do to stay in the loop. 

By doing this, you will know how to optimize your website and pages. At the same time, you will have a better understanding of how to decide on the most effective strategies for your business.

Share on:
Sean Si

About Sean

is a Filipino motivational speaker and a Leadership Speaker in the Philippines. He is the head honcho and editor-in-chief of SEO Hacker. He does SEO Services for companies in the Philippines and Abroad. Connect with him at Facebook, LinkedIn or Twitter. Check out his new project, Aquascape Philippines.