Major Google Algorithm Updates: A Brief History

Ryan Bednar

Ryan Bednar

02 / 17 / 2022

Major Google Algorithm Updates: A Brief History

Ryan Bednar

02 / 17 / 2022

RankScience is the #1 trusted agency to grow SEO traffic for venture-backed Silicon Valley startups.

Get a Free 30m SEO Strategy Session

Google is no stranger to change. In a given year, the search engine makes hundreds of changes to its algorithm, some of which go unnoticed and others of which spark outrage (or are met with fanfare). 

As a site owner, it’s imperative to know why and how Google’s algorithm has evolved throughout the years. To help you out, we’ve completed this brief overview of the largest, most important algorithm updates over the last decade. Keep reading for a better understanding of how Google’s ranking capabilities have improved, and what that means to your SEO strategy. 

Panda (launched 2011) 

Panda is one of the earliest initiatives to combat black hat SEO and to bring quality and user experience into focus. At the time of Panda’s launch, content farms appeared in droves. Site owners would commission dozens upon dozens of writers to produce a large volume of content, aimed at hurdling the site into the upper echelons of the search engine world.

As reported by ReadWriteWeb:

“Demand Media [of eHow was]  the epitome of a content farm and by far the largest example of one, pumping out 7,000 pieces of content per day…The company operates based on a simple formula: create a ton of niche, mostly uninspired content targeted to search engines, then make it viral through social software and make lots of money through ads.” 

The Panda algorithm impacted an estimated 11.8% of queries, wreaking havoc among large search engine players that earned their rankings from frequently published, low-quality content. The algorithm has since been integrated into Google’s core algorithm, helping to identify and qualify trustworthy webpages. 

Penguin (2012)

The Penguin update (formerly known as “webspam algorithm update”) was specifically launched to tackle manipulative link-building tactics. It came on the heels of Panda, which didn’t fully stomp out webspam from search results. Sites with lots of programmatic content have to make sure they’re showing 100% unique content to users. 

Previously, websites could elevate their search engine visibility by buying backlinks and ensuring that a large number of sites (regardless of quality) pointed to their content. Following the Penguin update, around 3% search results were immediately affected, and Google began cracking down on link schemes

Today, Penguin is a part of Google’s core algorithm. Backlinks remain a major factor for ranking—however, it’s expected that you earn links naturally through high-quality, shareable content. If Google finds that you’re building backlinks at an unnaturally fast pace, exchanging goods for links, or using automated programs to create links, then it’ll just as quickly put you in the dog house. 

Hummingbird (2013)

Hummingbird is widely considered a complete overhaul of Google’s core algorithm. It introduced a new, advanced way of translating search queries. As the name suggests, Hummingbird ensured that Google would work even faster and more accurately than before, thanks to a greater emphasis on semantic search and the knowledge graph.

The new algorithm sought to understand the implicit intent of a user’s query by considering the larger context of words, not just the meaning of words in isolation. For example, if you search “garlic chicken wings” today, Google can guess that you’re looking for a recipe or nutritional facts not information on a literal wing of a chicken. 

Moreover, the knowledge graph—which was released a year prior to Hummingbird—became much more accurate with the help of semantic search. Hummingbird has served as the vehicle behind a slew of other developments that all rely on natural language processing: voice search and local search, to name a few.

Pigeon (2013)

Pigeon, by many accounts, is considered the biggest update to local search results. It democratized SEO, allowing local businesses with physical stores in a user’s area to compete next to large stores and brands. 

Behind the scenes of Pigeon’s release, Google tweaked hundreds of ranking signals for Google Maps and Search. While the algorithm update required some debugging after its release, Pigeon essentially improved ranking parameters, using distance, location, and relevance as key factors. 

Safe to say that Pigeon influenced many brick-and-mortar stores to invest in a strong SEO strategy to increase both digital and physical foot traffic to their locations. 

Mobilegeddon (2015)

By now, we’ve all heard that Google favors mobile-friendly sites. Perhaps the earliest indication of this was Mobilegeddon, which handsomely rewarded web pages that performed well on smaller screen sizes. 

While search results on desktop went unaffected, mobile results saw big changes. In the words of Google (2015): “Now searchers can more easily find high-quality and relevant results where text is readable without tapping or zooming, tap targets are spaced appropriately, and the page avoids unplayable content or horizontal scrolling.”

illustration of mobile-friendly google site
Source: Google


Mobilegeddon set the golden standard for page ranking. Today, any SEO audit or game plan must factor in mobile-friendliness. 

RankBrain (2015)

RankBrain enhanced the utilization of machine learning within the core algorithm. Rather than checking for basic criteria, Google would now leverage an interpretation model to determine the relevance of a search result.

More specifically, the algorithm would consider factors such as user intent, location, and search history. If this sounds related to Hummingbird, it is; it expands upon the logic that Hummingbird introduced—that queries should not just be read literally, but viewed as an entity and put into context.

But at this point in time, 15% of Google searches were brand new and didn’t offer context. RankBrain aimed to address this. Using machine learning and data from various sources, it could extrapolate meaning and continue teaching itself over time. 

This is how Google can tell that when you’re searching for “SuperBowl” you’re looking for the latest scores and not driving directions, or tickets, to the stadium. Or, if you’re searching for “U.S. President,” you’re looking for information on the current-day president, not for a past one.      

Medic (2018)

The Medic update earned its name from having heavily impacted sites related to health, medicine, vitamins, legal, finances, and more. These sites are referred to as “Your Money or Your Life” (YMYL) sites because they can impact a person’s wellbeing.

With Medic, Google could better authenticate helpful, accurate, and unique content. This brought E.A.T. signals to the forefront: expertise, authority, or trust. Site owners needed to offer proof of credibility, consistent content, and other clear signs of authority to win the top spots on SERPs. 

Fred (2017)

Unofficially dubbed “Fred” by webmaster analytics expert Gary Illye, this update was once again created to combat low-quality content.

It suppressed sites that seemed to care more about monetization (read: sites that bombarded visitors with ads) than user experience. Many of these sites also tended to have thin content and poor-quality backlinks. 

BERT (2019)

BERT is often considered the largest change after RankBrain, which was released four years earlier. BERT is an addition to the RankBrain algorithm that uses NLP to further dissect search intent. 

The acronym itself stands for Bidirectional Encoder Representations from Transformers. It’s a fancy way of mentioning the AI-powered language models behind BERT which analyzes search sentiment and context. This tech additionally helps Google to understand synonyms and misspellings of words.  

Let’s look at this example: “traveling from U.S. to Europe.” Google will take into account the use of “to” and “from” and will only show information that’s relevant to someone traveling to Europe from the U.S.—not the other way around. 

The general guidance following BERT: there is nothing to necessarily optimize for BERT. But those who build content around real user intent are in the best shape. 

There’s more to come in the future

While Google has undergone some serious shifts in the last decade, its algorithm is far from complete. The algorithm—and subsequently SEO best practices—are constantly changing. 

Read Also: TitleGate: Google’s Sloppy Rollout of Auto-Generated Title Tags

If you need help keeping up, consider outsourcing to a team like RankScience. Our SEO practitioners are always looking out for new Google updates and helping brands pivot accordingly to maintain (and improve) their rankings. Contact us today


Get data-driven tips for growing your traffic with RankScience.

Hit Enter to search or Esc key to close