Google’s search engine algorithm is a complex and highly secretive system that no one outside the company’s confines can genuinely claim to understand.
But the secrecy behind the algorithm doesn’t mean changes go unnoticed.
The evolution of Google’s search algorithm means it is now unrecognizable from its first iteration decades ago. Its approach to making changes is also different: Google updates used to be a big event, but now, thousands of changes occur every single year.
To understand what it means for your business, it’s helpful to explore how we got to this point.
A Short History of the Evolution of Google Search
Google’s updates began slowly and then started to happen all at once. To get a feel for the general trajectory of the algorithm updates, let’s look at the first three significant updates that shaped both business and user experience.
Let’s hop in the time machine and go back to what wasn’t really a simpler time: 2003.
Google titled its first big update “Florida,” and it was the first time the giant recognized that although it deals in software, the people on the other side of the screen are humans. Florida brought about the predecessor of our current SEO. When the update arrived, it erased the rankings for websites that earned their top score via keyword stuffing, hidden links, and invisible text.
Retailers hated Florida. Their ire mostly arrived because Florida knocked them out of top search places right before the lucrative holiday season.
Jagger Updates (1, 2, and 3)
Jump forward two years to 2005, and the Jagger update arrived across three phases. The Jagger updates introduced a new focus on backlinks that targeted anyone attempting to use black hat methods and spam to get ahead. The Big Daddy update that hit at the end of the year further punished unnatural linking and kept some websites out of the new data centers.
By now, you can see where this is going. Google’s updates recognized merit rather than SEO efforts to help searchers find what they were actually looking for – not what SEO-savvy early adopters wanted them to see.
The same thing happened with the Vince update in 2009. Vince targeted keywords, and all of a sudden, the previously ranked sites (typically those who achieved their status through hardcore SEO efforts) moved down the rankings in favor of big brand domains. This was the year when if you sold soap, you’d never again outrank Unilever because Google assumes that if a searcher wants soap, then they want one of the big brands.
The year also ended up being a watershed moment for Google because updates became swifter. Several huge updates came out from 2009 and on including:
- Caffeine (2009) (new indexing system)
- MayDay (2010) (long-tail queries)
- Panda, Panda 2.0, Panda 2.1, Panda 2.2, Panda 2.3, Panda 2.4, Panda 2.5, Panda 3.0 (2011) (ended content farms)
- Freshness (2011) (updated search for ‘fresher results’)
- Panda Update 3.1 and Panda Update 3.2 (2011 and 2012)
The stream of updates continued until the long-anticipated Penguin update arrived. Panda, in particular, continued to receive updates for years.
SEO As We Know It Started with Penguin and Took Off with Hummingbird
A brief understanding of the earliest Google updates demonstrates that Google began to focus on genuine search intent and prioritize it over the SEO game.
Penguin is perhaps the epitome of this sentiment.
The Penguin algorithm finally introduced a chance to downrank the websites that disregarded Google’s Webmaster Guidelines in favor of spammy tactics. It was a punishment of sorts, but Google didn’t intend it to be permanent. If you lost your rankings during the first update, you had a chance to clear out the spam and recover.
In 2013, Google launched Penguin 2.0. Again, it was a chance for sites to improve. The first Panda looked at surface-level link spam, but the new version dove deeper to find less apparent examples and ferret out those who took their tactics underground.
Things really changed with the launch of Hummingbird in August 2013. Hummingbird perhaps represented the most sophisticated version of the Google algorithm that had yet existed. It focused on working through complex queries and returning the most relevant and freshest results. And unlike other updates, it impacted almost everyone: it hit 90 percent of searches globally.
Then We Went Local and Mobile
By 2014, we were all fully addicted to our phones, and our data was better than ever. There were 257 million 4G LTE mobile phones, and mobile search was more significant than ever.
Naturally, along came Pigeon.
Pigeon tied in the local algorithm and Google’s core algorithm to help provide local results. It focused heavily on both on and off-page SEO – and you needed to get listed to find yourself in the results.
The Mobile algorithm arrived a year later. It prioritized mobile-friendly sites and pages on mobile searches. Websites that still hadn’t adopted a mobile format were either severely-down ranked or filtered out entirely.
Possum launched in 2016, and it again refocused the search location efforts. This was the first time that search results were narrowly targeted to a physical location.
What’s Next for Google’s Updates?
There are two significant updates that we skipped out before. The first is RankBrain, which Google added to Hummingbird. It was the first time that Google attempted to learn the meaning behind queries, which allowed the algorithm to serve them better. It was the first time that content needed to be hyper-relevant and comprehensive.
The latest of the confirmed updates is Fred, which went live in March of 2017.
Fred differs from RankBrain, but it complements it. The update targeted webmaster violations, but mostly, it chewed up and spit out sites that used low-quality posts designed for affiliate use.
In other words, Google started reading – really reading – your content and deciding whether you were trying to be helpful or trying to sell. It didn’t knock out affiliate sites completely (obviously), but it did set a higher standard.
Fred also brings us to the next update that’s not yet out in 2019: BERT.
What Is BERT?
BERT is the most sweeping change since RankBrain, and Google says it impacts 1 in 10 queries.
Like RankBrain, it uses machine learning and AI to understand the searcher’s terms and intent better. It wants to understand the nuances of natural language to provide a better result.
It ties in well to Google’s past efforts to improve both mobile search and voice search, whose dominance only continues to grow. Remember that 27 percent of internet users already use voice search. Because we’re more likely to use natural language when speaking our queries, this update is vital to the health of Google’s algorithm.
Here’s a helpful example.
The word “pool” has several meanings that all depend on context. BERT will help the algorithm deal with this by learning more about speech patterns.
For example, if a searcher says “in the pool,” they could mean a swimming pool or a sports league. Location can then help narrow the search meaning down. Focusing on nuances like prepositions (in, on, to, etc.) is the program’s raison d’etre.
As AI becomes more sophisticated and the consumer market focuses more on the natural integrations of search into everyday life (through products like Alexa), you can expect more search updates to reflect the need to translate the human experience into code. What that means for websites, however, remains to be seen.
The Key Takeaway for Every Google Update
The evolution of Google search algorithms includes a wide range of updates that tackle different issues in both mobile and desktop-based search.
But two things bind all these updates together:
- Stop pretending you are above Google’s webmaster guidelines
- Create a site (and SEO) for humans
At the end of the day, Google only succeeds as long as its search results are relevant for users. But to provide relevant results, it needs to use human-friendly websites.
It’s easy to get hung up on the technical details of SEO and Google algorithms. And these details are essential for stellar execution. However, what matters most is that your site is both useful and unique for your user base.
If you can put your customers at the center of your website, you are already halfway towards meeting your SEO goals for both the current algorithm and future updates.
Google Offers Lessons in Quality
The evolution of Google’s algorithms have followed a pretty compelling trajectory: as much as Google shapes the internet, it is equally reactive to the human experience of search.
Although the press makes a fuss over each new update, Google focuses on quality and the person-centered experience above all else. That’s why it punishes sites that attempt to thwart its algorithms by buying or spamming their way to the top of the rankings. Not only do those sites not help Google’s core algorithm do its job, but they offer little-to-no value to humans.
It’s a cycle of internet doom and gloom that Google wants to eradicate.
The constant flux of Google’s algorithm presents a challenge, but it’s one your business can meet. Get in touch to learn how we can transform your SEO and help future-proof your site in preparation for the next Google update.