All About Google: The Algorithm

The Professional's Guide to SEO: Understand how Google thinks

We talk about search engine optimization as if there’s more than one engine, but when Google controls more than 90% of the global search market, any discussion of modern SEO has to start in Mountain View.

By now, you probably know the basics of crawling, indexing, and ranking. You wouldn’t have come this far if you couldn’t rank a site on Google. But to be a professional SEO, it’s not enough to know how Google works. You also have to know how Google thinks.

The Google algorithm

If we SEOs seem obsessed with the engine that drives Google’s results, it’s for good reason — that chunk of code is our judge, jury, and, sometimes, executioner. What is the algorithm, though, and why should we care about how it works? If we just follow the rules and do good SEO™, won’t everything be fine?

What is “The Algorithm”?

An algorithm, according to Merriam Webster, is the set of rules a machine follows to achieve a particular goal. The Algorithm is millions (possibly billions) of lines of code running across hundreds of thousands of servers in over 20 data centers that are the equivalent of small cities. It is a living, breathing web of heuristics and machine learning, constantly adapting to the behavior of billions of searchers.

The link wars (1998–?)

The Algorithm is also adapting to SEO tactics. Early search engines relied on what we now call “on-page” factors — essentially, the meta data and content of your site. This was relatively easy to manipulate, and led to a cat-and-mouse game between search engines and SEOs.

When Larry Page and Sergei Brin developed PageRank, they added a new world of “off-page” factors based on the building block of the worldwide web itself — links. While links were harder to manipulate, being beyond the control of any one site, it was only a matter of time before SEOs began to build, buy, and barter links, and even construct elaborate link networks.

So, Google had to adapt. In 2011, Eric Schmidt revealed that Google had made an astounding 516 changes to the algorithm based on over 8,000 experiments. In 2020, that number had increased to 4,050 improvements based on 600,000 experiments.

At the pace of more than 11 changes per day, how can we possibly keep up with The Algorithm, and should we even try?

Was it me, or Google?

You can’t chase every change, but it’s worth being aware of major Google Algorithm updates to answer one question: “Was it me, or was it Google?”. SEO is a moving target of assessing how your changes create positive (or negative) impact while competitors, searchers, and Google makes changes. Being aware of Google changes (big and small) is one critical piece to understanding your own impact as a professional SEO.

Just do good SEO!

You follow the rules, though — you read the Google guidelines, you wear a white hat, you shine your badge, and you take good care of your horse. That’s great, and we should all be aware of the rules. As a more advanced SEO, though, you need to know that the rules can change. You could wake up one morning to find (as some unfortunate SEOs did in 2014) that your lyrics site has been pushed down the page by a new lyrics box, or that Google has extracted the answer to a question from your site and now no one needs to bother clicking on your URL.

Following the rules is fine for beginners, but it’s not enough for SEOs looking to level up.

A brief non-history

You’ve likely come across many articles about major algorithm updates, but those updates aren’t just historical artifacts — they hold a deeper key to Google’s intent and future direction. Here’s a brief “non-history” of what a few of Google’s major updates can tell us about where SEO is headed.

Content quality (Panda)

Starting in 2011 and eventually covering 28 confirmed updates, the original Panda update (aka “Farmer”) was the start of a serious push to address content quality issues in search results. While Panda targeted content farms at first, it eventually expanded into duplicate content, low-quality user-generated content, and “thin” content created as placeholders for ads. Panda was baked into the core algorithm around 2015, carrying with it the ongoing philosophy that more is not necessarily better. Content should serve a purpose.

Link manipulation (Penguin)

The original Penguin update was a bit more hostile than its namesake and was built to specifically punish sites that were manipulating search results. Many of the subsequent Penguin updates seemed to target manipulative link building and link schemes, severely penalizing some sites in a way that made recovery very difficult. The impact was so severe that, in 2016, Google reversed many Penguin penalties and introduced a new algorithm update that devalued most low-quality links instead. Links still matter, but any effort put into building low-quality links or constructing obvious link networks is time and money poorly spent.

Local/mobile intent (Pigeon)

Local updates often get overlooked by even professional organic SEOs, but the Pigeon update blurred the lines between local and organic, with more organic quality factors being used in local packs and more localized results in organic search. Over time (and Pigeon is only a part of this), we’ve seen increasing localization outside of the local pack, a trend that’s gone hand-in-hand with the proliferation of mobile devices. If your business is inherently local or competes with brick-and-mortar businesses, then all of your SEO is local SEO.

Machine learning (Rankbrain)

Even years later, we don’t know exactly how the Rankbrain update works, but the launch of Rankbrain was Google’s first major admission that machine learning (ML) was part of the organic search algorithm. We’ll discuss more about the role of ML in SEO later in this chapter, but one lasting impact of turning over the algorithm to the machines is that we’ve seen more differentiation between various industries and verticals. There is no longer one set of rules that governs all SERPs, and it’s critical to understand your own competitive landscape.

Natural language (Hummingbird)

We tend to overlook infrastructure updates like Caffeine and Hummingbird. With the release of Hummingbird in late 2013, Google essentially rewrote the core architecture of search, powering changes and innovations for years to come (including advances in Natural Language Processing). While it’s hard to measure the full impact of Hummingbird, the implications for SEO are profound, and we’ll cover some of them in the next section of this chapter.

Natural language search

If you walked into a store and shouted “Buy computer!” or “Laptop price!” you might end up being escorted off the premises. If you’re over 25, though, this is probably how you learned to search. It’s only been the past few years that Google has begun to effectively handle natural language and respond to searches like “How long does a mid-range laptop typically last?”:

Google search "how long should a laptop last?"

While these advances are great for search users, they can be daunting for SEOs. How can we possibly target dozens, or even hundreds, of variations of the same phrase?

It’s much more than voice

We often conflate natural language search with voice search and voice appliances. While voice appliances have accelerated the use of natural language, the changes Google has made apply to the entire algorithm, across all devices.

The upshot of all of this for SEOs is that you can’t ignore natural language search just because you don’t optimize for voice search. Google’s quest to serve natural language queries has gradually rebuilt the entire algorithm and impacts all of organic search.

ML, NLP, & other acronyms

Google is spending millions of dollars on machine learning (ML) and natural language processing (NLP), so how can we possibly keep up with the pace of advancement? In the past couple of years, you’ve undoubtedly heard terms like RankBrain, BERT, and GPT-3 (to name just a few). Are we going to need degrees in machine learning to be professional SEOs?

While I do believe that a broad understanding of machine learning concepts can be very useful, the important part is to understand what Google is trying to accomplish. You may not know what bidirectional encoder representations from transformers (BERT) are, but you can understand that BERT helps Google better interpret critical words in the context of long queries.

From keywords to concepts

All of this NLP talk may not feel very actionable, so let’s look at a real-world example. Let’s say your company is targeting the keyword phrase “high-end vehicles” and you’re putting together a keyword research plan. You run the search in Google and spot this result at #2:

SERP result for luxury car brand

Note the highlighted words: luxury car. Digging a bit deeper, this page doesn’t contain the phrase “high-end vehicles” at all, or “high-end” in any context (it does mention vehicles). While some simple synonym matching predates BERT and probably even RankBrain, it illustrates the impact of natural language and the good news and bad news for SEOs.

The good news is that we no longer have to target exact-match phrases and laser-focus content on one phrase or a very small number of phrases. Take, for example, Moz’s popular Beginner’s Guide to SEO. According to Keyword Explorer, this page ranks for almost 2,000 phrases (and that’s just the tip of the iceberg, when you dive into natural language queries), including:

  • “beginners guide to SEO”

  • “SEO for beginners”

  • “search engine optimization basics”

  • “learning search engine optimization”

  • “search engine optimization fundamentals”

Did we specifically target all of these 2,000 phrases (or even the five phrases above) when writing this content? No, of course not. We did our research and were aware of important variations (like including both “SEO” and “search engine optimization”), but then we wrote like humans. That is to say, we used natural language to talk about the concept of learning SEO, and this created a rich framework of synonyms and variations.

The kind of laser-targeting you might’ve done in SEO in the early 2000s could even be working against you in the natural language era. By restricting your use of language and over-emphasizing specific keyword phrases, you’re potentially ranking for fewer terms.

Now, the bad news (depending on your point of view). This change also means that we face entirely new competition on many terms. That competitor content focused on luxury cars might not have even been on our radar when writing a page about high-end vehicles.

Fortunately, there’s one great and always up-to-date tool for understanding how Google processes natural language and keyword concepts — the SERPs themselves. Let’s go back to

that search for “high-end” vehicles. Note the related searches at the bottom of the SERP:

We've got a tool for that

Moz has a free tool that can help you discover and analyze keywords. When you're ready to get your hands dirty with keyword research, give it a try!
SERP results "related searches"

Immediately, you can spot not only some highly related phrases, but synonyms like “luxury”, “luxurious”, and “expensive” that might be part of your concept (or even better targets). While the data and tools to create strong keyword concepts are still evolving in the SEO marketplace, anyone can use the SERPs to see how Google is currently understanding natural language.

Search engine or answer engine?

Does all of this mean – as some have suggested – that Google is gradually becoming an answer engine? Let us repeat the sacred mantra of all professional SEOs: it depends. The answer is complicated because it really does depend on the query space. Some queries, like “What time is it in London?” have succinct, factual answers that can be easily summarized:

google answer engine

This answer comes directly from the Knowledge Graph and represents little or no SEO opportunity. However, consider a question like “How did we create the time zones?” and the corresponding Featured Snippet that Google currently returns:

google search result railroads

This isn’t a great answer, and that’s not necessarily HISTORY’s fault, or even the algorithm’s fault – this is a complicated question that ultimately requires rich content. No two-sentence summary is ever going to adequately address this sort of query.

That said, there is real value in understanding the kinds of questions that searchers ask and not thinking in terms of telegraphic one-word and two-word phrases. Just like you wouldn’t walk into a store and shout “Buy computer!”, search is evolving into questions like these:

Google results "people also ask"

This is how real people talk, and as a professional SEO, understanding the questions people are likely to ask is an essential part of crafting search-worthy content. Google may never completely become an answer engine in the literal sense, but every search is a question.

The human factor

As an SEO, it’s easy to think of ranking as an end in and of itself, but the value of SEO ultimately lies in the human that sees that ranking, clicks on the link, and engages with your site or brand. From Google’s perspective, better search quality ultimately leads to return visitors and more advertising revenue.

There are some very specific ways Google factors humans into the algorithmic equation, and understanding the logic behind them is crucial to leveling up your SEO.

CTR & other user metrics

If you’ve been in SEO long enough, you’ve probably heard, been in, or started an argument about whether or not click-through rate (CTR) is a ranking factor. Google says “no,” often emphatically (but very, very specifically), while SEOs point to case studies and experiments that seem to suggest otherwise. I can’t solve that debate in this chapter, but I do believe that the core topic is important and a key aspect of advanced SEO.

First, whether or not CTR or other engagement metrics (bounce rate, dwell time, etc.) are Capital-R Capital-S Ranking Signals, Google certainly cares about them. Consider this example. Let’s say I search for “How do I do SEO?” and I come across the following result:

Once I hit the page, I quickly realize that this is an official Google resource, and I decide that I’d really rather read something from the SEO industry itself. So, I click back. Upon returning to the SERP, I see something like this:

Google return to search "people also search for"

See that new box of alternative links? Google understands that I wasn’t happy with what I found, and they’re trying to help me refine my search. They know my dwell time (the amount of time I spent on the site before returning to search) was low and they reacted to that fact.

There are many examples of how Google measures, surfaces, and clearly cares about engagement metrics, but there’s a simpler argument — engagement metrics drive action. None of us are on the web to be a hood ornament on page one of Google. We’re hoping to drive searchers to some goal. That requires clicks and conversions.

CTR itself, while only one step in the journey, can at least help us understand whether our search results (our titles and meta data) and content match searcher expectations and where we can find room for improvement, particularly for non-brand searches.

Put simply, human-centered SEO is good for business, regardless of which specific factors Google uses to measure search quality or whether or not those factors are part of the algorithm. This is a theme I’m going to repeat in the next two sections.

Expertise, Authority, & Trust

If you’ve so much as mumbled the word “EAT” at an SEO event, even if it was “I’d sure like to EAT this sandwich,” you’ve probably started an argument. You probably know that E-A-T stands for Expertise, Authority and Trust. These sound like good things, but are they ranking factors, and, if so, how does Google even measure them? Here’s Google’s official take:

Note about whether E-A-T is a ranking factor

So, E-A-T is a good thing, and you should care about it, but Google may or may not use it, and if they do, they aren’t going to tell us how. On that same page, they link to a number of articles by respectable SEOs on the topic, but then specifically say they’re not endorsing those articles.

The best advice I can give is probably the most obvious — expertise, authority, and trust, and the perception that you possess any or all of the three, is good for SEO and good for business. This is especially critical in so-called YMYL industries (Your Money or Your Life), including healthcare and financial services, where consumer risks are high and trust is critical.

Speed & Core Web Vitals

Let’s cut to the chase — fast sites are good. Nobody likes to wait, and patience has never been a virtue of the internet. Speed sounds like a simple idea, but measuring “speed” turns out to be quite tricky. In June of 2021, Google finally rolled out the Page Experience Update and attempted to codify speed in the form of “Core Web Vitals” (CWV). CWV currently consists of LCP, FID, and CLS, though is ever-evolving, and may soon include more.

Speed problem solved, right? Let’s do a quick dive into each of the current metrics:

Largest Contentful Paint (LCP)

LCP measures the time it takes to display (aka “paint”) your main, above-the-fold content. It is essentially a measurement of perceived speed. Does your site seem fast?

First Input Delay (FID)

FID measures the time it takes for your site to respond to the first user interaction. “Interaction” is broadly defined, including clicking a link or button. Do you keep people waiting?

Cumulative Layout Shift (CLS)

CLS measures your content stability. Sites with aggressive advertising, for example, might have content that keeps moving as ads are loaded. Do you annoy people trying to read your site?

On the one hand, it’s great that Google is being specific and has provided tools (such as Lighthouse and Google Search Console) for measuring Core Web Vitals. On the other hand, it can be frustrating when an otherwise fast site fails on some dimension or scores in a way that doesn’t quite match reality. Ultimately, CWV is a work in progress and while it’s useful to track these metrics and improve, what’s more important is recognizing Google’s overall commitment to improving the search user experience and building for that experience.

Automatically track your site's Core Web Vitals
Automatically track your site's Core Web Vitals

Ensure your site is passing the Core Web Vitals metric thresholds in order to provide a good user experience and stay visible on the SERPs. Track URLs in bulk or spot-check important URLs for performance with on-demand analyses. Take it for a spin and see for yourself:

Start my free trial

Future-proof SEO

If you understand how Google thinks (and not just how Google works), you can begin to see where search is headed and, ultimately, help future-proof your SEO. Let me be clear – future-proof does not mean you can just “SEO your site” and be done with it. Even if the algorithm wasn’t changing every day, even if your competitors weren’t working every day, the world is still changing. For example, one of the most disruptive search events in history was the COVID-19 global pandemic, because it fundamentally altered searcher behavior.

Future-proofing your SEO means that you can anticipate trends and make educated guesses about the nature of future Google updates. It means that you’re less likely to make costly mistakes and more likely to build toward what Google wants.

Listen closely to Google’s intent

Over time, algorithm updates and changes to SERPs tell us a story about what Google values (and what they wish we’d stop doing). For example, you don’t necessarily have to get an A+ on every factor of Core Web Vitals, but CWV and its components send a clear message that Google values site speed and the user experience, and is looking for ways to better quantify that experience. CWV itself will evolve, but the intent behind it is here to stay.

One warning: listening to Google’s intent doesn’t mean freaking out over every PR message. For example, in 2014, Google very publicly announced the decision to use HTTPS as a ranking signal. The eventual update had relatively little impact, and it took years before we saw major changes in the SERPs around secure sites. These PR campaigns do tell you where Google is headed, but you still have to read between the lines and prioritize your SEO work.

Write for humans (aka customers)

Finally, while recent advancements in machine learning (ML) and natural language processing (NLP) can feel overwhelming, I believe that they’re generally good news for SEO. As search marketers and content marketers, they free us up to write for our human audience, knowing that – over time – Google will do a better job of approximating the way humans communicate.

If you understand not just what words your customers are using, but the questions they’re asking, and you serve those answers, you are going to be more and more likely to succeed in Google search results. The SEO journey begins at the search box, and the keywords people use are critical, but it ends at your site and with customer engagement.

Next up: All about Google: The SERPs

The changing landscape of search results.

Written by Dr. Pete Meyers, Moz's very own search scientist and MozCast wrangler.