How Google Brings You Results
By now, most of us have heard about the Google Algorithm.
For years, it was the arch-nemesis of every SEO nerd. Each time there was an algorithm update, SEO experts across the globe would check their results, analyze, complain and post about what the latest changes did to their search results.
But, do we really understand how it all works?
Google has their search engine algorithm engineers on a tight non-disclose contract, I’m sure. (Has anyone ever actually met one of these mysterious people?)
But, we do understand the basic mechanics of how Google’s search engine works. When it comes to ranking websites, there is actually more than one algorithm at work.
Eli Schwartz, who is a very well-respected SEO expert and author of “Product-Led SEO” describes at least 5 algorithms at play. Here’s how he breaks them down:
For years, we’ve described the bots that Google uses to crawl the internet looking for new pages and websites that have never been crawled before. We can call this one algorithm that is always on the hunt for new things to take in.
This component is simply that- a hunter. It’s not designed to assess anything other than the existence of newly formed URLs and websites that have never been crawled. Once this information has been taken in, the URLs are matched to existing websites.
Once the discovery algorithm crawls new pages and sites, these places are placed in the queue for future crawling at a later date.
Google uses an algorithm to not only discover new URLs but the crawling algorithm is used to better understand the entire web.
Given that Google is used by roughly 65% of all searches on the web, you could say it has the largest pool of information from which to understand the internet.
If we go with what Google tells us that they only want to serve the best search results possible, then the crawling algorithm is used to help them decide whether or not to crawl the new URLs they’ve discovered.
How the decision is made to crawl or not crawl is still somewhat a mystery, but you can safely guess that things like the website’s domain authority, the number of links going to the URL, etc are factors to consider.
This algorithm also makes no assessments about the quality of the content in any way.
The third algorithm at play when we’re trying to rank a website has to do with indexing the URLs that have been discovered and crawled.
This algorithm, by definition, has to be one of the most complex of the three simply because there has to be a decision made at this point: to index (or display) the URL or not.
One factor that goes into this is the uniqueness of the content. With all the weight loss products on the market today, if you type in “weight loss products”, and start wading through the search results, it’s very feasible that you are going to start getting content that starts to be redundant.
This is where the indexing algorithm comes in. Google has to decide what content is the most unique and non-duplicate. Clearly, the more unique the content, the more likely it is to be indexed.
Keep in mind, the indexing algorithm also has to determine what database to assign the discovered and crawled URL to. For example, information about politics has to be assigned to news, satire, or something entirely different.
Based on this, you can see the complexity of this algorithm.
This algorithm would be the one that actually determines where the URL that has been crawled & indexed should be placed in the search results.
Most people don’t realize that a URL can be crawled but not indexed and crawled and indexed but not ranked.
Now, you’re starting to see why SEO nerds tend to be on the outskirts of sanity.
Just kidding. Sort of.
Each URL is essentially given a score to help determine whether or not to rank it. The factors that drive this score are:
Intention: how well the intention of the initial search query matches the intent of the content on the URL. This is far more than just matching keywords like the old days. This pairing process is about matching the intention of the entire page to the intention of the searcher intention.
Google has gotten much better at understanding the specific meaning behind words, and it’s likely these improvements will continue.
Quality: You hear it again and again, but this is about the quality of your content. This is about everything from the correct grammar, readability of the content, how the information is presented and laid out, links, etc.
Usability: This is a very important factor, so if you have a lot of ads, pop-ups, no mobile-friendliness, long page load times, your site could be de-indexed altogether.
Context: If the search has an intention that can be interpreted as local, Google has gotten more sophisticated about presenting more local results based on a variety of signals like the browser’s device, location, etc- even if the location wasn’t included in the query.
Bert is more than just Ernie’s best buddy on Seasame Street. To Google, BERT stands for Bidirectional Encoder Representations From Transformers (We can’t make this stuff up).
Introduced to the world in 2019, BERT is a learning algorithm that is related to natural language processing which helps machines better understand the words typed into the search box.
The idea here is that the better Google understands the words you use in your search, the better they can match the results of this search. Before BERT, Google would all but ignore pronouns used in search, but after the implementation, nearly everything is used as a signal to determine intention.
The common example in SEO circles is the example from New England where the word “cow” means a large striped bass in context to fishing. So, if you type in “How to catch a cow fishing” pre-BERT, you’d get a bunch of moo-cow articles and pictures. Post BERT, you’ll get fishing information.
While the BERT algorithm doesn’t do anything to help or hurt search rankings directly, you can see how it greatly contributes to the ranking algorithm.
And, this is the big point- there isn’t just one algorithm. There are multiple algorithms at work when you type in words into Google. Each algorithm complements and assists the others to produce the best possible results that Google can give you.
Does Google Hate SEO?
Hate is a strong word. It probably doesn’t fit well, but there’s definitely evidence that the nerds at Google and SEO nerds used to not play well with each other.
In recent years, it’s probably fair to say things have moved from not speaking to détente to a congenial relationship. Google’s Webmaster Trends Analyst, John Mueller, holds open calls with SEO professionals all the time where he gives insights about SEO. The simple fact that these interactions exist is a step forward.
But, it’s hard to ignore the money. Google still makes the lion’s share of their income through Adwords, so there’s still the old thought that Google would like to have everyone paying them for Google Ads rather than improving their organic listings.
Either way, Google changes aspects of their algorithms hundreds of times (if not thousands of times) throughout the year. Why? Simple- to improve the delivery of results.
Our computers, phones, website plugins, etc. are updated all the time, why shouldn’t Google. You could argue that the simple fact that they update as much as they do is a sign that they are always striving to improve, and that’s a good thing.
Why You Should Care
Does the average business owner need to know all about the algorithms Google uses to bring their results?
But, knowing a little about how it all works does help in better understanding the need to have well-performing websites, solid content, and good technical SEO.
If you would like a free initial consultation about any aspect of your digital marketing, call the staff at Make It Loud today. With web design that “Wow!”, SEO, social media marketing, and more, we can help you get better results.