How To Win At SEO: A Simple Explanation Of The Elusive Google Algorithm

Are you not showing on Google no matter what? There is so much content on Google Search algorithm, it rarely provides a simple explanation.

I have been there, scratching my head bald, trying to figure out the Whys.

I read, and read, and read some more. Then, I scoured YouTube some, perused Medium, and then got drunk on a lot of rum. But I did not find the answers.

It took me a while to understand that they were in the Hows, not the Whys.

So, how does Google rank your website?

A simple explanation of Google Search Algorithm

Google functions a lot like a library. Yes, the traditional book and mortar business that abhors decibels. This analogy is the best way to understand the Google perspective.

Suppose you are planning to start a library in your town, and you want it to be the best one ever. What are the initial steps you take? 

Here’s a comprehensive list, ignoring the financials:

  1. Lease a big enough space and install some book racks. (Google data centers)
  2. Find relevant books for each topic. (Crawling)
  3. Organize the books alphabetically. (Indexing)
  4. Have a system to locate the book your reader would find relevant (Serving)

In addition, you have a few sub-tasks:

  • Replace outdated book versions with the latest ones
  • Ensure all books are in proper condition and theft-proof
  • Highlight books your readers could find value in, based on their interests
  • Block out the books with original covers but fake content

Google follows a similar MO – Click on any link above to see its Google equivalent.

Google Data Centers

Here’s what happens when I look up this search query – can Usain bolt do ballet? There are more than 900K search results! That’s just for one uncommon search term.

simple explanation of google search algorithm

There are 3.5 billion searches per day. Imagine the number of web pages served for each search session! Google stores this mind-boggling load of data in multiple Data Centers spread across continents.

simple explanation of google search algorithm

Credits

The above image is of Google’s book racks. They are highly secure servers that store the data.

As a webmaster, you want your website data to snuggle into a microchip in one of those servers and served when someone searches with the relevant keyword.

How do you get it up there? Here are the options:

  1. Sneak in and plant it there like Ethan Hunt
  2. Hire a hacker who puts your data there, above your competitors’
  3. Bribe Google
  4. Let Google do it for you, for free

No surprises, the correct option is A. Not really.

I trust Tom Cruise. But here’s the right option: You let Google do it for you, for free. I wish option C were correct, but Google doesn’t accept payments for ranking.

But how does Google discover you before placing your website in its database? Through Crawlers.

Crawlers/ Googlebots

Google developed a web crawler software called Googlebot, with the task of discovering the internet.

To elaborate on how these bots function, let me paint a picture. Imagine a line of ants foraging the forest to update the resource status back to base; that’s what bots do. “Crawling” is akin to a recce mission of ants.

The job of bots is two-fold:

  1. Analyze the edits on a known/ pre-crawled page.
  2. Discover new pages.

They analyze the textual and non-textual elements on a domain and upload the discovered data to the servers. And that way, Google knows that your website is about dog food and not about home baking techniques.

Besides the content, the Google bots analyze the user experience as well. With time Google has fine-tuned these bots to mimic human-being surfing the web.

And human beings surf the web from different device types and platforms: Some prefer desktop, others find mobile phones handy. The users have varied choices of media consumption: Some like to read, some listen, others prefer to watch a video. Still, some prefer accessing a service through an App, while others may stick to the browser. And to accommodate all the permutations and combinations, Google has designed different types of crawlers. You can find the whole list here.

You want to make it easy for Googlebots to crawl your website to make it to the Google Database. And this is how you do it:

How to improve website crawlability

Google gathers information about your website from many sources:

  1. Your website.
  2. Pre-crawled websites.
  3. User-submitted content (Google My Business, blog comments, etc.)
  4. Public databases and directories.

You need to influence each of them to ensure Google learns about your existence.

Optimizing your website

Internal Linking

It’s paramount to have an SEO-first approach to website design. That also includes appropriate internal linking to promote easy navigation for bots and humans alike.

The tree-like structure below is a visual vector to a smooth website flow.

Any page can be reached, by crawlers, from another page without using the navigation menu or sitemap.

Sitemap Submission

There are two methods of submitting this structure to Google. For both, you need to set up your Google Search Console account.

Method 1 (For websites with less than 1000 Pages)

Submit the homepage using the URL Inspection Tool built in the Google Search Console.

If your website has solid internal linking, Googlebots will have no problem navigating all the pages, starting from your homepage.

Method 2 (For websites with more than 1000 Pages)

Submit the sitemap through Google Search Console.

If you are using WordPress CMS, use the Yoast SEO plugin to create the sitemap. You can also choose the elements to include in your sitemap. For instance, if you may choose to omit ‘Tags’ and ‘Categories’ for your blogs.

Head over to the Search Appearance options panel supported by Yoast, as shown below. You can access different tabs that will help you customize the settings for your search appearance.

simple explanation of google search algorithm

Leave a mark on the Pre-crawled websites.

Domain Authority

Here’s a question: Would you heed to my dine-out advice or your friend’s?

There’s a good chance you’ll listen to your friend. Why? Because you know your friend better than you know me. There’s a factor of familiarity and trust.

Google is no different than you. It trusts the known: the websites it has crawled and authenticated before.

Domain Authority (DA) is a score assigned to every website that accounts for the level of trust it earned in the Google Algorithm. Greater the DA of a website, the more authentic it is.

Google crawls the known websites regularly to scan for fresh content. You can improve your website crawlability if a high DA publishes a link to your website on at least one of their pages.

This strategy is called backlinking.

Generating quality backlinks

There are many strategies to get that coveted backlink from high DA websites. We covered a few in detail here

Next, we will talk about a free tool that will help you find the DA of a website in 2 clicks!

Setting up the Mozbar

Moz is a company that aids SEO campaigning.

They own a Google chrome extension, Mozbar, that measures the DA of a website. Plus, it helps you determine which page is likely to rank higher on Google by analyzing the website’s vital signs.

After you download and set up the extension on Chrome, you can locate it at the top-right of the browser window. See in Red Below.

If you don’t see it yet, don’t worry; click on the Extensions icon in the blue box and ensure that you pin the Mozbar to the top.

Next, double click on it, the icon will turn blue from grey, indicating that it is active now. If you are on a Google SERP page, you’ll see the below screen.

simple explanation of google search algorithm

The Moz bar is enabled, as seen in the red box.

PA is the page authority, similar to DA, albeit limited to a page, not the whole domain. DA is the Domain Authority. The number in the middle is the count of backlinks to that page.

When a web page is open, Mozbar displays the spam score of the website too.

simple explanation of google search algorithm

User-submitted content

Any content submitted by the visitors to your website or on your external listing is user-submitted content. For instance, if a visitor:

  • Comments on your blog post
  • Writes a review on Google reviews
  • Writes a testimonial on your website
  • Creates a link of their business hosted on your website 
  • Writes a review on a directory page linking to your website

They all fall under this category. Google takes this content into account to gauge the quality of your website.

Remember, Google is constantly collecting cues regarding the customer experience you provide through your website. The list includes: how well you provide your services, response rate, and more. Make sure you provide a top-notch customer experience.

Public databases and directories

Depending on the business category and location, consider submitting your website to online databases and local directories.

These published submissions are called citations.

Google accounts for your citation sources and consistency as a ranking factor. Citations are of paramount importance while ranking on Google Maps.

Other methods

Leverage Robots.txt

Every website is assigned a crawl budget. It accounts for the attention Google will give to a website. You want to optimize your crawl budget by identifying the pages you want Google to crawl. Robots.txt is a text file that lets you do that.

It is a simple code that instructs the bots on which pages to follow and which ones not. This tactic avoids server request overload on your website.

In addition, this file finds use to optimize your page speed. Lesser the code that Google needs to parse, the lesser the page load time.

Avoid 404 errors

When the bots encounter a 404 error page, they need to restart the crawl. This rerun is a painful waste of your crawl budget.

To avoid that, use a 301 redirect for all your 404 error pages. A 301 redirect suggests the permanent migration of a page to another URL. This code redirects the bots to the new webpage, avoiding a dead-end path for the bots.

Avoid orphan pages

The web pages that are not linked anywhere else on your website are called orphan pages. But they may have a link to another page of the website. Since there is no way to reach orphan pages by crawling your website, the bots need to make a special visit, impacting your crawl budget.

Ensure your internal link building is robust. Any page of your website should have a path to every other page.

Indexing

Once Google crawls your website, it sorts the information on your website before uploading it on the server. In other words, your website gets indexed by Google. It then gets available in a repository, from where Google draws out data for a relevant search term.

Google Search is a “Google” Search and not a web searchIn the words of Matt Cutts in this video:

“When you do a Google search, you aren’t actually searching the web. You’re searching Google’s index of the web”.

Matts Cutts

So, how does Google do that? They follow a sorting order.

Google’s methodology is akin to admitting a fresh batch of toddlers into a school: They are first to split into different classrooms, organized alphabetically by their name, and given an enrollment code.

After crawling once, Google takes a note of all the words on your website, including the alt-text.

Next, it sorts on the “word order” and sets documents or web pages around that order. That makes sense, as text is the trigger for every search. Let’s elaborate on the word order in the upcoming section.

What happens after I hit search?

When I type coffee shop in the Google search bar and hit enter, here’s what Google does:

  1. It searches all the documents and collates those that have the word coffee. Assume documents – 15, 93, 122, 5001, 9002, 1000123 have the word – coffee.
  2. Next, it searches all the documents that have the word shop. Assume documents – 83, 93, 5001, 9002, 1210012 have the word – shop.
  3. It finds the documents which have both the words – coffee and shop. That is, documents 93 and 5001
  4. Next, it checks which of the two documents has both the words occurring in the order required by the user, how many times the combination occurs and thousands of other factors before determining the document that should rank at the top.

Since there are hundreds of millions of pages with the word coffee shop, in that order, we have tens of thousands of SERPs for those keywords. But only 10 of them make it to page 1 and get relevant traction.

How do you get to be one among those 10? By improving your website indexability.

How to improve your indexability

By now, we know how Google stores data. But how do you ensure that Google gets data in the best possible pattern, making it easy to store and retrieve when needed. Here are the steps:

  1. Using the keywords in the page titles, meta description, headings, URLs, and image alt-texts.
  2. Remove duplicate content or use canonical tags.
  3. Ensure a robust redirect setup.

Keyword Placement

Keyword research is how you discover what your potential audience types into Google. Getting the right keywords in your content gets you into the correct index of Google. However, this post will cover only the technical aspect. Moving forward, we will assume you got your keywords right.

While crawling your website, the bots analyze textual content. Google stores every word present on your website.

You need to insert the keywords in key sections like – Title, Meta Description, URL, Heading tags, and Alt-Text. Also, You need to plant the keywords strategically in the webpage copy. If you overload the content with keywords, the spam algorithm triggers de to keyword stuffing. If under, you may not appear on Search pages.

Ideally, 1-2 mentions of keywords for every 100 words should do the trick. In addition, keyword usage should be natural and contextual. Forced placement looks awkward for human visitors and bots alike.

Remove Redundancy

Google doesn’t penalize duplicate content: It can not determine the source of original content.

Nevertheless, similar pages compete with each other for ranking. It’s essential to minimize similar text on multiple pages.

However, at times you need to have similar pages. For example, a product description may appear on multiple URLs. In that case, you need to let Google know which of those is original. And that is what a canonical tag does for you.

When you declare a canonical URL for a webpage or blog post, you indicate that the source is different and should ranking higher.

Set-up Redirects

Google takes no time to de-index broken pages.

To avoid that, you need to let Google know that your webpage has permanently (301 redirects) or temporarily (302 redirects) moved to another URL. In that case, Google redirects the visitor to the new URL.

However, too many redirects will result in a higher page load time, impacting page rank.

Alright, Google has taken a note of your website and put it in an index. It’s time to serve your website to relevant visitors.

Serving

Okay, your library is open now, and you have a visitor. It’s time to locate the books for them.

But before that, you need to know their search intent. For instance, individuals looking for a book on dogs may be interested in different topics: Are they looking for books on the evolution of dogs from wolves? Or Dog food? Or the Body language of dogs? You must direct them to the correct rack based on their search requirements. This first step, in online search parlance, is termed Search Intent.

Search Intent

Over the years, Google search has become highly relevant to the search intent. Google algorithm has constantly evolved to provide top quality and easy-to-explore results based on the search query.

Now we have different results based on the search intent. If someone types in a salon near me, Google will show a map populated with businesses that provide salon services nearby to the user, instead of a blog post with the keyword: salon near me.

The above is an example of rich results.

You want to be visible in all types of search results, not just the organic listing. And here’s how you can do that:

How to improve your servability

Local SEO

Google takes into account the location of the user before serving the results. It makes sense, doesn’t it? If you’re a New Yorker looking for a nearby coffee shop on Google, you won’t want to see results from Okinawa. Local SEO helps you leverage the location aspect of SEO.

It consists of two aspects – your location and the service you provide. You need to highlight both of them to Google. And this is how you do it.

Once Local SEO efforts come to fruition, you will see your business in the Google Map Pack for a particular location for the services you provide. This listing means you secured an additional slot on the first page of Google other than your organic ranking!

Multilingual and Multi-Regional businesses

If your business spreads across different regions and languages, it’s imperative to inform Google about it.

Page Speed Optimization

Google has a tool – Google Pagespeed Insights that will help you find out how fast (or slow) your website is. It lets you see the results for desktop and mobile versions of your page.

In addition, it diagnoses the shortcomings which slow down the speed and gives you a list of suggestions to improve page speed.

This blog by Moz covers in detail the techniques to increase page speed. However, it does not cover Accelerate Mobile Pages (AMP). It is a feature that reduces mobile load time and makes your articles available for top stories carousel rich results in SERPs.

However, it comes with its challenges. This blog will help you figure out if AMP is the way for you.

Keep your content fresh.

Fresher content is more relevant.

Any breaking news is searched on Google first. Thus, Google ranks up the latest content.

You need to keep creating new content for your visitors or update the existing content to keep it fresh.

The single takeaway

There you have it! You have a checklist; tick all boxes to maximize the chances of Google considering your web pages for the relevant search queries.

If you want to take just one lesson on how to win at SEO, remember to create content for the users, not for Google Bots. Also, create different types of digital assets – Videos, Blogs, Downloadables, Podcasts, etc. The ranking will improve with time.

I wanted to write a simple explanation of Google Search Algorithm. This is an ever evolving guide, please suggest if you want me to include any aspect I may have missed.

Good luck.

Anurag Surya
Anurag Surya

B2B Saas | Content Marketer | SEO Specialist
My SEO strategies prioritize building human-to-human connections, resulting in longer-lasting client relationships.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.