Google’s ranking algorithm is a complicated thing. First, it has more than 200 factors under the hood. Second, SEOs have spent years figuring out which factors are more important than others. Finally, Google may just hide something and have some more ranking factors which SEOs don’t even think about or think less than actually have to.

But SEOs are not simple, too, as well as the Google algorithm. They run experiments and research to find out the truth about Google’s ranking mechanism, and it seems they did succeed in their findings. In this post, I’m going to list the key findings on Google’s ranking pillars. Hope this will make the life of SEOs a little bit easier.


Honestly, the only thing that’s left from the original PageRank is its name — the original formula hasn’t been used since 2006. But as a ranking factor, the concept of PageRank is still successfully applied by Google, as links are still here. And, as you know, links are what PR is based on.

So what influences your pages’ PageRank and its allocation?

Backlinks. Backlinks bring link juice to your page from other pages on other websites. The more credible and authoritative a linking page is, the more link juice it will pass to your page. Note that it’s the page, not the domain that matters. Thus, a backlink from a strong page on a weak website will bring you more use than a backlink from a weak page on a generally strong website.

Internal linking. Internal linking helps you wisely spread link juice on the pages of your website. What is wisely? It depends on your marketing strategy. If you have many target (i.e. converting) pages, then you have to interlink them all and empower them equally. If you have a few or even one conversion page, then pass more link juice there to make this page the strongest.

Link placement. The position of the link on a page influences its ability to pass link juice. Content links pass the PR better. Navigation or footer links do it worse.

Link type. Everything here seems quite logical: dofollow links pass PR, nofollow links don’t. But the reality is more complicated. The experience of many SEOs proves that nofollow links from powerful resources like Wikipedia benefit the authority of a page linked to. It’s not clear if they pass PR or static weight, but they do pass something for sure. What’s more, a recent Google’s introduction of new types of nofollow attributes — rel=”sponsored” and rel=”ugc” — may be a signal that Google is going to treat them differently.

Yet they are not mandatory and co-exist together with a good old nofollow, but nobody knows what Google will come up with in the future.

What can “kill” your PageRank?

Orphan pages. Orphan pages are not linked to any other page on your website, and Google cannot see them and doesn’t even know they exist. So they just sit idle and don’t receive any link juice.

Redirect chains. Google now says that redirects now pass 100% of PR, I’d still recommend avoiding long redirect chains. They eat up your crawl budget, and, as said before, you cannot blindly believe every word of Google.

Links in unparseable JavaScript. Google cannot read them, so they don’t pass PageRank.

404 links. 404 links lead to nowhere, so your precious PageRank goes nowhere, too

Links to unimportant pages. You cannot leave any of your pages with no links at all (remember what orphan pages are), but pages are not equal. It’s not rational to super-optimize a link profile of a page that is not important.

Too distant pages. Google may not find and index a page too distant, so this page is not likely to receive much PR or will receive no PR at all.

To timely find out and fix any issues that can prevent your pages from receiving PageRank, I suggest auditing your website regularly with special SEO tools. Also, keep an eye on your link profile!


Google loves repeating that content is king. And Google doesn’t lie — content is one of the two key ranking factors of Google (the second one has been just described above).

So what can we say about content?

First, it’s keywords. It would not be wise to assume that Google will rank a page about, say, lava lamps if this page doesn’t contain the keyword lava lamp even a single time.

Second, it’s semantics and common sense. Google is clever. It can read between the lines and looks not only at the keywords themselves but at the context surrounding the word. And, considering Google’s recent announcement of MUM and its functionality (I mean the ability to analyze video, images, and perform a multimodal search), Google will soon become much more clever than we may assume.

What are the things you have to avoid?

Keyword stuffing. If you just overpack your page with keywords, the content may start sounding unnatural, and Google will most likely consider that page spammy. This will result in a rankings’ drop or even deindexing.

No or thin content. If a page is too minimalistic, Google can fail to understand any content at all and consider the page empty. Thus marking it as a spammy one, too.

Plagiarism. Everything is clear — if you don’t bring value to users and copy someone else’s content, you’ll never get to Google’s SERP.

Machine content. Let’s clarify — the machine content of poor quality. This means mere nonsense, mumbling one phrase all the way long, poor quality rewriting, compilation, machine translation, etc. If the content’s quality is good, and the content brings value to users, it’s ok to use some AI assistants. The truth is that there’s no AI yet that can produce the content that would need no human revision.

Page Experience

The importance of page experience as a ranking factor rose greatly after the Core Web Vitals release and switching to mobile-first indexing. I think you all remember that SERP volatility in summer 2021. Of course, these were not the only updates that caused volatility in 2021, but definitely among the most notable.

What do we have here?

Page’s loading speed. The faster a page loads, the better. Both Google and users appreciate fast pages. Do you keep waiting for a slow page to load, or quit it and go to a faster one? This is what I mean.

Visual stability. Page’s visual elements must remain where they are meant to be, not chaotically shift around the screen. Say, if you press one button, but then it disappears and you end up pressing another one you did not intend to, it’s bad. Very bad.

Mobile-friendliness. With mobile-first indexing, Google started paying much more attention to mobile versions of websites — if they are visually stable, fast, and properly designed, if they actually are. Considering that the number of mobile queries surpassed the desktop ones in 2016 and keeps moving forward, having a poor mobile website will cost you rankings and reputation. So it’s high time to make your website mobile-friendly if you haven’t yet for some mysterious reason.

What will make your page experience worse?

Excessive and messy CSS and JS. You may not need many of these elements, and all they do is eat up on your page’s loading speed. In addition, the code lines may be messy — contain extra spaces, empty lines, and so on, which altogether costs you precious loading time. Carefully look through the CSS and JS of your website and get rid of what you don’t need. If you use a CMS like WordPress, consider removing some non-vital plugins, as they make pages load slower, too.

Non-responsive design. Desktop designs don’t fit mobile screens — users would rather close a page than keep trying to zoom small letters and corrupted images. Make your layouts responsive. In the case of CMS, most of them offer a great variety of customizable responsive themes, so all you need is just to choose the one you like the most.

Too big images and videos. The bigger the element is, the more time it takes to load it. Compress images before uploading them. Also, consider not placing heavy videos on the top of the page.

Excessive pop-ups. Pop-ups prevent users and Googlebot from seeing the main content of the page. Especially if the pop-up appears right when you open the page.

Slow server speed. This is a common case if you have shared hosting when many different websites share the same server, and server capacity is split between all those websites. The only option here is to move to a faster server and consider dedicated hosting (one website = one server).

To check the issues related to page experience, you can use Google Search Console, PageSpeed Insights tool, and Google Mobile-Friendly Test, which can help you even if you don’t have a GSC account (which is super-weird if so). If you need to comprehensively audit your pages in bulk, consider using WebSite Auditor and its Page Speed module — it’s detailed and provides you with a list of the URLs affected by any detected issue.

On-page optimization

On-page SEO is the content of your website but in a more technical meaning. On-page optimization includes working with titles, meta headings, and descriptions, alt texts, URLs, etc. In other words, everything Google first looks at when finding your page and shows to users then.

Let’s have a closer look.

Titles. A title is one of the most important elements on a page. It is what gives Google a hint about the content of the page. So a title has to contain keywords (the closer to the beginning — the better), clearly reflect the idea of the page’s content, and be pretty short (around 50 characters).

Meta descriptions. A meta description is a short (145 characters) description of the page’s content. Or, better say, a teaser. Meta description has to contain keywords and be attractive for both users and Google. As it will be featured in your SERP snippet thus becoming the first impression about your page.

Structured data. Schema markup is the best way to quickly show Google the most important elements on your page and earn a rich snippet, as the data from the page will appear right on the SERP.

There are different schemas for any type of page — you can add special ones for product pages, recipes, small businesses, etc.

Alt texts for images. Alt texts are descriptions of the images on your page. They help Google better understand what’s on the image. Maybe Google will not need alt texts in the future when the MUM fully rolls out, but today they are important. Describe your images clearly, but don’t get too carried away. Say if there’s an image of a tiger in the woods, write something like tiger in the woods, not 3 year old asian tiger sitting on an old rotten bamboo tree in the woods of ancient spirits.

URLs. Your URLs have to be user-friendly, so users could easily understand what the page is about. Just like users, Google will also better understand a URL that is short, simple and includes keywords.

What are the don’ts of on-page SEO?

Empty or too long titles and meta descriptions. In this case, Google can create titles and descriptions on its own, but the thing is that Google may understand the content not the way you meant, so the result would be misleading. Moreover, Google does look at original titles and descriptions when ranking a page, so you could guess what your positions will be if titles and descriptions are empty.

Empty alt texts. It will be harder for Google to understand what’s in the image, moreover, your images will be unlikely to appear in Google Images.

Messy URLs. If your URLs are too long and contain some unbelievable symbols, your pages can end up not being indexed at all.

Keyword stuffing. If your titles, meta descriptions, and alt texts are stuffed with keywords, Google can consider your page spammy.

Most site auditing tools are pretty good at finding these common on-page issues.


E-A-T goes for Expertise, Authority and Trust. E-A-T is not a direct ranking factor, but it’s considered a lot, especially for the pages that deal with future happiness, health, financial stability, or safety. Google calls such pages “Your Money or Your Life” pages, or YMYL. As for the rest of the internet, E-A-T signals are important, too, so it would not be right not to mention this factor.

What signalizes Google that your content is expert, authoritative, and trustworthy?

Author’s bio and background. In the case of YMYL, the author of the content needs some professional background in the related area. Medical texts will be more valuable if written by a doctor, law texts — by a lawyer, etc. An author’s bio has to reflect the author’s expertise, and the author needs to have some more publications on trusted resources that are linked quite often in the professional area. All of these tell Google that the content is expert and credible.

Outgoing links to credible sources. Outlinking to your sources is one more way to show Google that your content is not meh but relies on the data from authoritative sources.

Authority of a website. This factor is especially important for news websites when you cannot get outgoing links of other credibility marks quickly. A website where news is placed has to be credible, old enough, and trusted by both users and Google. A website also has to demonstrate relatedness to the field it talks about. If it’s a website of an organization, it has to contain the address, phone numbers, names of people behind it — in other words, everything that proves the organization exists is active, has real expertise behind it, and so on.

To sum it up

I didn’t mention technical soundness — it goes without saying. If Google cannot see and index your pages, for no matter what reason, it’s clear that Google will not see anything else mentioned above, either.

All in all, the five factors listed above include many of those 200+ factors that SEOs like to argue about. What factors do you consider on top of Google’s ranking mechanism?

Read More: