Read on and learn how pagination combines with click depth, crawl budget and internal linking to impact your Google rankings. You will also find some handy guidance on how to approach the issue.

The importance of click depth

Definition: Click depth describes the amount of clicks from the homepage needed before landing at another page - whether for users or the robotic crawlers deployed by search engines. A page linked from your homepage, for example, has a click depth of one.

Why are we framing this examination of pagination around click depth (also known as page depth)? Because click depth is among the most important factors in how Google understands your pages.

According to Google’s John Mueller, click depth is more important for your rankings than your site’s URL structure:

“What does matter for us ... is how easy it is to actually find the content. So especially if your homepage is generally the strongest page on your website, and from the homepage it takes multiple clicks to actually get to one of these stores, then that makes it a lot harder for us to understand that these stores are actually pretty important."

“On the other hand, if it’s one click from the home page to one of these stores then that tells us that these stores are probably pretty relevant, and that probably we should be giving them a little bit of weight in the search results as well. So it’s more a matter of how many links you have to click through to actually get to that content rather than what the URL structure itself looks like.”

That’s important stuff. In essence: Google gives a page taking only one click to reach from your homepage more weight and authority in search results. A page which takes several clicks to navigate to will tend to underperform.

Understanding crawl budget: how Googlebot crawls your site

Google’s robots (a.k.a. ‘crawlers’ or ‘spiders’) are interested in click depth. This is due to their inherent design.

Web crawlers operate by using hyperlinks. They typically find new pages by following links from other, previously crawled pages. There are few exceptions, with discovery via XML sitemaps being one example.

Every link that Googlebot finds on the web is added to a stack of links it needs to explore in future. Googlebot will regularly take a link from this backlog, crawl the page and decide whether to index its content for use in SERPs. Simultaneously, it will also add all links from that newly-crawled page to its to-do list.

The number of links that Googlebot explores can differ on a site-by-site basis. The depth it will crawl depends on whether the site is seen as authoritative, among a number of discrete factors (such as how long the site takes to crawl). In the majority of cases, though, Googlebot will only travel so far.

The relationship between click depth & crawl budget

The robots used by Google and other search engines to crawl your website will always take click depth into consideration.

Importantly: pages which are over three clicks from the home page are less likely to be crawled at all. Exceptions may occur if your site has an extremely high authority in Google’s eyes.

Critically: pages which aren’t crawled will have their potential, in terms of search visibility, massively limited.

Note about ‘spider traps’  - having the opposite problem

Sites which are considered authoritative by Google can encounter the opposite problem. Namely: being too crawlable.

In certain pagination schemes, a blog or news archive may provide ‘Previous’ and ‘Next’ links into infinity. If a site featuring this scheme is particularly authoritative, Googlebot might go deeper than usual.

A “spider trap” occurs when Googlebot spirals into this endless list of links, reaching potential depths of over 100,000. In these instances, search engine performance can be hurt by the inefficiency of Googlebot’s crawls. The bigger your site is, the more difficult it can be to spot these issues.

These aren’t things that small websites should have to worry about. However, it’s evident that the bigger a website gets, the more planning is needed to implement a clear, crawlable site architecture.

Why pagination effects rankings

The way you deploy pagination on your website has a significant effect on how robots will crawl and understand your content. This is because pagination is fundamentally about the architecture of your site's links.

The wrong implementation can significantly increase the depth of your key pages. In turn, this can limit your search performance by hindering Googlebot’s likelihood and ability to fully navigate your site.

Widely considered to be “best practice” pagination schemes, common versions you will find include:

  • Always including “Previous” and “Next” links
  • Always including a “First” option, linking to the first paginated page
  • Always including a link to the first paginated page from sub-pages (e.g. blog posts)

Imagine a pagination scheme which offers a basic list of numerical options, for instance. It offers users links to the next 3-5 pages, but no “Last”, “First” or “Previous” links. If this hypothetical website had 50 pages to get through, we would be looking at click depths of at least 10 to 15, if not higher.

This depth of pagination is bad news for Googlebot, which is likely to give up before exploring that extensively.

Notably, pagination isn’t just for robots - it’s for humans, too. Pagination is therefore a fine balancing act. Developers need to provide clear navigation for users, with good presentation, while ensuring all the technical fundamentals are covered.

Fortunately, there have been many studies into the effects different pagination schemes can have. Favourite examples include guides by Portent and Audisto, which clearly prove how pagination architecture can reduce or increase click depth.

Fixing click depth issues: internal linking

Clearly, having a high click depth across your website is bad for crawlability and search rankings. Contrarily, a low click depth corresponds with improved SEO results.

Whether your page depth issues are caused by poor pagination or wider site structure, the best place to start is at your internal links. A lack of appropriate links, from pages nearer the top of the funnel, is a common cause of click depth issues.

We know that hyperlinks are one of Google’s top indicators for understanding what pages are about and how they relate to other pages. We also know that links provide the "map" through which Googlebot navigates and understands a website as a whole. Furthermore, the more links that a page has, the more it is crawled:

A healthy internal linking strategy will therefore help to ensure click depth is always minimised - ideally to a max depth of 3 clicks. Simultaneously, it will guarantee that users have clear routes to find the content you want them to see.

Maximising your return on Google’s crawl budget can be achieved through a range of methods, including:

  • Ensure pages with useful content are linked from prominent, low-depth pages
  • Add featured links to ‘related’ or recommended products, posts and pages within the relevant category
  • Increase the number of high-level categories, adding links from a page depth closer to the homepage
  • Reduce the number of items or pages beneath a category (e.g. reduce total pagination from ten to five)
  • Create topical ‘pillar’ pages, strengthening links on key subjects - read our guide to Content Clusters for more detail
  • Look for opportunities to link top-level pages to the very deepest pages
  • Ensure site speed is optimised to reduce robots’ crawl time
  • Utilise auto-generating XML sitemaps to help inform robots of where content lives
  • Fix 404 errors and redirects: these server errors cause Googlebot to slow down or crawl several times, so keep the navigation across the site as smooth as possible

In summary: page depth should be a key consideration for websites which rely on large structures and heavy pagination. With the right approach, it’s possible to get more out of Googlebot’s crawl budget and ensure all the content on your site is visible to robots and users.

Worried that your internal linking and pagination strategies are hurting your search performance? Get in touch with Selesti today to see how our Digital Marketing team can help.