• SEO Crawling Issues

Google Clarifies Impact of Blocking Crawling on Linking Power

  • Felix Rose-Collins
  • 2 min read
Google Clarifies Impact of Blocking Crawling on Linking Power

Intro

In a recent SEO Office Hours podcast, Google's John Mueller addressed a question regarding the effect of blocking Google from crawling a webpage on the "linking power" of internal and external links. His response provided valuable insights into how Google perceives and handles links, offering a user-centric perspective on this technical aspect of SEO.

Links, both internal and external, are fundamental in SEO for conveying the importance and relevance of web pages. Internal links help Google understand the structure of a website and prioritize pages, while external links (backlinks) are traditionally seen as endorsements that can boost a site's authority and ranking potential. However, the exact mechanisms Google uses to assess these links, especially external ones, remain partly speculative and based on older, possibly outdated information.

The question posed to Mueller was whether blocking Google from crawling a page would negate the linking power of links on that page. Mueller's response was framed from the perspective of a user, suggesting that if a page isn't accessible, users—and by extension, search engines—cannot interact with or follow the links it contains. Therefore, blocking a page from being crawled effectively nullifies the influence of its links, as they are invisible to Google.

Mueller emphasized a user-centric approach to understanding the impact of links. This perspective contrasts with older, more mechanical views, such as the misguided belief that simply placing keywords or links in specific contexts (like the old practice of linking from a page about "octopus ink" to a printer ink site) would influence rankings. Instead, Mueller suggests considering whether the links provide value or relevance from a user's standpoint.

This shift to a user-based perspective aligns with broader SEO trends that prioritize user experience and relevance over purely technical manipulations.

To ensure that important pages are discoverable, Mueller advised making sure they are linked from indexable and relevant pages within the website. Blocking critical pages from crawling or indexing can make it difficult for search engines to discover and rank them, potentially undermining SEO efforts.

Misconceptions About Robots Meta Tags

Mueller also highlighted a common misconception regarding the use of the robots meta tag. Some website owners mistakenly use the noindex, follow directive, believing it will prevent a page from being indexed while still allowing search engines to follow the links on it. However, if a page is not indexed, search engines cannot access or follow its links, rendering this directive ineffective. Furthermore, while the nofollow directive can prevent crawlers from following links, there is no equivalent "follow" directive to force search engines to follow links.

Conclusion

Mueller's insights underscore the importance of a user-centric approach in SEO, particularly in the context of link handling and crawl management. Ensuring that critical pages are accessible and linked from relevant, indexable content is crucial for maintaining the linking power and overall discoverability of a site. As SEO evolves, focusing on genuine user value and experience continues to be key in aligning with search engine algorithms and achieving sustainable results.

Felix Rose-Collins

Felix Rose-Collins

Ranktracker's CEO/CMO & Co-founder

Felix Rose-Collins is the Co-founder and CEO/CMO of Ranktracker. With over 15 years of SEO experience, he has single-handedly scaled the Ranktracker site to over 500,000 monthly visits, with 390,000 of these stemming from organic searches each month.

Start using Ranktracker… For free!

Find out what’s holding your website back from ranking.

Create a free account

Or Sign in using your credentials

Different views of Ranktracker app