Googlebot Optimization: Going a Step Beyond SEO
Googlebot Optimization: Going a Step Beyond SEO
The difference between Googlebot optimization and regular search engine optimization is that Googlebot optimization is focused on how Google’s crawler accesses your site. A site’s crawlability is the important first step to ensuring its searchability.
What is Googlebot?
Googlebot is Google’s search bot (also known as a spider) that crawls the web and creates an index. The amount of time that Googlebot gives to your site is called “crawl budget.” The greater a page’s authority, the more crawl budget it receives.
Google’s Googlebot article says that “Googlebot shouldn’t access your site more than once every few seconds on average.” In other words, your site is always being crawled, provided your site is accurately accepting crawlers.
Googlebot first accesses a site’s robots.txt to find out the rules for crawling the site. Any pages that are disallowed will not be crawled or indexed. A robots.txt is essential is because it serves as a directive to the Googlebot. Googlebot will spend its crawl budget on any pages on your site. You need to tell the Googlebot where it should and shouldn’t expend crawl budget. If there are any pages or silos of your site that should not be crawled, please modify your robots.txt accordingly.
The less Googlebot is spending time on unnecessary sections of your site, the more it can crawl and return the more important sections of your site.
Content that is crawled more frequently is more likely to gain more traffic. Although pagerank is probably the main factor in crawl frequency, it’s likely that the pagerank becomes less important when compared with freshness factor of similarly ranked pages.
It’s especially crucial for Googlebot optimization to get your low ranked pages crawled as often as possible. By using internal linking, you can also provide a map for Googlebot to follow as it crawls your site.
Read more about Googlebot Optimization on KISSmetrics’ blog.