Stay organized with collections
Save and categorize content based on your preferences.
Thursday, December 04, 2008
We've upgraded the crawl rate setting in
Webmaster Tools
so that webmasters experiencing problems with Googlebot can now provide us more specific
information. Crawl rate for your site determines the time used by Googlebot to crawl your site on
each visit. Our goal is to thoroughly crawl your site (so your pages can be indexed and returned
in search results!) without creating a noticeable impact on your server's bandwidth. While most
webmasters are fine using the default crawl setting (that is, no changes needed, more on that
below), some webmasters may have more specific needs.
Googlebot employs sophisticated
algorithms that determine how much to crawl each site it visits. For a vast majority of sites,
it's probably best to choose the "Let Google determine my crawl rate" option, which is the
default. However, if you're an advanced user or if you're facing bandwidth issues with your
server, you can customize your crawl rate to the speed most optimal for your web server(s). The
custom crawl rate option allows you to provide Googlebot insight to the maximum number of
requests per second and the number of seconds between requests that you feel are best for your
environment.
Googlebot determines the range of crawl rate values you'll have available in Webmaster Tools. This
is based on our understanding of your server's capabilities. This range may vary from one site to
another and across time based on several factors. Setting the crawl rate to a lower-than-default
value may affect the coverage and freshness of your site in Google's search results. However,
setting it to higher value than the default won't improve your coverage or ranking. If you do set
a custom crawl rate, the new rate will be in effect for 90 days after which it resets to Google's
recommended value.
You may use this setting only for root level sites and sites not hosted on a large domain like
blogspot.com (we have special settings assigned for them). To check the crawl rate setting, sign
in to
Webmaster Tools
and visit the Settings tab. If you have additional questions, visit the
Webmaster Help Center
to learn more about how Google crawls your site or post your questions in the
Webmaster Help Forum.
Written By Pooja Shah, Software Engineer, Webmaster Tools Team
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],[],[[["\u003cp\u003eGoogle has updated Webmaster Tools to allow webmasters to customize their site's crawl rate for better server management.\u003c/p\u003e\n"],["\u003cp\u003eGooglebot automatically determines the optimal crawl rate for most websites, and it's generally recommended to stick with the default setting.\u003c/p\u003e\n"],["\u003cp\u003eWebmasters experiencing bandwidth issues or those with specific needs can adjust the crawl rate in Webmaster Tools to control the frequency of Googlebot visits.\u003c/p\u003e\n"],["\u003cp\u003eSetting a custom crawl rate lower than the default may impact a site's search visibility and freshness in Google Search results.\u003c/p\u003e\n"],["\u003cp\u003eCustom crawl rate settings are temporary and revert to Google's recommended value after 90 days.\u003c/p\u003e\n"]]],["Googlebot's crawl rate, which affects how frequently Google crawls a site, can be customized by advanced users or those with bandwidth issues. Webmasters can adjust the crawl rate in Webmaster Tools by specifying the maximum requests per second and seconds between requests. Google determines the available range based on server capabilities. The default setting lets Google decide. Customized settings are active for 90 days before reverting to Google's recommended rate, and the feature is only for root-level sites.\n"],null,["# More control of Googlebot's crawl rate\n\n| It's been a while since we published this blog post. Some of the information may be outdated (for example, some images may be missing, and some links may not work anymore). Read the up-to-date documentation on [reducing the Googlebot crawl rate](/search/docs/crawling-indexing/reduce-crawl-rate).\n\nThursday, December 04, 2008\n\n\nWe've upgraded the crawl rate setting in\n[Webmaster Tools](https://search.google.com/search-console)\nso that webmasters experiencing problems with Googlebot can now provide us more specific\ninformation. Crawl rate for your site determines the time used by Googlebot to crawl your site on\neach visit. Our goal is to thoroughly crawl your site (so your pages can be indexed and returned\nin search results!) without creating a noticeable impact on your server's bandwidth. While most\nwebmasters are fine using the default crawl setting (that is, no changes needed, more on that\nbelow), some webmasters may have more specific needs.\n\n\n[Googlebot](/search/docs/crawling-indexing/googlebot) employs sophisticated\nalgorithms that determine how much to crawl each site it visits. For a vast majority of sites,\nit's probably best to choose the \"Let Google determine my crawl rate\" option, which is the\ndefault. However, if you're an advanced user or if you're facing bandwidth issues with your\nserver, you can customize your crawl rate to the speed most optimal for your web server(s). The\ncustom crawl rate option allows you to provide Googlebot insight to the maximum number of\nrequests per second and the number of seconds between requests that you feel are best for your\nenvironment.\n\n\nGooglebot determines the range of crawl rate values you'll have available in Webmaster Tools. This\nis based on our understanding of your server's capabilities. This range may vary from one site to\nanother and across time based on several factors. Setting the crawl rate to a lower-than-default\nvalue may affect the coverage and freshness of your site in Google's search results. However,\nsetting it to higher value than the default won't improve your coverage or ranking. If you do set\na custom crawl rate, the new rate will be in effect for 90 days after which it resets to Google's\nrecommended value.\n\n\nYou may use this setting only for root level sites and sites not hosted on a large domain like\nblogspot.com (we have special settings assigned for them). To check the crawl rate setting, sign\nin to\n[Webmaster Tools](https://search.google.com/search-console)\nand visit the Settings tab. If you have additional questions, visit the\n[Webmaster Help Center](https://support.google.com/webmasters)\nto learn more about how Google crawls your site or post your questions in the\n[Webmaster Help Forum](https://support.google.com/webmasters/community).\n\nWritten By Pooja Shah, Software Engineer, Webmaster Tools Team"]]