Stay organized with collections
Save and categorize content based on your preferences.
Monday, October 27, 2014
We recently announced that our indexing system has been
rendering web pages
more like a typical modern browser, with CSS and JavaScript turned on. Today, we're updating one
of our
technical Webmaster Guidelines
in light of this announcement.
For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot
access to the JavaScript, CSS, and image files that your pages use. This provides you optimal
rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in
your site's robots.txt directly harms how well our algorithms render and index your content and
can result in suboptimal rankings.
Updated advice for optimal indexing
Historically, Google indexing systems resembled old text-only browsers, such as Lynx, and that's
what our Webmaster Guidelines said. Now, with indexing based on page rendering, it's no longer
accurate to see our indexing systems as a text-only browser. Instead, a more accurate
approximation is a modern web browser. With that new perspective, keep the following in mind:
Just like modern browsers, our rendering engine might not support all of the technologies a page
uses. Make sure your web design adheres to the principles of
progressive enhancement
as this helps our systems (and a wider range of browsers) see usable content and basic
functionality when certain web design features are not yet supported.
Pages that render quickly not only help users get to your content easier, but make indexing of
those pages more efficient too. We advise you follow the best practices for
page performance optimization,
specifically:
Optimize the serving of your CSS and JavaScript files
by concatenating (merging) your separate CSS and JavaScript files, minifying the
concatenated files, and configuring your web server to serve them compressed (usually gzip
compression)
Make sure your server can handle the additional load for serving of JavaScript and CSS files to
Googlebot.
Testing and troubleshooting
In conjunction with the launch of our rendering-based indexing, we also updated the
Fetch and Render as Google
feature in Webmaster Tools so webmasters could see how our systems render the page. With it,
you'll be able to identify a number of indexing issues: improper robots.txt restrictions,
redirects that Googlebot cannot follow, and more.
And, as always, if you have any comments or questions, please ask in our
Webmaster Help forum.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],[],[[["\u003cp\u003eGoogle's indexing system now renders web pages like modern browsers, requiring access to JavaScript, CSS, and image files for optimal ranking.\u003c/p\u003e\n"],["\u003cp\u003eBlocking Googlebot from accessing these files via robots.txt negatively impacts indexing and rankings.\u003c/p\u003e\n"],["\u003cp\u003eWebsites should be designed with progressive enhancement principles and optimized for performance to ensure content is accessible and efficiently indexed.\u003c/p\u003e\n"],["\u003cp\u003eWebmasters can use the "Fetch and Render as Google" tool to troubleshoot indexing issues and see how Google renders their pages.\u003c/p\u003e\n"]]],["Google's indexing system now renders web pages like modern browsers, necessitating an update to webmaster guidelines. Webmasters should allow Googlebot access to JavaScript, CSS, and image files for optimal indexing and ranking. Disallowing these files harms content rendering. Pages should be designed with progressive enhancement and optimized for performance, including minimizing downloads and optimizing CSS and JavaScript files. Webmasters can use the updated \"Fetch and Render as Google\" tool to identify indexing issues.\n"],null,["# Updating our technical Webmaster Guidelines\n\nMonday, October 27, 2014\n\n\nWe recently announced that our indexing system has been\n[rendering web pages](/search/blog/2014/05/understanding-web-pages-better)\nmore like a typical modern browser, with CSS and JavaScript turned on. Today, we're updating one\nof our\n[technical Webmaster Guidelines](/search/docs/essentials)\nin light of this announcement.\n\n\nFor optimal rendering and indexing, our new guideline specifies that you should allow Googlebot\naccess to the JavaScript, CSS, and image files that your pages use. This provides you optimal\nrendering and indexing for your site. **Disallowing crawling of Javascript or CSS files in\nyour site's robots.txt directly harms how well our algorithms render and index your content and\ncan result in suboptimal rankings.**\n\nUpdated advice for optimal indexing\n-----------------------------------\n\n\nHistorically, Google indexing systems resembled old text-only browsers, such as Lynx, and that's\nwhat our Webmaster Guidelines said. Now, with indexing based on page rendering, it's no longer\naccurate to see our indexing systems as a text-only browser. Instead, a more accurate\napproximation is a modern web browser. With that new perspective, keep the following in mind:\n\n- Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of [progressive enhancement](https://en.wikipedia.org/wiki/Progressive_enhancement) as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.\n-\n Pages that render quickly not only help users get to your content easier, but make indexing of\n those pages more efficient too. We advise you follow the best practices for\n [page performance optimization](/web/fundamentals/performance),\n specifically:\n\n - [Eliminate unnecessary downloads](/web/fundamentals/performance/optimizing-content-efficiency/eliminate-downloads)\n - [Optimize the serving of your CSS and JavaScript files](/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer) by concatenating (merging) your separate CSS and JavaScript files, minifying the concatenated files, and configuring your web server to serve them compressed (usually gzip compression)\n- Make sure your server can handle the additional load for serving of JavaScript and CSS files to Googlebot.\n\nTesting and troubleshooting\n---------------------------\n\n\nIn conjunction with the launch of our rendering-based indexing, we also updated the\n[Fetch and Render as Google](https://support.google.com/webmasters/answer/158587)\nfeature in Webmaster Tools so webmasters could see how our systems render the page. With it,\nyou'll be able to identify a number of indexing issues: improper robots.txt restrictions,\nredirects that Googlebot cannot follow, and more.\n\n\nAnd, as always, if you have any comments or questions, please ask in our\n[Webmaster Help forum](https://support.google.com/webmasters/community/).\n\n\nPosted by\n[Pierre Far](https://google.com/+PierreFar?rel=superauthor),\nWebmaster Trends Analyst"]]