[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["没有我需要的信息","missingTheInformationINeed","thumb-down"],["太复杂/步骤太多","tooComplicatedTooManySteps","thumb-down"],["内容需要更新","outOfDate","thumb-down"],["翻译问题","translationIssue","thumb-down"],["示例/代码问题","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2015-12-01。"],[[["\u003cp\u003eGoogle is retracting its 2009 recommendation for AJAX crawling using \u003ccode\u003e_escaped_fragment_\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eGooglebot can now render and understand web pages like modern browsers, including JavaScript and CSS, making the previous AJAX crawling proposal obsolete.\u003c/p\u003e\n"],["\u003cp\u003eWebsites are encouraged to adopt progressive enhancement principles, ensuring content accessibility for a wider range of browsers and search engines.\u003c/p\u003e\n"],["\u003cp\u003eWhile websites using \u003ccode\u003e_escaped_fragment_\u003c/code\u003e will still be indexed, Google recommends implementing industry best practices, like using \u003ccode\u003e#!\u003c/code\u003e URLs or History API's \u003ccode\u003epushState()\u003c/code\u003e, for new or restructured websites.\u003c/p\u003e\n"],["\u003cp\u003ePre-rendering pages for performance benefits is acceptable, but the content served to Googlebot must match the user's experience to avoid cloaking.\u003c/p\u003e\n"]]],["Google is no longer recommending the 2009 AJAX crawling proposal. Google's systems can now render and understand JavaScript-driven content, similar to modern browsers, provided Googlebot isn't blocked from accessing JavaScript or CSS files. Sites using the old `_escaped_fragment_` method will still be indexed, but new implementations should avoid it. Instead, use progressive enhancement principles and ensure the content served to Googlebot matches the user experience. Pre-rendering for general user benefits is acceptable, but cloaking is not.\n"],null,["# Deprecating our AJAX crawling scheme\n\nWednesday, October 14, 2015\n\n\nIn short: We are no longer recommending the [AJAX crawling](/search/docs/ajax-crawling)\nproposal we made\n[back in 2009](/search/blog/2009/10/proposal-for-making-ajax-crawlable).\n\n\nIn 2009, we made a\n[proposal to make AJAX pages crawlable](/search/blog/2009/10/proposal-for-making-ajax-crawlable).\nBack then, our systems were not able to render and understand pages that use JavaScript to present\ncontent to users. Because\n\"[crawlers are not able to see any content created dynamically](/search/docs/ajax-crawling/learn-more)\",\nwe proposed a set of practices that webmasters can follow in order to ensure that their AJAX-based\napplications are indexed by search engines.\n\n\nTimes have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript\nor CSS files, we are generally able to\n[render and understand your web pages like modern browsers](/search/blog/2014/05/understanding-web-pages-better).\nTo reflect this improvement, we recently\n[updated our technical Webmaster Guidelines](/search/blog/2014/10/updating-our-technical-webmaster)\nto recommend against disallowing Googlebot from crawling your site's CSS or JS files.\n\n\nSince the assumptions for our 2009 proposal are no longer valid, we recommend following the\nprinciples of\n[progressive enhancement](https://en.wikipedia.org/wiki/Progressive_enhancement).\nFor example, you can use the\n[`History API pushState()`](https://developer.mozilla.org/en-US/docs/Web/API/History_API)\nto ensure accessibility for a wider range of browsers (and our systems).\n\nQuestions and answers\n---------------------\n\n\n**Q: My site currently follows your recommendation and supports `_escaped_fragment_`.\nWould my site stop getting indexed now that you've deprecated your recommendation?** \n\nA: No, the site would still be indexed. In general, however, we recommend you implement industry\nbest practices when you're making the next update for your site. Instead of the\n`_escaped_fragment_` URLs, we'll generally crawl, render, and index the\n`#!` URLs.\n\n\n**Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site\nmove? Do I need to implement redirects?** \n\nA: If your current setup is working fine, you should not have to immediately change anything. If\nyou're building a new site or restructuring an already existing site, simply avoid introducing\n`_escaped_fragment_` urls.\n\n\n**Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still\nok?** \n\nA: In general, websites shouldn't pre-render pages only for Google---we expect that you might\npre-render pages for performance benefits for users and that you would follow progressive\nenhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot\nmatches the user's experience, both how it looks and how it interacts. Serving Googlebot different\ncontent than a normal user would see is considered cloaking, and would be against our Webmaster\nGuidelines.\n\n\nIf you have any questions, you can post them here, or\n[in the webmaster help forum](https://support.google.com/webmasters/go/community).\n\n\nPosted by\n[Kazushi Nagayama](https://plus.google.com/+KazushiNagayama/), Search\nQuality Analyst"]]