Official Google Webmaster Central Blog |
Updating our technical Webmaster Guidelines Posted: 27 Oct 2014 02:56 AM PDT Webmaster level: All We recently announced that our indexing system has been rendering web pages more like a typical modern browser, with CSS and JavaScript turned on. Today, we're updating one of our technical Webmaster Guidelines in light of this announcement. For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site's robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings. Updated advice for optimal indexingHistorically, Google indexing systems resembled old text-only browsers, such as Lynx, and that's what our Webmaster Guidelines said. Now, with indexing based on page rendering, it's no longer accurate to see our indexing systems as a text-only browser. Instead, a more accurate approximation is a modern web browser. With that new perspective, keep the following in mind:
Testing and troubleshootingIn conjunction with the launch of our rendering-based indexing, we also updated the Fetch and Render as Google feature in Webmaster Tools so webmasters could see how our systems render the page. With it, you'll be able to identify a number of indexing issues: improper robots.txt restrictions, redirects that Googlebot cannot follow, and more. And, as always, if you have any comments or questions, please as in our Webmaster Help forum. |
You are subscribed to email updates from Google Webmaster Central Blog To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
0 komentar:
Posting Komentar