Tips on How to Index Javascript Sites to Google

Tips on How to Index Javascript Sites to Google pixelwork

Tips on How to Index Javascript Sites to Google


On his Google+ page, John Mueller has posted an update on how Google indexes JavaScript sites and progressive web apps. Here are his recommendations:

1.- Don't hide anything from the Google robot

Use “feature discovery” and “progressive enhancement techniques” to make your content available to all users.

Avoid redirecting to a page that is “not compatible with the browser”.

Features Googlebot does not currently support include services including, Fetch API, Animation Frames.

2.- Use the rel = canonical attribute

Use the rel=canonical attribute when you need to serve content from multiple URLs. You can find more information about the attribute here (https://www.seoprofiler.com/training/faq?c=audit&c1=audit7&p=training/audit7.html)

3.- Avoid the AJAX crawling scheme on new sites

Consider migrating old sites that use this scheme soon. Remember to remove the “meta fragment” tags when migrating. Do not use a “meta fragment” tag if the “escaped fragment” URL does not serve fully rendered content.

4.- Avoid using “#” in URLs (outside of “#!”)

Googlebot rarely indexes URLs with “#” in them. Use “normal” URLs with “path/name/parameters” instead, consider using the navigation History API.

5.- Check your web pages

Use the search console's Fetch and Render tool to test how Googlebot sees your pages. Please note that this tool does not support “#!” or URL “#”.

6.- Check the robots.txt file

Make sure that all necessary resources (including JavaScript/Frameworks files, server responses, third-party APIs, etc.) are not blocked by robots.txt.

The fetch and process tool will display a list of discovered locked resources. If resources are locked by robots.txt (e.g. third-party API) or are otherwise temporarily unavailable, ensure that your client-side code does not fail.

7.- Do not use too many embedded resources

Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in waiting times and rendering without these resources available (for example, some JavaScript files cannot be loaded). Use reasonable HTTP caching policies.

8.- Google supports JavaScript to some extent

Google supports the use of JavaScript to provide titles, descriptions and tags to meta robots, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the specification, but the associated web page can be built using JS/PWA techniques. Remember to use a map file with the correct “last modified” dates to mark changes on your website.

10.- Other search engines do not support JavaScript at all

Finally, keep in mind that other search engines and web services that access your content do not support JavaScript at all, or might support a different subset.

In general, critical content on a web page should not be hidden in JavaScript. Google might be able to index JavaScript content to some extent, but you'll still have difficulty with other search engines.

Source: Free Seo News