Googlebot will render the #! URL directly, making it unnecessary for the website owner to provide a rendered version of the page.
Google announced it will no longer be supporting its original AJAX crawling scheme from back in 2009. Starting in the second quarter of 2018, Google said it will “no longer be using the AJAX crawling scheme.”
This should come as no surprise, because Google said years ago that it no longer officially and fully supported this method of AJAX crawling.
Now, Google is saying it will crawl and render your AJAX-based sites as is. John Mueller of Google wrote in the new blog post that “Googlebot will render the #! URL directly, making it unnecessary for the website owner to provide a rendered version of the page.” That means Google will continue to support these URLs in their search results.
Based on some of their tests, Google said it expects “AJAX-crawling websites won’t see significant changes with this update.” Of course, when the flip happens, then we will see how many people complain. To prepare, Google shared these tips with webmasters:
- Verify ownership of the website in Google Search Console to gain access to the tools there, and to allow Google to notify you of any issues that might be found.
- Test with Search Console’s Fetch & Render. Compare the results of the #! URL and the escaped URL to see any differences. Do this for any significantly different part of the website. Check our developer documentation for more information on supported APIs, and see our debugging guide when needed.
- Use Chrome’s Inspect Element to confirm that links use “a” HTML elements and include a rel=nofollow where appropriate (for example, in user-generated content)
- Use Chrome’s Inspect Element to check the page’s title and description meta tag, any robots meta tag, and other metadata. Also, check that any structured data is available on the rendered page.