Community Information
-
•
Content Crawling Issues with Client-Side Rendering (CSR) JavaScript
Hello, We have a website that works with Client-Side Rendering (CSR). At first, we were having problems with pages not being crawled and indexed. More specifically, we were getting a “Duplicate Without User-Selected Canonical” error. We thought that this was due to JS-related, and more specifically CSR-related, issues where Google was unable to crawl the page content (due to not executing JS on enough pages). As a workaround, we integrated Prerender.io and now the “Duplicate Without User-Selected Canonical” error has been replaced by “Crawled - Currently Not Indexed” and we have over 10,000 such links. With Next.JS we're thinking about how moving to SSR can help our situation, it's going to be a labor intensive transition but we don't know if the effects will be worth it. Daniel Cheung, one of the Technical SEO experts I follow, made the following comment about the “Crawled - Currently Not Indexed” error (which we share the same suspicions): "Google wants to know what content it is committing to its index. Therefore, it needs to see what the webpage looks like and it does so with mobile-first indexing. If the page requires extensive JavaScript or critical CSS paths have been blocked in the robots.txt file, Google cannot see the webpage and therefore cannot gauge the relevance and quality of the content. This is one of the most common reasons why modern JS frameworks experience indexing issues." We look forward to your support on this issue.3
© 2025 Indiareply.com. All rights reserved.