|
Using JavaScript and hashed URLs to load elements dynamically is not a good practice and causes problems with your website's visibility in SERPs. Robotstxt Make sure your robots.txt file does not block JavaScript files. If Google does not access JavaScript files, it will not be able to display your site's content correctly. Therefore, you should check that all folders marked as not allowed are not JS files. Example of a robotstxt file locking a JS file folder: Robotsxt locking JS file folder You can also use Screaming Frog to find locked resources: Screaming Frog Locked Resources ? We want to make sure Google can render your site correctly.
To do this, we need to ensure that no important JS files in the robotstxt file are blocked from indexing. GSC Indexing Statistics Report A good place to see how many JavaScript files are indexed on your site is the GSC Indexing Statistics report. You can find it in the settings: GSC Email Data Indexing Statistics Report When you open the report, you can see the indexing statistics by file type. Therefore, it is advisable to check it periodically. Crawl Statistics Server-Side Rendering You've probably come across the term server-side and client-side rendering.

Let's briefly remember: Client-side rendering: It means that the page content is generated in the browser, not on the server. If you are using a JavaScript framework, the default setting will be client-side rendering. Server-side rendering: JavaScript code rendering is done on the server and the browser only displays the final result. Why is client-side rendering not good for SEO? Using JavaScript for client-side rendering practically means extra seconds of page loading, which is definitely not good for user experience. Using this type of rendering engine also means worse results in SERPs: JavaScript problems can block your website or some of its content from displaying in the search engine.
|
|