Does JavaScript Rendering Impact Your SEO?

Felix Everett | 19 August 2025

In the volatile age of AI Search, the impact of JavaScript rendering on search ranking has again found itself under the spotlight. JavaScript and the rendering pipeline are often seen as a black box to those outside of web development, but with the increasing quantity of AI web crawlers scraping the Internet for content to feed hungry LLMs, it’s more important than ever to understand JavaScript SEO best practices and how rendering is affecting your ability to rank on AI search.

What Is JavaScript Rendering?

Rendering in the context of websites is the broad term used to describe the process that turns HTML, CSS, and JavaScript into the pixels on your screen. JavaScript rendering is simply the involvement of JavaScript in that process, often to dynamically add new content to existing HTML.


To understand how this affects SEO, it’s key to know the steps that happen between downloading a webpage from the Internet and seeing it on your screen.


Firstly, when a webpage arrives in your browser, your browser begins to parse that page’s HTML into a structural representation called the DOM tree (Document Object Model). Nothing is visual to the user at this stage, but it is
this layer that will be modified by JavaScript. There is also a complementary CSS Object Model (CSSOM), which combines with the HTML DOM to apply styling.


As your browser parses the HTML, if it encounters a <script> tag it will pause the parsing to download and execute the script (known as
blocking) before resuming. For page-critical JavaScript, render-blocking behaviour is essential to ensure a page loads predictably, but for non-essential JavaScript, the defer and async parameters can be added to the tag to allow the HTML to finish parsing first. You can read the full documentation on MDN web docs.


However, for large scripts,
render-blocking can negatively impact your website’s SEO through its effect on Core Web Vitals scores.


The way a website renders JavaScript
can also play a role in site optimisation. Using JavaScript to load the page from the client’s browser, for instance, is known as client-side rendering (CSR), and is a popular approach to web development thanks to its dynamicity and flexibility, but this can have serious SEO consequences if not handled carefully.


Lastly, as AI pushes its way into all corners of tech, the implications of JavaScript Rendering on web crawlers, used by Search Indexing and AI models alike, is evermore important.


Can (AI) Crawlers Render JavaScript?

One of the most important considerations for JavaScript SEO is the impact of JavaScript on crawling and indexation.


The largest web crawlers in the game - Googlebot and Bingbot - are both capable of rendering JavaScript and indexing rendered pages. In Google’s case, pages are sent through a multi-step process: once a URL is in the crawl queue, its HTML is parsed and added to a
render queue. The HTML is then rendered and sent back to be evaluated for indexing in Google Search.


In short, rendering content with JavaScript does not prevent indexation on Google.


In a recent talk addressing
what is rendering at BrightonSEO earlier this year, Google’s Martin Splitt covered the indexing pipeline, importantly emphasising that “the render queue time is not your biggest enemy” in getting your webpages indexed, and that the median time in the render queue is only five seconds. In short, using JavaScript for rendering does not, on its own, affect your webpage’s chance of being indexed.


But do other web crawlers render JavaScript, such as those used by AI? As
the presence of AI in Search continues to increase, all attention is on how AI reads the Internet.


Most crawlers that collect data for AI training - like OpenAI’s GPTBot and Anthropic’s ClaudeBot - only fetch static HTML and bypass JavaScript, backed up by
a study from Vercel. Google’s Gemini (which uses the same service as Googlebot) renders JS. Any page content hidden behind JavaScript will never be used as part of that training data. For example, if a website about gnocchi were to place its content behind JavaScript rendering, and it were the only website about gnocchi in the world, the AI model would not have that information to train on, and would not be able to inform the user about gnocchi. Apply this to an industry like ecommerce and it’s immediately clear why one may not want to hide their pages behind JavaScript.


To briefly step aside, let’s quickly touch on how AI training works. When a large language model (LLM) trains on a dataset, it doesn’t simply memorise a bunch of “facts”. Instead, it learns patterns and associations from enormous sets of information. If “gnocchi” and “made from potatoes” frequently appear together, a model trained on that association can accurately answer the question “what is gnocchi made from?”.


More recently, AI chatbots have become able to perform searches as part of their user experience, and cite links to web sources. ChatGPT and Gemini use Bing’s and Google’s indexes, respectively, both of which index rendered web pages, so placing content behind JavaScript will not prevent this content from appearing in AI search results.


So, JavaScript rendering may prevent content from being scraped for AI datasets, but for ChatGPT and Google this content is still very much reachable.

Does Client-Side Rendering Affect SEO?

The way in which a website handles its rendering also affects SEO. There are three main approaches to rendering content on websites, client-side rendering, server-side render, and static site generation.

Client-Side Rendering often poses challenges for SEO

Websites that use client-side rendering (CSR) defer the responsibility of page rendering to the user’s own browser. This is great for dynamic webpages like dashboards, feeds, and single-page applications, and is often found in ecommerce.


However, sites that deploy pages using CSR need to fetch additional resources to add to the DOM as the HTML is parsed, which can drive up page load times. If resource storage is not carefully managed to keep files locally, this can become a significant problem.


Slow load times can impact UX, and each additional script executed in the page can also delay the page from reaching a usable state, impacting SEO through poor
core web vitals scores. Beyond SEO, the negative environmental impact of slow webpages should not be understated, either.


Another consideration calls back to web crawling and the risks of hiding content behind JavaScript. While most modern web crawlers render JavaScript, it is still advisable to serve content without relying on JavaScript where possible.


In short, the use of
client-side rendering often poses some serious SEO challenges.

Server-Side Rendering and Static Site Generation are best practice for SEO

Unlike CSR, sites that deploy with server-side rendering (SSR) execute any JavaScript rendering on the server-side instead. This presents some advantages - HTML is available immediately and not hidden behind rendering, and a reduction in required resources brings down load times - but SSR adds also extra resource costs to the server.


As with CSR, there is still flexibility to dynamically build pages for frequently-changing content by regularly rendering pages on the server, but re-rendering for every new session can add strain on the server infrastructure.


A variant of SSR, static site generation (SSG), reduces the infrastructure costs associated with server rendering by pre-building pages at deploy time. These pre-rendered pages can then be served instantly via CDNs, enabling extremely fast page loads, especially for content that doesn’t change frequently.


Because both SSR and SSG deliver HTML without relying on client-side JavaScript, they offer a much stronger foundation for SEO. Search engines and AI crawlers can access the content directly, leading to better indexing and visibility.


Combined with proper caching strategies, SSR and SSG are typically considered
best practice for SEO performance, although CSR-heavy sites can still rank well with proper technical oversight.

JavaScript Rendering directly impacts Core Web Vitals

JavaScript Rendering also affects your SEO through its impact on Core Web Vitals (CWV). 


Websites with client-side rendering often require far more script and file requests to render a page vs their statically-generated or server-rendered counterparts. If a webpage relies on render-blocking requests before it can be displayed to the client, this can greatly increase load times, especially on the slower mobile connections that Google uses to calculate CWV scores. While all three scores are affected by JavaScript, CLS and LCP are particularly susceptible to JavaScript rendering.


First of all, the Largest Contentful Paint (LCP), which represents the time it takes for the largest “contentful” element of a page to finish rendering, is commonly delayed by any
render-blocking JavaScript that is fetched and executed before the LCP can complete. Pages with a high number of scripts see an even more exaggerated effect, especially common in the case of client-side rendering.


Secondly, Cumulative Layout Shifts (CLS), which are a measure of the extent to which page elements “snap about” as a page loads, also fall victim to
JavaScript rendering. Render-blocking resources can prevent elements on the page from rendering in the DOM, causing layout shifts when the elements can finally render. The tendency for scripts to delay the LCP often means that a delayed LCP corresponds with a major layout shift.

Does JavaScript rendering impact SEO?

In short, JavaScript rendering isn’t bad for SEO, but the way it's implemented can have a huge impact on your website’s visibility. JS rendering plays a substantial role in how search engines and crawlers interact with your site, and knowing how to optimise it is fundamental to a long-term SEO strategy. While JavaScript rendering does influence SEO, by ensuring your core content is accessible to crawlers, minimising render-blocking behaviour to improve Core Web Vitals, and choosing an SEO-friendly render strategy like SSR, you can strengthen your site’s visibility in both traditional and AI search.

JavaScript SEO FAQs

  • How Can I Make My JavaScript SEO Friendly?

    How your JavaScript and rendering is configured on your site can make a massive difference to your website’s SEO. Switching to server-side rendering or static site generation can help speed up page load times and reduce core web vitals issues, and deferring non-critical JavaScript files can help your webpages become usable faster.

  • Does JavaScript Affect SEO?

    Yes, JavaScript can have a substantial impact on SEO, depending on how it's implemented on your website, and can affect everything from crawling and indexation to core web vitals and UX. Our technical SEO and web performance audits can help you optimise your sites for Search.

  • Is Client-Side Rendering Bad for SEO?

    Yes, Client-side rendering can have a negative impact on SEO. Client-Side Rendering (CSR) is a popular site deployment method great for dynamically loading content, but its higher reliance on fetching resources makes it susceptible to slower load times and Core Web Vitals issues. Placing content behind client-side rendering can also prevent many AI crawlers from accessing, and ultimately surfacing your content.

Want to have a chat? 

Chat through our services with our team today and find out how we can help.

Contact us