G7 Guides
Home » The Ultimate Guide to Dynamic Rendering
Crawling SEO

The Ultimate Guide to Dynamic Rendering

Ever since Google announced dynamic serving many JavaScript-heavy websites have all been considering it. But what exactly is it?

Before we can explain what it is, we first need to fully understand JavaScript websites. Below is an example of a JavaScript-heavy website, where all the content becomes invisible as soon as you disable JavaScript.

javascript-off-and-on-comparison-seo-102

The simplest way to understand JavaScript is by thinking of food, yes food! JavaScript off is like purchasing a ready-made meal which only requires 10 minutes in the oven to cook, but your oven is broken, meaning you are unable to cook your meal. Whereas JavaScript on, your oven is fully working and you are able to cook your food.

JavaScript websites are rendered on your computer, they are not sent by the server-rendered. Your computer does all the hard work of building the page. Therefore, if you disable the primary tool ‘JavaScript’ it cannot build your page hence you are left with a blank website.

Search engines such as Google can and do render JavaScript websites, but they usually choose not to do this. Why? rendering a website requires x 1,000 if not more resources than crawling a plain HTML website. To put that into perspective, if the internet only had 100,000,000,000 pages and Google had to render every single one of them, it would be equivalent of crawling 10,000,000,000,000 pages. This isn’t very practical for search engines, so they don’t do it!, therefore in SEO it is always advised to avoid using JavaScript.

Although as mentioned before Google can render JavaScript websites, they won’t do it often or frequent enough. This means new content can take a while to be indexed and large websites will often find not all of their pages are ever indexed. This is because Google will only render JavaScript websites whenever they have spare resources which means today’s news might not be seen until next week for Googlebot, significantly impacting time to market.

So how can you have a JavaScript-heavy websites that’s 100% Google friendly? dynamic rendering.

So how does dynamic rendering work?

Dynamic rendering is serving users and Google different pages in order to satisfy both requirements. For example, a user needs the JavaScript website in order to interact with the website. Whereas Google doesn’t interact with websites therefore serving a static HTML page with all the content visible without executing JavaScript is all that Google requires.

This process been broken down into a diagram below

 

The static website is exactly the same website as the dynamic JavaScript website. The only difference, you do not need JavaScript enabled in order to see all the content. Going back to our previous example, this is like ordering food to be delivered to your house. It doesn’t matter if your oven/cooker is broken, you are still able to eat, which is the main objective.

Technical implementation of dynamic rendering

In order to implement dynamic rendering on your website, there are some key elements you must take into consideration.

Dynamically pre-rendering a JavaScript website and serving it to Google as a static page takes a long time. This can take anything from 5-10 seconds (time to first byte) which will make your website almost uncrawlable for bots. Therefore this option is NOT recommended.

To solve this, we must introduce a caching layer within our dynamic rendering infrastructure. See the diagram below to help illustrate.

 

Pre-rendered (static HTML pages) pages should be cached. This means that the page has already been built in the background, therefore does not need to be re-rendered on the fly when Google requests it.

Cached pages can be served to Google in under 200ms, whereas doing it on the fly will take anything from 5 seconds at best to 10+ seconds.

Note: we have seen large websites lose over 50% of their organic traffic due to pre-rendering on the fly, resulting in Google not being able to access/crawl the website properly. We can’t stress enough how important it is to not do this! Remember, high time to first byte signals to Google that your servers are unable to handle the traffic, which causes Google to slow down. Imagine time to first byte that’s 1,000%+ over Google’s recommended 200ms limit!

I will not go into the details of caching and how it all works in the background, however if it is something you are interested in. Please do contact us and if we gain enough interest we can go into comprehensive detail.

Related posts

Redirect Path: How to check what status code a website returns?

George Prodromou

How to Implement Pagination for SEO in 2019

George Prodromou

Improve your SEO performance with G7 Cloud Hosting

George Prodromou

Leave a Comment