How to make React Javascript SEO Friendly

//How to make React Javascript SEO Friendly

How to make React Javascript SEO Friendly

So, you want an amazing website with mixed media content – with the option to embed custom components – and more. You know how to do this using front end web development tools but then you find that your site is super slow and therefore prone to search engine rankings drops, as your technical SEO is awful.

Why does this happen? It is because you need a middleware to support the idea that you can have everything you desire on your website and still enjoy speed and ease of navigation. ReactJS is a middleware application that allows you to produce the kind of content creation you desire and then publish this in a way that is still search-friendly.

Let’s slow down a little and take you step by step through the problems you face and how middleware is the solution you are looking for.

Some things to mention first…

SEO professionals need to understand DOM – as this is what search engines are using to analyse and understand your webpage. DOM (Document Object Model) is what you see when you “inspect element” in a browser. This is what the computer sees before it starts rendering the page from the HTML document. And potentially what the web crawlers see too.

Headless Browsing is the act of fetching a webpage without a user interface. This is a way of judging the user’s experience of the web content. It allows for rendering static HTML snapshots, which allows for a quick glimpse of the page without wait for the full rendering of the website. This makes the website seem much quicker – which is great for SEO.

The headache of web development and SEO

Developers want to build web applications using only JavaScript because it is the most widely accepted coding language across web browsers. It has greater power and more flexibility – and reduces development time considerably. However, if a Google web crawler hits your JavaScript page it only sees your tags and not the loaded page and so damages your search engine optimisation rankings. Google claim to be getting better at rendering JavaScript – but it is still far from being perfect and potentially too much of a risk for web developers.

Using ReactJS could lead to problems of crawlability, obtainability and perceived latency. Crawlability are bots that understand your site’s architecture. It is possible to block search engines from crawling JavaScript – even by accident. Google claims to employ headless browsing to render the DOM – however Google can only process some of the JavaScript – which means not all of the page is obtainable to the search engine. Finally, there are issues of site latency. When browsers receive an HTML document and create the DOM – this is a huge file and takes a long time to load.

In short, you may produce an amazing website using ReactJS but it may be completely unsuitable for successful SEO practices.

The best solution?

One argument is that if you want a SEO friendly website using ReactJS you should use Server-side pre-rendering (SSR) services such as prerender.io. What is this? Well, when a website is first opened all the operations are carried out. Instead of this, you need the page rendered and instead sent as a static HTML page that can be easily read. The web crawlers receive a frozen page of information super speedily rather than waiting for the whole page to load. Most advisors suggest that going down the SSR route allows your chances of being indexed by a search engine to increase and you don’t have to ever worry that you JS is working with the bot, or not.

Google offer an alternative argument. They suggest that improvements in their service mean that the Googlebot should have no problems crawling your JavaScript as long as you are not blocking it. They claim they can render and under web pages like modern browser. This would suggest you don’t need to make any special adaptations to ReactJS to make it SEO friendly. As long as the app is built semantically and renders very quickly – it should rank.

This means you can build a normal React app without worrying about SEO. This means there is no added worry or complexity of a middleware app. This makes building your app the easiest it can be, but you should believe that your JavaScript will compile and run correctly when the bot asks it to.

Should you trust this to be the case when you can accidentally block Googlebot without knowing? Probably not. Server-side rendering may be difficult to get working – but it might be worth the complexity to ensure your ranking.

In conclusion

You can search for all sorts of solutions to rendering issues with ReactJS. Some say that the only option is an SSR such as prerender.io that presents a static page to the bot, whilst still presenting a full user experience to a human. Some would argue that this adds a complexity to your web development that is not worth the headache. Google will say that they have solved crawlability issues with ReactJs to the same level as modern browses. Therefore, you need do nothing other than make sure your page loads super quickly to rank on technical SEO.

There is a middle ground that may be worth searching in detail. Client-side rendering is a route that allows access to the bot without the complexity of the SSR. However, there are lots of differences of opinion of how to code this into your JS.

So, in short, the best solution could be a) do nothing b) employ middleware or c) adapt your coding to make sure bots can crawl your site as is it being rendered. What is clear is that it is crucial to make your site navigable and speedy to make sure your site is SEO friendly.

2018-04-10T18:34:57-05:00March 30th, 2018|