Using React Helmet to Manage <head> Elements?

I’m using React and React Router on the front end. I just spoke to an SEO expert and I need to put different meta info (title, description, etc.) on every page. I don’t think I can use javascript (e.g. document.title = 'page title') for this because search engines may not run the javascript.

It looks like the best practices way to do this is via React Helmet. Does that sound about right?

I found something interesting here – it looks like Google does index head content created via Javascript.

I added in React Helmet. It was easy to use, it worked, and I saw no degradation in my Lighthouse scores.

Google has recently updated its crawler engine to parse and execute javascript. I think it was similar to chrome 70+ last year and regularly updated. So yes, Googlebot can crwal and execute javascript.

The challenge is the time the index is updated with javascript content because js parsing and execution takes a lot more time and resources than parsing html. Many big sites running js reported bad coverage (some less than 1/5th).

SSR + React Helmet can help solve this challenge. Without SSR, you need to wait if and when google parse and execute the content it crawled from your website

p.s.
we experienced cases wherein googlebot crawls our site and will execute js days after. But sometimes, the client-side code days ago was no longer compatible to server-side now e.g. data mismatch for a removed, added, or changed key in database

2 Likes

altough google says it can execute javascript, the results are very mixed and not reliable. Also one should know that there are a lot of other crawlers. Most famously all the crawlers on social media sites that asks your page for og-tags when someone shares an url.

aditionaly, if your app loads data after beeing loaded, the google bot might render your app in a loading state.

I hear that. But can they at least read the title and meta tags in the head section, if they’re put there via javascript? I’m focusing on SEO right now so if they can read those things that’s a start.

The naïve approach of Googlebot crawling js sites should work by now, allegedly; however Google claimed as early as around 2016 that they can crawl such sites, and they could just miserably, if at all. But them being able to crawl the content (head+body) is just a small fragment of this puzzle.

As @rjdavid noted, load + parsing + execution time are of crucial importance SEO-wise. Quite sadly, Googlebot doesn’t seem to cache the app as a browser would. Consequently it will appear to them that every single page is loading painfully slow, and this alone will inevitably result in a catastrophic page ranking.

There are just two ways out: SSR and prerendering. And both are very problematic to implement correctly.

This is a very important topic. SEO has some interactions with the topics of SSR and React, so it may be helpful to discuss a few more thoughts about it. And of course the topic is relevant to all Meteor apps, regardless of what tech is being used for the front-end layout engine. For discussion purposes:

SSR
I do have SSR on Galaxy for my home page, via the included prerender.io.

But every page needs to have unique SEO tags for title, description, etc. For pages after the home page, would SSR be a net benefit, given that they are dynamically generated based on client selections? (Is there even SSR for pages after the home page?) There are hundreds of different pages, and my understanding is that one of the benefits of a single-page web app is accelerated performance vs loading the full-page html, css, etc. from the server.

Many articles today express satisfaction with React Helmet as an SEO solution. From an article titled “Improving SEO with React Helmet:”

React Helmet is a tremendously popular library that helps us improve our SEO by “tailoring” our pages’ metadata to each page content, in a dynamic and efficient way.

react-helmet currently has 1.2M weekly downloads.

GOOGLEBOT
Per this article, “The Ultimate Guide to JavaScript SEO” from an SEO agency, Googlebot has come a long way in the past few years in terms of its ability to deal with Javascript.

Googlebot is based on the newest version of Chrome. That means that Googlebot is using the current version of the browser for rendering pages. But it’s not exactly the same.

Googlebot visits web pages just like a user would when using a browser. However, Googlebot is not a typical Chrome browser.

  • Googlebot declines user permission requests (i.e. Googlebot will deny video auto-play requests).
  • Cookies, local, and session storage are cleared across page loads. If your content relies on cookies or other stored data, Google won’t pick it up.
  • Browsers always download all the resources – Googlebot may choose not to.

That sounds like Javascript SEO is possible, at least in terms of the <title> and <meta...> elements in the <head> section. However to learn more about how a particular site performs in terms of SEO, the article recommends loading it in the Google Search Console. I currently have the NoIndex flag set on my site, and Google Search Console won’t let me preview the site until that flag is removed, so I’ll have to wait a bit longer to test this.

In the meantime, Lighthouse is giving my site a Performance score of 100.

The deduction on the SEO score is because I have the NoIndex flag set.

If you are using Lighthouse, you need to throttle it similar to the average device bandwidth speed of your users for your site. Or else you are not getting the real performance of your site.

If you do not have an idea what that average speed is, use the public pagespeed analyzer of Google. It combines Google chrome analytics data from all your users together with the lab tested speed.

Yes, it has tons of benefits. In our country, when you search for “buenconsejo mandaluyong” (name and location of this townhouse), the first result is a non-homepage user generated content on our site

Seo-wise, the ranking page in mobile is actually an AMP page which loads 20ms according to Google Search Console data

Here is the link of the Google cached AMP page
https://www.google.com/amp/s/onepropertee.com/amp/listings/buenconsejo-townhouse-in-mandaluyong/Tyy7xpgKZQJaRKsh6

Lighthouse reports that it is using throttling:

That’s very interesting! So if someone comes to that page via search, they get an SSR version, but if they get there from within your app, they get the single-page-web-app version. Cool!

If that is the average internet speed of your users, then good, since google uses the chrome experience data of your actual users to rate the performance of your page.