Meteor filtering out robots, preventing crawling

Currently I’ve been battling the uncrawlable site. I have working, and pages are caching on a Heroku hosted Meteor React app. There’s a sitemap.xml, I have metadata for the usual suspects like description, OG tags, Twitter Cards, there isn’t a robots.txt nor any noindex tags.

It’s on, a bit older but seems to be fine in every other regard.

Try running it locally and doing a CURL request that’s identical to a bot’s.

Either you misconfigured or your HTTPS is misconfigured.

In order to use prerender properly you need to tell your webserver to route to the prerender server instead of your web app if its a bot. I based my logic around this NGINX config file.