Meteor filtering out robots, preventing crawling

#1

Currently I’ve been battling the uncrawlable site. I have prerender.io working, and pages are caching on a Heroku hosted Meteor React app. There’s a sitemap.xml, I have metadata for the usual suspects like description, OG tags, Twitter Cards, there isn’t a robots.txt nor any noindex tags.

It’s on 1.4.4.2, a bit older but seems to be fine in every other regard.

#2

Try running it locally and doing a CURL request that’s identical to a bot’s.

Either you misconfigured prerender.io or your HTTPS is misconfigured.

#3

In order to use prerender properly you need to tell your webserver to route to the prerender server instead of your web app if its a bot. I based my logic around this NGINX config file.