Galaxy + Cloudfront + Googlebot inconsistencies

Short version: A recent update of our site added a few new pages and a few new 301 redirects. Googlebot is failing to retrieve one of these new pages, and is not performing the new 301 redirects. Console error shows that it’s trying to retrieve an old galaxy version of the page in question, despite invalidation of our cloudfront distribution.

Long version:

I recently deployed an update of our site that includes three new pages, as well as a few new 301 redirects from now-unused pages to these new ones. As always I invalidated the CloudFront distribution, and everything works fine in multiple browsers.

However, when requesting that google index these new pages, it kept failing for one of them (www.audiblegenius.com/buildingblocks). So I changed chrome’s User Agent to Googlebot and re-tested the page in question, and I get the following errors in console:

Resource interpreted as Stylesheet but transferred with MIME type text/html: "https://dg8vmh2d2xmjv.cloudfront.net/0b91fe997334fa074b964fc4ac584b641e31e448.css?meteor_css_resource=true&_g_app_v_=127".
buildingblocks:2 Cross-Origin Read Blocking (CORB) blocked cross-origin response https://dg8vmh2d2xmjv.cloudfront.net/0b91fe997334fa074b964fc4ac584b641e31e448.css?meteor_css_resource=true&_g_app_v_=127 with MIME type text/html. See https://www.chromestatus.com/feature/5629709824032768 for more details.

Notice the Galaxy version in those errors is 127, while the actual latest version of the site is 129. This is an error that often comes up after I’ve deployed, but before invalidating cloudfront. Invalidating it usually fixes it.

Weirder still, if I load the page while specifying any of our three galaxy containers (for example https://www.audiblegenius.com/buildingblocks?_g_container_=3w3oRNN9hRwm46wrG-76p6n&_g_debug_=true) the Googlebot user agent loads it fine. I’m also able to request indexing for that specific container URL, though I worry that will be a problem down the road whenever that container gets swapped out.

I’ve tried invalidating the entire site again, as well as the specific URL, with no luck.

Resolved itself after 5 days. Maybe googlebot was working with a cached version of some pages? In any case, done deal.