Node.js framework shoot-out (Meteor alternative)

@ffxsam it’s been months now, what tech stack did you end up going with? How has it been working out for you? Is your solution still reactive? Are you using MongoDB?

@aadams Hey! One of my projects is still on Galaxy & Meteor + React for the time being, until I have time to rearchitect it.

Another project is in the process of being ported over to a new stack. React & Apollo on the front-end (on a static host, e.g. S3), connected via API Gateway to a GraphQL serverless backend running as an AWS Lambda function (Apollo server on Node.js 6.x). We’re using AWS CodePipeline + CodeBuild for CI/CD, which updates our codebase managed via CloudFormation. Still a bit of a work in progress, but it’s shaping up nicely. Apollo is secured via AWS Cognito, so users have to authenticate first, then Apollo will send along the proper Authorization header to API Gateway.

We’re still building out the front-end. We’ll be reactive to the extent that Apollo allows, which is nice because most of our app doesn’t need to be reactive. In the few places it should, I think we can use GraphQL subscriptions or Apollo polling to do so (not 100% sure how that works yet). And yes, we’re still using MongoDB (hosted on Atlas). The GraphQL backend uses Mongoose to interface with our Mongo database. If I had more time, I would’ve spent it learning more about AWS DynamoDB, but for now we’re using MongoDB.

Feel free to hit me up privately if you have any detailed questions!

1 Like

I do really like Meteor although on the streams handling I’m also loosing time. E.g. I’d like to stream a big CSV file into mongo records. Then the bindenvironment needs to be used according to the error messages. However this working is very unclear for me and when I finally got it working it will fail on big files. I presume the express way with a CSV stream inserting with mongoose will be more stable. In general the data handling for streams in combination with mongodb is giving me every time new errors/challenges. And I want to avoid in memory loops due to the size of CSV files although looping through the files line by line (instead of using streams) do give me the stable result.