It seems our company is headed towards ~40K concurrent users in a few months - and they will all come at once.
We’re wondering if anyone has any advice for us on how to prepare, how to stress test our app, and how to handle the deluge of users we’re expecting.
Thus far we’ve only handled ~500 concurrent users at most. That was hard, but redis-oplog by @diaconutheodor really saved our asses.
Any and all ideas are welcome.
Do you have a walkthrough script yet?
Any 3rd parties stress load tester will do the job to push Meteor to the limit and you could evaluate how good can it scale. In fact, at that scale, either Rust minihttp or Go fasthttp could probably be your choice,i found Rust minihttp is #2 fastest in Techempower benchmark Round 14
What is your hardware specifications and story?
@nadeemjq oh very nice, I know that for you the change of performance of redis-oplog was very dramatic, but you did not give too many details.
Did you manage to integrate the synthetic mutations by the way ?
Also, what I would do, is definitely create an elastic load balancer and use docker for deployment, let AWS spin more instances as you need them.