Running Meteor Methods in parallel

Hi!

Let’s say I have a collection of documents that need to be updated from an external API. On the client, I fetch these docs and then delegate the call to the API on the Server in a Meteor method

I.e.

//on client
const docs = Docs.find().fetch();
for (let doc of docs) {
  //Delegate to Server to handle actual request to external API and updating to DB
  Meteor.call('updateFromExternalAPI', doc);
}

I’m aware this is not efficient and I’d like to run these requests in parallel. I’ve used Promises.all() before and understand there’s probably a myriad of ways to do this in standard JS and other frameworks. But, what’s the desired pattern for doing things like this in Meteor? Will I have to rely on other npm packages?

Any contemporary blogs, tutorials would be greatly appreciated.

Thanks!

First, I’m not seeing why you need to do the fetch() on the client. Put that into a server method and call that.
Secondly, then you can have a look server-side as to what optimizations are possible.

Basically, only let the client trigger the process through a call and then let the server do the actual data processing.

2 Likes

I agree with @rhywden. It might be better to do it like this on the client:

Meteor.call('batchUpdateFromExternalAPI');

And on the server:

Meteor.methods({
    batchUpdateFromExternalAPI() {
        const docs = Docs.find().fetch();
        // Do update stuff on external API
    }
})
1 Like

So the client has access to the cursor of my Documents on MiniMongo. You’re saying to pass the cursor to the Server method and then fetch from there? Is fetch really that expensive? I thought all it did was “convert” the Cursor objects to a Collection that can be iterated over through normal JS functions and not a Mongo lib.

I appreciate the insights on that. But do you know of a way to actually send out simultaneous HTTP requests from the Meteor Server?

I’ve used a promise queue something like the npm package p-queue should be good. With this you can load up all the requests, set the concurrency, and retry any requests that fail. you can await a promise that resolves when the queue is empty if you need notification when it’s done.

1 Like

Its not that the fetch method is very expensive, but what you’re actually doing is subscribing to documents from a serverside publication based on user info and parameters. That publication sends these documents to minimongo on the client. Then in your setup, minimongo sends multiple method calls back to the server based on the very same documents that the publication - and therefore the server - already had

That’s a lot of extra data going over the wire. In short. You are doing this:

  • subscribe to server
  • server returns docs into minimongo client
  • client sends docs back to server via call

In my proposed setup you don’t send the docs over the wire to the client and back to the server, but simply the parameters required for the server to fetch the entire batch of docs from Mongo. Then in the serverside method you fetch these docs and send api calls in parallel. That saves you quite some roundtrips and you will have less load on both server and clientside.

As for the parallel requests. If they are API requests, I would prefer using the axios npm library with Promise.all. Though Meteor has a nice HTTP libray, imho its a bit more tricky to use. But in that regards, it also shows that you could use any of your favorite NPM libraries with Meteor.

If you do want to use Meteor’s own HTTP library, then this guide my be a nice one:

Gl :slightly_smiling_face:

2 Likes