Running long background tasks

Hi,

Not sure how to go about this…

I have a background task that needs to run every 12 hours. It takes 30 minutes to an hour to complete. What it does is it has a loop that will call an API for an x amount of times (I think right now it has about 500 calls, will differ depending on the API response), retrieve the response, process it and either overwrite existing or insert new document into mongo.

Currently I have a cron set up with percolate:synced-cron currently which works fine on my local machine build, however when I upload it to Galaxy hosting, the same cron job stops with this error:
Application exited with signal: killed (out of memory)

So firstly, how can I avoid the out of memory issue?
And secondly, would it be better to run a separate application which only purpose is to update the data in mongo? Im not sure how to do this either as this would just be a background task with no frontend… Just a simple nodejs server without meteor?

Any help appreciated

I had a similar task and I used lambda functions. You can start them by q call from your app or using cloudwatch.

This functions can run in parallel so the task that took almost an hour now takes around 5 to 10 (we also did some perf improvements).

Hope that helps.

@mnts it sounds like you might have a memory leak issue that causes the memory utilization to grow, unless the task itself is really too demanding. Have you checked that out? For instance on your local environment, does the memory usage grow while that task runs? Does it drop back to pre-task figures or remain the same after the run?

2 Likes