HTTP.get runs out of memory

My app uses HTTP.get to fetch certain web pages.
However, it gets an “Evacuation allocation failed” error which I believe means that it runs out of memory.
I assume this is because the web page that it tries to fetch is extremely large.
Is there a way to limit how much of the web page is fetched - or otherwise avoid this problem?

There seems to be an issue tracked here that has the same problem. It is related to your memory. I need answers to a few questions:

  1. Exactly how large is the body of the web page you’re trying to fetch? (use Chrome DevTools to find out)
  2. How much RAM is available when you run this?
  3. What Content-Type does the page you’re fetching have? application/json? text/plain?

The only workaround you can do is to bypass Meteor and go low-level Node.js and work with streams to buffer the data to disk, and then handling smaller portions of the data. This becomes a real hassle when working with JSON, but you could use JSONStream to parse the stream.
Working with streams is really nice because you can do things like buffer data to disk so your process doesn’t need to store large amount of data directly in memory.

I have to apologize here, Svenskunganka, as this was based on a
misunderstanding.

Apparently the error wasn’t produced by Http.get but rather by
console.log. I logged the entire web page to the console and apparently
this created the memory error.
I run the app on a Windows machine and apparently using console.log on a
very long text causes a memory error.
So I just removed the console.log and it works well now.