I’ve got code in a Meteor method that reads lines from a giant CSV file and then basically parses the lines and sticks them in the database… I’ve wrapped the function called for each line in Meteor.bindEnvironment, and this sort of works but it rapidly chews up all my RAM and then Node starts garbage collecting and the whole process slows to a crawl. Also the execution of each callback is pretty much delayed and doesn’t run in order. And to make it worse, it’s slow to start with because it’s creating a new bound function on each call, which appears to be very inefficient.
The code looks vaguely like this…
import { Meteor } from 'meteor/meteor';
import { CSV } from 'meteor/evaisse:csv';
...
CSV.readCsvFileLineByLine(
filename,
{ headers: true },
Meteor.bindEnvironment(function(data) {
Collection.upsert(
{ name: data.name },
{ $set: data }
);
})
);
I’ve tried defining a function that’s wrapped with Meteor.bindEnvironment a single time
The file is way too big to fit into memory unfortunately. Is there a way to release the memory used on each bindEnvironment call or maybe someone knows of a more efficient, Meteor fibers-aware way of reading a file line by line? I’ve tried half a dozen ways of reading a line at a time and all of them seem to have the same problem.
Meteor 1.4.2 and all previous versions (I’ve been struggling with it for a while).