[SOLVED ]Working with CSV File import with Fields Mapping

Hi, guys I’m trying work with CSV File Import with field. I was trying to read a csv file try to rename or remove some of the fields in then put /import it in new collection. Do you have any idea how to implement this?

1 Like

Hi @ivanfermanejo!

We also import CSV files to initialize the DB in some instances. We’ve done it two ways, one far easier (but with stricter requirements).

The quick-and-dirty way is to write a shell script to seed the database, like this:

#!/bin/bash

mongo localhost:27017/<<dbname>> --eval 'db.<<Collection>>.drop()'
# Remove "headerline" if your csv file does not have one ;)
mongoimport --verbose --host localhost:27017 --db <<dbname>> -c <<Collection>> --type csv --file <<filename>>.csv --headerline
# Any other cleanup done here. For example, we have a boolean field, but it gets imported as text, so we have lines like:
mongo localhost:27017/<<dbname>> --eval 'db.<<Collection>>.update({truth: "TRUE"}, {$set: {truth: true}})'
mongo localhost:27017/<<dbname>> --eval 'db.<<Collection>>.update({truth: "FALSE"}, {$set: {truth: false}})'

The requirement for that though is that you map field names to the headerline, and you said you needed to rename ore remove things. You could hack something together with a bunch of eval lines, like we did above, but there’s another solution, which gives you a lot more control: read the file in, parse it, and put records into the Collection exactly as you want them. Here’s some more pseudocode:

var fileSystem = Npm.require('fs');
var csvRowSplitter = new RegExp(',(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))');

var importCSV = function () {
    var CSV = Assets.getText('<<filename>>.csv');
    CSV = CSV.replace(/(\r\n|\r|\n)/g, '\n');
    var fileData = CSV.split('\n');
    fileData.shift();
    fileData.forEach(function (rowAsString) {
        var row = rowAsString.split(csvRowSplitter);
        if (row === null || row.length < <<NumOfFieldsYouExpect>>) {
            Log.error('ERROR: CSV ENTRY MISSING FIELDS');
            return;
        }
    /* Insert your processing here, indexing into the array "row", ex: "row[0]" to access fields */
    /* insert, update, remove, etc. as you like... */
    /* For example: */
    var rowAsObject =   {
            field1:     someLocalProcessedVariable,
            field2:       row[1],
            field3:      row[2] === 'TRUE',
            field4:     row[3] === 'TRUE',
            someCounter:       0
        };
        <<CollectionObject>>.insert(rowAsObject);
    });
};

Check out https://atmospherejs.com/dsyko/node-csv-npm

Thanks Alot! Gamemaker. I’ll try this and let you know if it works.

1 Like

Curious how this worked out. I’m trying to do something similar (upload CSV, then match CSV columns to collection fields, and then import) and while I can get the columns from the CSV, I’m just bouncing around trying to figure out how to a) render the collection fields and b) send the array of matches to the server.

1- Parse CSV to array of array.
2-
array_of_CSV.forEach(item => {
TheCollection.insert({
colum0: item[0],
colum1: item[1],
colum2: item[2],
})
})
something like this??

Hi @bysabi,

Yep, that’s what I was thinking. Just trying to work out how to let the user match CSV columns to collection fields. So giving them an interface with the Papa Parse results.meta.fields where they can then match up a list of existing fields, which would then output an array that would be used for the actual insert.

So my question would be, how can you give the user a way to say:

column0 = item[1],
column1 = item[7],
column2 = item[3],
columnn = item[n]

Do you have any advice on that?

Uhm, relay on user choose the correct match column/db_field is not realiable cause you never can validate that user match it correctly. Maybe is more recommended enforce correct CSV column format thought documentation and is more easy to code and mateinance.
In my case I only parse CSV on the server side with CSV/Excels provide by myself.