Someday you will need to import a large file to mongo or some other DBMS ,
You start writing code for achieving same using NodeJS, but when you run memory usage start increasing and either core of your CPU is at 100%.
var fs = require('fs'), readline = require('readline'), stream = require('stream'), instream = fs.createReadStream("SomeFile"), k, buffer = [], c = 0, sCount = 0, lines = 0; console.log("Importing..."); rl.on('line', function (line) { ++lines; k = line.split(","); buffer.push({p:k[0],q:k[1]}); c++; if (c == 100000) { rl.pause(); //Pause the reader this won't help, as till system pauses some lines are already read. insertDocument(buffer, function () { sCount += buffer.length; buffer = []; //clear buffer c = 0; rl.resume(); }) } }); rl.on('close', function () { insertDocument(buffer, function () {}); console.log("\nProcessed [ Lines:" + lines + "]"); });
- Since nodejs is Async language you start getting `line` events without pausing for previous event to finish, You might get 10000 lines per second, depends on Disk Activity.
- Now we created a buffer so that less calls should be made to database, but still till the database processes request node may push same buffer 100 times for processing.
- To solve the above problem you might think pausing the reader till it records are not processed, Cool !, but it won’t work as expected because readline doesn’t pauses immediately, some events are still fired, so you might miss those – https://nodejs.org/api/readline.html#readline_rl_pause
- You might think to add a delay of few ms, but it isn’t scalable solution, Bit easy solution is to create another buffer and use a bit to indicate start of insertion operation, events after that can be processed into new buffer and when database finished change the bit copy the buffer.
A better solution would be making async call. Mongo (MONGOIMPORT), Mysql (MYSQLDUMP) have inbuilt functions to process large CSV file
mongoimport --db DATABASE --collection COLLECTION_NAME --type csv --headerline --file SOME_LARGE.csv
it works efficiently with a nice CLI showing progress.
Recent Comments