2017-04-24 63 views
1

任何人都可以指出我正确的方向为什么我不能将数据导入到mongodb?当我尝试导入只有第100行总的文件,我得到将数据导入到mongodb时如何修复JavaScript堆内存不足错误?

database-operations git:(master) ✗ node import_acparts_to_mongdb.js (node:10216) Warning: Possible EventEmitter memory leak detected. 11 close listeners added. Use emitter.setMaxListeners() to increas e limit ➜ database-operations git:(master) ✗

我尝试从同一个文件,CSV文件导入600.000线已结构如下:

设施; ITEM_NUMBER; part_name; part_description; net_weight; customs_statistical PBL; 5535210444; COVER; COVER; 0; 84314980 D37; 5535211545;支架;托架FIRE SUPP TANK A101-20; 2939; 72169110 PBL; 5535211234;支架;托架FIRE SUPP TANK A101-20; 2,939; 84314300 PBL; 5535212478; RING-SNAP; RING-SNAP; 0,045; 84314980 ....... 。 ......

➜ database-operations git:(master) ✗ node import_acparts_to_mongdb.js

<--- Last few GCs --->

38787 ms: Mark-sweep 1384.9 (1436.8) -> 1384.8 (1436.8) MB, 1181.9 /0.0 ms [allocation failure] [GC in old space requested]. 39964 ms: Mark-sweep 1384.8 (1436.8) -> 1384.8 (1436.8) MB, 1177.7/0.0 ms [allocation failure] [GC in old space requested]. 41199 ms: Mark-sweep 1384.8 (1436.8) -> 1385.8 (1420.8) MB, 1234.0/0.0 ms [last resort gc]. 42429 ms: Mark-sweep 1385.8 (1420.8) -> 1386.9 (1420.8) MB, 1229.8/0.0 ms [last resort gc].

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x4962c9cfb39 1: $__validate [/Users/isaklafleur/Dropbox/Isak/Coding/Other/autoMDM/node_modules/mongoose/lib/document.js:~1404] [pc=0xe52ebc4f d97] (this=0x383867c1f221 ,callback=0x383867c201e1) 2: validate [/Users/isaklafleur/Dropbox/Isak/Coding/Other/autoMDM/node_modules/mongoose/lib/document.js:~1324] [pc=0x...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory 1: node::Abort() [/usr/local/bin/node] 2: node::FatalException(v8::Isolate*, v8::Local, v8::Local) [/usr/local/bin/node] 3: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/usr/local/bin/node] 4: v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/usr/local/bin/node] 5: v8::internal::Runtime_AllocateInTargetSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node] 6: 0xe52eb8079a7 [1] 10085 abort node import_acparts_to_mongdb.js ➜ database-operations git:(master) ✗


const mongoose = require('mongoose'), 
    parse  = require('csv-parse'), 
    path  = require('path'), 
    fs   = require('fs'), 
    ACpart  = require('./models/acparts'); 

mongoose.Promise = require('bluebird'); 

mongoose.connect('mongodb://localhost/automdm_test'); 

const db = mongoose.connection; 

db.on('error', console.error.bind(console, 'connection error:')); 

db.once('open', function() { 
    // we're connected! 

    const p = path.join(__dirname, '/../', 'file-operations', 'csv-files'); 
    //console.log(p); 

    const parser = parse({delimiter: ';'}, function(err, data){ 
     //console.log(data); 
     const facility = data.map((item,i) => data[i][0]); 
     const item_number = data.map((item,i) => data[i][1]); 
     const part_name = data.map((item,i) => data[i][2]); 
     const part_description = data.map((item,i) => data[i][3]); 
     const net_weight = data.map((item,i) => data[i][4]); 
     const customs_statistical = data.map((item,i) => data[i][5]); 

     // Looping and storing the data into mongodb 
     for (let i = 1; i < data.length; i++) { 

      const newACpart = new ACpart(); 
      newACpart.facility = facility[i] 
      newACpart.item_number = item_number[i]; 
      newACpart.part_name = part_name[i]; 
      newACpart.part_description = part_description[i]; 
      newACpart.net_weight = net_weight[i]; 
      newACpart.customs_statistical = customs_statistical[i]; 
      newACpart.save() 
      .then(function() { 
       mongoose.disconnect(); 
      }) 
      .catch(function(err) { 
       console.log('There was an error', err); 
      }); 
     } 
    }); 
    fs.createReadStream(p + '/mrsparts.csv').pipe(parser); 
}); 

回答

1

你将不能够把一切都放在内存,如果它比你堆大。使用流CSV解析器,像那些之一:

然后它在批量发送到数据库,而不是一次。

+0

谢谢你的回答。正如你所看到的,我正在使用csv解析器。你能指出我正确的方向如何分批并分批发送吗? :) –