0

I just started learning to build nodejs application. I am able to figure how things work so i made decision to test my application with large test data. I create json file with 1 million records in it.

I import data using

mongoimport --host 127.0.0.1 --port 27017 --collection customers --db Customer --file "path to json file mock.json" --jsonArray

Sample Json file is

[{"fname":"Theresia","lname":"Feest","email":"aileen@okeefe.name"},
{"fname":"Shannon","lname":"Bayer","email":"tristian.barrows@christiansenvandervort.com"},
{"fname":"Cora","lname":"Baumbach","email":"domenico.grimes@lesley.co.uk"},
{"fname":"Carolina","lname":"Hintz","email":"betty@romaguerasenger.us"},
{"fname":"Dovie","lname":"Bartell","email":"rogers_mayert@daniel.biz"}]

but it is taking too much time approx. 14 Hrs.

Please suggest any other optimized ways for the same.

A.T.
  • 24,694
  • 8
  • 47
  • 65
  • 1
    check [this](http://stackoverflow.com/questions/27884838/the-speed-of-mongoimport-while-using-jsonarray-is-very-slow) post – kkites Feb 03 '15 at 10:09
  • @KrishnaPullakandam thanks for right direction, it was really helpful. JSON array is somehow slow than json data in new line – A.T. Feb 04 '15 at 06:41
  • Glad to know. mongoimport might be splitting the file to import in parallel. With array I would guess it has to put the data into buffer and load it.How much time does it take now? – kkites Feb 04 '15 at 11:39
  • earlier it was 20/sec, now it is 8k/sec to 9k/sec. :-), it was huge difference. – A.T. Feb 04 '15 at 12:40

1 Answers1

0

Split your single json file into multiple files and then run multiple parallel mongoimport commands for each of the file.

yeniv
  • 1,589
  • 11
  • 25