1

I was assigned a database project where the dataset I downloaded, was 50GB. After extracting the dataset I have 31 folders. Each of this 31 folders contain 23 folders. And each of this 23 folders contain 59 (00-59).json data. I need to feed this data to MongoDB. I was writing this command for it: mongoimport --db twitter --collection twitterCol --file /media/shamsad/1E8A00A88A007E91/archiveteam-twitter-stream-2013-08/08/01/00/00.json

But it is tiresome for writing this command for 31*23*59 .json data. I searched if there's any way so I can feed all the data at once. I found this answer. But I didn't understand it. Is there any other way to do this, or if this is the way, what does this script do and how does it work? ps: I'm using Ubuntu 16.04

Community
  • 1
  • 1
Rrakib
  • 200
  • 1
  • 1
  • 9
  • 2
    you can write script (bash, for example) to go through directories, locate jsons and call mongoimport – rzysia Apr 27 '17 at 15:08

0 Answers0