I am new to mongodb
and want to know about importing a json
file from one server to another. I tried the following command mongoimport -d test -c bik check.json
and it works fine for me. Now i want to know when there are multiple json
files how do i import all of them at a single go. I could not find any related document where it is written this is not possible. Please help me is this possible and how

- 1,528
- 2
- 22
- 36

- 3,373
- 15
- 49
- 83
13 Answers
I came up with a more elegant way to automatically import ALL collections:
ls -1 *.json | sed 's/.json$//' | while read col; do
mongoimport -d db_name -c $col < $col.json;
done
I hope this is helpful.

- 226,338
- 43
- 373
- 367

- 10,606
- 4
- 50
- 70
-
Worked like a charm! I actually needed to import tsv files, so I had to include `--headerline` and `--type tsv` Pretty simple. Thanks – agarcian Dec 10 '14 at 18:13
-
It worked and made my day but I had to add --jsonArray with mongoimport command `mongoimport -d tpsmnew -c $col < $col.json --jsonArray;` because I was getting this error Failed: cannot decode array into a D – mohit jain Jun 27 '21 at 08:35
You can always write some shell scripts.
colls=( mycoll1 mycoll2 mycoll5 )
for c in ${colls[@]}
do
mongoimport -d mydb -c $c.json
done

- 226,338
- 43
- 373
- 367
-
3Just a query. Is it not possible to define a directory name and just specify the path of the directory to import all the `json` files from it instead of specifically specifying the filename. This is because if there are several files then we have to specify all the filenames and then do the `for` loop which is same as manually performing the `mongoimport` command for each files. Can it be done more dynamically? – user850234 Jul 23 '12 at 09:58
-
1I tried that and it didn't work. It works with `mongorestore`, though. – Sergio Tulentsev Jul 23 '12 at 10:05
-
-
-
Just for FYI. Importing data as json has one drawback, It will not preserve the data types. Long converts to floats etc. Don't get surprised if you get any Number format exceptions in application (in our case we use java). – titogeo Jul 26 '13 at 05:10
-
@titogeo i'm glad that happens. I didn't know it did that. I was hoping it wouldn't stringify everything just because it was a big json string – Kristian Nov 17 '15 at 23:12
-
@SergioTulentsev: This creates a separate collection for each json file and the name of json file becomes the name of resulting collection. What if we want to send all json files into one collection e.g., "test" ? What will be the change in your current script ? – Faaiz Nov 03 '21 at 16:53
-
@Faaiz: use the version from romaninsh instead. Looks similar, but in it, collection name is separate from the file name. – Sergio Tulentsev Nov 03 '21 at 17:01
Windows Batch version:
@echo off
for %%f in (*.json) do (
"mongoimport.exe" --jsonArray --db databasename --collection collectioname --file %%~nf.json
)

- 356
- 2
- 10
-
1Thank you Tomi. I had to omit --jsonArray as I had multiple json files which were not enclosed in [] – ankit9j Jul 30 '17 at 08:49
You can do it by this way also :
for filename in *; do mongoimport --db <Database> --collection <Collection Name> --file $filename; done

- 1,910
- 14
- 36
This worked for me in MAC OS X
find . -regex '.*/[^/]*.json' | xargs -L 1 mongoimport --db DB_NAME -u USER_NAME -p PASSWORD --collection COLLECTION_NAME --file

- 18,766
- 20
- 94
- 101
For windows bat file. This would be way better if you have a list of json files in the folder. and the collection name matches the name in files
@echo off
for %%f in (*.json) do (
"mongoimport.exe" --db databasename --collection %%~nf --drop --file %%f
)
pause

- 85
- 5
Not sure whether it's a new feature, but mongoimport
now can actually read from stdin. So what one can do to import multiple JSON files is as simple as
cat *.json | mongoimport --uri "mongdb://user:password@host/db?option=value" --collection example
I'm using mongodb-tools v4.2.0 btw.
UPDATE
mongodbimport
can potentially consume a high amount of memory which may cause the program to be kill by system OOM. My machine's got 32GB RAM and this happened consistently when I tried to import ~10GB of data which is stored in RAM disk.
To divide a relatively large job into batches:
#!/usr/bin/env bash
declare -a json_files=()
for f in *.json; do
json_files+="$f"
if [[ "${#json_files[@]}" -ge 1000 ]]; then
cat "${json_files[@]}" | mongoimport --uri="mongodb://user:pass@host/db" --collection=examples -j8 #--mode=upsert --upsertFields=id1
json_files=()
fi
done

- 3,593
- 4
- 32
- 54
Another one line solution (assuming you are in the folder where the json files are):
ls | sed 's/.json$//' | xargs -I{} mongoimport -d DATABASE_NAME -c {} {}.json

- 395
- 4
- 9
Linux:
> cat one.json two.json > three.json
> mongoimport --db foo --collection baz --file three.json"
Or, all files in the folder :
> cat *.json > big.json
> mongoimport --db foo --collection baz --file "big.json"

- 161
- 2
- 3
One line solution:
for /F %i in ('dir /b c:\files\*.json') do mongoimport.exe /d db /c files /file c:\file\%i

- 5,383
- 3
- 28
- 37
I'm going to show how to import many collections efficiently using only the Linux's terminal (it also works in Mac).
You must have all json files at the same folder and the file's name should be the collection that will be imported to your database.
So, let's begin, open the folder that contains your json files. Replace the <DATABASE>
to your database name, then execute the line below:
for collection in $(ls | cut -d'.' -f1); do mongoimport --db <DATABASE> --collection ${collection} --file ${collection}.json; done
But what is going on there?
First of all, you have to keep in mind that the parentheses will be executed first. In this case, it creates a list of all files getting just the name of each file (removing it's extension).
Secondly, all list will be added to a loop "for" in a local variable called collection (this variable's name could be anything you want)
Thirdly, the "do" execute the import line(*)
Finally the "done", finish the loop.
(*) The import line is composed by "mongoimport" that requires the database name "--db", the collection name "--collection", and the file name "--file". These requirements has been filled by the variable "$collection" created on the "for" stuff
I hope helped someone! Good luck guys :)

- 11
- 1
I used the solutions here to add a shell function to my bash profile for doing this quickly.
My example depends on the mongo export outputting each collection as a file with the collection name and .metadata.json
extension.
function mimport() {
for filename in *; do
collection="${filename%.metadata.json}";
mongoimport --db $1 --collection $collection --file $filename;
done
}
Use in the path of the export files, passing the DB name to the command...
mimport my_db
Will load all collections into the DB at localhost.

- 43
- 4
Python:
from pathlib import Path
import subprocess
jsons_folder = "./out/"
mongodb_host = "172.22.0.3"
mongodb_port = "27017"
mongodb_user = "root"
mongodb_password = "1234"
for f in Path(jsons_folder).glob("*.json"):
cmd = [
"mongoimport", "-h", mongodb_host, "-p", mongodb_port, "--authenticationDatabase", "admin",
"-u", mongodb_user, "-p", mongodb_password,
"--db", "callscoring", "--collection", "scoring_result_entry", str(f.absolute())
]
subprocess.run(cmd)

- 1,303
- 3
- 17
- 36