I have a json file that is 2.37 gb with about 2.1 million records. I wanted to use jq to go through the file and create a new file every 100000 records.
I.e
part1.json part2.json part3.json part4.json part5.json etc
Has anyone done this with jq?
I have a json file that is 2.37 gb with about 2.1 million records. I wanted to use jq to go through the file and create a new file every 100000 records.
I.e
part1.json part2.json part3.json part4.json part5.json etc
Has anyone done this with jq?
Well you could use jq in conjunction with split
to write those files.
$ jq -nc --stream 'fromstream(1|truncate_stream(inputs))' large_file.json |
split -dl 100000 -additional-suffix=.json - part