I would like to chain 2 linux commands together such that command 1 can continually pass/stream data to command 2. Command 1 in my case, generates a (line-by-line) database export, which i would like to pipe in to command 2 (which will call a function to write each data export line to a new db).
E.g.
command 1 | command 2
This Phyton based solution looks really nice, but also blocks, and just reads the stout line by line AFTER its all available.. How to pipe input to python line by line from linux program?
As way of background, I'm attempting to export a 10 TB Cassandra db, which i can do using dsbulk. My idea/preference here is to not build up a 10 TB export and then process it, i would like to process it "in flight"
Any pointers appreciated, thanks.