I have the following pipeline:
readFile > parseCSV > otherProcess
The readFile
is the standard Node.Js createReadStream
, while the parseCSV is a Node.js transform stream (module link).
I want to iterate through a csv file line by line and handle a single line at the time. Therefore, streams and async iterator are a perfect match.
I have the following code which is working properly:
async function* readByLine(path, opt) {
const readFileStream = fs.createReadStream(path);
const csvParser = parse(opt);
const parser = readFileStream.pipe(csvParser);
for await (const record of parser) {
yield record;
}
}
I'm quite new to Node.Js streams, but I've read from many sources that the module stream.pipeline
is preferred to the .pipe
method of read streams.
How can I change the code above in order to use the stream.pipeline
(actually the promise version got from util.promisify(pipeline)
) and yielding one line at the time?