Situation: 1. One app (not node.js) writes data to some file, size is unknown 2. Node.js should pipe contents of that file until it is opened for writing by the first app.
Is it possible?
Situation: 1. One app (not node.js) writes data to some file, size is unknown 2. Node.js should pipe contents of that file until it is opened for writing by the first app.
Is it possible?
You have some solution.
writeStream.js is a simple test class to write in the file as demo
const fs = require('fs')
const fd = fs.openSync('./openedFile.txt', 'w+')
const stream = fs.createWriteStream(null, {
fd,
encoding: 'utf8'
})
process.stdin.pipe(stream)
readStream polling: this is simple, the read stream is always closed when it reaches the end of file (EOF) and a 'end' event is fired. So you can't keep an open fd to a file if you read it all. (in this example I don't manage if the stream is still open because it could not have read all the file)
const fs = require('fs')
let streamed = 0
function readFd() {
const fd = fs.openSync('./openedFile.txt', 'r')
const stream = fs.createReadStream(null, {
fd,
encoding: 'utf8',
start: streamed
})
stream.on('data', function (chunk) {
streamed += chunk.length;
})
stream.pipe(process.stdout)
}
setInterval(readFd, 1000)
readStream via tail (tail command will poll for you)
const { spawn } = require('child_process');
const child = spawn('tail', ['-f', 'openedFile.txt']);
child.stdout.pipe(process.stdout)