4

I am using node.js and reading input from a serial port by opening a /dev/tty file, I send a command and read the result of the command and I want to close the stream once I've read and parsed all the data. I know that I'm done reading data by and end of data marker. I'm finding that once I've closed the stream my program does not terminate.

Below is an example of what I am seeing but uses /dev/random to slowly generate data (assuming your system isn't doing much). What I find is that the process will terminate once the device generates data after the stream has been closed.

var util = require('util'),
    PassThrough = require('stream').PassThrough,
    fs = require('fs');

// If the system is not doing enough to fill the entropy pool
// /dev/random will not return much data.  Feed the entropy pool with :
//  ssh <host> 'cat /dev/urandom' > /dev/urandom
var readStream = fs.createReadStream('/dev/random');
var pt = new PassThrough();

pt.on('data', function (data) {
    console.log(data)
    console.log('closing');
    readStream.close();  //expect the process to terminate immediately
});

readStream.pipe(pt);

Update:1

I am back on this issue and have another sample, this one just uses a pty and is easily reproduced in the node repl. Login on 2 terminals and use the pty of the terminal you're not running node in the below call to createReadStream.

var fs = require('fs');
var rs = fs.createReadStream('/dev/pts/1'); // a pty that is allocated in another terminal by my user
//wait just a second, don't copy and paste everything at once
process.exit(0);

at this point node will just hang and not exit. This is on 10.28.

ShaneH
  • 640
  • 8
  • 15
  • Maybe this will help? http://stackoverflow.com/questions/16399476/readstream-pipe-does-not-close – Matt Browne Nov 15 '13 at 02:48
  • I'm guessing it didn't help? In case you missed it, the OP of that question put their solution at the bottom of the question (to me at least, it wasn't obvious at first that it wasn't just more questions at the bottom). – Matt Browne Nov 15 '13 at 19:45
  • This is a weird behavior. The OP's code exits immediately in Mac Os (node v0.10.21)… – Paul Mougel Nov 15 '13 at 21:51
  • Possibly I'm missing it but that question was resolved by closing the connection to their database (mongoose) I have no connections other than the stream created in the code. – ShaneH Nov 18 '13 at 13:15

3 Answers3

1

Instead of using

readStream.close(), 

try using

readStream.pause().

But, if you are using the newest version of node, wrap the readstream with the object created from stream module by isaacs, like this :

var Readable = require('stream').Readable;
var myReader = new Readable().wrap(readStream);

and use myReader in place of readStream after that.

Best of luck! Tell me if this works.

sam100rav
  • 3,733
  • 4
  • 27
  • 43
  • Well, I get a nice stack trace when I do that. :) _stream_readable.js:730 throw new Error('Cannot switch to old mode now.'); – ShaneH Nov 18 '13 at 15:07
  • This is related to [this issue](https://github.com/isaacs/readable-stream/issues/16), which was fixed in [this commit](https://github.com/isaacs/readable-stream/commit/8595376912748ca7984df9047a4a7498fc61f720). – Paul Mougel Nov 18 '13 at 18:34
  • I've changed by answer according to your error. Actually my answer was relevant to only old node versions. Sorry for that. – sam100rav Nov 20 '13 at 12:42
0

You are closing the /dev/random stream, but you still have a listener for the 'data' event on the pass-through, which will keep the app running until the pass-through is closed.

I'm guessing there is some buffered data from the read stream and until that is flushed the pass-through is not closed. But this is just a guess.

To get the desired behaviour you can remove the event listener on the pass-through like this:

pt.on('data', function (data) {
  console.log(data)
  console.log('closing');

  pt.removeAllListeners('data');
  readStream.close();
});
mamapitufo
  • 4,680
  • 2
  • 25
  • 19
  • Nope, the process still just sits there after printing closing. – ShaneH Nov 14 '13 at 20:01
  • which version of node and on which platform are you running this? I tried your code with 0.10.16 on Windows 7 and I got the described behaviour with your code and removing the listener on `pt` fixed it. I don't have `/dev/random`, I was reading from a big enough file that the reads were made in 64k chunks. The process returned after reading the first chunk with my changes, it waited for the second chunk with the original code. – mamapitufo Nov 15 '13 at 14:19
  • I'm running 10.21 on a linux platform. I would expect your test to work because your stream is going to continuously have data /dev/random does not. One can open /dev/random it attach an on('data') and it will just sit there until there's some data to send. Your big file sends the first chunk you get the data event and then remove your listeners and close the stream but the stream still has data. – ShaneH Nov 15 '13 at 18:59
  • I just tried with node 0.10.18, reading from `/dev/random` on OSX and I observed the same behaviour: if I keep the listener on pt then I get 2 "closing" messages, but if I remove the listener on the first 'data' event then I don't get the second. I have no idea why it doesn't work for you on Linux. – mamapitufo Nov 17 '13 at 22:38
  • It looks like you're system is doing enough to keep /dev/random delivering data. Would you be able to try removing the close and the removeAllListeners and see if there are long pauses between receiving data events or if you receive a steady stream of events? – ShaneH Nov 18 '13 at 15:03
  • Actually, calling `.close()` [triggers an unpipe](https://github.com/joyent/node/blob/master/lib/_stream_readable.js?source=c#L564-L567), which [causes a cleanup](https://github.com/joyent/node/blob/master/lib/_stream_readable.js?source=c#L492-L494), which [removes the `data` event handler](https://github.com/joyent/node/blob/master/lib/_stream_readable.js?source=c#L509-L529)… Also, once a `data` event listener is set, the streams goes into flowing mode: data isn't likely to be kept in an internal buffer. – Paul Mougel Nov 18 '13 at 18:42
  • the 'data' listener is added to the PassThrough stream, but `close()` is called on the ReadStream. If I understand correctly, on the code you linked the PassThrough stream is `dest`, not `src`, so it's 'data' listener won't be cleaned up. – mamapitufo Nov 18 '13 at 22:38
0

i am actually pipe to a http request.. so for me it's about :

pt.on('close', (chunk) => {
  req.abort();
});
Abdennour TOUMI
  • 87,526
  • 38
  • 249
  • 254