10

Assume we have such a program:

// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"]; 
for (let i = 1; i < 1000; i++) {
  fs.write("./same/path/file.txt", arr[i], {flag: "a"}});
}

My question is, will string1 to string1000 be gurantted to append to the same file in order?

Since fs.write is async function, I am not sure how each call to fs.write() is really executed. I assume the call to the function for each string should be put somewhere in another thread (like a callstack?) and once the previous call is done the next call can be executed.

I'm not really sure if my understanding is accurate.

Edit 1

As in comments and answers, I see fs.write is not safe for multiple write to same file without waiting for callback. But what about writestream?

If I use the following code, would it guarantee the order of writing?

// imagine the string1 to string1000 are very long strings, which will take a while to be written to file system
var arr = ["string1",...,"string1000"]; 
var fileStream = fs.createWriteFileStream("./same/path/file.txt",  { "flags": "a+" });
for (let i = 1; i < 1000; i++) {
  fileStream.write(arr[i]);
}
fileStream.on("error", () => {// do something});
fileStream.on("finish", () => {// do something});
fileStream.end();

Any comments or corrections will be helpful! Thanks!

Lubor
  • 989
  • 3
  • 10
  • 33
  • 1
    If you mean [`fs.writeFile()`](https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback), then the documentation states that _"...it is unsafe to use `fs.writeFile` multiple times on the same file without waiting for the callback"_. – robertklep Oct 27 '16 at 19:41
  • Note that due to how buffering works in node.js, not waiting for the callback does not affect order or writes. What's unsafe is the possibility of overflowing the buffer. So the risk is missing writes, not unordered writes. – slebetman Oct 27 '16 at 19:55
  • @slebetman, Thanks for replying. But what does buffer here really mean? – Lubor Oct 27 '16 at 19:57
  • 2
    @Lubor: The core node.js manages file/disk I/O by spawning one thread per open file. When you write, you don't actually write to the file. What you do instead is send a message to this I/O thread. So this I/O thread needs to store this message somewhere in RAM. This is the I/O buffer. I believe it's size is fixed at compile time. The thread then does a proper async I/O loop writing data from this buffer to disk whenever the file is writable (when the OS write buffer for the file is empty, your OS again does not write to disk but buffer in a similar way) – slebetman Oct 27 '16 at 20:02
  • 1
    @Lubor: From what I've read of the code when answering a similar question I can see that the ordering of strings entering this I/O buffer and being taken from this buffer to be written to disk is guaranteed. The buffer can't be rearranged thus the order you write is the order that will be written to disk. But if the buffer is full your writes will be ignored. – slebetman Oct 27 '16 at 20:05
  • @slebetman Thank you for the detailed explanation. But when we use `WritableFileStream`, how can it be safer than directly using `fs.write()`, don't we need the I/O buffer for `WritableFileStream`? – Lubor Oct 27 '16 at 20:09
  • @Lubor: I need to check on the implementation but it could be that write streams implement another layer of buffering which is easier to do in js (arrays can grow easily) than C. – slebetman Oct 27 '16 at 20:19

4 Answers4

16

The docs say that

Note that it is unsafe to use fs.write multiple times on the same file without waiting for the callback. For this scenario, fs.createWriteStream is strongly recommended.

Using a stream works because streams inherently guarantee that the order of strings being written to them is the same order that is read out of them.

var stream = fs.createWriteStream("./same/path/file.txt");
stream.on('error', console.error);
arr.forEach((str) => { 
  stream.write(str + '\n'); 
});
stream.end();

Another way to still use fs.write but also make sure things happen in order is to use promises to maintain the sequential logic.

function writeToFilePromise(str) {
  return new Promise((resolve, reject) => {
    fs.write("./same/path/file.txt", str, {flag: "a"}}, (err) => {
      if (err) return reject(err);
      resolve();
    });
  });
}

// for every string, 
// write it to the file, 
// then write the next one once that one is finished and so on
arr.reduce((chain, str) => {
  return chain
   .then(() => writeToFilePromise(str));
}, Promise.resolve());
nem035
  • 34,790
  • 6
  • 87
  • 99
  • then what about createWriteFileStream? If I use filestream, can we guarantee the order? – Lubor Oct 27 '16 at 19:47
  • Thanks! But are you saying stream.write(str) is synchronous operation? I have been considering this as async. – Lubor Oct 27 '16 at 20:00
  • 2
    @Lubor Just for future reference: A stream, by definition, guarantees that the order of writing to it is the same as the order of the "output" (in this case, to a file). – tcooc Oct 27 '16 at 20:02
  • @tcooc Wow, this definition can make a lot of sense! Thanks! – Lubor Oct 27 '16 at 20:03
  • @Lubor my mistake, my phrasing was wrong. `stream.write` is async. – nem035 Oct 27 '16 at 20:06
  • Your first example is about using streams. As I have read in internet (I am far from node.js) streams are async. Is it safe to you your code when we need sync writing? – Pavel_K Nov 16 '18 at 19:57
1

You can synchronize the access to the file using the read/write locking for node, please see the following example an you could read the documentation

var ReadWriteLock = require('rwlock');

var lock = new ReadWriteLock();

lock.writeLock(function (release) {
  fs.appendFile(fileName, addToFile, function(err, data) {
    if(err) 
      console.log("write error"); //logging error message
    else    
      console.log("write ok");

    release(); // unlock
   });    
});
Rahul Suresh
  • 184
  • 1
  • 14
0

I had the same problem and wrote an NPM package to solve it for my project. It works by buffering the data in an array, and waiting until the event loop turns over, to concatenate and write the data in a single call to fs.appendFile:

const SeqAppend = require('seqappend');

const writeLog = SeqAppend('log1.txt');

writeLog('Several...');
writeLog('...logged...');
writeLog('.......events');
Gregory
  • 73
  • 1
  • 6
-1

you can use json-stream package to achieve it

  • Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Mar 21 '23 at 13:42