I have an array with say.. 100000 objects. I use the map function and on each iteration, I build a string and write the content to a CSV like so:
entriesArray.map((entry) => {
let str = entry.id + ',' + entry.fname + ',' + entry.lname + ',' +
entry.address + ',' + entry.age + ',' + entry.sex + '\n'
writeToFile(str);
});
The writeToFile function:
const writeToFile = (str) => {
fs.appendFile(outputFileName + '.csv', str, (err) => {
if (err) throw err;
});
};
This works as expected, but i'm concerned if having so many asynchronous write operations could lead to any data inconsistencies. So my question is, is this safe? Or is there a better way to do it.
Btw, The same code on a MAC OS threw the error Error: ENFILE: file table overflow, open 'output.csv'. On a bit of research, I learned that this is due to OSX having a very low open file limit. More details on this can be found here.
Again I'm hoping an improvement to my file write mechanism could sort this issue out as well.