0

Scenario

Need to run two tasks in parallel, and run the third task after firstTask && secondTask are done. Have been using async library and the code works, but want to know if there is something I could improve about my code.

Details

Task 1: readFileNames: reads a folder and returns an array of file names.

Task 2: copyFile: copy a config file from src folder to a destination folder.

Task 3: writeConfig: write the result of readFileNames into the config file located at the destination folder.

Questions

Should I combine the parallel control flow with eachSync? Also, was wondering if promises would help me achieve what I am trying to do? And which approach is better in terms of performance? Async vs Q or should I use a more abstracted library like orchestrator?

Below is what I have so far, it works but was wondering if there is a better way of doing it:

Code

var async = require("async");
var fs = require("fs-extra");
var dir = require("node-dir");
var path = require("path");
var _ = require("underscore");

var readFileNames = function (src, cb) {
  dir.files(src, function (err, files) {
    if (err) { return cb(err); }
    return cb(files);
  });
};

var copyFile = function (src, dest, cb) {
  fs.copy(src, dest, function (err) {
    if (err) { return cb(err); }
    return cb();
  });
};

var writeConfig = function (destFile, content, cb) {
  fs.appendFile(destFile, content, function (err) {
    if (err) { return cb(err); }
    return cb();
  });
};

var modulesFolder = path.join(__dirname, "modules");
var srcFile = path.join(__dirname, "src", "config.json");
var destFile = path.join(__dirname, "dest", "config.json");

async.parallel(
  [
    function (callback) {
      readFileNames(modulesFolder, function (files) {
        callback(null, files);
      });
    },
    function (callback) {
      copyFile(srcFile, destFile, function () {
        callback(null, "");
      });
    }
  ],
  // last callback
  function (err, results) {
    var toWrite = _.flatten(results);
    toWrite.forEach(function (content) {
      if(content) {
        writeConfig(destFile, content + "\n", function () {
        });
      }
    });
    console.log("done");
  }
);

files

├── dest
├── main.js
├── modules
│   ├── module1.txt
│   └── module2.txt
├── node_modules
│   ├── async
│   ├── fs-extra
│   └── node-dir
├── package.json
└── src
    └── config.json
Community
  • 1
  • 1
AJ Meyghani
  • 4,389
  • 1
  • 31
  • 35
  • What exactly do you mean by more elegant? or, "kind of works", it either works or it doesn't. if it isn't giving you the output you want, show us the output it's giving you and an example of what you want. – Kevin B May 07 '15 at 18:17
  • Just updated the question. Code works, but I am wondering if there is a pattern for doing what I am trying to achieve. I don't have much experience with node patterns, and that's why I am posting this question. – AJ Meyghani May 07 '15 at 18:22
  • I don't see a reason for you to use eachSync. however, `async.eachSeries` would be applicable in place of `toWrite.forEach` since `writeConfig` is async. – Kevin B May 07 '15 at 18:23

2 Answers2

1

I have been using async library and the code works, but want to know if there is something I could improve about my code.

You are using way too many anonymous function expressions. Just pass the callback you are receiving around! All those helper functions (including the named ones) are pretty superfluous.

Also, your readFileNames function does not follow the node callback conventions. I assume you don't intend to write the error into your result file?

Also your final callback does ignore errors.

Should I combine the parallel control flow with eachSync?

I guess you mean eachSeries, not eachSync? Yes, that would be reasonable, if you expect the appendFile calls to chain anyway. But you could just as well use parallel again, which would more closely relate to the .forEach you're currently doing. But in case, you should use an async iteration method, as currently you are logging "done" too early.

So rather do

var async = require("async");
var fs = require("fs-extra");
var dir = require("node-dir");
var path = require("path");

var modulesFolder = path.join(__dirname, "modules");
var srcFile = path.join(__dirname, "src", "config.json");
var destFile = path.join(__dirname, "dest", "config.json");

async.parallel([
  function (callback) {
    dir.files(modulesFolder, callback);
  },
  function (callback) {
    fs.copy(srcFile, destFile, callback);
  }
], function(err, results) {
  if (err)
    return console.error(err);
  var toWrite = results[0] // ignore the result of the copy action
  async.eachSeries(toWrite, function(content, callback) {
    // if(content) not sure if needed at all
      fs.appendFile(destFile, content + "\n", callback);
  }, function(err) {
    if (err)
      return console.error(err);
    console.log("done");
  });
});

Also, was wondering if promises would help me achieve what I am trying to do?

Yes, they could do this as well, and possibly even easier and more straightforward (if you're accustomed to them). They'd also simplify error handling a lot.

And which approach is better in terms of performance?

Neither. The performance of this little script is bound by the IO capabilities of your system, and the degree of parallelisation your algorithm uses. Which can be achieved by either library.

Bergi
  • 630,263
  • 148
  • 957
  • 1,375
0

Your use of async.parallel() looks fine to me, and if it gets the job done then you're done. Regarding performance, have you parallelized all tasks that can be done in parallel? How fast is your disk IO? These questions matter significantly more than your choice of which async/promise library to use.

With that said, promise libraries like Q would typically slow things down a bit when compared to async because they'll tend to do a process.nextTick at times when it's not strictly necessary, but this performance degradation is quite small. In the vast majority of cases, performance concerns shouldn't dictate your choice of async/promise library.

Andrew Lavers
  • 8,023
  • 1
  • 33
  • 50