3

I use axios to get data from API then consume data in my node.js app. The data are array of 300 objects like this one:

{
  'location': 'us',
  'date': '156565665',
  'month': '5.1',
  'day': '6',
  'type': 'default',
  'count': '7',
  'title': 'good',
  'filter': 'no',
  'duration': 'short',
  'options': 'no',
}

After I get this array of objects I need to transform each object: replace its keys with new ones and convert some values into proper data types (string to float):

{
  'loc': 'us',
  'date_new': parseInt('156565665'),
  'month': parseFloat('5.1'),
  'day': parseInt('6'),
  'type': 'default',
  'count': parseInt('7'),
  'title': 'good',
  'filter': 'no',
  'duration': 'short',
  'options': 'no',
}

For now I just use for loop and in each iteration convert keys and values of each object. But there will be thousands of objects like these ones. It will be a worker for processing these data. What is the best way to process them in node.js?

I am going to use some ready-made queue like bee-queue or resque, but even in this case it would be good to make code "node.js way" that this processing of my array of objects will not slow down node loop. Maybe use push each object to array of promises and put them to Promise.all() (but there will be 300 promises in Promise.all())? What is the best way to make hard calculations like this in node.js?

Stas Coder
  • 309
  • 5
  • 10
  • *What is the best way to make hard calculations like this in node.js?* You don't! **Put individual objects in a database** as and when you receive them, and then **pick them up individual or in a paginated manner** to process them. – gurvinder372 Nov 10 '17 at 13:35
  • @gurvinder372 But I get array of 300 objects from API at once. But I can get them as stream from axios. – Stas Coder Nov 10 '17 at 13:37
  • Can you control the page-size from 300 to less than 10? – gurvinder372 Nov 10 '17 at 13:38
  • @gurvinder372 I can't, but I can get them as stream from axios – Stas Coder Nov 10 '17 at 13:38
  • You can use https://github.com/dominictarr/JSONStream and https://stackoverflow.com/questions/11874096/parse-large-json-file-in-nodejs – gurvinder372 Nov 10 '17 at 13:42
  • @gurvinder372 I will give it a try and give feedback – Stas Coder Nov 10 '17 at 13:45
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/158696/discussion-between-stas-coder-and-gurvinder372). – Stas Coder Nov 10 '17 at 13:49
  • 300 object transforms like that is pretty light-weight stuff. Keep it simple and make it work, then test, don't do optimisation till you know there's an issue, then you can address the issue directly. Promises won't help using them the way you describe and are not needed. – lecstor Nov 10 '17 at 14:00

1 Answers1

3

But there will be thousands of objects like these ones. It will be a worker for processing these data. What is the best way to process them in node.js?

I would recommend

Example

var request = require('request')
  , JSONStream = require('JSONStream')
  , es = require('event-stream')

request({url: 'URL'})
  .pipe(JSONStream.parse('rows.*'))
  .pipe(es.mapSync(function (data) {
    console.error(data)
    return data
  }))
  • After parsing, store them in a database instead of processing them immediately since a hard-calculation for a big object will hold-up the processing on Nodejs.

  • Pick them up individually one by one from database for processing.

gurvinder372
  • 66,980
  • 10
  • 72
  • 94
  • how do I pick each object one by one from DB? I use firebase. I can pick only whole array of objects from db. – Stas Coder Nov 10 '17 at 14:03
  • I think you can keep a single JSON as a document as well https://firebase.google.com/docs/database/web/structure-data – gurvinder372 Nov 10 '17 at 14:05
  • so maybe I don't need to save json in db? Maybe I will process it at once? – Stas Coder Nov 10 '17 at 14:08
  • @StasCoder You mentioned in your question that you will do some hard-calculations in your processing, so I recommended postponing the calculations till you can pick those records individually. – gurvinder372 Nov 10 '17 at 14:12