7

I'm using NodeJS. I want to do something to 1,000,000 rows without loading all the rows into the memory. (for-each)

Before, when I used ASP Classic I did:

do while not rec.eof
   //do something
   rec.movenext
loop

In node-mysql I didn't found anything that similar to cursor. only

connection.query('select * from bigdata',function(err,rows))

The problem is, that I don't want to loads all the rows at one time.

I found an answer, and put it below. I keep the question to help others who will have the same question

Aminadav Glickshtein
  • 23,232
  • 12
  • 77
  • 117

2 Answers2

12

Thanks to Ziggy Jonshon, I found something similar to cursor in node-mysql, using stream.

I can get the query result by stream, and pause it automatically:

See Issue #1370 on node-mysql:

const mysql = require('mysql');
const stream = require('stream');

connection.query('SELECT * FROM `bigdata`')
  .on('error', function(err) {
    // Do something about error in the query
  })
  .stream()
  .pipe(new stream.Transform({
    objectMode: true,
    transform: function (row, encoding, callback) {
      // Do something with the row of data

      callback();
    }
  }))
  .on('finish', function() {
    connection.end();
  });

This way I don't need to load all the data into the memory of the node process.

Javier Elices
  • 2,066
  • 1
  • 16
  • 25
Aminadav Glickshtein
  • 23,232
  • 12
  • 77
  • 117
  • 1
    I have edited the answer to fix a few typos in the code. The `pipe` invocation needs to set the option `objectMode: true` in order to deal with the data row as an object. I have also added the bit to deal with a possible error in the SQL and the code to include the necessary modules. – Javier Elices Mar 04 '19 at 14:32
-4
connection.query('select * from bigdata',function(err,rows){
    rows.foreach(function(){
        console.log(this);
    });
});
Kevin Grosgojat
  • 1,386
  • 8
  • 13
  • 2
    Thanks. But I don't want to load all rows into memory. I want to get the rows one after one. – Aminadav Glickshtein Mar 15 '16 at 15:24
  • You want to make 1,000,000 individual calls to the database? That's probably not a desirable thing to do. – Jake Wilson Mar 15 '16 at 15:25
  • It's okay. Or maybe to make 1000. and each time get 1000 rows. – Aminadav Glickshtein Mar 15 '16 at 15:25
  • You have to put an iterator in the foreach loop and change your sql statement with the iterator conditions. – Kevin Grosgojat Mar 15 '16 at 15:33
  • While this answer is probably correct and useful, it is preferred if you [include some explanation along with it](http://meta.stackexchange.com/q/114762/159034) to explain how it helps to solve the problem. This becomes especially useful in the future, if there is a change (possibly unrelated) that causes it to stop working and users need to understand how it once worked. Thanks! – Hatchet Mar 15 '16 at 18:05
  • 1
    @JakeWilson - If OP's rows contain large amounts of data, and he's selecting a lot of them, memory is going to be exhausted, which is why he asked the question he did. – Lee Goddard Nov 24 '20 at 11:54