0

How can I stream a response using an in memory DB?

I'm using Loki JS as an in memory DB. There is a particular resource where I must return the entire contents of a table (cannot be paginated) and that table can grow to 500,000 items or so, which is about 300mb.

In other cases, I have used fs.createReadStream to get a file and stream it back to the user:

fs.createReadStream('zips.json')
  .on('data', function() {
    res.write(...)
  })
  .on('end', function() {
    res.end();
  })

This has worked great for large files, but how can I do something equivalent using an in memory DB?

const items = lokiDb.addCollection('items');
items.insert('a bunch of items ...');

// I would now like to stream items via res.write
res.write(items)

Currently, res.write(items) will cause memory problems as Node is trying to return the entire response at once.

mcranston18
  • 4,680
  • 3
  • 36
  • 37

2 Answers2

0

As far as I can tell, there is no native stream provider in Loki, though I may have missed it. What you may want to do instead is listen to the 'insert' event on the collection and write that, like so:

const items = lokiDb.addCollection('items');
items.on('insert', (results) => {
  res.write(results);
});

items.insert('a bunch of items ...');
Paul
  • 35,689
  • 11
  • 93
  • 122
0

If I'm correct, basically your problem is that readStreams only read from files, and that you want to read from an in-memory data structure. A solution might be to define your own readStream class, slightly modifying the prototype stream.Readable._read method:

var util = require('util');
var stream = require('stream');

"use strict";
var begin=0, end=0;
var options = {
    highWaterMark:  16384,
    encoding:       null,
    objectMode:     false
};

util.inherits(InMemoryStream, stream.Readable);

function InMemoryStream(userDefinedOptions, resource){

    if (userDefinedOptions){
        for (var key in userDefinedOptions){
            options.key = userDefinedOptions[key];
        }
    }

    this.resource = resource;
    stream.Readable.call(this, options);
}


InMemoryStream.prototype._read = function(size){

    end += size;
    this.push(this.resource.slice(begin, end));
    begin += size;

    }

exports.InMemoryStream = InMemoryStream;    
exports.readStream = function(UserDefinedOptions, resource){
    return new InMemoryStream(UserDefinedOptions, resource);
}

You convert your in-memory datastructure (in the following example an array) to a readStream, and pipe this through to a writeStream, as follows:

"use strict";

var fs = require('fs');
var InMemoryStream = require('/home/regular/javascript/poc/inmemorystream.js');

var stored=[], writestream, config={};

config = {
    encoding: null,
    fileToRead: 'raphael.js',
    fileToWrite: 'secondraphael.js'
}

fs.readFile(config.fileToRead, function(err, data){
    if (err) return console.log('Error when opening file', err);
    stored = data;

    var inMemoryStream = InMemoryStream.readStream({encoding: config.encoding}, stored);
    writestream = fs.createWriteStream(config.fileToWrite);
    inMemoryStream.pipe(writestream);

    inMemoryStream.on('error', function(err){
        console.log('in memory stream error', err);
    });


});
mycena
  • 1
  • Sorry, stupid example; wasn't thinking straight: ofcourse stored simply becomes a Buffer. Change the push() in the prototype read as follows and you should be okay, for string-based data at least: this.push(this.resource.join(' ').slice(begin, end)); – mycena Oct 26 '17 at 13:05