11

I've got a fairly standard MEAN project setup with the angular-fullstack generator using yeoman.

What I'm finding is that when GETting a largish (over 65536 bytes) json result, it is encoded using gzip and chunked, but the json returned is not valid viewed either in chrome or consumed by my angular client $resource because it contains TWO responses! e.g {name:'hi'}{name:'hi'} for a single id or [{..},{..}][{..},{..}] for a array.

The server api endpoint was autogenerated from the angular-fullstack generator and looks something like:

// Get list of worlds
exports.index = function(req, res) {
  World.find(function (err, worlds) {
    if(err) { return handleError(res, err); }
    res.json(200, worlds);
  });
};

If i slice the data so it's not chunked, then the json is well formed. I've checked the mongo db and the data is ok there too and debugging the worlds variable, I can JSON.stringify and get the expected string result without any duplicates. but the moment it's sent, I'm getting a doubling up of json result on the response.

Update for comment

angular-fullstack 2.0.4

the schema looks like:

'use strict';

var mongoose = require('mongoose'),
    Schema = mongoose.Schema;

var WorldSchema = new Schema({
  name: String,
  info: String,
  active: Boolean,
  tiles: [Schema.Types.Mixed]
});

module.exports = mongoose.model('World', WorldSchema);

seeded with:

 var newWorld = new WorldModel({
                    _id: planet._objectId,
                    name: "SimDD World",
                    tiles : seed()
                });
                newWorld.save();

...

var seed = function () {
    var data = [];
    for (var i = 0; i < planet.HEIGHT; i++) {
        for (var j = 0; j < planet.WIDTH; j++) {
            data.push({
                coords:{
                    x:i,
                    y:j
                },
                type:'.'
            });
        }
    }
    return data;
}
Joe
  • 11,147
  • 7
  • 49
  • 60
  • What version of af are you using? Can you post a model and seed data? – Andy Gaskell Jul 19 '14 at 03:23
  • @AndyGaskell updated question, although i don't think the data itself is the issue since i'm able to get the expected structure back with a smaller dataset as long as there is no chunking – Joe Jul 19 '14 at 03:29
  • Wouldn't that be related to Mongo's [GridFS](http://docs.mongodb.org/manual/core/gridfs/) ? – Goodzilla Jul 19 '14 at 10:18
  • @Goodzilla interesting read, but i don't think so... firstly, i'd have to be using `cursor = db.fs.chunks.find({files_id: myFileID}).sort({n:1});` fs related commands to do my finds right? the scaffolded server controller doesn't seem to do this at all. And second, the response only comes up to 120k gzipped, no where near 16megs but over 65k. – Joe Jul 19 '14 at 10:48
  • What browser (and version) are you using, and have you tried to replicate it on any other? – meeDamian Jul 21 '14 at 15:27

4 Answers4

8

Looks like this is being caused by the compression middleware, removing app.use(compression()); from the express config seems to fix this.

DaftMonk
  • 955
  • 7
  • 9
  • hey! DaftMonk :) thanks for the angular fullstack generator, I've learned heaps of best practices just from seeing how you've constructed the server/client controllers, socket, models etc etc. – Joe Jul 24 '14 at 23:34
  • re your answer, yes it does fix it, but it increases the transfer size by 10x, not ideal but hey, i'm not a greedy guy, this is good enough. Sounds like not the root cause though? – Joe Jul 24 '14 at 23:35
  • The fix is for this if you're using `yo-angular-fullstack` (PS thanks for the awesome work!) or the `connect-livereload` module was to upgrade `connect-livereload` to newer version `>= 0.4.1` See the comments in the [issue here](https://github.com/expressjs/compression/issues/20#issuecomment-121787739) – julian soro Mar 22 '16 at 19:45
1

The issue is seen in browsers and not in postman. I checked the HTTP request headers and when I add 'Accept' Header as html in postman the same problem is seen in postman as well. So I believe the browsers are handling differently with Accept type with html.

1
// app.use(require('connect-livereload')());

I came across the same problem when building my angular-fullstack app (thanks, DaftMonk), after some extensive debugging using node-inspector, turns out the JSON data gets passed to the livereload module and gets duplicated when it comes out. Disabling this middleware eliminated the problem for me.

Mrvicadai
  • 11
  • 4
0

Does this work for you? I don't see a reason why it shouldn't.
I assume you have a planet object that has:
HEIGHT, WIDTH and _objectId properties.

Remember if you modify a mixed type you need to tell mongoose that the value changed and subsequently save it.
http://mongoosejs.com/docs/schematypes.html#mixed

var WorldModel = require('../api/world/world.model');
var planet = require('planetSeedData');

var seed = function() {
  var data = [];
  for (var i = 0; i < planet.HEIGHT; i++) {
    for (var j = 0; j < planet.WIDTH; j++) {
      data.push({
        coords: {x:i, y:j},
        type: '.'
      });
    }
  }
  return data;
};

var myPlanet = {
  _id: Mongoose.Types.ObjectId(planet._objectId),
  name: "SimDD World",
  tiles : seed()
};               

WorldModel.create(myPlanet);

// if modified, you would do something like:
// WorldModel.markModified('tiles');
// WorldModel.save();
Josue Alexander Ibarra
  • 8,269
  • 3
  • 30
  • 37
  • 2
    Bit uncomfortable where the answer is heading because as stated in the question, I have no problems with mongoose, creating/saving data to the mongodb or getting it out in my node controller. It's only when I send that data to the client that the response thinks it sensible to duplicate chunks. I don't see how this is much different to how I store/update the data or how this will affect the response. Even when I did this: `res.json(200, JSON.stringify(worlds));` I was still getting issues. This is just sending a string back, no relations to mongoose anymore. – Joe Jul 21 '14 at 06:04
  • Wow, that's really odd. I'll see if I can improve my answer after replicating you results – Josue Alexander Ibarra Jul 21 '14 at 06:12
  • I could not replicate, the bug is somewhere else, I don't think it's in the database, but more likely in the compression method. Could you make a gist with the minimal code? On the other hand, I think you can patch it with this: `var json = JSON.stringify(worlds); res.json(200, json.substr(0, json.length / 2));` It's ugly but it should work. – Josue Alexander Ibarra Jul 21 '14 at 18:03
  • hum... the worlds array is expanding, but the compression kicks in when it goes over a fixed size (65k). Dividing it by 2 isn't going to cut it. – Joe Jul 24 '14 at 23:38