12

I am working on a Big Data client side application. Server language is Java. Within Frontend I have heavily vanilla JavaScript but AngularJS as MVC framework.

Problem

Dealing with big data analysis, at a time a single REST api response is around 1.5MB to 3MB. Dealing with this data to construct DOM is a pain.

  • First it takes around 5 to 10 seconds to load the JSON.
  • Then I build the UI (DOM)
  • Once DOM is constructed, based on users interaction with data - I have to send/hit the server back with same JSON with updated values.

Suggest, What are the options do I have to optimize the page responsiveness"

  • Couple of things I have in mind:
  • break the JSON into chunks of 1000 at a time, once DOM is loaded then bring data silently and update the UI.
  • GZIP the JSON on the server and decode back on client.

Give me your concrete workarounds!

Sample JSON could be:

var data = [
    {
        prop:val,
        prop2: {},
        prop3:[
            id: val,
            prop4: { {}, {}, {}, {}},
            prop5: [ [], [], [] ]
        ]
    },
    {},
    {},
    {}
]

Some Use cases

  • DATA size could be 10000 of objects, nested at a minimum of six, seven deep levels.
  • Need to construct grid (table), rows goes around same length as objects and columns at minimum of 100 columns.
  • All data cells have custom context menu, have nested headers, all columns to be sortable, rows to be sortalbes and these sorted orders hits the server as soon as users changes that. But I do have a second of threshold.

A very basic example is here: http://shekhardesigner.github.io/OuchGrid/

absqueued
  • 3,013
  • 3
  • 21
  • 43
  • 1
    Hi, I don't know your design, but what about lazy loading? Maybe you are displaying a large list of data and you could load only that part of the list, that is visible to the user. [Here](http://binarymuse.github.io/ngInfiniteScroll/) is a link to ngInfiniteScroll - which does exactly what i said. – Jonas Nov 11 '14 at 11:05
  • Could you plz post a sample of data you are sending to the client. – Yury Tarabanko Nov 11 '14 at 11:10
  • You should configure your server to gzip the response -- especially for slower internet connections the time taken to compress/decompress should be less than the decrease in time to download such a large file. Remember that the majority of browsers able able to natively handle gzipped responses if you set the correct HTTP headers. – Qantas 94 Heavy Nov 11 '14 at 11:10
  • DATA is confidential so I cant post. – absqueued Nov 11 '14 at 11:11
  • I just need a structure not the data itself. Fake it LOL – Yury Tarabanko Nov 11 '14 at 11:11
  • Tried gzip, no help. Connection is good enough. Its an intranet app hence, users are at executive levels. Only issues is at once 10,000+ rowsXcolumns grid have to be constructed. – absqueued Nov 11 '14 at 11:12
  • If you have an array of objects `[{name: value},{name: value}...10000]` you can replace it with an array of arrays `[[value], [value] ...]` drastically reducing payload size by eliminating repetative property names. – Yury Tarabanko Nov 11 '14 at 11:16

4 Answers4

9

some of my advice:

  1. paginate your response data from serverside first to reduce the size of the json object.
  2. parallel render then ui by chunk
  3. don't do deep watch in the data if the data is too big, for example don't ngRepeat for too many data, if two way binding is not nessary. that will make your application very slow
Sean
  • 2,990
  • 1
  • 21
  • 31
  • I have done all possible Angular Optimizations like avoiding nested ngRepeat, doing bindonce, using trackby, building custom directive to handle with careful interceptors. – absqueued Nov 11 '14 at 11:21
  • a single REST api response is around 1.5MB to 3MB doesn't make sense, have you paginate your data from server side? or you do pagination from client side – Sean Nov 11 '14 at 11:23
  • Haven't implemented pagination in server - this option is what I am thinking to do the next. Clientside pagination is not the requirement so lazy loading is what I am considering next as well but thinking about responsiveness. – absqueued Nov 11 '14 at 11:26
  • yep, pagination in serverside is crucial. and if the data is static, that is said, it is immutable, then conside to use template to render the html directly – Sean Nov 11 '14 at 11:32
5

"First it takes around 5 to 10 seconds to load the JSON. Then I build the UI (DOM)"

  1. is it not possible to do these 2 steps async? for example loading the dom and waiting for an ajax callback?

  2. I am not sure if there is a way, since i am lacking details but maybe you want to rethink your whole process to load "smaller objects" when you need them.

  3. think about compressing the object/string somehow

These are the first 3 ways i can think of right now how to optimate. Depending on your usecase you might be able to addapt these suggestions

I hope this helps - feel free to add feedback

Max Bumaye
  • 1,017
  • 10
  • 17
  • Keeping both action at sync is issue due to two reasons: 1.) As soon as user sees the UI - starts interacting with it and I have to post back the updates. 2.) Updating DOM as soon as I receive data is bottleneck as response is real huge here. I am talking about 10000+ rows at a time with assume minimum of 100 columns. If I keep updating DOM - browser freezes. FYA, its an analytic app. – absqueued Nov 11 '14 at 11:20
  • 1
    browser freezes means your spamming your stack with all these updates... your not giving the render queue enough time to rerender the page -> freeze. You can do the DOM update asnyc by putting your DOM changes 1 by 1 into the queue .. this allows the render queue to "render inbetween" and will look more fluid... not sure if that applys to you eather though – Max Bumaye Nov 11 '14 at 11:28
  • If you want to update your dom in batches.. just put a setTimeout(update, 0); /*update is your dom-update function*/ around it and have this pushed into your queue... the user will se the DOM changes apply as he is still able to interact since the renderqueue can push its render in between you update() calls – Max Bumaye Nov 11 '14 at 11:34
  • 100 columns? u display all 100 columns at the same time? – Petr Averyanov Nov 11 '14 at 11:40
  • When you have all data from the server - on the client side just think about pushing "small parts" to the queue and it will render much better than waiting to update EVERYTHING ;D – Max Bumaye Nov 11 '14 at 11:45
  • @MaxBumaye seems like an idea worth trying. Petr-averyanov Yes columns has to be there. out of which 100, 10+ can be pinned to have the comparison and other would be scrollable horizontally. – absqueued Nov 11 '14 at 13:11
3

For one of my assignments, we used our own solution. Most of the data would be collection or array, so we implemented simple algorithm to remove all redundant property names and just one set of property names and collection of object of values. We saw decent reduction in size and probably on next level some libraries could be used too further decompress. Again this works only if we work on structured data, may need different algorithm if array objects have different structure.

Other algorithms like I would recommend look at natural compression techniques such as removing all white space from the JSON, making field names smaller.

Alternatively, you could look to a specification like Protocol Buffers which will probably yield a much smaller upload size. Look at https://github.com/dcodeIO/ProtoBuf.js

If processing of your data takes time and freezes rendering of screen, you can try web workers (for latest browsers) to offload the processing logic and main thread/event loop will still be available for UI rendering or responding to user actions.

Suresh Balla
  • 189
  • 2
  • 16
1

"First it takes around 5 to 10 seconds to load the JSON. Then I build the UI (DOM)"

I recommend you load up the UI and data async, but disallow the user from taking certain actions until the appropriate data is loaded.

Once the data is loaded into your variable(s)/service(s), use front-end pagination to minimise the strain on the browser. The javascript can store a lot of data, but the DOM will struggle to render the large quantities of HTML.

Shoreline
  • 798
  • 1
  • 9
  • 26