I am working on a Big Data client side application. Server language is Java. Within Frontend I have heavily vanilla JavaScript but AngularJS as MVC framework.
Problem
Dealing with big data analysis, at a time a single REST api response is around 1.5MB to 3MB. Dealing with this data to construct DOM is a pain.
- First it takes around 5 to 10 seconds to load the JSON.
- Then I build the UI (DOM)
- Once DOM is constructed, based on users interaction with data - I have to send/hit the server back with same JSON with updated values.
Suggest, What are the options do I have to optimize the page responsiveness"
- Couple of things I have in mind:
- break the JSON into chunks of 1000 at a time, once DOM is loaded then bring data silently and update the UI.
- GZIP the JSON on the server and decode back on client.
Give me your concrete workarounds!
Sample JSON could be:
var data = [
{
prop:val,
prop2: {},
prop3:[
id: val,
prop4: { {}, {}, {}, {}},
prop5: [ [], [], [] ]
]
},
{},
{},
{}
]
Some Use cases
- DATA size could be 10000 of objects, nested at a minimum of six, seven deep levels.
- Need to construct grid (table), rows goes around same length as objects and columns at minimum of 100 columns.
- All data cells have custom context menu, have nested headers, all columns to be sortable, rows to be sortalbes and these sorted orders hits the server as soon as users changes that. But I do have a second of threshold.
A very basic example is here: http://shekhardesigner.github.io/OuchGrid/