1

I am working on a React and Redux application that uses ImmutableJS to store all of it's state. The app receives data from a sensor at about 100 Hz. I need to draw a graph that updates in real time and displays this data. I have been using React-Vis for the graph, the problem is that it takes an array of objects and not an ImmutableJS data structure.

I have solved this by converting the ImmutableJS data structure to an array like this:

const data = this.props.HEGPercentage.toIndexedSeq().toArray()

This works but the problem I am encountering is massive lag if I run this with real data, I think because it has to create a new array all the time.

How can I create a high performance solution for this and still use ImmutableJS?

  • It sounds like you are dispatching 100 state-changing actions per second. I would think that could slow things down significantly by itself, even without the huge additional overhead of actually trying to re-render a graph with that data at 100fps. Not sure if the use of `toIndexedSeq().toArray()` is the problem. If you skip the graph and just do something simple like `console.log(this.props.HEGPercentage)`. Does it still make the page lag (making it hard to scroll, or type in input fields, or similar)? Just trying to establish if it's worth trying to optimize the `toIndexedSeq().toArray()`. – jonahe Sep 18 '17 at 10:57
  • I have some code to simulate sensor data input and set it to a rate that should be similar and there was no lag. Now I have also refactored the reducer to use regular JS objects instead of Immutable and it seems to have improved performance a lot. I would still like to use Immutable if possible for consistency with the rest of the app... – Magnus Brantheim Sep 18 '17 at 13:46
  • Ok. Is the sensor data so precise (or the fluctuations so big) that values almost never re-appear? If not, maybe it would be possible to optimize through some kind of caching. If the values are needlessly precise (maybe the screen resolution couldn't represent the difference between 0.00001, and 0.00002 anyway) maybe the input data could be rounded so that values don't change as often (fewer re-renders) and repeat more often (better for caching). Maybe look at https://github.com/reactjs/reselect . – jonahe Sep 18 '17 at 14:01
  • Otherwise I don't know. It seems you'd have to make the translation from Immutable js to an array sooner or later (if that's what the graph lib wants as input). I should say though, that neither performance nor Immutable.js is something I'm an expert in. So yeah.. read my advice as pure speculation. – jonahe Sep 18 '17 at 14:07
  • I am using selectors. I can try the rounding and see if it helps. Would you then do the rounding before you store it in the Redux store? At the moment it usually has like 13 decimal points which may be unnecessary. I have also implemented averaging so it takes 10 sensor values, averages them and then stores them which cuts the updates down to about 10Hz. – Magnus Brantheim Sep 18 '17 at 14:17
  • Good question. I actually don't know. I would probably start with trying doing it before storing it, yes. But I'm starting to to wonder if Redux is the right tool for this. Is it important to have a history of all these actions etc? Because otherwise it sounds more like problem suited for [rx.js](https://github.com/reactivex/rxjs) or one of those "reactive" libraries. This part of the the program is essentially a flow of transformations on a continuous stream of data. You have stream source providing sensor data, you group the data points by 10, do averaging, then update the the view. – jonahe Sep 18 '17 at 15:32
  • (But I can also see the argument for not introducing a new library just for this.) – jonahe Sep 18 '17 at 15:34

1 Answers1

1

Converting between plain JS objects and Immutable.js objects can be very expensive. In particular, the fromJS() and toJS() operations are the most expensive operations in the Immutable.js library, and should be avoided as much as possible (and especially in Redux mapState functions).

It sounds like you're already at least somewhat on the right track. Use memoized selectors to cut down on expensive transformations, try to round those numbers if possible so that there's fewer meaningless changes, and cut down on the total number of updates overall.

My React/Redux links list has a large section of articles on improving performance for React, Redux, and Immutable.js, which may be helpful.

It's also worth noting that I personally advise against use of Immutable.js for a variety of reasons.

markerikson
  • 63,178
  • 10
  • 141
  • 157
  • Alright, after reading some of the articles you linked I think I am leaning towards removing Immutable.js from the project since one of the main concerns of the application is to draw the graph in real time and this seems to require conversion to a JS object. Now I am thinking about if I should just mutate the array. I am using concat each time I add a value but as far as I know this creates a new array which seems a bit wasteful. My main concern is real-time rendering of sensor data, preferably as fast and responsive as possible. – Magnus Brantheim Sep 19 '17 at 09:19
  • If you're using React-Redux, then it does rely on immutable updates to produce new references so that it can detect changes and re-render. If you're _not_ using React-Redux, that may not be a concern. However, I'd still go ahead and try doing it the "right" way first, and see how things perform. – markerikson Sep 19 '17 at 15:52