I'm using indexOf
in a React component to style a button based on whether an object is in an mobx observable array.
The button is for favoriting. It pushes the object for that specific list item into an observable array in the store called 'favorites'. favorites is an observable array of objects.
Here's the ES6 template literal from my button:
className={`btn-group ${((this.props.store.favorites.indexOf(this.props.data) > -1)) ? 'success' : 'info'}`}
Basically, it's a check for if the object is in the array, className will be success
, if false info
.
This works perfectly fine when the favorites array is in local state. But I get that the objects within the favorites array look differently once they are in the observable array. I get that the observable array favorites is different than a local array favorites.
But how do I do a test for whether an object is in an observable array of objects? I tried slice()
and peek()
and using the findIndex but no dice.