I'm looking to accomplish this in a low overhead way, but I'm also looking to learn what the best practice would be.
I have React / Zustand / Immer app, that calls a 3rd party API every 20~ seconds that returns JSON with 100~ entities in it.
Currently, what I'm doing is looping through the entities, and CRUDing them using a Zustand store: (I'm not using removeBus yet, so that code is rough)
export const useBussesStore = create<MetroState>((set) => ({
busses: [
{
id: '-1',
latitude: 38.627003,
longitude: -90.199402,
timestamp: '1649680408',
label: "Welcome to our performance",
dupes: 1
},
],
movement: 0,
addBus: (payload: Bus) =>
set(
produce((draft) => {
draft.busses.push(payload);
})
),
removeBus: (payload: string) =>
set(
produce((draft) => {
const dramaIndex = draft.busses.findIndex((el: any) => el.id === payload);
draft.busses.splice(dramaIndex, 1);
})
),
patchBus: (payload: Bus) =>
set(
produce((draft) => {
let bus = draft.busses.find((el: Bus) => el.id === payload.id);
bus.latitude = payload.latitude;
bus.longitude = payload.longitude;
})
),
}));
Doing 100 state updates in a row seems wrong to me and I've had this issue for months and google has not helped.
Do you have a more logical way to do this?
I also have a Redux version of this that just replaces the list of entities. It performs well, but isn't as cool as handling each entity individually. I've been considering using a realtime database, but if it comes to that, I might just use my Redux version and bury this project.