Django 2.2
I need to fetch 4000-10000 data rows from a particular datatable (let's call this commonsites
) amongst others to display a webpage.
I can narrow down to just 3 fields of these 4000-10000 rows (id, name, business_id)
My traffic is low. But i was wondering whether it's a good idea to use caching to fetch these 4000-10000 rows
The data for these rows are unlikely to change. But in case they do change or get deleted, how do I update/remove individual rows in the cache, rather than the entire cache?
Or is this even a good idea?
My installs are :
- redis==3.3.11 # https://github.com/antirez/redis
- django-redis==4.11.0 # https://github.com/niwinz/django-redis
Update
Adding more clarity, the webpage is a retrieve. Once the page request is made, the javascript frontend will make an API call. This API call will then fetch these 4000-10000 data rows from a datatable.
So these datarows are pre-existing data.
The data is sent as API json data as a list in the json data.
Just to be clear, the data will not be paginated. It will all be displayed. And I haven't measured the data size, so I cannot say how large the data is. I doubt it be more than 5mb.