0

We have APIs written in the back end(Flask) and are being consumed by front end (React JS). In the back end, we are merging CSV files using pandas, so after merging all the CSVs we have around 1 million records that we need to give to the FE to show it to the end user.

So what are the various ways here to give such huge data to front end and how front end can send that data back to back end after user updates the data. We have web application deployed on GCP app engine. Any help would be highly appreciated. Thanks!

  • 2
    Request data in chunks, apply pagination, don't do under-/overfetching. All depends on your usecase. – pzaenger Aug 28 '23 at 09:20
  • Use lazy loading if needed – Lokkesh Aug 28 '23 at 09:46
  • @pzaenger After merging CSV files, merged CSV files I am uploading to GCS bucket and I am downloading and reading CSV from bucket and return data to client in chunks, now we are showing data on UI like excel and user can update any records from UI, so What are the ways to get data back in the back end? – Abhishek Pore Aug 28 '23 at 10:51

1 Answers1

0

I think that this post have some good ideas that can help you to handle the large amount of data you're dealing with:

Restful API - handling large amounts of data.