I have a large CSV file which needs to be parsed in order to make post requests to a REST API.The post request body is a json string. The first line in the file has keys and following lines are the values, e.g.
FirstName,LastName
John,Doe
Mark,Twain
So the post body will be something like {"FirstName":"John", "LastName":"Doe"}
This file will be used to create test data for the developers. I will provide a simple page where the developers can enter an account number and hit Submit. The goal is to parse this file and make POST requests to a REST API service.
I want to avoid reading the file each time a request comes in but rather cache these requests on startup to avoid the hassle of reading/parsing the file each time, so that each time a request comes in the request body simply needs to be retrieved from a cache . Is caching these POST request on startup the right way to go here ?
Also there might be a need to add more CSV files just to have variety of data. What would be a way to make this scalable ?
The format(key names) of the CSV file will be same so each file can be parsed in the same way.