I have a flask application which reads a csv file using pandas
and it returns some data after reading. There is no manipulation done to the dataframe
. I have the dataframe
stored in a pickle
format so for every request that comes in, the application unpickles
the file and reads data and returns it to the client.
from flask import Flask, request, jsonify, abort
from flask_cors import CORS, cross_origin
import pandas as pd
import os
import json
@application.route('/Getdata', methods=['GET'])
@cross_origin()
@Validate_API_Key
def index():
fid = request.args.get('Fid', default=0, type=int)
df = pd.read_pickle(os.getenv('DFFileName'), compression='gzip')
res = get_fid_data(fid, df)
data = res.to_dict(orient='records')
return jsonify(data=data)
This is how the get_fid_data()
is set up
def get_fid_data(fid, df):
frecord = pd.DataFrame()
# certain rows are selected from df based on fid and the
# rows are appended to frecord. The frecord is then returned.
return frecord
My question is, is there a way to make the the df
global after reading it initially? It seems like unpickling
the df for every request is unnecessary if its possible to "persist" the dataframe in memory for as long as the flask application is running. I'd like to have this so that for every request that comes in I can read the df
from memory rather than reading the file.
Is there a way to achieve this?