UPDATE: I have realized that every new run was creating a new Python console which was causing more memory consumption. I had to turn of the setting that creates new console for each run. This feature automatically got enabled when i upgraded to Pycharm pro for some reason. Now, memory consumption is steady.
My project creates a csv named 'pressure_drop' and I want to create a new pandas dataframe using the code below.
The pressure_drop.csv
in this example has 10150 rows and 12 columns. As you can see, I am deleting some columns that don't need to be shown and then creating a data frame by assigning row and column index. Finally, it is written to a new .csv file that is more readable that I will use to create interactive charts etc.
The problem is, Python takes up more memory space every time the code is run in the console and Python ends up crashing if the code is run enough number of times. Can you help me understand why this is happening?
For example, Python takes up ~100 more MB's every time the code is run for the data set above.
import pandas as pd
def data_frame_creator(result_array):
array = results_csv_loader(result_array)
array = np.delete(array,[3,4,5,6,7],1)
len = array.shape
row_count = len[0] +1
df = pd.DataFrame(data = array, index=[np.arange(1,row_count)], columns=columns.dataframe_columns)
df.to_csv('Output.csv')
data_frame_creator('pressure_drop.csv')