I am using ARIMA model from pmdarima for my time series analysis. I want to do incremental training for which i use to save my models using pickle to s3 and then load it to update the model on a daily basis. I do the training of around 300 features. The size of models is too big. For eg, for a feature X, the number of rows in the dataset are 1 Million with 2 columns, datetime and Y.
The model size of this feature is around 4GB. I don't know what is causing this much size increase. As i have more than 300 features my data size will be 300*4 = 1200 GB on a daily basis and i want this to be reduced
Can someone help me understand this?