0

I am using ARIMA model from pmdarima for my time series analysis. I want to do incremental training for which i use to save my models using pickle to s3 and then load it to update the model on a daily basis. I do the training of around 300 features. The size of models is too big. For eg, for a feature X, the number of rows in the dataset are 1 Million with 2 columns, datetime and Y.

The model size of this feature is around 4GB. I don't know what is causing this much size increase. As i have more than 300 features my data size will be 300*4 = 1200 GB on a daily basis and i want this to be reduced

Can someone help me understand this?

Rohan0980
  • 47
  • 5
  • Just to clarify. You said that for a "feature X" you have 1M observations with datetime and Y. So your features are actually your time-series. Is this correct? What are the orders of the ARIMA model you are training? – Hemerson Tacon Aug 02 '21 at 12:19
  • Yes. ARIMA is for time series forcasting only. My order is (3,1,3) – Rohan0980 Aug 02 '21 at 13:07

0 Answers0