I have been playing with XGBoost (specifically, the XGBRegressor) in Python. I used it to create a model with 200 estimators, max_depth of 14. It was trained on about 2M training data points. It only has 3 features and 1 output. The model is extremely accurate and I also checked it was not overfitting. But, when I save the model, it is huge! It takes up 160 MB on disk, and when I convert it to C (using Treelite) it is 490 MB on disk. I have to finally deploy it in machines without python, and it has to interface with another software which can only load C files. Such a huge size is a major implementation challenge.
Is there something I am not doing right? Or do xgboost models typically this large? I have looked around on the web, but haven't been able to find this issue.
I am using Python 3.7 and Anaconda to build my models. I am not sure what other details I should post.