1
[20:44:18] Allocated 994MB on [0] GeForce MX150, 670MB remaining.
[20:44:19] Allocated 151MB on [0] GeForce MX150, 517MB remaining.
[20:44:19] Allocated 37MB on [0] GeForce MX150, 475MB remaining.
Traceback (most recent call last):
  File "tune_hyper_para.py", line 86, in <module>
    seed=42, metrics=['auc'], verbose_eval=True)
  File "C:\Users\chang\Anaconda3\envs\xgb-gpu\lib\site-packages\xgboost-0.71-py3.6.egg\xgboost\training.py", line 406, in cv
    fold.update(i, obj)
  File "C:\Users\chang\Anaconda3\envs\xgb-gpu\lib\site-packages\xgboost-0.71-py3.6.egg\xgboost\training.py", line 218, in update
    self.bst.update(self.dtrain, iteration, fobj)
  File "C:\Users\chang\Anaconda3\envs\xgb-gpu\lib\site-packages\xgboost-0.71-py3.6.egg\xgboost\core.py", line 894, in update
    dtrain.handle))
  File "C:\Users\chang\Anaconda3\envs\xgb-gpu\lib\site-packages\xgboost-0.71-py3.6.egg\xgboost\core.py", line 130, in _check_call
    raise XGBoostError(_LIB.XGBGetLastError())
xgboost.core.XGBoostError: b'[20:44:22] C:/dev/libs/xgboost/src/tree/updater_gpu.cu:537: GPU plugin exception: c:\\dev\\libs\\xgboost\\src\\tree\\../common/device_helpers.cuh(467): out of memory\n'

The error code is listed above. I am taking part in a Kaggle Inclass Competition, and try to use GPU support on XGBoost to accelerate the process of tuning hyper parameters.

I use read_csv in pandas to load a dataset of 73000 * 300 numerical values, around 150MB, but I am so confused why there is an "out of memory" error. I have tried to slice the dataset smaller, but only when the new dataset is less than 1/5 of the old one can the code work.

Is there some configurations to finish? I install the GPU support with a pre-compiled binary from Download XGBoost Windows x64 Binaries and Executables. Is it relevant? Or my GPU device is so poor? But I find that, less than half of the memory of GPU are taken up in the task manager when the code runs, so I guess maybe there are some approaches to fully utilize my GPU device?

If it's the problem of my device, why this program takes up so much memory and is only capable to process a dataset around 30mb?

  • What is your GPU ? I think the problem comes from the installation of xgboost in regard of your components – Mohamed AL ANI May 26 '18 at 14:28
  • Check the available VRAM from your display adapater settings, if it is less that your computation requirement, u end up in oom error – Surya Tej May 26 '18 at 14:32
  • @Mohamed: My laptop uses NVIDIA GeForce MX150, which has 2G VRAM, so i think it should really work when processing a dataset around 200mb. And I also have a Intel integrated graphics card with 1GB VRAM. Is it relevant? I'm kind of confused and lazy in installing the xgboost, but it seems to be a bad idea. – Yaomin Chang May 27 '18 at 05:48

0 Answers0