Questions tagged [lightgbm]

LightGBM is a gradient boosting framework that uses tree based learning algorithms. It is designed to be distributed and efficient with the following advantages: ... Support of parallel and GPU learning. Capable of handling large-scale data.

LightGBM is a high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. It is under the umbrella of the DMTK(http://github.com/microsoft/dmtk) project of Microsoft.

Resources:

676 questions
8
votes
2 answers

LightGBM: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

I was running lightgbm with categorical features: X_train, X_test, y_train, y_test = train_test_split(train_X, train_y, test_size=0.3) train_data = lgb.Dataset(X_train, label=y_train, feature_name=X_train.columns, …
MJeremy
  • 1,102
  • 17
  • 27
8
votes
4 answers

During installation of lightgbm it says that you should install cmake first, while I have installed it

I want to install the GPU version of lightgbm on Ubuntu, based on the following command: pip install lightgbm --install-option=--gpu During installation, an error is occurred saying "Please install CMake first". After installing CMake, I get the…
Hossein
  • 2,041
  • 1
  • 16
  • 29
8
votes
2 answers

Python - LightGBM with GridSearchCV, is running forever

Recently, I am doing multiple experiments to compare Python XgBoost and LightGBM. It seems that this LightGBM is a new algorithm that people say it works better than XGBoost in both speed and accuracy. This is LightGBM GitHub. This is LightGBM…
Cherry Wu
  • 3,844
  • 9
  • 43
  • 63
7
votes
1 answer

Need help implementing a custom loss function in lightGBM (Zero-inflated Log Normal Loss)

Im trying to implement this zero-inflated log normal loss function based on this paper in lightGBM (https://arxiv.org/pdf/1912.07753.pdf) (page 5). But, admittedly, I just don’t know how. I don’t understand how to get the gradient and hessian of…
7
votes
0 answers

Using LightGBM with MultiOutput Regressor and eval set

I am trying to use LightGBM as a multi-output predictor as suggested here. I am trying to forecast values for thirty consecutive days. I have a panel dataset so I can't use the traditional time series approaches. I have a very large dataset so it…
azuber
  • 389
  • 4
  • 12
7
votes
2 answers

How can I stop the log output of lightgbm?

I would like to know how to stop lightgbm logging. What kind of settings should I use to stop the log? Also, is there a way to output only your own log with the lightgbm log stopped?
musako
  • 897
  • 2
  • 10
  • 26
7
votes
0 answers

sklearn : FitFailedWarning : Estimator fit failed

As you can see, I have a problem with using sklearn (lightgbm, GridSearchCV). Please let me know how to solve this error. My code is the following: import lightgbm as lgb from lightgbm.sklearn import LGBMClassifier estimator =…
E.Kim
  • 95
  • 1
  • 2
7
votes
1 answer

num_leaves selection in LightGBM?

Is there any rule of thumb to initialize the num_leaves parameter in lightgbm. For example for 1000 featured dataset, we know that with tree-depth of 10, it can cover the entire dataset, so we can choose this accordingly, and search space for tuning…
Ankish Bansal
  • 1,827
  • 3
  • 15
  • 25
7
votes
2 answers

How to write custom F1 score metric in light gbm python in Multiclass classification

Can someone help me how to write custom F1 score for multiclass classification in python??? Edit: I'm editing the question to give a better picture of what I want to do This is my function for a custom eval f1 score metric for multiclass problem…
Thanish
  • 79
  • 1
  • 3
7
votes
1 answer

LightGBMError: b'Check failed: config->bagging_freq > 0 && config->bagging_fraction < 1.0f && config->bagging_fraction > 0.0f

Working with lightGBM on Python and as it doesn't have enough documentation, I am unable to tackle this issue since a while. Please help me out with these few questions if anyone with lgb experience available here. lgb.cv doesn't work when having…
Krithi07
  • 481
  • 2
  • 7
  • 18
7
votes
3 answers

Installing GPU support for LightGBM on Google Collab

Anyone got luck trying to install GPU support for lightgbm on Google Collab using the Notebooks there ?
Propanon
  • 109
  • 1
  • 6
7
votes
1 answer

Disambiguating eval, obj (objective), and metric in LightGBM

I'm asking this in reference to the R library lightgbm but I think it applies equally to the Python and Multiverso versions. There are 3 parameters wherein you can choose statistics of interest for your model - metric, eval, and obj. I'm trying to…
Hack-R
  • 22,422
  • 14
  • 75
  • 131
6
votes
0 answers

Multi-Class or Multi-Label Classification with LightGBM

I am working on a classification project that an outcome may belong to multiple classes. For example, the outcomes may belong to Class A, B, and/or C; e.g., A, B, A&B, A&C, B&C, etc. However, I want to predict the probability of a class. For…
6
votes
1 answer

Reproduce LightGBM Custom Loss Function for Regression

I want to reproduce the custom loss function for LightGBM. This is what I tried: lgb.train(params=params, train_set=dtrain, num_boost_round=num_round, fobj=default_mse_obj) With default_mse_obj being defined as: residual = y_true -…
Franc Weser
  • 767
  • 4
  • 16
6
votes
1 answer

How to implement custom logloss with identical behavior to binary objective in LightGBM?

I am trying to implement my own loss function for binary classification. To get started, I want to reproduce the exact behavior of the binary objective. In particular, I want that: The loss of both functions have the same scale The training and…
Joel
  • 15,496
  • 7
  • 52
  • 40
1 2
3
44 45