Questions tagged [lightgbm]

LightGBM is a gradient boosting framework that uses tree based learning algorithms. It is designed to be distributed and efficient with the following advantages: ... Support of parallel and GPU learning. Capable of handling large-scale data.

LightGBM is a high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. It is under the umbrella of the DMTK(http://github.com/microsoft/dmtk) project of Microsoft.

Resources:

676 questions
6
votes
2 answers

Bayesian optimization for a Light GBM Model

I am able to successfully improve the performance of my XGBoost model through Bayesian optimization, but the best I can achieve through Bayesian optimization when using Light GBM (my preferred choice) is worse than what I was able to achieve by…
xxyy
  • 109
  • 2
  • 9
6
votes
1 answer

LightGBM ignore warning about "boost_from_average"

I use LightGBM model (version 2.2.1). It shows next warning on train: [LightGBM] [Warning] Starting from the 2.1.2 version, default value for the "boost_from_average" parameter in "binary" objective is true. This may cause significantly…
Mikhail_Sam
  • 10,602
  • 11
  • 66
  • 102
6
votes
2 answers

Lightgbm inside docker libgomp.so.1: cannot open shared object file

I have LightGBM installed in my mac and tested earlier for a different project. Now I am inside a docker with python 3.6 on my mac. As soon as I add import lightgbm as lgbm in my Flask application, I get error OSError: libgomp.so.1: cannot open…
nad
  • 2,640
  • 11
  • 55
  • 96
6
votes
1 answer

evaluating test dataset using eval() in LightGBM

I have trained a ranking model with LightGBM with the objective 'lambdarank'. I want to evaluate my model to get the nDCG score for my test dataset using the best iteration, but I have never been able to use the lightgbm.Booster.eval() nor…
John
  • 309
  • 3
  • 12
6
votes
0 answers

How to plot a decision tree from lightgbm model IN R?

How do you plot a lightgbm decision tree? I have searched everywhere but I could not find a solution. Here is an example of the model: data(agaricus.train, package = "lightgbm") train <- agaricus.train dtrain <- lgb.Dataset(train$data, label =…
aSRA
  • 67
  • 3
6
votes
1 answer

Loading lightgbm model and using predict with parallel for loop freezes (Python)

I have the need to use my model to do predictions in batches and in parallel in python. If I load the model and create the data frames in a regular for loop and use the predict function it works with no issues. If I create disjoint data frames in…
6
votes
1 answer

How to set the frequency of metric output in lightGBM?

In the documents, it is said that we can set the parameter metric_freq to set the frequency. I have also tried the parameter verbose, the parameters are set as params = { 'task': 'train', 'boosting_type': 'gbdt', 'objective': 'binary', …
Yu Gu
  • 2,382
  • 5
  • 18
  • 33
5
votes
4 answers

I am not able to run lightgbm on Mac because of an OSerror: libomp.dylib' (no such file)

I am not able to run lightbm on my MAC. I already tried all the solutions that I found but I can't get rid of the problem. When I import lightgbm the following OSERROR…
Pedro Silva
  • 61
  • 1
  • 2
5
votes
1 answer

Optuna LightGBM LightGBMPruningCallback

I am getting an error on my modeling of lightgbm searching for optimal auc. Any help would be appreciated. import optuna from sklearn.model_selection import StratifiedKFold from optuna.integration import LightGBMPruningCallback def…
Tinkinc
  • 449
  • 2
  • 8
  • 21
5
votes
1 answer

LightGBM with Tweedie loss; I'm confused on the Gradient and Hessians used

I'm trying to figure out custom objective functions in LightGBM, and I figured a good place to start would be replicating the built-in functions. The equation LightGBM uses to calculate the Tweedie metric…
Sinnombre
  • 346
  • 1
  • 7
5
votes
1 answer

Suppress LightGBM warnings in Optuna

I am getting below warnings while I am using Optuna to tune my model. Please tell me how to suppress these warnings? [LightGBM] [Warning] feature_fraction is set=0.2, colsample_bytree=1.0 will be ignored. Current value:…
ffl
  • 91
  • 1
  • 4
5
votes
4 answers

Optuna pass dictionary of parameters from "outside"

I am using Optuna to optimize some objective functions. I would like to create my custom class that "wraps" the standard Optuna code. As an example, this is my class(it is still a work in progress!): class Optimizer(object): def…
Mattia Surricchio
  • 1,362
  • 2
  • 21
  • 49
5
votes
2 answers

LightGBM on Numerical+Categorical+Text Features >> TypeError: Unknown type of parameter:boosting_type, got:dict

Im trying to train a lightGBM model on a dataset consisting of numerical, Categorical and Textual data. However, during the training phase, i get the following error: params = { 'num_class':5, 'max_depth':8, 'num_leaves':200, 'learning_rate':…
redwolf_cr7
  • 1,845
  • 4
  • 26
  • 30
5
votes
0 answers

Hyperparameter Optimization with LGBMRanker

I'm using LGBMRanker for a ranking problem and want optimize the hyperparameters with GridSearchCV. I have three splits of data (X,y): X_1, X_2, X_3 , y_1, y_2, y_3 I also have the query group size for each split (three lists): gp_1, gp_2, gp_3 I'm…
Ellie
  • 51
  • 4
5
votes
1 answer

LightGBM error : ValueError: For early stopping, at least one dataset and eval metric is required for evaluation

I am trying to train a LightGBM with gridsearch, I get the below error when I try to train model. ValueError: For early stopping, at least one dataset and eval metric is required for evaluation I have provided validation dataset and evaluation…
deep
  • 91
  • 1
  • 2
  • 8