- optuna.integration.lightgbm.train(params, train_set, num_boost_round=1000, valid_sets=None, valid_names=None, feval=None, feature_name='auto', categorical_feature='auto', keep_training_booster=False, callbacks=None, time_budget=None, sample_size=None, study=None, optuna_callbacks=None, model_dir=None, verbosity=None, show_progress_bar=True, *, optuna_seed=None)
Wrapper of LightGBM Training API to tune hyperparameters.
It optimizes the following hyperparameters in a stepwise manner:
min_child_samples. It is a drop-in replacement for lightgbm.train(). See a simple example of LightGBM Tuner which optimizes the validation log loss of cancer detection.
time_budget (int | None) – A time budget for parameter tuning in seconds.
study (Study | None) – A
Studyinstance to store optimization results. The
Trialinstances in it has the following user attributes:
elapsed_secsis the elapsed time since the optimization starts.
average_iteration_timeis the average time of iteration to train the booster model in the trial.
lgbm_paramsis a JSON-serialized dictionary of LightGBM parameters used in the trial.
optuna_callbacks (list[Callable[[Study, FrozenTrial], None]] | None) – List of Optuna callback functions that are invoked at the end of each trial. Each function must accept two parameters with the following types in this order:
FrozenTrial. Please note that this is not a
callbacksargument of lightgbm.train() .
model_dir (str | None) – A directory to save boosters. By default, it is set to
Noneand no boosters are saved. Please set shared directory (e.g., directories on NFS) if you want to access
get_best_booster()in distributed environments. Otherwise, it may raise
ValueError. If the directory does not exist, it will be created. The filenames of the boosters will be
verbosity (int | None) –
A verbosity level to change Optuna’s logging level. The level is aligned to LightGBM’s verbosity .
Deprecated in v2.0.0.
verbosityargument will be removed in the future. The removal of this feature is currently scheduled for v4.0.0, but this schedule is subject to change.
show_progress_bar (bool) –
Flag to show progress bars or not. To disable progress bar, set this
Progress bars will be fragmented by logging messages of LightGBM and Optuna. Please suppress such messages to show the progress bars properly.
optuna_seed (int | None) –
TPESamplerfor random number generator that affects sampling for
The deterministic parameter of LightGBM makes training reproducible. Please enable it when you use this argument.
train_set (lgb.Dataset) –
num_boost_round (int) –
valid_names (Any | None) –
feval (Callable[..., Any] | None) –
feature_name (str) –
categorical_feature (str) –
keep_training_booster (bool) –
callbacks (list[Callable[..., Any]] | None) –
sample_size (int | None) –
- Return type: