optuna.integration.lightgbm.train(*args: Any, **kwargs: Any) → Any[source]

Wrapper of LightGBM Training API to tune hyperparameters.

It tunes important hyperparameters (e.g., min_child_samples and feature_fraction) in a stepwise manner. It is a drop-in replacement for lightgbm.train(). See a simple example of LightGBM Tuner which optimizes the validation log loss of cancer detection.

train() is a wrapper function of LightGBMTuner. To use feature in Optuna such as suspended/resumed optimization and/or parallelization, refer to LightGBMTuner instead of this function.

Arguments and keyword arguments for lightgbm.train() can be passed.