optuna.integration.lightgbm.LightGBMTunerCV¶
-
class
optuna.integration.lightgbm.
LightGBMTunerCV
(params: Dict[str, Any], train_set: lgb.Dataset, num_boost_round: int = 1000, folds: Optional[Union[Generator[Tuple[int, int], None, None], Iterator[Tuple[int, int]], BaseCrossValidator]] = None, nfold: int = 5, stratified: bool = True, shuffle: bool = True, fobj: Optional[Callable[[…], Any]] = None, feval: Optional[Callable[[…], Any]] = None, feature_name: str = 'auto', categorical_feature: str = 'auto', early_stopping_rounds: Optional[int] = None, fpreproc: Optional[Callable[[…], Any]] = None, verbose_eval: Optional[Union[bool, int]] = True, show_stdv: bool = True, seed: int = 0, callbacks: Optional[List[Callable[[…], Any]]] = None, time_budget: Optional[int] = None, sample_size: Optional[int] = None, study: Optional[optuna.study.Study] = None, optuna_callbacks: Optional[List[Callable[[optuna.study.Study, optuna.trial._frozen.FrozenTrial], None]]] = None, verbosity: int = 1)[source]¶ Hyperparameter tuner for LightGBM with cross-validation.
It employs the same stepwise approach as
LightGBMTuner
.LightGBMTunerCV
invokes lightgbm.cv() to train and validate boosters whileLightGBMTuner
invokes lightgbm.train(). See a simple example which optimizes the validation log loss of cancer detection.Arguments and keyword arguments for lightgbm.cv() can be passed except
metrics
,init_model
andeval_train_metric
. The arguments that onlyLightGBMTunerCV
has are listed below:- Parameters
time_budget – A time budget for parameter tuning in seconds.
study – A
Study
instance to store optimization results. TheTrial
instances in it has the following user attributes:elapsed_secs
is the elapsed time since the optimization starts.average_iteration_time
is the average time of iteration to train the booster model in the trial.lgbm_params
is a JSON-serialized dictionary of LightGBM parameters used in the trial.optuna_callbacks – List of Optuna callback functions that are invoked at the end of each trial. Each function must accept two parameters with the following types in this order:
Study
andFrozenTrial
. Please note that this is not acallbacks
argument of lightgbm.train() .
-
__init__
(params: Dict[str, Any], train_set: lgb.Dataset, num_boost_round: int = 1000, folds: Optional[Union[Generator[Tuple[int, int], None, None], Iterator[Tuple[int, int]], BaseCrossValidator]] = None, nfold: int = 5, stratified: bool = True, shuffle: bool = True, fobj: Optional[Callable[[…], Any]] = None, feval: Optional[Callable[[…], Any]] = None, feature_name: str = 'auto', categorical_feature: str = 'auto', early_stopping_rounds: Optional[int] = None, fpreproc: Optional[Callable[[…], Any]] = None, verbose_eval: Optional[Union[bool, int]] = True, show_stdv: bool = True, seed: int = 0, callbacks: Optional[List[Callable[[…], Any]]] = None, time_budget: Optional[int] = None, sample_size: Optional[int] = None, study: Optional[optuna.study.Study] = None, optuna_callbacks: Optional[List[Callable[[optuna.study.Study, optuna.trial._frozen.FrozenTrial], None]]] = None, verbosity: int = 1) → None[source]¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__
(params, train_set[, …])Initialize self.
compare_validation_metrics
(val_score, best_score)higher_is_better
()run
()Perform the hyperparameter-tuning with given parameters.
Make subset of self.train_set Dataset object.
tune_bagging
([n_trials])tune_feature_fraction
([n_trials])tune_feature_fraction_stage2
([n_trials])tune_min_data_in_leaf
()tune_num_leaves
([n_trials])tune_regularization_factors
([n_trials])Attributes
Return parameters of the best booster.
Return the score of the best booster.
-
property
best_params
¶ Return parameters of the best booster.
-
property
best_score
¶ Return the score of the best booster.