Integration¶
-
class
optuna.integration.
ChainerPruningExtension
(trial, observation_key, pruner_trigger)[source]¶ Chainer extension to prune unpromising trials.
Example
Add a pruning extension which observes validation losses to Chainer Trainer.
trainer.extend( ChainerPruningExtension(trial, 'validation/main/loss', (1, 'epoch')))
Parameters: - trial – A
Trial
corresponding to the current evaluation of the objective function. - observation_key – An evaluation metric for pruning, e.g.,
main/loss
andvalidation/main/accuracy
. Please refer to chainer.Reporter reference for further details. - pruner_trigger –
A trigger to execute pruning.
pruner_trigger
is an instance of IntervalTrigger or ManualScheduleTrigger. IntervalTrigger can be specified by a tuple of the interval length and its unit like(1, 'epoch')
.
- trial – A
-
class
optuna.integration.
ChainerMNStudy
(study, comm)[source]¶ A wrapper of
Study
to incorporate Optuna with ChainerMN.See also
ChainerMNStudy
provides the same interface asStudy
. Please refer tooptuna.study.Study
for further details.Example
Optimize an objective function that trains neural network written with ChainerMN.
comm = chainermn.create_communicator('naive') study = optuna.Study(study_name, storage_url) chainermn_study = optuna.integration.ChainerMNStudy(study, comm) chainermn_study.optimize(objective, n_trials=25)
Parameters: - study – A
Study
object. - comm – A ChainerMN communicator.
-
optimize
(func, n_trials=None, timeout=None, catch=(<class 'Exception'>, ))[source]¶ Optimize an objective function.
This method provides the same interface as
optuna.study.Study.optimize()
except the absence ofn_jobs
argument.
- study – A
-
class
optuna.integration.
LightGBMPruningCallback
(trial, metric, valid_name='valid_0')[source]¶ Callback for LightGBM to prune unpromising trials.
Example
Add a pruning callback which observes validation scores to training of a LightGBM model.
param = {'objective': 'binary', 'metric': 'binary_error'} pruning_callback = LightGBMPruningCallback(trial, 'binary_error') gbm = lgb.train(param, dtrain, valid_sets=[dtest], callbacks=[pruning_callback])
Parameters: - trial – A
Trial
corresponding to the current evaluation of the objective function. - metric – An evaluation metric for pruning, e.g.,
binary_error
andmulti_error
. Please refer to LightGBM reference for further details. - valid_name – The name of the target validation.
Validation names are specified by
valid_names
option of train method. If omitted,valid_0
is used which is the default name of the first validation. Note that this argument will be ignored if you are calling cv method instead of train method.
- trial – A
-
class
optuna.integration.
XGBoostPruningCallback
(trial, observation_key)[source]¶ Callback for XGBoost to prune unpromising trials.
Example
Add a pruning callback which observes validation errors to training of an XGBoost model.
pruning_callback = XGBoostPruningCallback(trial, 'validation-error') bst = xgb.train(param, dtrain, evals=[(dtest, 'validation')], callbacks=[pruning_callback])
Parameters: - trial – A
Trial
corresponding to the current evaluation of the objective function. - observation_key – An evaluation metric for pruning, e.g.,
validation-error
andvalidation-merror
. Please refer toeval_metric
in XGBoost reference for further details.
- trial – A