Pruners¶

class
optuna.pruners.
MedianPruner
(n_startup_trials=5, n_warmup_steps=0, interval_steps=1)[source]¶ Pruner using the median stopping rule.
Prune if the trial’s best intermediate result is worse than median of intermediate results of previous trials at the same step.
Example
We minimize an objective function with the median stopping rule.
>>> from optuna import create_study >>> from optuna.pruners import MedianPruner >>> >>> def objective(trial): >>> ... >>> >>> study = create_study(pruner=MedianPruner()) >>> study.optimize(objective)
Parameters:  n_startup_trials – Pruning is disabled until the given number of trials finish in the same study.
 n_warmup_steps – Pruning is disabled until the trial reaches the given number of step.
 interval_steps – Interval in number of steps between the pruning checks, offset by the warmup steps. If no value has been reported at the time of a pruning check, that particular check will be postponed until a value is reported.

class
optuna.pruners.
NopPruner
[source]¶ Pruner which never prunes trials.
Example
>>> from optuna import create_study >>> from optuna.pruners import NopPruner >>> >>> def objective(trial): >>> ... >>> >>> study = create_study(pruner=NopPruner()) >>> study.optimize(objective)

class
optuna.pruners.
PercentilePruner
(percentile, n_startup_trials=5, n_warmup_steps=0, interval_steps=1)[source]¶ Pruner to keep the specified percentile of the trials.
Prune if the best intermediate value is in the bottom percentile among trials at the same step.
Example
>>> from optuna import create_study >>> from optuna.pruners import PercentilePruner >>> >>> def objective(trial): >>> ... >>> >>> study = create_study(pruner=PercentilePruner(25.0)) >>> study.optimize(objective)
Parameters:  percentile – Percentile which must be between 0 and 100 inclusive (e.g., When given 25.0, top of 25th percentile trials are kept).
 n_startup_trials – Pruning is disabled until the given number of trials finish in the same study.
 n_warmup_steps – Pruning is disabled until the trial reaches the given number of step.
 interval_steps – Interval in number of steps between the pruning checks, offset by the warmup steps. If no value has been reported at the time of a pruning check, that particular check will be postponed until a value is reported. Value must be at least 1.

class
optuna.pruners.
SuccessiveHalvingPruner
(min_resource=1, reduction_factor=4, min_early_stopping_rate=0)[source]¶ Pruner using Asynchronous Successive Halving Algorithm.
Successive Halving is a banditbased algorithm to identify the best one among multiple configurations. This class implements an asynchronous version of Successive Halving. Please refer to the paper of Asynchronous Successive Halving for detailed descriptions.
Note that, this class does not take care of the parameter for the maximum resource, referred to as \(R\) in the paper. The maximum resource allocated to a trial is typically limited inside the objective function (e.g.,
step
number in simple.py,EPOCH
number in chainer_integration.py).Example
We minimize an objective function with
SuccessiveHalvingPruner
.>>> from optuna import create_study >>> from optuna.pruners import SuccessiveHalvingPruner >>> >>> def objective(trial): >>> ... >>> >>> study = create_study(pruner=SuccessiveHalvingPruner()) >>> study.optimize(objective)
Parameters:  min_resource –
A parameter for specifying the minimum resource allocated to a trial (in the paper this parameter is referred to as \(r\)).
A trial is never pruned until it executes \(\mathsf{min}\_\mathsf{resource} \times \mathsf{reduction}\_\mathsf{factor}^{ \mathsf{min}\_\mathsf{early}\_\mathsf{stopping}\_\mathsf{rate}}\) steps (i.e., the completion point of the first rung). When the trial completes the first rung, it will be promoted to the next rung only if the value of the trial is placed in the top \({1 \over \mathsf{reduction}\_\mathsf{factor}}\) fraction of the all trials that already have reached the point (otherwise it will be pruned there). If the trial won the competition, it runs until the next completion point (i.e., \(\mathsf{min}\_\mathsf{resource} \times \mathsf{reduction}\_\mathsf{factor}^{ (\mathsf{min}\_\mathsf{early}\_\mathsf{stopping}\_\mathsf{rate} + \mathsf{rung})}\) steps) and repeats the same procedure.
 reduction_factor –
A parameter for specifying reduction factor of promotable trials (in the paper this parameter is referred to as \(\eta\)). At the completion point of each rung, about \({1 \over \mathsf{reduction}\_\mathsf{factor}}\) trials will be promoted.
 min_early_stopping_rate –
A parameter for specifying the minimum earlystopping rate (in the paper this parameter is referred to as \(s\)).
 min_resource –