optuna.samplers
The samplers
module defines a base class for parameter sampling as described extensively in BaseSampler
. The remaining classes in this module represent child classes, deriving from BaseSampler
, which implement different sampling strategies.
See also
3. Efficient Optimization Algorithms tutorial explains the overview of the sampler classes.
See also
User-Defined Sampler tutorial could be helpful if you want to implement your own sampler classes.
RandomSampler |
GridSampler |
TPESampler |
CmaEsSampler |
NSGAIISampler |
QMCSampler |
BoTorchSampler |
|
---|---|---|---|---|---|---|---|
Float parameters |
✅ |
✅ |
✅ |
✅ |
▲ |
✅ |
✅ |
Integer parameters |
✅ |
✅ |
✅ |
✅ |
▲ |
✅ |
✅ |
Categorical parameters |
✅ |
✅ |
✅ |
▲ |
✅ |
▲ |
✅ |
Pruning |
✅ |
✅ |
✅ |
▲ |
❌ |
✅ |
▲ |
Multivariate optimization |
▲ |
▲ |
✅ |
✅ |
▲ |
▲ |
✅ |
Conditional search space |
✅ |
▲ |
✅ |
▲ |
▲ |
▲ |
▲ |
Multi-objective optimization |
✅ |
▲ |
✅ |
❌ |
✅(▲ for single-objective) |
▲ |
✅ |
Batch optimization |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
▲ |
Distributed optimization |
✅ |
✅ |
✅ |
✅ |
✅ |
✅ |
▲ |
Constrained optimization |
❌ |
❌ |
✅ |
❌ |
✅ |
❌ |
✅ |
Time complexity (per trial) (*) |
\(O(d)\) |
\(O(dn)\) |
\(O(dn \log n)\) |
\(O(d^3)\) |
\(O(mnp)\) |
\(O(dn)\) |
\(O(n^3)\) |
Recommended budgets (#trials) (**) |
as many as one likes |
number of combinations |
100 ~ 1000 |
1000 ~ 10000 |
100 ~ 10000 |
as many as one likes |
10 ~ 100 |
Note
✅: Supports this feature. ▲ : Works, but inefficiently. ❌: Causes an error, or has no interface.
(*): We assumes that \(d\) is the dimension of the search space, \(n\) is the number of finished trials, \(m\) is the number of objectives, and \(p\) is the population size (algorithm specific parameter).
This table shows the time complexity of the sampling algorithms. We may omit other terms that depend on the implementation in Optuna, including \(O(d)\) to call the sampling methods and \(O(n)\) to collect the completed trials.
This means that, for example, the actual time complexity of RandomSampler
is \(O(d+n+d) = O(d+n)\).
From another perspective, with the exception of NSGAIISampler
, all time complexity is written for single-objective optimization.
(**): The budget depends on the number of parameters and the number of objectives.
Note
For float, integer, or categorical parameters, see 2. Pythonic Search Space tutorial.
For pruning, see 3. Efficient Optimization Algorithms tutorial.
For multivariate optimization, see BaseSampler
. The multivariate optimization is implemented as sample_relative()
in Optuna. Please check the concrete documents of samplers for more details.
For conditional search space, see 2. Pythonic Search Space tutorial and TPESampler
. The group
option of TPESampler
allows TPESampler
to handle the conditional search space.
For multi-objective optimization, see Multi-objective Optimization with Optuna tutorial.
For batch optimization, see Batch Optimization tutorial. Note that the constant_liar
option of TPESampler
allows TPESampler
to handle the batch optimization.
For distributed optimization, see 4. Easy Parallelization tutorial. Note that the constant_liar
option of TPESampler
allows TPESampler
to handle the distributed optimization.
For constrained optimization, see an example.
Base class for samplers. |
|
Sampler using grid search. |
|
Sampler using random sampling. |
|
Sampler using TPE (Tree-structured Parzen Estimator) algorithm. |
|
A sampler using cmaes as the backend. |
|
Sampler with partially fixed parameters. |
|
Multi-objective sampler using the NSGA-II algorithm. |
|
Multi-objective sampler using the MOTPE algorithm. |
|
A Quasi Monte Carlo Sampler that generates low-discrepancy sequences. |
|
A class to calculate the intersection search space of a |
|
Return the intersection search space of the |