optuna.study.Study
- class optuna.study.Study(study_name, storage, sampler=None, pruner=None)[source]
A study corresponds to an optimization task, i.e., a set of trials.
This object provides interfaces to run a new
Trial
, access trials’ history, set/get user-defined attributes of the study itself.Note that the direct use of this constructor is not recommended. To create and load a study, please refer to the documentation of
create_study()
andload_study()
respectively.Methods
add_trial
(trial)Add trial to study.
add_trials
(trials)Add trials to study.
ask
([fixed_distributions])Create a new trial from which hyperparameters can be suggested.
enqueue_trial
(params[, user_attrs])Enqueue a trial with given parameter values.
get_trials
([deepcopy, states])Return all trials in the study.
optimize
(func[, n_trials, timeout, n_jobs, ...])Optimize an objective function.
set_system_attr
(key, value)Set a system attribute to the study.
set_user_attr
(key, value)Set a user attribute to the study.
stop
()Exit from the current optimization loop after the running trials finish.
tell
(trial[, values, state, skip_if_finished])Finish a trial created with
ask()
.trials_dataframe
([attrs, multi_index])Export trials as a pandas DataFrame.
Attributes
Return parameters of the best trial in the study.
Return the best trial in the study.
Return trials located at the Pareto front in the study.
Return the best objective value in the study.
Return the direction of the study.
Return the directions of the study.
Return system attributes.
Return all trials in the study.
Return user attributes.
- Parameters
study_name (str) –
sampler (Optional[samplers.BaseSampler]) –
pruner (Optional[BasePruner]) –
- add_trial(trial)[source]
Add trial to study.
The trial is validated before being added.
Example
import optuna from optuna.distributions import FloatDistribution def objective(trial): x = trial.suggest_float("x", 0, 10) return x**2 study = optuna.create_study() assert len(study.trials) == 0 trial = optuna.trial.create_trial( params={"x": 2.0}, distributions={"x": FloatDistribution(0, 10)}, value=4.0, ) study.add_trial(trial) assert len(study.trials) == 1 study.optimize(objective, n_trials=3) assert len(study.trials) == 4 other_study = optuna.create_study() for trial in study.trials: other_study.add_trial(trial) assert len(other_study.trials) == len(study.trials) other_study.optimize(objective, n_trials=2) assert len(other_study.trials) == len(study.trials) + 2
See also
This method should in general be used to add already evaluated trials (
trial.state.is_finished() == True
). To queue trials for evaluation, please refer toenqueue_trial()
.See also
See
create_trial()
for how to create trials.See also
Please refer to Second scenario: Have Optuna utilize already evaluated hyperparameters for the tutorial of specifying hyperparameters with the evaluated value manually.
- Parameters
trial (FrozenTrial) – Trial to add.
- Return type
None
- add_trials(trials)[source]
Add trials to study.
The trials are validated before being added.
Example
import optuna def objective(trial): x = trial.suggest_float("x", 0, 10) return x**2 study = optuna.create_study() study.optimize(objective, n_trials=3) assert len(study.trials) == 3 other_study = optuna.create_study() other_study.add_trials(study.trials) assert len(other_study.trials) == len(study.trials) other_study.optimize(objective, n_trials=2) assert len(other_study.trials) == len(study.trials) + 2
See also
See
add_trial()
for addition of each trial.- Parameters
trials (Iterable[FrozenTrial]) – Trials to add.
- Return type
None
- ask(fixed_distributions=None)[source]
Create a new trial from which hyperparameters can be suggested.
This method is part of an alternative to
optimize()
that allows controlling the lifetime of a trial outside the scope offunc
. Each call to this method should be followed by a call totell()
to finish the created trial.See also
The Ask-and-Tell Interface tutorial provides use-cases with examples.
Example
Getting the trial object with the
ask()
method.import optuna study = optuna.create_study() trial = study.ask() x = trial.suggest_float("x", -1, 1) study.tell(trial, x**2)
Example
Passing previously defined distributions to the
ask()
method.import optuna study = optuna.create_study() distributions = { "optimizer": optuna.distributions.CategoricalDistribution(["adam", "sgd"]), "lr": optuna.distributions.FloatDistribution(0.0001, 0.1, log=True), } # You can pass the distributions previously defined. trial = study.ask(fixed_distributions=distributions) # `optimizer` and `lr` are already suggested and accessible with `trial.params`. assert "optimizer" in trial.params assert "lr" in trial.params
- Parameters
fixed_distributions (Optional[Dict[str, BaseDistribution]]) – A dictionary containing the parameter names and parameter’s distributions. Each parameter in this dictionary is automatically suggested for the returned trial, even when the suggest method is not explicitly invoked by the user. If this argument is set to
None
, no parameter is automatically suggested.- Returns
A
Trial
.- Return type
- property best_params: Dict[str, Any]
Return parameters of the best trial in the study.
Note
This feature can only be used for single-objective optimization.
- Returns
A dictionary containing parameters of the best trial.
- property best_trial: FrozenTrial
Return the best trial in the study.
Note
This feature can only be used for single-objective optimization. If your study is multi-objective, use
best_trials
instead.- Returns
A
FrozenTrial
object of the best trial.
See also
The Re-use the best trial tutorial provides a detailed example of how to use this method.
- property best_trials: List[FrozenTrial]
Return trials located at the Pareto front in the study.
A trial is located at the Pareto front if there are no trials that dominate the trial. It’s called that a trial
t0
dominates another trialt1
ifall(v0 <= v1) for v0, v1 in zip(t0.values, t1.values)
andany(v0 < v1) for v0, v1 in zip(t0.values, t1.values)
are held.- Returns
A list of
FrozenTrial
objects.
- property best_value: float
Return the best objective value in the study.
Note
This feature can only be used for single-objective optimization.
- Returns
A float representing the best objective value.
- property direction: StudyDirection
Return the direction of the study.
Note
This feature can only be used for single-objective optimization. If your study is multi-objective, use
directions
instead.- Returns
A
StudyDirection
object.
- property directions: List[StudyDirection]
Return the directions of the study.
- Returns
A list of
StudyDirection
objects.
- enqueue_trial(params, user_attrs=None)[source]
Enqueue a trial with given parameter values.
You can fix the next sampling parameters which will be evaluated in your objective function.
Example
import optuna def objective(trial): x = trial.suggest_float("x", 0, 10) return x**2 study = optuna.create_study() study.enqueue_trial({"x": 5}) study.enqueue_trial({"x": 0}, user_attrs={"memo": "optimal"}) study.optimize(objective, n_trials=2) assert study.trials[0].params == {"x": 5} assert study.trials[1].params == {"x": 0} assert study.trials[1].user_attrs == {"memo": "optimal"}
- Parameters
- Return type
None
See also
Please refer to First Scenario: Have Optuna evaluate your hyperparameters for the tutorial of specifying hyperparameters manually.
- get_trials(deepcopy=True, states=None)[source]
Return all trials in the study.
The returned trials are ordered by trial number.
Example
import optuna def objective(trial): x = trial.suggest_float("x", -1, 1) return x**2 study = optuna.create_study() study.optimize(objective, n_trials=3) trials = study.get_trials() assert len(trials) == 3
- Parameters
deepcopy (bool) – Flag to control whether to apply
copy.deepcopy()
to the trials. Note that if you set the flag toFalse
, you shouldn’t mutate any fields of the returned trial. Otherwise the internal state of the study may corrupt and unexpected behavior may happen.states (Optional[Container[TrialState]]) – Trial states to filter on. If
None
, include all states.
- Returns
A list of
FrozenTrial
objects.- Return type
- optimize(func, n_trials=None, timeout=None, n_jobs=1, catch=(), callbacks=None, gc_after_trial=False, show_progress_bar=False)[source]
Optimize an objective function.
Optimization is done by choosing a suitable set of hyperparameter values from a given range. Uses a sampler which implements the task of value suggestion based on a specified distribution. The sampler is specified in
create_study()
and the default choice for the sampler is TPE. See alsoTPESampler
for more details on ‘TPE’.Optimization will be stopped when receiving a termination signal such as SIGINT and SIGTERM. Unlike other signals, a trial is automatically and cleanly failed when receiving SIGINT (Ctrl+C). If
n_jobs
is greater than one or if another signal than SIGINT is used, the interrupted trial state won’t be properly updated.Example
import optuna def objective(trial): x = trial.suggest_float("x", -1, 1) return x**2 study = optuna.create_study() study.optimize(objective, n_trials=3)
- Parameters
func (Callable[[Trial], Union[float, Sequence[float]]]) – A callable that implements objective function.
The number of trials for each process.
None
represents no limit in terms of the number of trials. The study continues to create trials until the number of trials reachesn_trials
,timeout
period elapses,stop()
is called, or a termination signal such as SIGTERM or Ctrl+C is received.See also
optuna.study.MaxTrialsCallback
can ensure how many times trials will be performed across all processes.timeout (Union[None, float]) – Stop study after the given number of second(s).
None
represents no limit in terms of elapsed time. The study continues to create trials until the number of trials reachesn_trials
,timeout
period elapses,stop()
is called or, a termination signal such as SIGTERM or Ctrl+C is received.n_jobs (int) –
The number of parallel jobs. If this argument is set to
-1
, the number is set to CPU count.Note
n_jobs
allows parallelization usingthreading
and may suffer from Python’s GIL. It is recommended to use process-based parallelization iffunc
is CPU bound.catch (Tuple[Type[Exception], ...]) – A study continues to run even when a trial raises one of the exceptions specified in this argument. Default is an empty tuple, i.e. the study will stop for any exception except for
TrialPruned
.callbacks (Optional[List[Callable[[Study, FrozenTrial], None]]]) –
List of callback functions that are invoked at the end of each trial. Each function must accept two parameters with the following types in this order:
Study
andFrozenTrial
.See also
See the tutorial of Callback for Study.optimize for how to use and implement callback functions.
gc_after_trial (bool) –
Flag to determine whether to automatically run garbage collection after each trial. Set to
True
to run the garbage collection,False
otherwise. When it runs, it runs a full collection by internally callinggc.collect()
. If you see an increase in memory consumption over several trials, try setting this flag toTrue
.show_progress_bar (bool) – Flag to show progress bars or not. To disable progress bar, set this
False
. Currently, progress bar is experimental feature and disabled whenn_trials
isNone`
,timeout
not isNone
, andn_jobs
\(\ne 1\).
- Raises
RuntimeError – If nested invocation of this method occurs.
- Return type
None
- set_system_attr(key, value)[source]
Set a system attribute to the study.
Note that Optuna internally uses this method to save system messages. Please use
set_user_attr()
to set users’ attributes.
- set_user_attr(key, value)[source]
Set a user attribute to the study.
See also
See
user_attrs
for related attribute.See also
See the recipe on User Attributes.
Example
import optuna def objective(trial): x = trial.suggest_float("x", 0, 1) y = trial.suggest_float("y", 0, 1) return x**2 + y**2 study = optuna.create_study() study.set_user_attr("objective function", "quadratic function") study.set_user_attr("dimensions", 2) study.set_user_attr("contributors", ["Akiba", "Sano"]) assert study.user_attrs == { "objective function": "quadratic function", "dimensions": 2, "contributors": ["Akiba", "Sano"], }
- stop()[source]
Exit from the current optimization loop after the running trials finish.
This method lets the running
optimize()
method return immediately after all trials which theoptimize()
method spawned finishes. This method does not affect any behaviors of parallel or successive study processes. This method only works when it is called inside an objective function or callback.Example
import optuna def objective(trial): if trial.number == 4: trial.study.stop() x = trial.suggest_float("x", 0, 10) return x**2 study = optuna.create_study() study.optimize(objective, n_trials=10) assert len(study.trials) == 5
- Return type
None
- property system_attrs: Dict[str, Any]
Return system attributes.
- Returns
A dictionary containing all system attributes.
- tell(trial, values=None, state=None, skip_if_finished=False)[source]
Finish a trial created with
ask()
.See also
The Ask-and-Tell Interface tutorial provides use-cases with examples.
Example
import optuna from optuna.trial import TrialState def f(x): return (x - 2) ** 2 def df(x): return 2 * x - 4 study = optuna.create_study() n_trials = 30 for _ in range(n_trials): trial = study.ask() lr = trial.suggest_float("lr", 1e-5, 1e-1, log=True) # Iterative gradient descent objective function. x = 3 # Initial value. for step in range(128): y = f(x) trial.report(y, step=step) if trial.should_prune(): # Finish the trial with the pruned state. study.tell(trial, state=TrialState.PRUNED) break gy = df(x) x -= gy * lr else: # Finish the trial with the final value after all iterations. study.tell(trial, y)
- Parameters
trial (Union[Trial, int]) – A
Trial
object or a trial number.values (Optional[Union[float, Sequence[float]]]) – Optional objective value or a sequence of such values in case the study is used for multi-objective optimization. Argument must be provided if
state
isCOMPLETE
and should beNone
ifstate
isFAIL
orPRUNED
.state (Optional[TrialState]) – State to be reported. Must be
None
,COMPLETE
,FAIL
orPRUNED
. Ifstate
isNone
, it will be updated toCOMPLETE
orFAIL
depending on whether validation forvalues
reported succeed or not.skip_if_finished (bool) – Flag to control whether exception should be raised when values for already finished trial are told. If
True
, tell is skipped without any error when the trial is already finished.
- Returns
A
FrozenTrial
representing the resulting trial. A returned trial is deep copied thus user can modify it as needed.- Return type
- property trials: List[FrozenTrial]
Return all trials in the study.
The returned trials are ordered by trial number.
This is a short form of
self.get_trials(deepcopy=True, states=None)
.- Returns
A list of
FrozenTrial
objects.
- trials_dataframe(attrs=('number', 'value', 'datetime_start', 'datetime_complete', 'duration', 'params', 'user_attrs', 'system_attrs', 'state'), multi_index=False)[source]
Export trials as a pandas DataFrame.
The DataFrame provides various features to analyze studies. It is also useful to draw a histogram of objective values and to export trials as a CSV file. If there are no trials, an empty DataFrame is returned.
Example
import optuna import pandas def objective(trial): x = trial.suggest_float("x", -1, 1) return x**2 study = optuna.create_study() study.optimize(objective, n_trials=3) # Create a dataframe from the study. df = study.trials_dataframe() assert isinstance(df, pandas.DataFrame) assert df.shape[0] == 3 # n_trials.
- Parameters
attrs (Tuple[str, ...]) – Specifies field names of
FrozenTrial
to include them to a DataFrame of trials.multi_index (bool) – Specifies whether the returned DataFrame employs MultiIndex or not. Columns that are hierarchical by nature such as
(params, x)
will be flattened toparams_x
when set toFalse
.
- Returns
- Return type
pd.DataFrame
Note
If
value
is inattrs
during multi-objective optimization, it is implicitly replaced withvalues
.
- property user_attrs: Dict[str, Any]
Return user attributes.
See also
See
set_user_attr()
for related method.Example
import optuna def objective(trial): x = trial.suggest_float("x", 0, 1) y = trial.suggest_float("y", 0, 1) return x**2 + y**2 study = optuna.create_study() study.set_user_attr("objective function", "quadratic function") study.set_user_attr("dimensions", 2) study.set_user_attr("contributors", ["Akiba", "Sano"]) assert study.user_attrs == { "objective function": "quadratic function", "dimensions": 2, "contributors": ["Akiba", "Sano"], }
- Returns
A dictionary containing all user attributes.