Note

Go to the end to download the full example code.

# Lightweight, versatile, and platform agnostic architecture

Optuna is entirely written in Python and has few dependencies. This means that we can quickly move to the real example once you get interested in Optuna.

## Quadratic Function Example

Usually, Optuna is used to optimize hyperparameters, but as an example, let’s optimize a simple quadratic function: \((x - 2)^2\).

First of all, import `optuna`

.

```
import optuna
```

In optuna, conventionally functions to be optimized are named objective.

```
def objective(trial):
x = trial.suggest_float("x", -10, 10)
return (x - 2) ** 2
```

This function returns the value of \((x - 2)^2\). Our goal is to find the value of `x`

that minimizes the output of the `objective`

function. This is the “optimization.”
During the optimization, Optuna repeatedly calls and evaluates the objective function with
different values of `x`

.

A `Trial`

object corresponds to a single execution of the objective
function and is internally instantiated upon each invocation of the function.

The suggest APIs (for example, `suggest_float()`

) are called
inside the objective function to obtain parameters for a trial.
`suggest_float()`

selects parameters uniformly within the range
provided. In our example, from \(-10\) to \(10\).

To start the optimization, we create a study object and pass the objective function to method
`optimize()`

as follows.

```
study = optuna.create_study()
study.optimize(objective, n_trials=100)
```

You can get the best parameter as follows.

```
best_params = study.best_params
found_x = best_params["x"]
print("Found x: {}, (x - 2)^2: {}".format(found_x, (found_x - 2) ** 2))
```

```
Found x: 1.988820391331749, (x - 2)^2: 0.00012498364997523206
```

We can see that the `x`

value found by Optuna is close to the optimal value of `2`

.

Note

When used to search for hyperparameters in machine learning, usually the objective function would return the loss or accuracy of the model.

## Study Object

Let us clarify the terminology in Optuna as follows:

**Trial**: A single call of the objective function**Study**: An optimization session, which is a set of trials**Parameter**: A variable whose value is to be optimized, such as`x`

in the above example

In Optuna, we use the study object to manage optimization.
Method `create_study()`

returns a study object.
A study object has useful properties for analyzing the optimization outcome.

To get the dictionary of parameter name and parameter values:

```
study.best_params
```

```
{'x': 1.988820391331749}
```

To get the best observed value of the objective function:

```
study.best_value
```

```
0.00012498364997523206
```

To get the best trial:

```
study.best_trial
```

```
FrozenTrial(number=81, state=1, values=[0.00012498364997523206], datetime_start=datetime.datetime(2024, 11, 11, 5, 25, 20, 832978), datetime_complete=datetime.datetime(2024, 11, 11, 5, 25, 20, 837131), params={'x': 1.988820391331749}, user_attrs={}, system_attrs={}, intermediate_values={}, distributions={'x': FloatDistribution(high=10.0, log=False, low=-10.0, step=None)}, trial_id=81, value=None)
```

To get all trials:

```
study.trials
for trial in study.trials[:2]: # Show first two trials
print(trial)
```

```
FrozenTrial(number=0, state=1, values=[7.226835916728057], datetime_start=datetime.datetime(2024, 11, 11, 5, 25, 20, 515189), datetime_complete=datetime.datetime(2024, 11, 11, 5, 25, 20, 515837), params={'x': -0.6882774999482582}, user_attrs={}, system_attrs={}, intermediate_values={}, distributions={'x': FloatDistribution(high=10.0, log=False, low=-10.0, step=None)}, trial_id=0, value=None)
FrozenTrial(number=1, state=1, values=[46.89443987270024], datetime_start=datetime.datetime(2024, 11, 11, 5, 25, 20, 516088), datetime_complete=datetime.datetime(2024, 11, 11, 5, 25, 20, 516348), params={'x': 8.84795150922524}, user_attrs={}, system_attrs={}, intermediate_values={}, distributions={'x': FloatDistribution(high=10.0, log=False, low=-10.0, step=None)}, trial_id=1, value=None)
```

To get the number of trials:

```
len(study.trials)
```

```
100
```

By executing `optimize()`

again, we can continue the optimization.

```
study.optimize(objective, n_trials=100)
```

To get the updated number of trials:

```
len(study.trials)
```

```
200
```

As the objective function is so easy that the last 100 trials don’t improve the result. However, we can check the result again:

```
best_params = study.best_params
found_x = best_params["x"]
print("Found x: {}, (x - 2)^2: {}".format(found_x, (found_x - 2) ** 2))
```

```
Found x: 1.9925623051834989, (x - 2)^2: 5.531930418340792e-05
```

**Total running time of the script:** (0 minutes 0.907 seconds)