optuna.integration.AllenNLPExecutor¶

class optuna.integration.AllenNLPExecutor(trial, config_file, serialization_dir, metrics='best_validation_accuracy', *, include_package=None, force=False, file_friendly_logging=False)[source]

AllenNLP extension to use optuna with Jsonnet config file.

This feature is experimental since AllenNLP major release will come soon. The interface may change without prior notice to correspond to the update.

See the examples of objective function.

You can also see the tutorial of our AllenNLP integration on AllenNLP Guide.

Note

From Optuna v2.1.0, users have to cast their parameters by using methods in Jsonnet. Call std.parseInt for integer, or std.parseJson for floating point. Please see the example configuration.

Note

In AllenNLPExecutor, you can pass parameters to AllenNLP by either defining a search space using Optuna suggest methods or setting environment variables just like AllenNLP CLI. If a value is set in both a search space in Optuna and the environment variables, the executor will use the value specified in the search space in Optuna.

Parameters
• trial – A Trial corresponding to the current evaluation of the objective function.

• config_file – Config file for AllenNLP. Hyperparameters should be masked with std.extVar. Please refer to the config example.

• serialization_dir – A path which model weights and logs are saved.

• metrics – An evaluation metric for the result of objective.

• force – If True, an executor overwrites the output directory if it exists.

• file_friendly_logging – If True, tqdm status is printed on separate lines and slows tqdm refresh rate.

Note

Added in v1.4.0 as an experimental feature. The interface may change in newer versions without prior notice. See https://github.com/optuna/optuna/releases/tag/v1.4.0.

Methods

 Train a model using AllenNLP.
run()[source]

Train a model using AllenNLP.

Return type

float