Welcome toVigges Developer Community-Open, Learning,Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
3.8k views
in Technique[技术] by (71.8m points)

python - Best parameters of an Optuna multi-objective optimization

When performing a single-objective optimization with Optuna, the best parameters of the study are accessible using:

import optuna
def objective(trial):
    x = trial.suggest_uniform('x', -10, 10)
    return (x - 2) ** 2

study = optuna.create_study(direction='minimize')
study.optimize(objective, n_trials=100)

study.best_params  # E.g. {'x': 2.002108042}

If I want to perform a multi-objective optimization, this would be become for example :

import optuna
def multi_objective(trial):
    x = trial.suggest_uniform('x', -10, 10)
    f1 = (x - 2) ** 2
    f2 = -f1
    return f1, f2

study = optuna.create_study(directions=['minimize', 'maximize'])
study.optimize(multi_objective, n_trials=100)

This works, but the command study.best_params fails with RuntimeError: The best trial of a 'study' is only supported for single-objective optimization.

How can I get the best parameters for a multi-objective optimization ?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

In multi-objective optimization, you often end up with more than one best trial, but rather a set of trials. This set if often referred to as the Pareto front. You can get this Pareto front, or the list of trials, via study.best_trials, then look at the parameters from each individual trial i.e. study.best_trials[some_index].params.

For instance, given your directions of minimizing f1 and maximizing f2, you might end up with a trial that has a small value for f1 (good) but at the same time small value for f2 (bad) while another trial might have a large value for both f1 (bad) and f2 (good). Both of these trials could be returned from study.best_trials.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to Vigges Developer Community for programmer and developer-Open, Learning and Share
...