Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tune] Support for multi-objective optimization? #8018

Closed
bitosky opened this issue Apr 14, 2020 · 8 comments
Closed

[tune] Support for multi-objective optimization? #8018

bitosky opened this issue Apr 14, 2020 · 8 comments
Labels
question Just a question :)

Comments

@bitosky
Copy link

bitosky commented Apr 14, 2020

What is your question?

In the documents, "metric" refers to "the training result objective value attribute".It should be set to a single string, such as "mean_loss" or "episode_reward_mean" etc.
It's there any way to tackle multi-objective optimization problem? For example, ("mean_loss", "accuracy" , "something else") as a metric.

@bitosky bitosky added the question Just a question :) label Apr 14, 2020
@richardliaw
Copy link
Contributor

One way you can do this is by having a weighted sum of the objective values -

result = {"multiobjective": a * mean_loss + b * accuracy}

Some underlying packages such as Dragonfly (https://dragonfly-opt.readthedocs.io/en/master/getting_started_py/) and Ax (facebook/Ax#185) also have multi-objective optimization support. You should hypothetically be able to get them working with RayTune.

@bitosky
Copy link
Author

bitosky commented Apr 15, 2020

@richardliaw Thanks for answering my question. But I still have some problem.
The schedulers are also use "metric", and there is no underlying packages. What should I do to tackle multi-objective optimization problem if I don't use the weighted sum trick?

async_hb_scheduler = AsyncHyperBandScheduler(
    time_attr='training_iteration',
    metric='episode_reward_mean',
    mode='max',
    max_t=100,
    grace_period=10,
    reduction_factor=3,
    brackets=3)

Besides, it's there a plan for supporting multi-objective optimization?

@richardliaw
Copy link
Contributor

For existing schedulers, I have no immediate plan to support multi-objective optimization. However, I'm happy to take any contributions. Note that this is not hard to do - adding multi-objective optimization does not require any lower-level modifications to the framework.

I'll close this for now, but feel free to reopen if you have any questions.

@aswanthkrishna
Copy link

were you able to get this to work? @bitosky

@bitosky
Copy link
Author

bitosky commented Nov 9, 2021

were you able to get this to work? @bitosky

@aswanthkrishna I didn't solve it directly. I wrote my own scheduler and analyzer to deal with multi-objective optimization problems. It was not easy for me, And I spent quite a while to do that.

@sgbaird
Copy link

sgbaird commented Nov 18, 2022

@bitosky do you mind sharing your custom scheduler and analyzer example?

Also came across Optuna Multi-objective Example in the Ray docs. It mentions that schedulers may not work correctly with multi-objective optimization.

@sidharrth2002
Copy link

Hi, @bitosky just wanted to follow up on this! An example of your custom scheduler would be super-helpful!

@sgbaird
Copy link

sgbaird commented Jan 2, 2025

Maybe worth mentioning https://honegumi.readthedocs.io/ with the multi-objective selection set to True (without early stopping)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Just a question :)
Projects
None yet
Development

No branches or pull requests

5 participants