Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passing parameter_constraint-s with bounds other than [0,1] while passing model_gen_kwargs: model_gen_options: optimizer_kwargs: inequality_constraints violates parameter constraints #1635

Closed
sgbaird opened this issue May 26, 2023 · 3 comments

Comments

@sgbaird
Copy link
Contributor

sgbaird commented May 26, 2023

As part of the root problem in #1430 (cc @bernardbeckerman), where I'm trying to implement an optimization with the following features:

$^*$ Using the Service API isn't required, but it's the one I have the most experience with, so I've typically been using AxClient instances.

$^{**}$ I might revert back to generating initialization trials manually and reparameterizing the equality constraint as an inequality constraint by hiding a variable during the optimization. See #510 (comment). However, with this approach, you still lose some interpretability from having feature importances for all parameters.

Colab Reproducer

Open In Colab

Trying out different configurations

Relevant variables

loose_parameters

[{'name': 'filler_A', 'type': 'range', 'bounds': [0.0, 1.0]},
 {'name': 'filler_B', 'type': 'range', 'bounds': [0.0, 1.0]},
 {'name': 'resin_A', 'type': 'range', 'bounds': [0.0, 1.0]},
 {'name': 'resin_B', 'type': 'range', 'bounds': [0.0, 1.0]},
 {'name': 'resin_C', 'type': 'range', 'bounds': [0.0, 1.0]}]

tight_parameters

[{'name': 'filler_A', 'type': 'range', 'bounds': [0.0, 0.5862264918643882]},
 {'name': 'filler_B', 'type': 'range', 'bounds': [0.0, 0.5862264918643882]},
 {'name': 'resin_A', 'type': 'range', 'bounds': [0.0, 0.7585178857401957]},
 {'name': 'resin_B', 'type': 'range', 'bounds': [0.0, 0.7585178857401957]},
 {'name': 'resin_C', 'type': 'range', 'bounds': [0.0, 0.7585178857401957]}]

parameter_constraints

['filler_A + filler_B >= 0.24048211425980426',
 'filler_A + filler_B <= 0.5872264918643882']

equality_constraints

[(tensor([0, 1, 2, 3, 4]),
  tensor([1., 1., 1., 1., 1.], dtype=torch.float64),
  1)]

Configurations

Loose parameters only ([0, 1] bounds)

  • Obeys equality constraint
ax_client.create_experiment(
    name=experiment_name,
    parameters=loose_parameters,
    objectives=objectives,
    immutable_search_space_and_opt_config=False,
)

# Total, 1.0, 1.0000000002848757, 1.000000000000022, 1.0000000000000044, 1.0

Tight parameters only (not [0, 1] bounds)

  • violates equality constraint
ax_client.create_experiment(
    name=experiment_name,
    parameters=tight_parameters,
    objectives=objectives,
    immutable_search_space_and_opt_config=False,
)

# Total, 0.5862265232286588, 0.6174594081549808, 0.7585178857401957, 0.5862264918643882, 0.6720687245984942

Loose parameters and parameter_constraints

  • Obeys equality constraint
  • Obeys parameter constraints
  • Except suggested points are identical within tolerance
ax_client.create_experiment(
    name=experiment_name,
    parameters=loose_parameters,
    parameter_constraints=parameter_constraints,
    objectives=objectives,
    immutable_search_space_and_opt_config=False,
)

# Total, 0.9999999999999999, 1.0000000262342321, 1.0000000000000002, 1.0000000000000457, 1.00000000000556

Tight parameters and parameter_constraints

  • violates equality constraint
  • obeys parameter_constraints
ax_client.create_experiment(
    name=experiment_name,
    parameters=tight_parameters,
    parameter_constraints=parameter_constraints,
    objectives=objectives,
    immutable_search_space_and_opt_config=False,
)

# Total, 0.5862264918643891, 0.5862264918643938, 0.5862264918643882, 0.629080942852503, 0.6337654556561465

Takeaways

@Balandat
Copy link
Contributor

Thanks for the repro!

I think the fundamental issues with the non-[0, 1] bounds here is that in Ax we automatically transform the parameter ranges to [0, 1] internally. As a result, the inequality and equality constraints that you're passing via the code below should be defined in the transformed space, rather than the original space.

            model_gen_kwargs={
                "model_gen_options": {
                    "optimizer_kwargs": {
                        "inequality_constraints": inequality_constraints,
                        "equality_constraints": equality_constraints,
                    },
                }
            },

If the constraints are passed to Ax at the level of the search space, then we automatically transform them under the hood and therefore can make sure that they're applied correctly (this is also the reason why you get the error of duplicate kwargs in #1430 - we're already passing the (transformed) constraints to the optimizer).

In the case of passing them in via the optimizer_kwargs as done above, this would have to be done manually. In order to do that, one would have to extract the scaling factors and modify the constraints accordingly. However, this can get kind of dicey and complicated (not least b/c of the multiple kwarg issue), so the better approach here would be to define the constraints in the original space and then transform them internally; I guess the main issue then is that we currently don't support this for equality constraints.

@sgbaird
Copy link
Contributor Author

sgbaird commented May 30, 2023

@Balandat, thanks for this! It's good to get some confirmation, and what you said makes sense. I get that there's a lot of considerations that have to be made to hook up this kind of functionality, so in the meantime, the reparameterization as an inequality constraint (make one variable "hidden") still seems like the best way to go.

@lena-kashtelyan
Copy link
Contributor

@sgbaird it seems to me that your question was answered, so I'm closing this issue. Please feel free to follow up on it, but if you do, please reopen the issue as we might not see your comment on a closed one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants